Hey Guys, thanks for the reply and suggestions, and thank you kindly vade for your terrifying yet very informing and enthusiastic solution!
Let me explain why I am doing this so it doesn’t seam so insane;
I am working with an editor who only works in FCP, who is cutting an edit for a projection onto a 3D object. They want to see it projected as they are editing it, in realtime at the site of the installation.
In order for the projection to work, I have to perform a 3D meshwarp in Quartz composer which warps the video so it will align properly with this kind of spherical object that they are projecting on. Since the projector setup is very wacky involving mirrors, ect…this warp cannot be part of the source footage, but has to happen on its way to the projector.
Right now I have it setup where I am running the signal out of FCP at 1920×1080 via DVI to HDMI, and into a Matrox M02 Mini on another computer. That computer will warp and output the video via Quartz keeping resolution in tact. There have been some color space problems with the M02 Mini, and they would really like it to work all on one computer so this is how I came across you guys and Syphon as a possible solution for frame sharing.
Anyhow I hope that explains what I am doing, its not so much crazy, but more rare and ahead of the curve – yet if this could be developed it could be a great tool for real-time projection mapping – where you can do the mapping in Quartz and create all the content layers and elements in other video applications on the spot!
I never get enough time to render my mapping projects and it makes the content suffer – but with this real-time approach it could be much better.