I actually just mocked this up via FX Factory using the Quartz Composer plugin and, being employed by Noise Industries and knowing enough about FCP, FX Factory, I am actually surprised, nay.. amazed, that this appears to remotely possibly kind of work, at all.
This is a huge bag of worms. If you don’t want to deal with oddness, walk away now.
This has only been tested in FCP. I suspect AE will fail horribly. Wait. Fuck me. It actually kind of works. Maybe. Sort of. Motion will probably behave similarly. Notice all the caveats with this?
Install the Syphon QC plugin: http://syphon.v002.info
Install FX Factory: http://noiseindustries.com (get the latest).
Install the FX Pack: http://syphon.v002.info/downloads/testing/Syphon.fxpack.zip
Now, i order to ensure that this works (this is beyond alpha, not supported, not sanctioned by us, or Noise Industries, logic, or reality at all, and thus will probably break in 30 seconds, butttt…. that said).
1) Ensure that FCP is using Dynamic RT on your sequence (in the tab all the way on the left):
Video Playback Quality should be set to Dynamic
Video Playback Frame Rate should be set to Full
To get video *in* to FCP:
Generators -> Syphon Client -> Drop into timeline. Go to controls and add the App name and Server name appropriately. You can hit play, and should see a preview of what is coming in from Syphon. If you want to do the equivalent of ‘Capturing’, just hit Render, and it will grab the frames from your other app, in realtime. It *will drop frames* since rendering is not *capturing* from a frame source. But the act of rendering will effectively cache the video (until you render again, or change something). This is so incredibly stupid and fragile and I dont know why you would want to do it, but there it is.
To get video out of FCP:
If you want to “publish” a whole sequence, you should make a *new* sequence, and then nest the sequence you want to publish into this bew sequence, effectively making a long clip out of it. Then add Effects -> Syphon -> Syphon Server. Play your sequence back and you should be able to see it in a client. Also.. no idea why you would want to do that.
The same logic / workflow basically works for AE, except that AE seems to destroy the plugin instance (and thus the sending of data) when it stops playing. See what I meant about ‘kind of works’ ?
But, you can, kind of, in the most hacky way possible, barely get something functional out of these apps. That are not, remotely, designed for this sort of thing.
A *much* better way of doing this would be to have some sort of Syphon Quicktime capture and camera component so that A Syphon Server would look like a video output device to FCP, and a a published source from another app would look like a Capture source/camera to FCP. These are, most likely, not things we ourselves will get around to making.
That said, praise the lord, this kind of works. Again, I, Tom, the project, the team, jesus in heaven (who does not even exist) and $deity do not support this. This is the work of the devil. Who also is imaginary. Don’t ask. I almost already regret sharing, and creating this abomination…