Syphon for Final Cut Pro?

Home Forums Syphon Syphon Development – Developer Syphon for Final Cut Pro?

Viewing 12 posts - 1 through 12 (of 12 total)
  • Author
  • #5131

    Hey Guys I am working on a projection design for a video mapping project. I would like to use Syphon to capture my Final Cut Pro output so we can edit content while at the same time processing the same content in Quartz Composer.

    Is there anyway to frame share via Syphon, with Quartz Composer and Final Cut Pro?

    Much appreciated!

    Ryan Uzi


    What do you mean by “output” from Final Cut Pro? FCP’s output is usually a file, often rendered at less than real-time.

    If you’d like to send clips and sequences to QC through Syphon, it is probably possible if you do some coding and write an FxPlug that would render in realtime, but then it would only work while you were playing those clips/sequences in FCP, and if FCP decided to render in realtime (depends on system, codec, other variables).

    FCP is an awkward choice for real-time video editing. Do you have to use it?


    I actually just mocked this up via FX Factory using the Quartz Composer plugin and, being employed by Noise Industries and knowing enough about FCP, FX Factory, I am actually surprised, nay.. amazed, that this appears to remotely possibly kind of work, at all.

    This is a huge bag of worms. If you don’t want to deal with oddness, walk away now.

    This has only been tested in FCP. I suspect AE will fail horribly. Wait. Fuck me. It actually kind of works. Maybe. Sort of. Motion will probably behave similarly. Notice all the caveats with this?

    Install the Syphon QC plugin:
    Install FX Factory: (get the latest).
    Install the FX Pack:

    Now, i order to ensure that this works (this is beyond alpha, not supported, not sanctioned by us, or Noise Industries, logic, or reality at all, and thus will probably break in 30 seconds, butttt…. that said).

    1) Ensure that FCP is using Dynamic RT on your sequence (in the tab all the way on the left):
    Video Playback Quality should be set to Dynamic
    Video Playback Frame Rate should be set to Full

    To get video *in* to FCP:
    Generators -> Syphon Client -> Drop into timeline. Go to controls and add the App name and Server name appropriately. You can hit play, and should see a preview of what is coming in from Syphon. If you want to do the equivalent of ‘Capturing’, just hit Render, and it will grab the frames from your other app, in realtime. It *will drop frames* since rendering is not *capturing* from a frame source. But the act of rendering will effectively cache the video (until you render again, or change something). This is so incredibly stupid and fragile and I dont know why you would want to do it, but there it is.

    To get video out of FCP:
    If you want to “publish” a whole sequence, you should make a *new* sequence, and then nest the sequence you want to publish into this bew sequence, effectively making a long clip out of it. Then add Effects -> Syphon -> Syphon Server. Play your sequence back and you should be able to see it in a client. Also.. no idea why you would want to do that.

    The same logic / workflow basically works for AE, except that AE seems to destroy the plugin instance (and thus the sending of data) when it stops playing. See what I meant about ‘kind of works’ ?

    But, you can, kind of, in the most hacky way possible, barely get something functional out of these apps. That are not, remotely, designed for this sort of thing.

    A *much* better way of doing this would be to have some sort of Syphon Quicktime capture and camera component so that A Syphon Server would look like a video output device to FCP, and a a published source from another app would look like a Capture source/camera to FCP. These are, most likely, not things we ourselves will get around to making.

    That said, praise the lord, this kind of works. Again, I, Tom, the project, the team, jesus in heaven (who does not even exist) and $deity do not support this. This is the work of the devil. Who also is imaginary. Don’t ask. I almost already regret sharing, and creating this abomination…


    Hey Guys, thanks for the reply and suggestions, and thank you kindly vade for your terrifying yet very informing and enthusiastic solution!

    Let me explain why I am doing this so it doesn’t seam so insane;

    I am working with an editor who only works in FCP, who is cutting an edit for a projection onto a 3D object. They want to see it projected as they are editing it, in realtime at the site of the installation.

    In order for the projection to work, I have to perform a 3D meshwarp in Quartz composer which warps the video so it will align properly with this kind of spherical object that they are projecting on. Since the projector setup is very wacky involving mirrors, ect…this warp cannot be part of the source footage, but has to happen on its way to the projector.

    Right now I have it setup where I am running the signal out of FCP at 1920×1080 via DVI to HDMI, and into a Matrox M02 Mini on another computer. That computer will warp and output the video via Quartz keeping resolution in tact. There have been some color space problems with the M02 Mini, and they would really like it to work all on one computer so this is how I came across you guys and Syphon as a possible solution for frame sharing.

    Anyhow I hope that explains what I am doing, its not so much crazy, but more rare and ahead of the curve – yet if this could be developed it could be a great tool for real-time projection mapping – where you can do the mapping in Quartz and create all the content layers and elements in other video applications on the spot!

    I never get enough time to render my mapping projects and it makes the content suffer – but with this real-time approach it could be much better.

    Any thoughts?

    Ryan Uzi


    I don’t see why the mxo2 mini would not work in that situation. I do something similar with mine to capture input from another machine.

    Ensure you are capturing via hdmi and that you shut the mxo2 down and do a full reboot if you have issues. That seems to fix random yuv vs rgb interpretation issues. That said, the above syphon ought to work going from fcp to qc. have you tried it? I suspected you would want it for something like mapping. Seems appropriate. I think it should work in that light. Follow those instructions and apply the effect to a nested sequence.


    Hey thanks very much for that, the nested sequence and the final cut pro plugin you have for syphon is a great idea. I’m going to try it out!

    One problem with the computer to computer video patch from Vades Mx02 mini tutorial is that I have not yet been able to get the matrox M02 recognized in quartz composer 4.0, the video input source is stuck on default and grayed out…its very sad. would anyone have any tips on that. Very much appreciated!

    Ryan Uzi


    Run QC in 32 bit mode (google to find out how). 64 bit Quicktime “sees” fewer video digitizers, and the MXO may show up only in 32 bit mode.



    I just found this topic.

    I would like to do something similar:
    Output a FCP(7)-Sequenz into Max-Jitter from cycling74.
    I put the Syphon Server Effect onto my Sequenz in FCP, but cannot see it inside Max. Anything I have not taken care of?



    You have to play the sequence – and this plugin isnt supported – so, youre kind of on your own. See if Simple Client picks it up.


    Hi Vade.

    I´ve used the tcp syphon plug and it has been invaluable for doing stereoscopic 3d on-stage editing, via QC.

    Can not se any fcpx version.
    Any chance of one or a way to bake one myself?

    Only found this reference online:
    “If you are using FCPX, you are out of luck”

    Thanks for your work (and help) through the years!


    This really isn’t supported. You might be able to roll something yourself via FX Factory and the Syphon QC plugin, but its not going to work as well as something authored to be persistent as a Syphon output.

    It is, frankly, not on our radar at all right now. You could request some development from a 3rd party, or attempt to write something yourself.

    But, to be short, we don’t plan on supporting this or even attempting to test it. The current FCP plugin design is a jury-rigged – should not work as well as it should – and thus to be avoided – as documented – plugin. Avoid unless you want to pull your hair out.

    Sorry about that.


    Thanks for following up.

    I know about the lack of support, but I´ve spent hundred of hours with the fcp7 plug, and I would not have gotten done the stuff I´ve done the last cupel of years without it, even with it´s kinks.

    I work with multiple stereoscopic riggs on the floor and back wall producing stage plays with actors standing in the 3D landscape. There is a lot of adjustment that would have taken *loads* more rendering if I had not at least had the frame preview I get out of the plug.

    How else to edit over 6 projectors on stage directly from fcp?

    Back when you first released I was guessing the mapping community would have immense use for it, editing over multiple outputs right on to stage, wall, objects or buildings.


Viewing 12 posts - 1 through 12 (of 12 total)
  • You must be logged in to reply to this topic.