vade

Forum Replies Created

Viewing 20 posts - 161 through 180 (of 529 total)
  • Author
    Posts
  • in reply to: Mirroring texture horizontally and vertically #26478
    vade
    Keymaster

    Just bind the shader and unbind it when you’re in the FBO. Theres no reason at all to hit the CPU for any of what you need.

    in reply to: Mirroring texture horizontally and vertically #26282
    vade
    Keymaster

    This is easily done in OpenGL via the servers

    bindToDrawFrameOfSize: and unbindAndPublish

    methods, (see http://syphon.v002.info/FrameworkDocumentation/interface_syphon_server.html#ab5da335ea3e45903eceae51adb363240 )

    What this does is, Syphon framework attaches an internally used and managed frame buffer object, which is attached to the ‘texture’ / surface that it will share. You are then responsible for drawing your scene to OpenGL, as normal (assuming you are drawing directly to the frame buffer object weve attached) – you then call unbindAndPublish on the server – and any drawing youve done will be sent off via Syphon.

    You can then get the Syphon Image from the server you just made, to draw to your own OpenGL View, like you would any texture.

    Essentially you would:

    *Set Up OpenGL Context
    *Set up Syphon Server.
    *Set up your resources.

    in your render loop:

    *attach context

    *server bindToDrawFrameOfSize (you are now drawing “into” Syphon)

    *Draw your OpenGL content, and modify the vertices and texture coordinates of your texture to achieve the desired effect, or use a GLSL shader, or any number of methods appropriate.

    * unbindAndPublish (youve now notified any listening clients youre drawing is done and its ready to be seen elsewhere). This unbinds the frame buffer object, and synchronizes the contents of the shared texture to other applications.

    If you want, you may now:

    * get the SyphonImage from the server (its the most recent thing you’ve drawn above)
    * draw that image as normal (no effects) to your own scene for a live preview of your applications output that will be seen by others, etc.

    I hope that helps.

    We highly suggest avoiding pixel readback to the CPU – it defeats the entire purpose of using Syphon to begin with : keeping things fast on the GPU. Where it belongs.

    One thing to note is that we try to do a good job of isolating OpenGL state before and after our OpenGL calls into your context. Ensure you leave things as they were, if you’ve altered state within the bind and unbind calls on the server.

    in reply to: issues connecting jit.gl.nod to jit.gl.syphonserver #25984
    vade
    Keymaster

    Hi. Youve hooked up the outlet of jit.gl.node to the videoplane. Thats incorrect. It looks like you want to hook the second outlet of the jit.gl.node to the jit.gl.mesh, so that the MESH draws to the internal texture.

    So it goes.

    * node captures mesh to a texture.
    * node outputs texture to video plane to see it in the jitter app.
    * node outputs texture to syphon, so syphon sees it.

    That one fix, seems to solve it for me.

    in reply to: Syphon support seems broken for Processing 2.0.1 #25872
    vade
    Keymaster
    in reply to: Syphon Virtual Screen (again!) #24887
    vade
    Keymaster

    Just so you know, the latest QLab has native Syphon integration both in and out.

    in reply to: Dragonframe to QC #24383
    vade
    Keymaster

    Or you could use something like Syphoner to get a screen capture to Syphon. Its not the best solution, but might be helpful in a pinch?

    http://syphoner.sigma6.ch

    in reply to: Syphon server in a Chromium/webkit browser #23876
    vade
    Keymaster

    It would be definitely be faster in the sense that it would not need to render unless the scroll or content changed.

    However, its a non trivial task, integrating into a codebase like that :\

    in reply to: Making the QC Plugin "safe" #21935
    vade
    Keymaster

    Additionally, all native QC Plugins are ‘unsafe’ – this is not an issue with Syphon, but an issue with how applications not designed to leverage Quartz Composer, can still sort of load compositions via Quicktime.

    in reply to: Making the QC Plugin "safe" #21934
    vade
    Keymaster

    Read this thread:

    http://forum.garagecube.com/viewtopic.php?f=1&t=5212&sid=3566417d09eef1666404f123a0857e67&start=15#wrap

    You want a specific build of QCRehab.

    What you really want is the vendor to add Syphon support. Ask them.

    in reply to: Jitter many jt.gl.mesh to syphon #21098
    vade
    Keymaster

    Jit.gl.node is your friend.

    in reply to: MaxMSP-JItter 6.1 problem #20824
    vade
    Keymaster

    How about you give us more information on what doesn’t work, what your environment is, and expectations, rather than reverse trouble shoot ๐Ÿ™‚

    Im guessing you are attempting to run 64 bit? The MXO right now is 32 bit only.

    in reply to: MaxMSP-JItter 6.1 problem #20822
    vade
    Keymaster

    Appears to work here – just installed 6.1.1 and Syphon works as expected.

    in reply to: no frame source menu in simple client #19953
    vade
    Keymaster

    Oh, and c) send the Jitter patch you are using, along with the version of Max/MSP you are using.

    in reply to: no frame source menu in simple client #19952
    vade
    Keymaster

    Hi Phil.

    Just to be sure, you are

    a) Running both Simple Client and Simple Server, and do not “see” the Server rendering in the Client?

    b) You are running a supported operating system, ie: Mac OS X 10.6+ (10.7, 10.8 are supported).

    Let us know.

    in reply to: Syphon server and Mesh Renderer #19949
    vade
    Keymaster

    Are you running the latest Syphon for Quartz Composer Beta 2, and have ensured you do not have any older versions loading?

    If so, post a composition.

    in reply to: QC crash #19884
    vade
    Keymaster

    Great to hear it was an incompatibility and not a “new” issue!

    in reply to: VMWare, Virtual Box or similar #19781
    vade
    Keymaster

    I know quite a few people who have successfully run Syphon on a Hacktinosh. As far as VM’s, Im unaware of anyone who has even tried.

    in reply to: Have an idea, need help choosing the pieces. #19779
    vade
    Keymaster

    “Unity 3D pro canโ€™t stream live output of the 3D actor.”

    Sure it can.

    We had a demo of a Unity3D rig setup used at Framestore (a very large VFX company) that streams live mocap data from London to NYC into Unity3D live.

    Using Syphon for Unity3D, you could capture the scene live, and send it to any compositing or other software you’d like.

    in reply to: Simple Syphon video player on video card output #19778
    vade
    Keymaster

    Hi ced,

    So a few things to know.

    Most Broadcast video capture and output cards (ie, HD-SDI, Matrox, BlackMagic, AJA, etc) do not have a GPU (a graphics processing unit) in the sense of being OpenGL compatible. I know that sounds sort of odd, but those cards work differently than the video cards used for 3D computing / gaming etc.

    Syphon relies on the latter (a GPU), and needs OpenGL. Since the AJA/BlackMagic etc cards *do not* have a GPU or OpenGL support, special software has to be made to take anything from OpenGL (like Syphon) to be output and synced.

    That sounds horribly complicated, but BlackMagic has an example that uses OpenGL to output to their cards, but no one has made a Syphon “enabled” version. That sounds like what you would need to solve your problem

    Unfortunately I have pretty full plate, so I can’t whip up anything soon – but you might see if anyone elsewhere has done it.

    You might also find commercial piece of software that has both Syphon support and Black Magic support, and could sort of “roll your own” solution via 3rd party software.

    in reply to: Syphon Recorder Beta 10. Crash #19731
    vade
    Keymaster

    Awesome that both issues are solved. Thanks for getting back to us, its good to know when fixes actually fix!

Viewing 20 posts - 161 through 180 (of 529 total)