vade

Forum Replies Created

Viewing 20 posts - 341 through 360 (of 529 total)
  • Author
    Posts
  • in reply to: Pure Data / GEM implementation #5023
    vade
    Keymaster

    Yea, there is

    I am referencing headers from PD extended. You can find the path in the Xcode Project via the Targets -> syphon.pd_darwin target, Get Info -> Build and then the Header Search Paths, which are currently set to:

    "/Applications/Pd-extended.app/Contents/Resources/include"

    I also link to GEM.pd_darwin which is also found in the PD-extended app bundle, at :

    "/Applications/Pd-extended.app/Contents/Resources/extra/Gem/Gem.pd_darwin"

    That ought to be it, the resulting target bundle is set to not require anything else. Unless I am forgetting something 🙂

    Basically, install PD-Extended, and you should be ok, and the Xcode project should find what is needed to compile, or hack at the code.

    Are you interested in contributing the the PD-GEM / Syphon plugin?

    in reply to: jit.gl.mesh issues? #4610
    vade
    Keymaster

    The read me and the help patch is all you should need.

    How about you post what the issue you are having?

    Please read :

    http://www.mikeash.com/getting_answers.html

    Help us help you.

    in reply to: Pure Data / GEM implementation #5021
    vade
    Keymaster

    Hi jesse,

    I linked to a compiled test binary of the plugin on my drop box, on the PD mailing list, you can find a working *server only* implementation here:

    http://dl.dropbox.com/u/42612525/pd-syphonserver.zip

    in reply to: Rutt Etra 3 Link? #4307
    vade
    Keymaster
    in reply to: Glitch plugins cause blank screen w/VDMX #4278
    vade
    Keymaster

    Can you try and remove all other plugins, and Kineme skank plugins and see if it works then?

    Anything in QC Patches, and QC Plug-Ins ? I suspect an old version of some Kineme plugin whose name I forget, which does some weird stuff.

    Thanks.

    in reply to: v002 Model importer / no model #4303
    vade
    Keymaster

    Remove all other 3rd party plugins, and ensure you supply a full path to the model file.

    in reply to: Virtual WebCam Device #5234
    vade
    Keymaster

    I believe using CamTwist in in Quartz Composer composition should allow you do to exactly that. Make a Syphon Client, and send it into Cam Twist. Voila ?

    in reply to: HD Syphon stream #4945
    vade
    Keymaster

    Oh. Wait.

    You are using *TWO* GPUs eh?

    Do you have QC rendering in 1, and the other running Mad Mapper?

    Try using two displays on a single GPU. You will avoid expensive read back from one GPU to the other.

    in reply to: HD Syphon stream #4944
    vade
    Keymaster

    HD video playback is totally doable. However, there are a few things you need to ensure you do in Quartz Composer (if you want to use that) for HD playback via Syphon.

    #1) With the Apple Movie Importer patch, ensure you use Asynchronous mode (select the Movie Importer patch in the editor, hit Apple – 2, to get to settings, and enable the check box).

    That ensures that the Movie playback is buffered and reads ahead of time, rather than “just in time”.

    #2) What codec are you using? What frame rate movie?

    #3) Are you using the QC Syphon Server (Public release 2) with the Image Input , or are you capturing the scene?

    #4) Are you running Quartz Composer editor in 32, or 64 bit mode? Due to some nuances in Quicktime, 32 bit mode will be faster for movie playback and VRam usage. To do this get info on the Quartz Composer.app, and select “Open in 32 bit”.

    Let us know.

    in reply to: Multiple Syphon streams from VDMX to Unity 3D? #4938
    vade
    Keymaster

    So, right now, the Unity Syphon client plugin simply returns the first server found. We have to add some logic to tell it which server to use. Basically, this means bridging the C# to Objective-C String objects to that the internal search mechanism can find the specified server.

    We’ve not gone there yet because we’ve been busy bootstrapping the other implementations, and working on other bugs. If you want to lend a hand and attempt that code, that would be great!

    in reply to: syphon recorder #4678
    vade
    Keymaster

    Just FYI, you can totally do broadcast quality rendering in Jitter, you just need to know how to patch so you save frames out non realtime, and drive your metro/qmetro/rendering at non realtime speeds, and ensure your timing/keyframes are working at non-realtime.

    Totally doable.

    in reply to: Frame rate issues in a client with Unity Syphon server #4930
    vade
    Keymaster

    Also, both Simple Server and Client render on the main thread with, so you can’t really rely on that for looking at hiccups and determining the cause. Main thread / runloop issues will interrupt rendering for both (just hold a menu down). It.. Is “simple” for a reason – just a fairly straight forward implementation.

    Try loading a composition up with QCPlayer.app (Developer example that uses CVDisplayLink to drive rendering), you should see much fewer hiccups.

    in reply to: Frame rate issues in a client with Unity Syphon server #4929
    vade
    Keymaster

    What does timescale control in unity?

    Are you ensuring vertical sync for your unity rendering? Does unity render at 60hz ? What may be happening is that the additional overhead of rendering the texture to syphon is pushing it over a 1/60th refresh, causing apparent dropped frames.

    You may want to throw a timer in there and look at how long a frame takes to render with and without a client attached, and adjust the timescale programmatically to compensate?

    in reply to: Frame rate issues in a client with Unity Syphon server #4926
    vade
    Keymaster

    glFlushRendererApple != CGLFlushRenderer. Its not just for single buffered outputs, but when FBOs and other buffers are rendered to, or when *secondary OpenGL Threads update the contents of buffers*, flushing needs to happen so resources can be synced. IOSurface is one such instance where an additional flush is needed to synchronize a shared resource.

    Look at “Using glFlush Effectively” section

    The CGLFlushDrawable issue applies only during a single application, on a single window you own. You can totally call CGLFlushDrawable on multiple windows at multiple times, more than once, during a period between refreshes.

    You also have to understand that GL profiler “hides” much complexity of the programmable pipeline and “collapses” GLSL shader programs, ARB programs, etc into some calls it lists in the states. So when you see draw call usage for CGLFlushDrawable, or GLFlush go “up”, its most likely due to either fill rate issues, or due to using the programmable pipeline. Notice there is no higher GPU usage or rates for shader calls, even though unity and many apps use them. Much stuff gets “rolled” into other calls, making identifying some of the performance issues more nuanced than relying solely on GL Profiler and trusting its output.

    Secondly, IOSurfaceLock/Unlock is never used in Syphon because we never hit the CPU. Those calls are, as you guessed, only for when a surface is changed in main memory, and needs to be synchronized, and re-flushed to the GPU. The locking is so that any GPU based app cannot read into the texture while it is being updated from another app, potentially causing issues, incomplete reads, etc.

    I think you are over-thinking this issue to be honest. This sounds suspiciously like a fill rate issue, in that you simply can’t draw that much on screen. What size frames are you sending back and forth? I think its 2400x 600? (triple head 800×600 size no?)

    Understand that, because you are using Unity’s render buffer, and syphons “publish frame image” api, and drawing to the screen in unity, and then drawing in another app, you end up drawing 2400×600 a few times.

    1) Render Unity Scene to a texture
    2) Render resulting unity texture to Syphons IOSurface via publishFrame Image
    3) Unity renders the resulting texture in the Unity app.
    4) App using Syphon renders the Syphon texture.

    Due to the lack of CPU spikes, VRam usage, etc, this seriously looks like a fill rate limitation to me.

    You might find that you get better performance if you simple do a glCopyTexSubImage2D on the front buffer and the end of the unity rendering, getting *everything* rendered at that point into a texture. You can also cheat this way, and copy that directly into SyphonServers texture. This eliminates 3 render passes (Rendering the Unity Scene to its own internal texture, Unity Rendering that Texture, and rendering that texture to Syphons internal texture), and adds a gl copy / blit pass, and of course Unity still renders the scene once, rather than rendering it to texture, then rendering that texture (both which use up available fill rate).

    This is what we do in the yet to be finalized/fixed for 10.7 Screen Capture app for getting the entire scene, look at :

    SyphonScreenCaptureAppDelegate.m line 404 (thats amusing