VLCKit to Syphon

Home Forums Syphon Syphon Development – Developer VLCKit to Syphon

Viewing 5 posts - 1 through 5 (of 5 total)
  • Author
  • #58748


    I’m trying to hookup VLCKit to Syphon on OS X. I can get a basic example to play but VLCKit does not give me a handle on the actual data. The player is passed an NSView to which it adds a layer to draw the video onto.

    Can I do something with that layer and publish that data to Syphon? Anybody have any example code? I would need to do this for three separate streams.

    If anybody knows anything that either allows a VLC/rtsp stream to go to Syphon simply or a way to read in VLC/rtsp in Quartz Composer (which is the actual goal of this exercise).



    I haven’t done any direct work with VLCKit but I did look into it. If I recall correctly they have some lower level APIs and higher level APIs that give you a view to draw to.

    Considering you need:

    * An OpenGL context to initialize Syphon – and preferably one that is owned or ‘shared’ with a GL context used by the host app you are attempting to get frames from

    * Either the OpenGL texture ID, or a handle on to the CPU side bytes of the data, where you make a texture yourself to pass to Syphon

    I would suggest that you forego looking at higher level convenience API’s and try to find a VLCKit example that

    * Draws to an OpenGL context
    * Has an obvious Texture upload component, where VLCKit submits CPU side data to be drawn.

    You could then more easily shim Syphon into that pipeline, than via Views and scraping, which is slower, less efficient and more prone to weird issues.


    I don’t think that what vade describes is possible with VLCKit – I’d suggest you find an alternative library.

    I’m surprised there isn’t a Syphon-capable app that receives rtsp already – if you get this working, it would be great if you were to share your results.


    Max MSP is RTSP capable too.

Viewing 5 posts - 1 through 5 (of 5 total)
  • You must be logged in to reply to this topic.