Forum Replies Created
May 24, 2018 at 1:09 pm in reply to: Nothing coming through as output except simple server #59470
You should contact the makers of MadMapper and VPT for support with their products, but…
Have you enabled Syphon output in MadMapper and VPT? I think both require steps to enable it – in MadMapper it is a toggle on each Projector in your output.
It works. IIRC it’s 32-bit only, as the 64-bit FFGL API wasn’t formalised at the time.
What are you using it for? Most FFGL hosts have native support for Syphon nowadays (without having to use FFGL).
Hi – this isn’t an issue with Syphon itself, but with the TCP Syphon Server – you should contact the developer of TCP Syphon Server directly for support.
Best of luck.
The attached movie can’t be from Syphon Recorder – it doesn’t produce .mp4 files.
If the recorded dimensions exactly match the Syphon source, no filtering is applied in Syphon Recorder.
Can you post a picture of the Preferences window in Syphon Recorder?
Sounds like you’re missing a Copy Files build phase to copy Syphon.framework to the Frameworks folder of your app bundle.
If you name implementation files which mix Objective C and C++ with the .mm extension, Xcode will handle them properly – just be careful about Objective C types in headers you want to include in any pure C++ files.
I don’t think the Powershots have the capability in hardware, unfortunately.
Glad it’s otherwise useful though.
IIRC you can limit VLC’s buffering somehow – that’s probably the cause of the delay.
Right now the following Canon Cameras:
EOS-1D Mark III
EOS-1Ds Mark III
EOS DIGITAL REBEL Xsi/450D/ Kiss X2 EOS DIGITAL REBEL XS/ 1000D/ KISS F EOS 50D
EOS 5D Mark II
EOS Kiss X3/EOS REBEL T1i /EOS 500D EOS 7D
EOS-1D Mark IV
EOS Kiss X4/EOS REBEL T2i /EOS 550D EOS 60D
EOS Kiss X5/EOS REBEL T3i /EOS 600D EOS Kiss X50/EOS REBEL T3 /EOS 1100D EOS 5D Mark III
EOS 1D X
EOS Kiss X6i/EOS 650D/EOS REBEL T4i
EOS Kiss X7i/EOS 700D /EOS REBEL T5i
EOS Kiss X7/EOS 100D/EOS REBEL SL1
EOS Kiss X70/EOS 1200D/EOS REBEL T5/EOS Hi
EOS 7D Mark II
EOS 5DS / EOS 5DS R / EOS REBEL T6s / EOS 760D / EOS 8000D / EOS REBEL T6i / EOS 750D / EOS Kiss
EOS-1D X Mark II / EOS 80D / EOS Rebel T6 / EOS 1300D / EOS Kiss X80 / EOS 5D Mark IV
EOS Kiss X9i / EOS Rebel T7i / EOS 800D / EOS 9000D / EOS 77D / EOS 6D Mark II / EOS Kiss X9 / EOS Rebel SL2 / EOS 200D
Syphon shouldn’t be adding significant overhead – the parts of the process to do with generating and affecting your content are likely to be the drain on resources. Only start to worry about it if you are starting to hit limits (eg dropped frames). In that case, try each part of your pipeline independently to identify the part that is struggling. If you still see problems without Syphon being involved, then Syphon isn’t the problem.
There is no simple maths you can do. Syphon treats all content the same – the only significant hit Syphon will cause is when you resize the Syphon frames, in which case the underlying surface is rebuilt. Otherwise the only limitation is, as with any graphics work, that the more pixels you are working with the more work needs to be done.
Have you tried? All available channels will be recorded.
Sounds like a question for the Madmapper guys – http://www.madmapper.com/support/April 6, 2017 at 5:30 am in reply to: Camera Live – Multiple Cameras / Instances Simultaneously #59249
Multiple connections aren’t supported by the underlying Canon SDK. If running two instances is working for you, it isn’t going to do any harm.
A Syphon Client takes a moment to connect to its server, so that may be the issue you see with blanking in OBS. If there’s any way to keep both sources “active” but one invisible, then try that. I don’t know anything about OBS I’m afraid.January 22, 2017 at 8:22 am in reply to: Syphon from one part to another within the same app? Error Reporting? #59243
Using Syphon internally – there’s nothing wrong with doing that, and your per-frame drawing shouldn’t suffer, though the SyphonClient will take a moment to find the server initially. You could use a shared context and the server’s -newFrameImage to get the texture, without the involvement of a SyphonClient.
Syphon shouldn’t cause a crash, ever. The documentation tells you when a method can fail, and what the result of failure will be (eg bindToDrawFrameOfSize: returns a BOOL to indicate success). Check return values and code to deal with the failure value being returned.
Your frame step is missing a draw-the-texture-into-the-FBO line
A peculiarity of the IOSurface-backed textures which Syphon uses is that they are reluctant to return pixel data by the usual means. You will likely have to draw the texture into an FBO and then get the pixels from the FBO’s texture backing.
Rough example code in a previous post here.
Syphon shares a texture on the GPU between processes. I’m not familiar with v4loop, but I’m guessing it shares regular memory, which is never going to come anywhere near the performance of a shared GPU resource. Yes, it is technically possible to get an image out of one process and into another, no, it isn’t the same as Syphon (or Spout).