Forum Replies Created
-
AuthorPosts
-
vadeKeymaster
Thats not quite correct –
OpenGL access has to be *serialized* – meaning you cannot make concurrent requests to OpenGL from multiple threads, as you will corrupt the command stream submitted to the GPU, and cause issues.
OpenGL is a state machine, you twiddle state on and off and submit objects, and then render at the current configuration. Changes you make persist, and order of operations matter.
So in terms of multithreading OpenGL, you have to serialize access to a OpenGL Context.
Something like
Thread A:
Lock for OpenGL
Draw / Resize Viewport
Render/ Flush the screen
Unlock for OpenGLThread B:
Do something to prepare expensive calculations for OpenGL
Lock for OpenGL
Submit OpenGL Commands
Unlock OpenGL
Continue on your wayHope that helps.
vadeKeymasterUnfortunately, thats not quite how it works.
Changing the blend mode really only affects the ‘drawing’ – it doesnt change the alpha values or anything in the texture.
You might need to run a jit.gl.slab to change the texture from premultiplied to unpremultiplied (or, perhaps, vice versa).
You’d do something like.
jit.gl.syphonclient
|
jit.gl.slab (some slab shader here that does alpha changing)
|
jit.gl.syphonserver(in short hand)
That could be helpful perhaps!
vadeKeymasterCheck this thread out :
http://v002.info/forums/topic/jitter-alpha-not-showing-in-syphon-client/
Should be a blend mode switch somewhere in VPT 🙂
vadeKeymasterSo – alpha can be pre-multiplied or un-premultiplied.
Syphon makes NO adjustments to the incoming textures alpha channel, its up to applications to handle alpha. Sadly different applications can use different versions.
It sounds like one app is sending premultiplied alpha, and the other is assuming unpremultiplied.
Graphics is hard and annoying. I’d get in contact with the VPT people and see if they are assuming an alpha format. That looks like premultiplied to my eyes, and I’ve not seen that with any other apps, and recall I had similar issues in Max/MSP/Jitter without changing the blend mode in Jitter.
March 2, 2015 at 9:09 am in reply to: Anamorphic Stereo , dual Syphon output with kinect head tracking … #58938vadeKeymasterYou are overwriting the rendering from one eye with the second.
You need two syphon textures one for left and right respectively, or you need a render pass to blit both left and right into a single texture, and then output that one texture to unity.
vadeKeymasterAnother option a lot of people don’t realize is much of OpenCV’s functionality is still usable at lower resolutions. Try scaling your video down to 320×240’ish or less – way less CPU and less shit to shunt between CPU and GPU.
You can do analysis on downscaled frames, and then scale the values back up for tracking on top of the full res original image from the camera.
vadeKeymasterYea, the Readback from the GPU to OpenCV, and then the added latency of doing the computer vision work is going to massively drop your frame rate.
Syphon is optimized for running on the GPU – so ping ponging between Camera (CPU) to Syphon (GPU), To Quartz Composer (GPU) to PS3 Eye (OpenCV, CPU) to MadMapper (GPU).
Now, if all of those steps were on the GPU you’d likely have little problems (I’m aware of users running content for shows with more servers in realtime sans issue).
Some suggestions:
See if you can run the camera at lower resolutions – remove the Quartz Composer step to resize frames.
Run an OpenFrameworks OpenCV app instead, which has much more control over camera -> OpenCV and resizing and hinting to OpenCV which fframes to use. Its more work (and if you don’t program, yea, its likely out of the question) and will likely take longer.
You could cut it down to:
Camera ->(CPU) Openframeworks convert to OpenCV (CPU), and then to Syphon to whomever and keep it on the GPU.
Notice there is no ping ponging happening there.
Optimizing video can be difficult, especially when you don’t always have control of the black boxes you run it through.
January 21, 2015 at 2:33 pm in reply to: Crash using Syphon in Processing // Long Term Installation #58910vadeKeymasterLooking at this crash log, its not immediately clear that Syphon is the root cause.
The crash is :
Graphics hardware encountered an error and was reset: 0x00000000 caused by the Apple graphics libraries (gpusSubmitDataBuffers).
– and more importantly the crashed thread has no Syphon calls at all – it looks as though this is an issue with the AMD GPU from what I can tell.
I can only imagine that either:
a) Syphon has a very small leak somewhere for some resource that is causing this issue. However a leak in OpenGL resources is usually very very obvious.
b) the Kernel assigns some values for IOSurfaces that aren’t being cleaned up somewhere properly and eventually it runs out of IOSurface IDs because it doesnt want to re-use them. No idea, total conjecture.
Can you test on different hardware?
January 2, 2015 at 6:02 pm in reply to: Crash using Syphon in Processing // Long Term Installation #58904vadeKeymasterThanks however, for doing that work!
January 2, 2015 at 6:01 pm in reply to: Crash using Syphon in Processing // Long Term Installation #58903vadeKeymasterHi. As far as I can tell from the images, the most important portion of the log is missing.
is there a reason you cannot mail the crash logs themselves? They are saved and accessible via Console.app in Applications/Utilities, in the side bar. You can attach or copy paste the text or one or two. We’d need the ‘crashed thread’ info, which I don’t see in the images.
vadeKeymasterOh, also, using the QC Editor to power your syphon capture is super sub-optimal. QC Editor renders using the main queue and therefor you cannot guarantee consistent FPS at all.
vadeKeymasterHm. Out of curiosity, do you get the same stutter if you use a more lightweight codec like Apple Intermediate Codec or Photo Jpeg?
vadeKeymasterTry running OpenEmu. OpenEmu has Syphon support built in.
There are a lot of issues here.
1: video games don’t run at 60hz like your monitor does, but typically 29.97.
2: Game emulators don’t always do the right things to compensate for the frame rate discrepancy, so it can be hit or miss in terms of frame rate just on its own (I know, seems weird but its the case sometimes still just due to phasing)
3: Everything Tom said about Syphon Recorder being meant for realtime, on demand, just in time frame arrival means it has a VERY hard time meeting the FPS you desire, even if the host app runs at a smooth 60Hz, frames don’t arrive at *exactly* the same time, there is, for lack of a better word, ‘clock drift’.
Try OpenEmu. I believe it has native recording, and if that doesnt suffice, try its native Syphon support, see if that helps.
Thanks for taking the time to write that up.
vadeKeymasterPull from the repo, I’ve updated the code and verified in QC at least the color is fixed (via digital color meter).
Thanks for the report and good eyes!
vadeKeymasterlike i said its only on github. you have to build. we aren’t packaging.
Install the Syphon QC Plugin.
vadeKeymasterv002 Screen Capture 2.0 (on our https://github.com/v002/v002-Media-Tools/ repository) works with Yosemite. I just tested it and made a RGB 4444 Pro Res recording via Quartz Composer -> Syphon Recorder at 60fps.
We’ve been lax with updating official releases of plugins because of the number of bugs and unfixed issues in QC, so we aren’t putting a lot of dev and personal time in to it. Sorry about the confusion.
vadeKeymasterYea, the texture coordinates can be tricky. Make sure the vertex shader is sending the appropriate coordinate as a varying, and ensure that your geometry provides viable texture coordinates if you aren’t calculating them by hand in the vertex shader. I recall QC had some weird gotchas with which gl_MultiTexcoord you sent in the past.
vadeKeymasterThis has been in place since Quartz Composer 3rd party patches were first introduced. Its not new.
Just ask on Facebook or on Kineme.
vadeKeymasterYou could use Syphoner from Sigma6 to do Syphon enabled program capture perhaps, and then use TCPSyphon from one machine to another if you really needed to do hat.
vadeKeymasterQuicktime doesnt load non-safe patches as a security policy. So any 3rd party patches won’t load in Applications that load QTZ’s via old Quicktime calls, and don’t support Quartz Composer’s native API directly.
I know, its nuances and complicated.
If I recall, Kineme makes a “Safe Mode” hack patch, I forget the name, but it might work under 10.10. I’ve not tried it.
I hope I am recalling the details correctly. Let me know if that information is helpful. I’d ping the Quartz Composer Facebook group for details on that Kineme hack to enable Non safe patches in “safe mode”.
Annoying :X
-
AuthorPosts