Forum Replies Created
-
AuthorPosts
-
vadeKeymaster
That crash is not a crash with the Syphon framework or any deep internals of the app, but rather a loading issue with the Nib. Can you re-download the app? I note you are on 10.6.3, perhaps this is an issue with 10.6.3, and you being on a hackintosh?
Can you copy the Syphon Recorder to a real mac running a current OS and try it there, please?
vadeKeymasterWell, part of the issue is that Jitter is in fact Jittery for normal QT playback, and because all of this is also contingent on how you are sending frames to OpenGL. If you continuously bang the renderer or have the texture sent to the syphon server banged you will cause additional and possibly duplicate frames sent out from Jitter to Syphon, even though you have @unique 1 specified in your jit.qt.movie. The frame duplication is also an issue on how GL is drawn and handled. Suffice it to say it is nuanced.
vadeKeymasterWell, in theory, yes, but it greatly depends on how the model is set up and rigged. A general solution is difficult because knowing what bones are what, what the bind pose is, etc, can be difficult.
That said, I have some ideas, and plan on some things. Once some work stuff gets out of the way I hope I can delve into it.
vadeKeymasterYea. I had done some looking into this, and there is an RFC for sending uncompressed YUV frames. I think VLC and GStreamer support RFC 4175, which is I think what you would want to handle.
http://www.rfc-editor.org/rfc/rfc4175.txt
see : rtpvrawdepay and rtpvrawpay
Would be very curious / exciting to see that indeed!
vadeKeymasterActually, Jitter is named appropriately. It really can’t handle smooth playback under load due to how the scheduler, threading, and frame dropping is handled. Yeah. I know.
I suggest you use something other than Jitter is smooth playback is your goal. Honestly.
vadeKeymasterOf course, thats the whole idea of Syphon. Just hide the Application, the Syphon Server will still send frames.
vadeKeymasterWhat? No. Try again. The screen capture for all clients takes whats on screen in the area on the window directly above it. If you occlude the window you will get the occluding surface. It does not have x-ray vision so to speak (for speed reasons).
vadeKeymasterOh sure blame me!
Ha.
Sandro, do you have more than 1 GPU in your system; is it two displays on two GPUs, or two displays on two GPUS?
Also, I probably need to handle the second monitor case, as I may be assuming some things regarding capture and fullscreen contexts.
Thanks for your patience. This is why its in SVN eh? 🙂
vadeKeymasterHrm, thats odd. Are you also building the Jitter and QC Syphon implementations from the SVN? I have only tested it with the latest SVN builds, not the public beta 1 or 2 (1 will 100% not work).
Are you also sure that Syphon.framework is properly being referenced and copied into the Screen Capture.app Content/Frameworks folder?
Yea, SVN stuff is fun, no? We ought to polish that up and do new release soon. Hrm.
vadeKeymasterLook in your menu bar. Syphon Screen Capture shows a capture area, and you can select what to capture from the menu bar, or quit the app. This is so it does not take up screen real-estate while capturing, rendering the UI “in the way”.
Again, menu bar 🙂
vadeKeymasterI don’t see why not. Why don’t you give some specifics? I can only guess what it is you are or are not doing.
This might be of help: http://tinyurl.com/mwztw
vadeKeymasterDon’t get me wrong, these sorts of situations are definitely a ‘trap’, and I find myself spending more time making “tools” than actual art, but for whatever reason I still find it satisfying, so I continue to do it. Maybe one drive has taken over the other.
That was not meant to be harsh, more of a kick in the ass 🙂
There is time for everything, but maybe not all at once. Good luck on your projects!
vadeKeymasterIf you are using Max/MSP, you are a programmer whether you like it or not. Any of these environments require the application of logic, critical problem solving solutions along with some creative inspiration to make original and interesting work.
Being a programmer is like having another tool in your tool-belt, better to have even a dull knife than not have one at all. If you don’t want to be a programmer at all, I suggest not using Max.
Seriously. Use something else (and if Programing turns you off, don’t work in the new Media Scene, or get a grant so you can have someone do the work for you).
Patching environments (Max, Quartz, PD, VVVV, etc) are still programming environments and you will, for better or worse be forced to learn the ins and out of the system, the caveats and assumptions the environment has, and the best, most agile ways of using it regardless of what environment you choose to work in. Its part of learning how the tool works and how to get it to do what *you* want it to.
Check out Andrew Bensons “Jitter Recipes” on the C74 page, they are well written overviews of the though process and patching process, and Andrew is a great Jitter programmer, and makes nice work, so the patches are approachable and understandable, as well as interesting. I also have shared some work on the forums describing how it works, but not in ‘article’ format.
vadeKeymasterThe web player cannot load 3rd party plugins, so Syphon and other tools wont work. Basically, the Web Player is a separate runtime for Unity that is kind of sand-boxed, so Syphon cant work in there. Also, you would require the user to have a separate app running that is either a Syphon Client or Server, independent of the web page they are running, which I don’t see the point of.
vadeKeymasterHonestly, Vizzie is never going to be fast. If optimization is what you want, you should be researching how to effectively use jit.gl.slab, make GL contexts, and send textures around using send and receive. Vissie uses only on the CPU matrix operators, and will not be as fast as the *graphics* card for doing visual effects.
Get your head around basic GL with Jitter use @automatic 0 and learn about triggering order for drawing GL, and then use slabs / textures and even render to texture (@capture and the jit.gl.sketch workaround using forced drawing to capture whole scenes to a single texture). These techniques are more complicated and nuanced, but will give you an amazing about of flexibility that vissie simply will not ever be able to give you.
The fact that vizzie was not done in GL is actually kind of a shame, as it makes a lot of sense to have it there.
vadeKeymasterI dont see *any* Syphon code in there, so I’m not sure what to tell you. If you want to send GL to Syphon use @capture , a named texture and send that texture to Syphon’s jit.gl.syphonserver
Syphon likes textures, send it those, especially if you are processing in OpenGL. Is that your patch you made?
December 30, 2010 at 3:44 pm in reply to: Emotion software (http://www.adrienm.net/emotion/eMotion.html) #4590vadeKeymasterHi!
yes, I did a very very rough Syphon integration for eMotion for someone and it works. I’ve not released it because there are some issues (I also do not know how to use eMotion very well). I ought to email the sources back to Adrian and let him take it from there, as I dont know enough of the internal eMotion engine to know whats what.
Be sure to thank bangnoise too, he pulled more than his weight on this. Ill ping Adrian and give him my changes.
vadeKeymasterNice work!
vadeKeymasterWhere are you dead-ending? Whats the wall so to speak?
vadeKeymasterWell, clearly you have to copy the externals to your app, how else is it going to find jit.gl.syphonserver etc, if you don’t include it in the build script? Unless I am missing something, this seems like a small inconvenience to application packagers to add a framework to the runtime app bundle?
-
AuthorPosts