Frame rate issues in a client with Unity Syphon server

Home Forums Syphon Syphon Implementations – User Frame rate issues in a client with Unity Syphon server

Viewing 20 posts - 1 through 20 (of 26 total)
  • Author
    Posts
  • #4905
    WeivCo
    Participant

    One of the issues we’ve been trying to track down as we near our 1.0 release is a behavior we notice when we connect a Syphon client: a highly variable framerate that has the effect of dropped frames or actual dropped frames.

    I’ve done a little debugging and was able to note the following:
    The effect only happens when a client is connected.
    The effect happens both with our own 2nd display client and with Simple Client with the window on either display (though moving SC to display 2 drops frame rate in half!).
    The effect seems to occur more frequently in our own display client.
    The effect is visible both in our own “live view” RenderTexture and in the client.
    Not a CPU issue, my machine notes a max 14-15% usage with 4 of 8 cores evenly used.
    Not a RAM issue, I have 8GB with most of it free and our app uses about 70MB of it.
    According to OpenGL profiler, seemingly not a GPU issue, at least in terms of overall load – plenty of texture memory left, CPU wait varies a bit but averages 40-50%.
    The effect is less noticable at lower res, like 640×480 but it still happens.
    The effect occurs less frequently with VSync off as opposed to on.

    We’re using Syphon that’s roughly 4-5 weeks old, and here’s our code for serving frames:

    using UnityEngine;
    using System.Collections;
    using WeivSDK;
    
    /* Interop services is required for external DLL linkage. */
    using System.Runtime.InteropServices;
    
    public class SyphonFrame : MonoBehaviour {
    
    	public float fade = 1f;
    
    	protected float currentFade = 0f;
    	float fadeTime = 0.3f;
    	private float fadeVel = 0f;
    	float lastTime = 0f; // for independent dt when Time.timeScale is 0
    
    	[DllImport ("SyphonUnityPlugin")]
    	private static extern void syphonServerPublishTexture(int nativeTexture, int width, int height);
    
    	[DllImport ("SyphonUnityPlugin")]
    	private static extern void syphonServerDestroyResources();
    
    	public RenderTexture syphonRT = null;
    
    	public Material material;
    
    	public void Setup( int width, int height )
    	{
    		syphonRT = new RenderTexture( width, height, 24, RenderTextureFormat.ARGB32 );
    		syphonRT.filterMode = FilterMode.Point;
    		syphonRT.isPowerOfTwo = false;
    		syphonRT.isCubemap = false;
    		syphonRT.wrapMode = TextureWrapMode.Clamp;
    
    		currentFade = 0f;
    		lastTime = Time.realtimeSinceStartup;
    
    		syphonRT.Create();
    	}
    
    	void OnRenderImage(RenderTexture lastFXComponentRT, RenderTexture liveViewRT)
    	{
    		if ( !Mathf.Approximately(currentFade, fade) )
    		{
    			if ( material != null )
    			{
    				Color color = new Color(currentFade, currentFade, currentFade, 1f);
    				material.SetColor("_Fade", color);
    			}
    		}
    
    		if (material != null)
    		{
    			// render from the pre-GUI screen to the live view using the Weiv Camera Material
    			Graphics.Blit(lastFXComponentRT, liveViewRT, material);
    		}
    		else
    		{
    			Debug.Log("ERR ::::: Weiv Camera Material was null!");
    			// render from the pre-GUI screen to the live view
    			Graphics.Blit(lastFXComponentRT, liveViewRT);
    		}
    
    		if ( Application.platform == RuntimePlatform.OSXEditor || Application.platform == RuntimePlatform.OSXPlayer )
    		{
    			// render from the live view to syphon
    			Graphics.Blit(liveViewRT, syphonRT);
    
    			// Syphon "render one pixel" bug fix
    			// TODO: investigate this more
    			Graphics.DrawTexture(new Rect(0, 0, 0, 0), syphonRT);
    
    			/* Grab a copy for Syphon */
    			syphonServerPublishTexture(syphonRT.GetNativeTextureID(), syphonRT.width, syphonRT.height);
    		}
    	}
    
    	void FixedUpdate ()
    	{
    		// get dt independent of Time.timeScale
    		float dt = Time.realtimeSinceStartup - lastTime;
    
    		float newFade = Mathf.SmoothDamp(currentFade, fade, ref fadeVel, fadeTime, Mathf.Infinity, dt);
    		currentFade = newFade;
    
    		lastTime = Time.realtimeSinceStartup;
    	}
    
    	void OnDestroy( )
    	{
    		if ( syphonRT != null )
    		{
    			DestroyImmediate( syphonRT );
    		}
    	}
    
    	// Also called in the editor when play is stopped
    	void OnDisable ()
    	{
    		Destroy(syphonRT);
    		GL.InvalidateState();
    	}
    
    	public void ChangeResolution( int width, int height )
    	{
    		if ( syphonRT != null )
    		{
    			DestroyImmediate( syphonRT );
    
    		}
    
    		syphonRT = new RenderTexture( width, height, 24, RenderTextureFormat.ARGB32 );
    		syphonRT.filterMode = FilterMode.Point;
    		syphonRT.isPowerOfTwo = false;
    		syphonRT.isCubemap = false;
    		syphonRT.wrapMode = TextureWrapMode.Clamp;
    
    		syphonRT.Create( );
    	}
    }

    If I were to guess, I’d say this is an issue with the server, the Unity server specifically, the Unity plug-in system, or our implementation. In other words, not an issue with the client. Any ideas?

    #4906
    Brian Chasalow
    Participant

    dunno. are you having the same issue in the editor and the runtime?

    #4907
    WeivCo
    Participant

    Ok, further data that points to some kind of Syphon serving issue. I figured out how to attach the Unity profiler to our standalone build and went through a build with VSync on and with it off with a client connected and then VSync off without a client where the visual result is good. Most of our time is spent in Device.Present, which seems to mean waiting for the GPU. It runs pretty well, except for the occasional spike, which seems to correspond directly with a “chunking” effect that looks similar to 1-2 dropped frames. And sometimes the spikes are more frequent, resulting in a “grinding” effect.

    Outputting to client, VSync ON: http://img846.imageshack.us/img846/3762/screenshot20111011at112o.png

    Outputting to client, VSync OFF: http://img684.imageshack.us/img684/2909/screenshot20111011at124.png

    Not outputting, VSync OFF: http://img17.imageshack.us/img17/2909/screenshot20111011at124.png

    Brian, thanks for the response, yeah same behavior happening in both.

    #4908
    Brian Chasalow
    Participant

    well, this line in the plugin has me concerned, because the context init’d from CGLGetCurrentContext() may not exist by the time it is published later on. When you resize/fullscreen/etc a unity window, it destroys and recreates the context. I’m not sure if this could lead to that grinding effect – usually if something like that were to cause an issue you will simply see nothing on screen. But it’s something I noticed in another plugin I’ve been working on and need to figure out a good solution for this one as well.

    if(_unitySyphonServer == nil)
        {
            _unitySyphonServer = [[SyphonServer alloc] initWithName:@"Demo" context:CGLGetCurrentContext() options:nil];
            NSLog(@"creating Syphon Server");
        }
    #4909
    WeivCo
    Participant

    Thanks Brian, I appreciate your feedback.

    I guess I forgot to mention that our app is running windowed at 1024×640 and then we’re using RenderTextures to send frames at various resolutions to our live view and then to Syphon, as the code above demonstrates. But our resolutions settings are saved, so when I launch the app it can load at a set res and it will just stay there. But would it be a problem to have the app’s resolution be different from the one sent to Syphon, with Syphon mistakenly thinking the resolution has changed and the context needs to be recreated every frame? It seems like that would slow things down a lot more than it is right now.

    And I forgot to mention what is almost the most important thing! We’ve found an odd pseudo-workaround.

    The effect pretty much goes away completely when you change the timeScale below 1. Even just 0.5 makes things look much better.

    #4910
    vade
    Keymaster

    Im honestly not as familiar with Unity internals as Brian is (much much less so), so I’m a bit hesitant to comment on the Unity side of things. I do know from working with Brian on the initial Plugin is that, indeed, uUnity bounces contexts around too and fro, which means the Plugin can not be guaranteed to have the same context active as when it was initialized, thus those checks.

    Might I suggest compiling a debug version of the Syphon Unity plugin and ensuring that its not re-creating resources due to context switches during the hiccups?

    I am personally not very satisfied with the state of the Unity plugin, only because Unity is such a closed black box, that its next to impossible to know what state assumptions and areas in the pipeline are the most suitable to ‘abuse’.

    As far as a ‘serving’ issue is concerned, know this:

    Syphon, internally, has to add an additional render to texture phase *if* it is publishing a texture from a pre-existing texture (an alternative is to bind, then unbind and publish which renders directly to Syphons texture, however this is only useful in some scenarios)

    This additional render to texture pass copies the incoming texture to the IOSurface backed texture for display. This is usually quite fast, fully hardware accelerated, and, judging form your bench-marks seems uniquely to be a cause for concern.

    However, *making a new internal IOSurface backed texture* is *non trivial* and *quite slow*. One early optimization we did in Syphon was to ensure we only rarely create new internal IOSurfaces when needed. If your context is changing behind the scenes from time to time, and you init a new server, you will init a new internal IOSurface backed texture, which might account for some slowdowns and grinding, as I understand it also has to init resources on the GPU, *and* the OS kernel.

    Run a debug build of the Unity plugin. Put logs in places where checks are done (check if contexts match, check if servers are being re-created, etc), and see what internally is happening. It might be revealing and help narrow it down.

    Sorry I cannot be of more immediate help! I wish Unity was not quite as black box’ed as it is…

    #4911
    WeivCo
    Participant

    Thank you gentlemen, I can’t tell you how much I appreciate your suggestions and speedy responses. We are trying to solve this problem during work hours so I definitely am grateful. 🙂 Our app is a Unity app with it’s only output via Syphon, so I guess there’s an incentive for us, heh.

    I also want to say that Syphon is awesome, and you guys have done a great job sticking with it. We are also thankful and impressed by the unified support it has received within the VJ software space. I’m glad we’ll be built around Syphon from the get-go!

    Anyway, we’ll look into this more and keep the thread updated.

    Side note, I also found a couple interesting links while searching around for IOSurface stuff regarding web browsers using it:
    http://src.chromium.org/svn/trunk/src/content/browser/renderer_host/accelerated_surface_container_mac.cc
    https://bugzilla.mozilla.org/show_bug.cgi?id=598425

    #4912
    Brian Chasalow
    Participant

    In runtime builds or when the scene view is closed, there is only one context. When the window is updated/resized/fullscreen’d etc, the context is destroyed and recreated.

    if you have a scene view open in the Unity Editor, it would appear that you have two shared contexts. Calling CGLGetCurrentContext() in Update will bounce between the two contexts, which leads me to believe that methods like Update() are called every other frame by the scene view and the game view- an unlikely although possible scenario. I have an email out to Aras requesting more information about that.

    The syphon plugin does not currently take into account context destruction/recreation, and should, but I’m not sure how to go about this yet. Previously, I had it such that if the screen width or height was resized, it would destroy and recreate the server instance, effectively doing the same thing. I do not see this in your code, and I do not see anywhere you calling syphonServerDestroyResources which is very important on server destruction.
    I doubt this is where the grinding is coming from though, with what Anton was saying.

    This line of code always had me a little concerned:
    Graphics.DrawTexture(new Rect(0, 0, 0, 0), syphonRT);
    As it remains a mystery as to why it is needed for it Syphon to function properly in Unity.

    #4913
    WeivCo
    Participant

    All right, well I came out alive jumping into the deep end of XCode without really knowing that or Obj-C, so that’s good 😛

    I added NSLog lines in the Unity plug-in when publishing and in Syphon in bindToDrawFrameOfSize and got these first few lines (2400×600 is just the current test res, my problem is occuring in 640×480 as well):

    2011-10-11 19:06:25.031 Weiv 1.0 Demo[6692:207] creating Syphon Server
    2011-10-11 19:06:25.032 Weiv 1.0 Demo[6692:207] Publishing to texture 2400, 600
    2011-10-11 19:06:25.033 Weiv 1.0 Demo[6692:207] Rebuilding surface from 0.000000, 0.000000 to 2400.000000, 600.000000
    2011-10-11 19:06:25.064 Weiv 1.0 Demo[6692:207] Publishing to texture 2400, 600

    And I keep publishing till the server is destroyed. So the surface is only being rebuilt once it seems, which is good. Though I publish once first before rebuilding?

    Yeah that line of code seems to fix some kind of weird UV issue or something. The effect seems to be that one pixel in the image is being sampled and drawn for the entire screen. A couple scenes we had triggered it, one of them when a camera was in a specific place and/or some sprites were being instantiated at a specific moment. Other scenes seemed to have it all the time. Changing to VertexLit rendering was a workaround, but I looked at your example script and reduced the fix down to that one line while maintaining a Unity-friendly OnRenderImage. I’m assuming that’s why the surface is being rebuilt from 0,0?

    BTW our destroying happens here:

    using UnityEngine;
    using System.Collections;
    using System.Runtime.InteropServices;
    
    public class SyphonServer : MonoBehaviour {
    
    	/* Nothing to be done for initialization */
    	/* DLL is programmed to auto-initialize with "Weiv" upon first publish */
    
    	/* Cleanup */
    	[DllImport ("SyphonUnityPlugin")]
    	private static extern void syphonServerDestroyResources();
    
    	/* To help handle resizing */
    	private float mScreenWidth;
    	private float mScreenHeight;
    
    	void Start ()
    	{
    		mScreenWidth = Screen.width;
    		mScreenHeight = Screen.height;
    	}
    
    	void LateUpdate()
    	{
    		if ( Application.platform == RuntimePlatform.OSXEditor || Application.platform == RuntimePlatform.OSXPlayer )
    		{
    			if ( mScreenWidth != Screen.width || mScreenHeight != Screen.height )
    			{
    				mScreenWidth = Screen.width;
    				mScreenHeight = Screen.height;
    				syphonServerDestroyResources( );
    			}
    		}
    	}
    
    	void OnApplicationQuit( )
    	{
    		if ( Application.platform == RuntimePlatform.OSXEditor || Application.platform == RuntimePlatform.OSXPlayer )
    		{
    			/* Cleanup everything and close the server */
    			syphonServerDestroyResources( );
    		}
    	}
    
    }

    #4914
    Brian Chasalow
    Participant

    "I'm assuming that's why the surface is being rebuilt from 0,0?"

    EDIT: i thought you were referring to the Graphics.DrawTexture call, but you were talking about something else. my bad.

    anyway,
    calling Graphics.DrawTexture(new Rect(0, 0, 0, 0), syphonRT) has nothing to do with IOSurface/Syphon/logic/sanity. It is a wonky Unity workaround to basically force GL to clean up its act, so that by the time it serves the texture to Syphon, its matrices are popped. This way, it will be completely done rendering regardless of whatever magic Unity is doing behind the scenes by the time you serve the screen texture.

    I found that fix accidently after days/weeks of fighting with Syphon/Unity, and it only still exists because I know not of any other way to make it work.

    #4915
    vade
    Keymaster

    The reason the surface is rebuilt from 0,0 is because when you initialize a Syphon Server Obj-C object it as a NSSize struct that is NSZeroSized, and will only init the texture when you first call publishFrameBlahBlah, because it needs to make a texture to match.

    If your scene size is changing at all, then a rebuild is triggered, because textures cannot be resized, only destroyed and re-created anew at a different size.

    #4916
    WeivCo
    Participant

    Thanks guys, good to know more about what’s going on. I’ll keep digging and try to get some more information.

    #4917
    WeivCo
    Participant

    Here’s a run-through where I load a scene that creates a server, let it sit there for a few seconds, then open a client, then let it sit there a while longer, then quit:

    19:35:27.771 Weiv 1.0 Demo[14489:207] creating Syphon Server
    19:35:27.778 Weiv 1.0 Demo[14489:207] pimpedVersionForSyphon didn't find an icon key
    19:35:27.793 Weiv 1.0 Demo[14489:207] Made a new appImage
    19:35:27.793 Weiv 1.0 Demo[14489:207] NEW server description in handleServerAnnounce
    19:35:27.794 Weiv 1.0 Demo[14489:207] uuid found in indexOfDescriptionForSyphonServerUUID
    19:35:27.794 Weiv 1.0 Demo[14489:207] index not found in handleServerAnnounce, but didChange
    19:35:27.798 Weiv 1.0 Demo[14489:207] Sizes didn't match, rebuilding surface from 0.000000, 0.000000 to 2400.000000, 600.000000
    19:35:27.799 Weiv 1.0 Demo[14489:207] Destroying IOSurface now
    19:35:27.799 Weiv 1.0 Demo[14489:207] Setting up IOSurface
    19:35:27.802 Weiv 1.0 Demo[14489:207] Initializing
    19:35:27.803 Weiv 1.0 Demo[14489:207] _pushPending, so glFlush-ing...kawooooossshhhh
    
    Attempting to open second display.
    
    Display showing!
    
    19:36:06.871 Weiv 1.0 Demo[14489:207] pimpedVersionForSyphon didn't find an icon key
    19:36:06.874 Weiv 1.0 Demo[14489:207] Made a new appImage
    19:36:06.874 Weiv 1.0 Demo[14489:207] NEW server description in handleServerAnnounce
    19:36:06.874 Weiv 1.0 Demo[14489:207] uuid found in indexOfDescriptionForSyphonServerUUID
    19:36:06.874 Weiv 1.0 Demo[14489:207] index found in handleServerAnnounce
    19:36:08.569 Weiv 1.0 Demo[14489:1a903] No response to pings
    19:36:08.570 Weiv 1.0 Demo[14489:1a903] All servers responded to announce request.
    
    Bluetooth Adapter looking stopped.
    
    19:36:38.760 Weiv 1.0 Demo[14489:1aa03] Draining queue
    19:36:38.760 Weiv 1.0 Demo[14489:207] Destroying IOSurface now
    19:36:38.760 Weiv 1.0 Demo[14489:207] Deleting texture
    19:36:38.761 Weiv 1.0 Demo[14489:207] Releasing surface and context
    19:36:38.761 Weiv 1.0 Demo[14489:207] Destroying IOSurface now
    19:36:38.762 Weiv 1.0 Demo[14489:207] destroying Syphon Server
    #4918
    Brian Chasalow
    Participant

    I’m not entirely sure what you’re doing there without seeing the respective code but it doesn’t look to be doing anything strange.

    #4919
    WeivCo
    Participant

    Yeah, sorry, I guess I could have added more “in thisFunction” to the output. But my conclusion is the same. It looks pretty normal to me.

    The only thing I’m confused about is the “NEW server description” happening twice. I would have thought that the second time it would have updated instead of making a new one again. But there could be a good reason for that, I don’t have that great of an understanding of everything yet. Still, nothing seems to be getting destroyed and rebuilt every couple seconds or anything. So next up is digging through the client I suppose.

    #4920
    vade
    Keymaster

    out of curiosity, are you using a system with two GPUs, or a single GPU, dual head ?

    #4921
    WeivCo
    Participant

    I’m on a Rev A quad core Macbook Pro i7. Two GPUs, AMD/ATI 6750M and the HD 3000. But I’m using gfxCardStatus to force Discrete Only. Earlier Thunderbolt drivers would not allow output at all while using the HD 3000, so I’m assuming that the AMD is driving both. But there was a driver update recently, so I’m not sure if that has changed or not.

    I did notice for the clients that they run much better on the primary display. Do you think it’s one of those CVDisplayLink-set-to-primary issues or that this is actually using the HD 3000 for the secondary display? Because that would be unfortunate in either case…

    #4922
    Brian Chasalow
    Participant

    No idea on this end, but it does seem like a potential source of problems. I would try to take the hardware out of the equation before poking too hard at the software- whatever that takes. Maybe someone else has better advice.

    #4923
    WeivCo
    Participant

    I’ve done quite a bit more research here, including testing just on my native display with a Simple Client where I know for sure that I’m using just my AMD card. I still get the issue.

    I’ve managed to fire up OpenGL Profiler and attach it to our app to trace calls. It renders very fast normally, and only when I get a “grinding” type of effect I notice a 10x increase in the amount of time spent in CGLFlushDrawable, i.e. copying from the back buffer to the front. Based on my research, I wonder if this has to do with mixing single- and double-buffering?

    From what I understand using glFlushRenderAPPLE is for single-buffering, and I don’t see a usage of fence, which means Unity -> Syphon server -> Syphon client would mean double -> single -> double buffering.

    I found on one post:

    Calling flushBuffer more than once within the refresh period should actually block your application (lots of time spent in CGLFlushDrawable appearing in a Shark profile).

    If the rendering process it “out of phase” (as you call it in a Nov 2009 quartz-dev thread), maybe a flush is effectively being called too many times in a refresh period?

    #4924
    WeivCo
    Participant

    Just found this:
    http://fedora.cis.cau.edu/~pmolnar/CIS200P11/Pong/OpenGL.framework/Headers/CGLIOSurface.h

    Does “changing IOSurface on the CPU” only mean CPU rendering? Cuz I guess there’s a IOSurfaceLock and unlock. Not sure if that would help sync things when it comes to weird issues like this??

    EDIT: Oh you seem to already know about it. Quite a frequenter on the Apple lists 😛

Viewing 20 posts - 1 through 20 (of 26 total)
  • You must be logged in to reply to this topic.