dirrrk

Forum Replies Created

Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
    Posts
  • in reply to: Texture from NSBitmapImageRep to SyphonServer? #5200
    dirrrk
    Participant

    Thanks bangnoise!
    That was it. I did not set

    CGLSetCurrentContext( syServer.context );

    at the top of my rendering function. Thanks for the help!!!

    And also thanks Vade for the CoGeWebKit QC Plugin. It behaves much smoother than trick I devised!
    (I’m looking to the NSView’s- (BOOL)needsDisplay method, to see if anything changed and if I should push an image to the Syphon Server, but the results are not nearly as smooth as the CoGeWebKit plugin. The double buffer and asynchronisity between the two is a really good guide in improving mine).

    Thanks again!
    Dirk

    in reply to: Texture from NSBitmapImageRep to SyphonServer? #5198
    dirrrk
    Participant

    Cool!
    Thanks for the link to that project. That’s very useful!! This is almost exactly the same I need to do.

    However, I’m still struggling with getting this to work outside of Quartz Composer. When I look at the code of the CoGe WebKit plugin, I think the core of the code is in - (BOOL) execute:(id<QCPlugInContext>)context atTime:(NSTimeInterval)time withArguments:(NSDictionary*)arguments. This is also where the conversion from NSBitmapImageRep (webBitmap) to GLuint texture (webTexture1) is happening. From line 846 in CoGeWebKitPlugIn.m (in svn r15):

    //NSLog(@"rendering...");

    In the plugin a Quartz Composer outputImage is created from the texture:

    #if __BIG_ENDIAN__
    #define CogePrivatePlugInPixelFormat QCPlugInPixelFormatARGB8
    #else
    #define CogePrivatePlugInPixelFormat QCPlugInPixelFormatBGRA8
    #endif
    
    	self.outputImage =  [context outputImageProviderFromTextureWithPixelFormat:CogePrivatePlugInPixelFormat
    					pixelsWide:width
    					pixelsHigh:height
    					name:webTexture1
    					flipped:YES
    					releaseCallback:_TextureReleaseCallback
    					releaseContext:NULL
    					colorSpace:CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB)
    					shouldColorMatch:YES];

    However, if I replace the Quartz Composer specific snippet above with the Syphon Server publishing stuff:

    CGLLockContext(syServer.context);
    [syServer publishFrameTexture:webTexture1
    				textureTarget:GL_TEXTURE_RECTANGLE_EXT
    				  imageRegion:NSMakeRect(0, 0, [self.webBitmap pixelsWide], [self.webBitmap pixelsHigh])
    			textureDimensions:[self.webBitmap size]
    					  flipped:YES];
    CGLUnlockContext(syServer.context);

    I still only get artifacts in my Syphon client. 🙁

    My complete renderToSyphon method is the following now:

    - (void) renderToSyphon {
    	//NSLog(@"rendering...");
    	CGLLockContext(syServer.context);
    
    	glPushAttrib(GL_COLOR_BUFFER_BIT | GL_TRANSFORM_BIT | GL_VIEWPORT);
    
    	// create our texture
    	glEnable(GL_TEXTURE_RECTANGLE_EXT);
    	glGenTextures(1, &webTexture1);
    	glBindTexture(GL_TEXTURE_RECTANGLE_EXT, webTexture1);
    
    	if (self.webBitmap != NULL) {
    		@synchronized(self.webBitmap) {
    
    			glPixelStorei(GL_UNPACK_ROW_LENGTH, [self.webBitmap bytesPerRow] / [self.webBitmap samplesPerPixel]);
    			glPixelStorei (GL_UNPACK_ALIGNMENT, 1); 
    
    			glTexParameteri(GL_TEXTURE_RECTANGLE_EXT, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    			glTexParameteri(GL_TEXTURE_RECTANGLE_EXT, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    			glTexParameteri(GL_TEXTURE_RECTANGLE_EXT, GL_TEXTURE_WRAP_S, GL_CLAMP);
    			glTexParameteri(GL_TEXTURE_RECTANGLE_EXT, GL_TEXTURE_WRAP_T, GL_CLAMP);
    
    			glTexImage2D(GL_TEXTURE_RECTANGLE_EXT, 0,
    				[self.webBitmap samplesPerPixel] == 4 ? GL_RGBA8 : GL_RGB8,
    				[self.webBitmap pixelsWide],
    				[self.webBitmap pixelsHigh],
    				0,
    				[self.webBitmap samplesPerPixel] == 4 ? GL_RGBA : GL_RGB,
    				GL_UNSIGNED_BYTE, [self.webBitmap bitmapData]);
    
    		}
    	}
    	glFlushRenderAPPLE();
    
    	[syServer publishFrameTexture:webTexture1
    					textureTarget:GL_TEXTURE_RECTANGLE_EXT
    					  imageRegion:NSMakeRect(0, 0, [self.webBitmap pixelsWide], [self.webBitmap pixelsHigh])
    				textureDimensions:[self.webBitmap size]
    						  flipped:YES];
    
    	glBindTexture(GL_TEXTURE_RECTANGLE_EXT, 0);
    	glPopAttrib();
    
    	CGLUnlockContext(syServer.context);
    }

    Could I be creating the wrong type of context for the Syphon Server? How would you create an safe offscreen context?

    Or is there some really simple rendering to texture example which I can try to put in between my

    CGLLockContext(syServer.context);

    and

    [syServer publishFrameTexture:someTexture
    				textureTarget:GL_TEXTURE_RECTANGLE_EXT
    				  imageRegion:NSMakeRect(0, 0, w, h)
    			textureDimensions:NSMakeSize(w, h)
    					  flipped:NO];
    CGLUnlockContext(syServer.context);

    to see if I got the other parts of my application in order, how I create the SyphonServer and the context?

    … or more in general: how does one debug an OpenGL project, when the output is just artifacts from the graphics card?

    (again, please forgive my opengl inexperience and ignorance)
    Thanks,

    Dirk

Viewing 2 posts - 1 through 2 (of 2 total)