Texture from NSBitmapImageRep to SyphonServer?

Home Forums Syphon Syphon Development – Developer Texture from NSBitmapImageRep to SyphonServer?

Viewing 6 posts - 1 through 6 (of 6 total)
  • Author
    Posts
  • #5196
    dirrrk
    Participant

    Hi,

    first of all, thank you for building a awesome tool for OS X!

    And my apologies for being an OpenGL newbie.

    In the process of building a Syphon Server into a transparent web browser, I’m trying to put an offscreen texture into the SyphonServer. (To understand what I mean by “transparent web browser”, see the TransparentWebView git project for my borderless, clear color webbrowser window.).

    Because WebKit is not OpenGL based, I’m looking to convert the image data (with alpha) from a WebKit WebView (NSView) into an offscreen OpenGL texture which I can publish with the SyphonServer. To get the image data from the NSView I’m using [NSBitmapImageRep initWithFocusedViewRect]. However, due to my lack of understanding of OpenGL, pixelformats, contexts and textures, I’m failing to get a proper texture into the Syphon Server.

    My results look like this:

    I used the code of SyphonScreenCapture as inspiration how to make a SyphonServer and I also tried the official / documented way in combination with a snippet to create a texture from a NSBitmapImageRep.

    Excerpts of my render method. Creating the NSBitmapImageRep. (succesfully creates an image with alpha)

    NSRect webViewRect = [self bounds];
    
    [self lockFocus];
    NSBitmapImageRep *bitmapData = [[NSBitmapImageRep alloc] initWithFocusedViewRect:webViewRect];
    [self unlockFocus];

    Texture creation method one. (fails, see screenshot above)

    SyphonImage *serverImage = [syServer newFrameImage];
    glBindTexture(GL_TEXTURE_RECTANGLE_EXT, [serverImage textureName]);
    glTexImage2D(GL_TEXTURE_RECTANGLE_EXT, 0, GL_RGBA, webViewRect.size.width, webViewRect.size.height,
    			 0, GL_RGBA, GL_UNSIGNED_BYTE, [bitmapData bitmapData]);
    [syServer bindToDrawFrameOfSize:webViewRect.size];
    [syServer unbindAndPublish];

    Texture creation method two. (fails, gives me similar artifacts)

    GLenum format = [bitmapData hasAlpha] ? GL_RGBA : GL_RGB;
    NSSize image_size = [bitmapData size];
    
    GLuint tex_id;
    glGenTextures(1, &tex_id);
    glBindTexture(GL_TEXTURE_2D, tex_id);
    
    gluBuild2DMipmaps(GL_TEXTURE_2D, GL_RGBA8, image_size.width, image_size.height, format, GL_UNSIGNED_BYTE, [bitmapData bitmapData]);
    
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
    
    [syServer publishFrameTexture:tex_id textureTarget:GL_TEXTURE_2D imageRegion:webViewRect textureDimensions:webViewRect.size flipped:YES];
    
    glBindTexture(GL_TEXTURE_2D, 0);

    And this is the NSOpenGLPixelFormat I am using. I also tried other pixelformats with similar resuls. When I tried to add NSOpenGLPFAOffScreen, ’cause I’m not drawing my texture to screen, it gives me an error and tells me that I cannot create such pixelformat.

    const NSOpenGLPixelFormatAttribute attr[] = {
        NSOpenGLPFADoubleBuffer,
    	NSOpenGLPFAAccelerated,
    	NSOpenGLPFAColorSize, 32.0,
    	NSOpenGLPFADepthSize, 32.0,
    	0
    };

    Hopefully somebody can explain me what I’m doing wrong and how I can fix this.
    Thanks,
    Dirk

    #5197
    vade
    Keymaster

    You should take a look at the CoGe Webkit QC plugin, which has methods to take a Webkit based view and turn it into a OpenGL Texture:

    http://code.google.com/p/cogewebkit/

    I helped write the rendering, so let me know if anything in there needs explaining 🙂

    One thing im noticing is, you should be calling glEnable(GL_TEXTURE_RECTANGLE_EXT) or GL_TEXTURE_2D before using the textures.

    That might be it?

    #5198
    dirrrk
    Participant

    Cool!
    Thanks for the link to that project. That’s very useful!! This is almost exactly the same I need to do.

    However, I’m still struggling with getting this to work outside of Quartz Composer. When I look at the code of the CoGe WebKit plugin, I think the core of the code is in - (BOOL) execute:(id<QCPlugInContext>)context atTime:(NSTimeInterval)time withArguments:(NSDictionary*)arguments. This is also where the conversion from NSBitmapImageRep (webBitmap) to GLuint texture (webTexture1) is happening. From line 846 in CoGeWebKitPlugIn.m (in svn r15):

    //NSLog(@"rendering...");

    In the plugin a Quartz Composer outputImage is created from the texture:

    #if __BIG_ENDIAN__
    #define CogePrivatePlugInPixelFormat QCPlugInPixelFormatARGB8
    #else
    #define CogePrivatePlugInPixelFormat QCPlugInPixelFormatBGRA8
    #endif
    
    	self.outputImage =  [context outputImageProviderFromTextureWithPixelFormat:CogePrivatePlugInPixelFormat
    					pixelsWide:width
    					pixelsHigh:height
    					name:webTexture1
    					flipped:YES
    					releaseCallback:_TextureReleaseCallback
    					releaseContext:NULL
    					colorSpace:CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB)
    					shouldColorMatch:YES];

    However, if I replace the Quartz Composer specific snippet above with the Syphon Server publishing stuff:

    CGLLockContext(syServer.context);
    [syServer publishFrameTexture:webTexture1
    				textureTarget:GL_TEXTURE_RECTANGLE_EXT
    				  imageRegion:NSMakeRect(0, 0, [self.webBitmap pixelsWide], [self.webBitmap pixelsHigh])
    			textureDimensions:[self.webBitmap size]
    					  flipped:YES];
    CGLUnlockContext(syServer.context);

    I still only get artifacts in my Syphon client. 🙁

    My complete renderToSyphon method is the following now:

    - (void) renderToSyphon {
    	//NSLog(@"rendering...");
    	CGLLockContext(syServer.context);
    
    	glPushAttrib(GL_COLOR_BUFFER_BIT | GL_TRANSFORM_BIT | GL_VIEWPORT);
    
    	// create our texture
    	glEnable(GL_TEXTURE_RECTANGLE_EXT);
    	glGenTextures(1, &webTexture1);
    	glBindTexture(GL_TEXTURE_RECTANGLE_EXT, webTexture1);
    
    	if (self.webBitmap != NULL) {
    		@synchronized(self.webBitmap) {
    
    			glPixelStorei(GL_UNPACK_ROW_LENGTH, [self.webBitmap bytesPerRow] / [self.webBitmap samplesPerPixel]);
    			glPixelStorei (GL_UNPACK_ALIGNMENT, 1); 
    
    			glTexParameteri(GL_TEXTURE_RECTANGLE_EXT, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    			glTexParameteri(GL_TEXTURE_RECTANGLE_EXT, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    			glTexParameteri(GL_TEXTURE_RECTANGLE_EXT, GL_TEXTURE_WRAP_S, GL_CLAMP);
    			glTexParameteri(GL_TEXTURE_RECTANGLE_EXT, GL_TEXTURE_WRAP_T, GL_CLAMP);
    
    			glTexImage2D(GL_TEXTURE_RECTANGLE_EXT, 0,
    				[self.webBitmap samplesPerPixel] == 4 ? GL_RGBA8 : GL_RGB8,
    				[self.webBitmap pixelsWide],
    				[self.webBitmap pixelsHigh],
    				0,
    				[self.webBitmap samplesPerPixel] == 4 ? GL_RGBA : GL_RGB,
    				GL_UNSIGNED_BYTE, [self.webBitmap bitmapData]);
    
    		}
    	}
    	glFlushRenderAPPLE();
    
    	[syServer publishFrameTexture:webTexture1
    					textureTarget:GL_TEXTURE_RECTANGLE_EXT
    					  imageRegion:NSMakeRect(0, 0, [self.webBitmap pixelsWide], [self.webBitmap pixelsHigh])
    				textureDimensions:[self.webBitmap size]
    						  flipped:YES];
    
    	glBindTexture(GL_TEXTURE_RECTANGLE_EXT, 0);
    	glPopAttrib();
    
    	CGLUnlockContext(syServer.context);
    }

    Could I be creating the wrong type of context for the Syphon Server? How would you create an safe offscreen context?

    Or is there some really simple rendering to texture example which I can try to put in between my

    CGLLockContext(syServer.context);

    and

    [syServer publishFrameTexture:someTexture
    				textureTarget:GL_TEXTURE_RECTANGLE_EXT
    				  imageRegion:NSMakeRect(0, 0, w, h)
    			textureDimensions:NSMakeSize(w, h)
    					  flipped:NO];
    CGLUnlockContext(syServer.context);

    to see if I got the other parts of my application in order, how I create the SyphonServer and the context?

    … or more in general: how does one debug an OpenGL project, when the output is just artifacts from the graphics card?

    (again, please forgive my opengl inexperience and ignorance)
    Thanks,

    Dirk

    #5199
    bangnoise
    Keymaster

    Is your GL context the current context when you render? i.e. are you either setting it explicitly using CGLSetCurrentContext (or its NS alternative), or better, using CGLMacro.h?

    That would be my first guess.

    or more in general: how does one debug an OpenGL project, when the output is just artifacts from the graphics card?

    Attach OpenGL Profiler to your app and set it to break on any OpenGL error.

    Any help?

    #5200
    dirrrk
    Participant

    Thanks bangnoise!
    That was it. I did not set

    CGLSetCurrentContext( syServer.context );

    at the top of my rendering function. Thanks for the help!!!

    And also thanks Vade for the CoGeWebKit QC Plugin. It behaves much smoother than trick I devised!
    (I’m looking to the NSView’s- (BOOL)needsDisplay method, to see if anything changed and if I should push an image to the Syphon Server, but the results are not nearly as smooth as the CoGeWebKit plugin. The double buffer and asynchronisity between the two is a really good guide in improving mine).

    Thanks again!
    Dirk

    #5201
    bangnoise
    Keymaster

    Cool – glad it’s working. If the only GL work you do is this, you should be able to set the context once after you create it, and it will remain set for subsequent renders. Alternatively, look into using CGLMacro.h, which is the fastest possible way to direct commands to a context.

Viewing 6 posts - 1 through 6 (of 6 total)
  • You must be logged in to reply to this topic.