objective c - Drawing a CIImage to a NSOpenGLView (without hitting main memory) -


i feel must missing here. i've subclassed nsopenglview, , i'm attempting draw ciimage in drawrect: call.

override func drawrect(dirtyrect: nsrect) {     super.drawrect(dirtyrect)     let start = cfabsolutetimegetcurrent()      openglcontext!.makecurrentcontext()     let cglcontext = openglcontext!.cglcontextobj     let pixelformat = openglcontext!.pixelformat.cglpixelformatobj     let colorspace = cgcolorspacecreatewithname(kcgcolorspacesrgb)!     let options: [string: anyobject] = [kcicontextoutputcolorspace: colorspace]     let context = cicontext(cglcontext: cglcontext, pixelformat: pixelformat, colorspace: colorspace, options: options)      context.drawimage(inputimage, inrect: self.bounds, fromrect: input.extent)      openglcontext!.flushbuffer()     let end = cfabsolutetimegetcurrent()     swift.print("updated view in \(end - start)") } 

i'm under mistaken impression nsopenglcontext (and it's underlying cglcontext) can wrapped in cicontext, , rendering produce image in view. while work being done in above code, have no idea actual pixels ending (because view ends blank).

if grab current nsgraphicscontext , render that, image in nsopenglview, rendering seems take 10x long (i.e. changing cicontext declaration this):

    // works, slow     let context = nsgraphicscontext.currentcontext()!.cicontext! 

also tried this, both slow , doesn't display image, making double fail:

    // doesn't work. , it's slow.     input.drawinrect(bounds, fromrect: input.extent, operation: .compositedestinationatop, fraction: 1.0) 

the simple solution render out cgimage, , pop onto screen (via cgimage -> nsimage , nsimageview, or backing calayer). that's not performant enough case. in app, i'm looking render couple dozen thumbnail-sized images, each own different chain of cifilters, , refresh them in realtime underlying base image changes. while each render in few milliseconds each (with current cgimage-bracketed pathway), view updates still on order of few frames per second.

i have working path looks cgimageref -> ciimage -> bunch of cifilters -> cgimage -> assign calayer display. appears having cgimages @ both ends of rendering chain killing performance.

after profiling, appears of time being spent copying memory around, suppose expected, not efficient. backing cgimage needs shuttled gpu, filtered, goes main memory cgimage, (presumably) goes right gpu scaled , displayed calayer. ideally, root images (before filtering) sit on gpu, , results rendered directly video memory, have no idea how accomplish this. current rendering pathway pixel smashing on gpu (that's fantastic!), swamped shuttling memory around display darned thing.

so can enlighten me on how core image filtering pathway keeps things on gpu end-to-end? (or @ least has swap data in once?) if have, say, iosurface-backed ciimage, how draw directly ui without hitting main memory? hints?

after day of banging head against wall on this, still no dice nsopenglview... but, think i'm able want via caopengllayer instead, fine me:

class gllayer: caopengllayer {     var image: ciimage?      override func drawincglcontext(ctx: cglcontextobj, pixelformat pf: cglpixelformatobj, forlayertime t: cftimeinterval, displaytime ts: unsafepointer<cvtimestamp>) {         if image == nil {             let url = nsurl(fileurlwithpath: "/users/doug/desktop/test.jpg")             image = ciimage(contentsofurl: url)!         }          let colorspace = cgcolorspacecreatewithname(kcgcolorspacesrgb)!         let options: [string: anyobject] = [kcicontextoutputcolorspace: colorspace]         let context = cicontext(cglcontext: ctx, pixelformat: pf, colorspace: colorspace, options: options)         let targetrect = cgrectmake(-1, -1, 2, 2)         context.drawimage(image!, inrect: targetrect, fromrect: image!.extent)     } } 

one thing of note - coordinate system caopenglview different. view 2.0x2.0 units, origin @ 0,0 (took while figure out). other that, same non-working code in original question (except, know, works here). perhaps nsopenglview class isn't returning proper context. knows. still interested in why, @ least can move on now.


Comments