Saturday, February 6, 2016

TECHNOLOGY | Updating Apple Developer Connection OpenGL ES sample projects from 2.0 to 3.0

This post is one possible way for me to connect with the Apple Developer Community by filling a huge gap in its understanding as to how to update and otherwise develop apps for the iPhone that use the latest GPU API (OpenGL ES 3.0). Most OpenGL/iOS developers customize the OpenGL ES sample projects from the Apple Developer Connection website, which use OpenGL ES 2.0—far inferior to 3.0. The web is littered with posts on developer message boards, in which iOS developers are asking for help with OpenGL 3.0. When posting a question, say, on stackoverflow.com, nearly all developers confess to using the sample code from Apple; and, the ones who don't give themselves away by leaving the comments supplied by Apple within their sample project files intact whenever they show "their" code.

It's no small feat to do this; here's two reasons why:
  1. Answers to common questions are rare. I literally stumbled onto the answer to the problem described below (the one with the -6683 error constant) by clicking the wrong link in Xcode's documentation viewer; and, I didn't even see it as the answer until I implemented in my project on a larf, and then mistyped a key combination, causing the app to run instead of just build. (I hadn't intended to test it until I had tried one other thing I thought would fix the problem, which, ironically, didn't).
  2. Solutions to problems are not always logical. You'll never guess your way through to a solution for an OpenGL problem. Never (not even if demons weren't screaming in your ears the whole time, either). For example, the solution to the above-problem was a matter of specifying a different color channel  for a YUV texture for 3.0-compatibility than the one specified in 2.0. In the call to  CVOpenGLESTextureCacheCreateTextureFromImage, they went from GL_RED_EXT and GL_RG_EXT to GL_LUMINANCE (for Y) and GL_LUMINANCE_ALPHA (for UV), respectively. Luminance makes sense for the Y channel—much more than red ever did—but, alpha (transparency) for the UV channels baffles me. What's even more confusing about this is the fact that, in the shaders, Y is the red channel and UV is the red and green (but not blue) channels. So, it makes no sense, and it's now different in two places. And, that's just one example. It's like that all day (and with demons screaming in my ears, to boot).
2.0 fragment shader: Some changes made to 3.0 do make sense...2.0 vertex shader: ...but the subtle few can significantly impact ease-of-deployment...
Same vertex shader (above, right), revised for 3.0: ...Developers updating from 2.0 must comb their shaders line-by-line to ensure compatibility...Same fragment shader (above, left), revised for 3.0: ...while divining the intended meaning by the plethora of vague error messages that can result from any error, great or small
Anyway, by posting this coveted solution for bedraggled developers, I'm hoping to find reciprocal help in other areas of iOS app development as I continue to write Chroma, should I need it. Software development—at least the kind that "makes a difference in [one's] day"—is not a one-man job, and any thinking the contrary is unrealistic and naive. This revised code, then, is a general overture of sorts to the world of iOS developers, who assistance I may desperately need in the future (in order to prevent more blinding of the eyes by demons, one problem Chroma is purposed to begin solving [search for blindness on this blog].




Today, I revised three sample code projects published by Apple Developer Connection, namely, VideoSnake and Real-timeVideoProcessingUsingAVPlayerItemVideoOutput and GLCameraRipple, all of which, in their original form, use OpenGL ES 2.0 for GLSL, shader and the API used by EAGLContext and EAGLLayer-backed views. These customized versions now use OpenGL ES 3.0.

VideoSnakeOpenGLES3.zip

Real-timeVideoProcessingUsingAVPlayerItemVideoOutput_GLKit.zip
GLCameraRipple.zipModified to use Open GL ES 3.0 (vs. 2.0). This sample is ideal for beginner iOS developers who want to use the latest OpenGL implementation in their apps, in that it meets only the minimum requirements for doing so, making it the simplest to follow. It is also the fastest OpenGL implementation of any, achieving over 60 FPS on an iPhone 6 Plus—consistently, and even under constant, heavy use.
Device color space settings changed dramatically in OpenGL ES 3.0 (top-left), and it's not always evident on how to restore them when converting from 2.0; in the new sample, the GL_RED_EXT and GL_RG_EXT values in both the Y and UV texture-creation commands were replaced with GL_LUMINANCE and GL_LUMINANCE_ALPHA, respectively. Developers unfamiliar with iOS-specific OpenGL implementations struggle to solve the unintended "false color" output of their upgraded code

The speed increase is appreciable, especially in apps that use both OpenGL and Core Image. For example, my app that processes video in real-time using OpenGL displays a histogram using Core Image, also in real-time:
Rendering video on the GPU......and user-interface animation on the CPU (rotate scale and histogram)......enables smooth playback and animation
Any developer who has ever mixed the two is familiar with the resulting performance impact. Not so with 3.0. I was able to watch videos at full speed (30 fps) and display a histogram for each of those frames without dropping frames and slowing playback:
An OpenGL ES Profile in Instruments, displaying performance statistics for the OpenGL 3.0 version of Real-timeVideoProcessingUsingAVPlayerItemVideoOutput with an added CIHistogramDisplayFilter, updating with each frame
The code posted here will be especially useful to anyone receiving the following error message (the most common of all):
Failed to create IOSurface image (texture)
Error at CVOpenGLESTextureCacheCreateTextureFromImage -6683
In addition to helping developers through sticking points like that one, the noticeable performance improvement should also encourage anyone developing software for the GPU on the iPhone 5s, iPhone 6, iPhone 6 Plus, iPhone 6s and iPhone 6s Plus to revise their existing OpenGL ES 2.0 code for far faster performance. Such developers will be that impressed by the benefits of migrating from 2.0 to 3.0, as will their customers; I'm not talking about geek-rated enhancements. Everyone will notice.
NOTE | To cover all bases, each sample project uses GLKit differently—one employs GLKiT view-related delegates; the other, subclasses of view-related GLKit objects. In VideoSnake GLK, UIViewController and UIView are delegates of GLKViewController and GLKView, respectively; in Real-timeVideoProcessingUsingAVPlayerItemVideoOutput GLK, 
GLKViewController and GLKView are subclassed in lieu of UIViewController and UIView.
Why the effort?
If you're new to this blog, you may be wondering why such an extensive effort to develop an app such as this one that meets the highest standard. See that thing that looks like a Cheerio on my left eye?
I never get Cheerios in my left eye (or the right one, for that matter), so I'm going to assume that's the thing demons are using to damage it (cloaked matter damages human tissue)
That's how demons blind (and torment) people. They put them on your eyes in order to damage them slowly. You can't see these things because they are placed on your eye for usually no more than 1/30th of a second, and they're cloaked:

Not all eye-damaging cloaked entities look the same; but, really, if you've seen one, you've seen them all
Cloaked is a descriptor for the state of molecules that makes them (mostly) permeable, invisible, and timeless. Because there is no perfect cloak, some light does reflect from a cloaked molecule. As a result, you can see cloaked objects (and people and demons and anything else cloaked) using a CCD digital imaging sensor, which senses a wider spectrum of light than the human eye. It renders the portion of the spectrum you cannot see by substituting human-visible colors for the invisible ones in the image it produces. By shooting video with it, you can capture the use of cloaked objects, in that taking at least 30 images per second compensates for the faster-than-time movement of the cloaked entity placing them on (and removing them from), say, your eyes.

Rerun