├── Makefile ├── README.md ├── glDmaTexture.c └── include └── glhelp.h /Makefile: -------------------------------------------------------------------------------- 1 | CC = gcc 2 | CFLAGS ?= -Iinclude -I/usr/include/libdrm -W -Wall -Wextra -g -O2 -std=c11 3 | LDFLAGS ?= 4 | LIBS := -lGLESv2 -lglfw -lEGL 5 | 6 | %.o : %.c 7 | $(CC) $(CFLAGS) -c -o $@ $< 8 | 9 | all: glDmaTexture 10 | 11 | glDmaTexture: glDmaTexture.o 12 | $(CC) $(LDFLAGS) -o $@ $^ $(LIBS) 13 | 14 | clean: 15 | -rm -f *.o 16 | -rm -f glDmaTexture 17 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # V4L2 Video to GL Hardware Path example for Raspberry Pi 2 | 3 | I'll be putting my Raspberry Pi v4l2 tests here if they are half decent. 4 | The idea is to have examples that are simple to follow and read. Not to be well structured code. 5 | I had trouble finding good examples on the web that did this so thought it might help others if I could make one. 6 | 7 | I am working on a Raspberry Pi4b with 32bit Raspbian Buster OS. 8 | Kernel version, 9 | Linux sensitpi 5.10.17-v7l+ #1403 SMP Mon Feb 22 11:33:35 GMT 2021 armv7l GNU/Linux 10 | **The camera I'm using is the RaspiCam2, aka Sony imx219.** 11 | 12 | The first (and currently only) demo is doing camera video capture using V4L2 and setting up DMA buffers to create a 'zero copy' hardware pipeline all the way to GL texture. 13 | I ended up using glfw as that seemed to allow for getting a gl window thrown up with minimal amount of code. There were a few non obvious tricks to getting this to work which I will describe below. 14 | 15 | Actually first you might as well try to build and run as it's possible the issues I had have been fixed. 16 | 17 | **0. Try Just Build and run.** 18 | Install some dependencies 19 | `sudo apt install libglfw3-dev libgles2-mesa-dev` 20 | 21 | Install & Build the test program 22 | `git clone https://github.com/Fredrum/rpi_v4l2_tests.git` 23 | `cd rpi_v4l2_tests` 24 | `make` 25 | 26 | If all seems good, run, 27 | `glDmaTexture` 28 | 29 | If that didn't work you might want to read on... 30 | 31 | 32 | **1. The camera driver (at the time) didn't actually allow for streaming video.** 33 | If you run: 34 | `> v4l2-compliance` 35 | 36 | near the top you should have a line saying 37 | `Compliance test for device /dev/video0:` 38 | 39 | then if you go to the bottom of the test printout you might see a line that looks something like this, 40 | `test VIDIOC_EXPBUF: OK (Not Supported)` 41 | 42 | If you get this 'not supported' message, then you'll have the same problem as I had as the systems camera driver won't allow for streaming. The solution involves making a small addition to the raspberry kernel camera driver and if you haven't done anything like that before (I hadn't) you'll need to read up a little on how to build and install one of the drivers. It's totally doable though! You'll start by downloading the full kernel tree which might take a while so you might want kick that download off soon! :) 43 | The source file you must edit is 44 | `../drivers/staging/vc04_services/bcm2835-camera/bcm2835-camera/bcm2835-camera.c` 45 | add a line like this 46 | `.vidioc_expbuf = vb2_ioctl_expbuf,` 47 | in struct v4l2_ioctl_ops camera0_ioctl_ops, eg just after ".vidioc_dqbuf = vb2_ioctl_dqbuf," somewhere around line 1481. 48 | Then build and install it on your system. A good tip I got was to make an edit in the info found at the bottom of the driver source and then you can look for that after install+reboot using, 49 | `modinfo bcm2835-v4l2` 50 | Also you can run the '4l2-compliance' command again and if its working the line near the bottom should instead say, 51 | `test VIDIOC_EXPBUF: OK` 52 | 53 | NOTE: If you run an general update on your raspberry system it will probably overwrite this fix and you'll have to re-install or re-do. 54 | 55 | Here's a forum thread about this: https://www.raspberrypi.org/forums/viewtopic.php?f=43&t=291940 56 | 57 | **2. It sort of works but you get only 3-4 fps!** 58 | I didn't realise that for the camera to be able to operate as a streaming videocamera I also had to add a special config file. 59 | So if this happens for you too then try making a file here, 60 | `/etc/modprobe.d/bcm2835-v4l2.conf` 61 | containing the text 62 | `options bcm2835-v4l2 max_video_width=1920` 63 | `options bcm2835-v4l2 max_video_height=1080` 64 | 65 | To verify that it now works you can run these commands, 66 | `> v4l2-ctl -v width=1280,height=720,pixelformat=RX24` 67 | `> v4l2-ctl --stream-mmap --stream-count=-1 -d /dev/video0 --stream-to=/dev/null` 68 | 69 | My forum thread about this one: https://www.raspberrypi.org/forums/viewtopic.php?f=43&t=298040 70 | 71 | 72 | **3. Don't use the imx219 dt overlay!** 73 | For this particular setup to work we don't want to use this driver. I tried it out and seemed to get low latency and high frame rate but it was a red herring. 74 | `dtoverlay=imx219` 75 | This driver is made for capturing raw bayered sensor data and its meant for the 'libcamera' library. 76 | My forum thread here: https://www.raspberrypi.org/forums/viewtopic.php?f=107&t=293712 77 | 78 | 79 | **4. Goto 0.** 80 | 81 | 82 | 83 | ## Some known issues 84 | Strangely the RGB pixel color order seems to sometimes switch to BGR and I don't understand why yet. If this happens you can change the pixelformat in eglCreateImageKHR() from DRM_FORMAT_XRGB8888 -> DRM_FORMAT_XBGR8888 or vice versa. 85 | 86 | There's tearing in the playback. I haven't bothered trying to fix this yet as there's an as-of-yet unsolved Raspberry OS problem with the tearing. Also I didn't want to add double bufferering as my goal is low latency. 87 | 88 | Talking about the latency I think it might be faster and probably also lower system/bus bandwith use setting the camera to produce some YUV format. I didn't want to do that either as I think it would have involved setting up an ISP unit and I wanted to keep it simple to start with. The issue here is that you are limited in what pixelformats are compatible with the RPis 3d graphics, V3D. 89 | 90 | 91 | 92 | Please let me know if you can spot bits that are incorrect or that can be improved! 93 | 94 | Cheers! 95 | -------------------------------------------------------------------------------- /glDmaTexture.c: -------------------------------------------------------------------------------- 1 | ///-----------------------------------------------/// 2 | /// Example program for setting up a hardware /// 3 | /// path for camera/video over v4l2 to Open GL /// 4 | /// texture. /// 5 | ///-----------------------------------------------/// 6 | // 7 | // sudo apt install libglfw3-dev libgles2-mesa-dev 8 | 9 | #define GLFW_INCLUDE_ES2 10 | #include 11 | 12 | #include 13 | #include 14 | #include 15 | #include 16 | #include 17 | #include 18 | #include 19 | 20 | #include "glhelp.h" 21 | 22 | static const GLuint WIDTH = 1280; 23 | static const GLuint HEIGHT = 720; 24 | 25 | static const GLchar* vertex_shader_source = 26 | "#version 300 es\n" 27 | "in vec3 position;\n" 28 | "in vec2 tx_coords;\n" 29 | "out vec2 v_texCoord;\n" 30 | "void main() { \n" 31 | " gl_Position = vec4(position, 1.0);\n" 32 | " v_texCoord = tx_coords;\n" 33 | "}\n"; 34 | 35 | static const GLchar* fragment_shader_source = 36 | "#version 300 es\n" 37 | "#extension GL_OES_EGL_image_external : require\n" 38 | "precision mediump float;\n" 39 | "uniform samplerExternalOES texture;\n" 40 | "in vec2 v_texCoord;\n" 41 | "out vec4 out_color;\n" 42 | "void main() { \n" 43 | " out_color = texture2D( texture, v_texCoord );\n" 44 | "}\n"; 45 | 46 | /// negative x,y is bottom left and first vertex 47 | static const GLfloat vertices[][4][3] = 48 | { 49 | { {-1.0, -1.0, 0.0}, { 1.0, -1.0, 0.0}, {-1.0, 1.0, 0.0}, {1.0, 1.0, 0.0} } 50 | }; 51 | static const GLfloat uv_coords[][4][2] = 52 | { 53 | { {0.0, 0.0}, {1.0, 0.0}, {0.0, 1.0}, {1.0, 1.0} } 54 | }; 55 | 56 | GLint common_get_shader_program(const char *vertex_shader_source, const char *fragment_shader_source) { 57 | enum Consts {INFOLOG_LEN = 512}; 58 | GLchar infoLog[INFOLOG_LEN]; 59 | GLint fragment_shader; 60 | GLint shader_program; 61 | GLint success; 62 | GLint vertex_shader; 63 | 64 | /* Vertex shader */ 65 | vertex_shader = glCreateShader(GL_VERTEX_SHADER); 66 | glShaderSource(vertex_shader, 1, &vertex_shader_source, NULL); 67 | glCompileShader(vertex_shader); 68 | glGetShaderiv(vertex_shader, GL_COMPILE_STATUS, &success); 69 | if (!success) { 70 | glGetShaderInfoLog(vertex_shader, INFOLOG_LEN, NULL, infoLog); 71 | printf("ERROR::SHADER::VERTEX::COMPILATION_FAILED\n%s\n", infoLog); 72 | } 73 | 74 | /* Fragment shader */ 75 | fragment_shader = glCreateShader(GL_FRAGMENT_SHADER); 76 | glShaderSource(fragment_shader, 1, &fragment_shader_source, NULL); 77 | glCompileShader(fragment_shader); 78 | glGetShaderiv(fragment_shader, GL_COMPILE_STATUS, &success); 79 | if (!success) { 80 | glGetShaderInfoLog(fragment_shader, INFOLOG_LEN, NULL, infoLog); 81 | printf("ERROR::SHADER::FRAGMENT::COMPILATION_FAILED\n%s\n", infoLog); 82 | } 83 | 84 | /* Link shaders */ 85 | shader_program = glCreateProgram(); 86 | glAttachShader(shader_program, vertex_shader); 87 | glAttachShader(shader_program, fragment_shader); 88 | glLinkProgram(shader_program); 89 | glGetProgramiv(shader_program, GL_LINK_STATUS, &success); 90 | if (!success) { 91 | glGetProgramInfoLog(shader_program, INFOLOG_LEN, NULL, infoLog); 92 | printf("ERROR::SHADER::PROGRAM::LINKING_FAILED\n%s\n", infoLog); 93 | } 94 | 95 | glDeleteShader(vertex_shader); 96 | glDeleteShader(fragment_shader); 97 | return shader_program; 98 | } 99 | 100 | int main(void) { 101 | 102 | GLuint shader_program, vbo; 103 | GLint pos; 104 | GLint uvs; 105 | GLFWwindow* window; 106 | 107 | glfwInit(); 108 | glfwWindowHint(GLFW_CLIENT_API, GLFW_OPENGL_ES_API); 109 | glfwWindowHint(GLFW_CONTEXT_CREATION_API, GLFW_EGL_CONTEXT_API); 110 | glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 2); 111 | glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 0); 112 | glfwWindowHint(GLFW_RESIZABLE, GL_FALSE); 113 | window = glfwCreateWindow(WIDTH, HEIGHT, __FILE__, NULL, NULL); 114 | glfwMakeContextCurrent(window); 115 | 116 | EGLDisplay egl_display = glfwGetEGLDisplay(); 117 | if(egl_display == EGL_NO_DISPLAY) { 118 | printf("error: glfwGetEGLDisplay no EGLDisplay returned\n"); 119 | } 120 | 121 | printf("GL_VERSION : %s\n", glGetString(GL_VERSION) ); 122 | printf("GL_RENDERER : %s\n", glGetString(GL_RENDERER) ); 123 | 124 | shader_program = common_get_shader_program(vertex_shader_source, fragment_shader_source); 125 | pos = glGetAttribLocation(shader_program, "position"); 126 | uvs = glGetAttribLocation(shader_program, "tx_coords"); 127 | 128 | glClearColor(0.0f, 0.0f, 0.0f, 1.0f); 129 | glViewport(0, 0, WIDTH, HEIGHT); 130 | 131 | glGenBuffers(1, &vbo); 132 | glBindBuffer(GL_ARRAY_BUFFER, vbo); 133 | glBufferData(GL_ARRAY_BUFFER, sizeof(vertices)+sizeof(uv_coords), 0, GL_STATIC_DRAW); 134 | glBufferSubData(GL_ARRAY_BUFFER, 0, sizeof(vertices), vertices); 135 | glBufferSubData(GL_ARRAY_BUFFER, sizeof(vertices), sizeof(uv_coords), uv_coords); 136 | glEnableVertexAttribArray(pos); 137 | glEnableVertexAttribArray(uvs); 138 | glVertexAttribPointer(pos, 3, GL_FLOAT, GL_FALSE, 0, (GLvoid*)0); 139 | glVertexAttribPointer(uvs, 2, GL_FLOAT, GL_FALSE, 0, sizeof(vertices)); /// last is offset to loc in buf memory 140 | glBindBuffer(GL_ARRAY_BUFFER, 0); 141 | 142 | /// END Open GL setup ------------------------------------ 143 | /// BEGIN V4L2 setup ------------------------------------- 144 | 145 | int fd=-1; 146 | const char* camera_device = "/dev/video0"; 147 | 148 | fd = open(camera_device, O_RDWR); 149 | if(fd == -1){ 150 | printf("Cannot open device '%s'\n", camera_device); 151 | return -1; 152 | } 153 | printf("Camera device opened: %s with fd: %d\n", camera_device, fd); 154 | 155 | struct v4l2_capability cap; 156 | memset(&cap, 0, sizeof(cap)); 157 | if( ioctl(fd, VIDIOC_QUERYCAP, &cap) == -1) 158 | { 159 | if(EINVAL == errno){ 160 | printf("VIDIOC_QUERYCAP: This isn't a V4L2 device\n"); 161 | } else { 162 | perror("VIDIOC_QUERYCAP"); 163 | } 164 | return -1; 165 | } 166 | if(!(cap.capabilities & V4L2_CAP_VIDEO_CAPTURE)) 167 | { 168 | printf("This is no video capture device\n"); 169 | return -1; 170 | } 171 | if(!(cap.capabilities & V4L2_CAP_STREAMING)) 172 | { 173 | printf("No streaming i/o support\n"); 174 | return -1; 175 | } 176 | 177 | struct v4l2_format fmt; 178 | memset(&fmt, 0, sizeof(fmt)); 179 | fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; 180 | fmt.fmt.pix.width = WIDTH; 181 | fmt.fmt.pix.height = HEIGHT; 182 | fmt.fmt.pix.pixelformat = V4L2_PIX_FMT_BGRX32; /// V4L2_PIX_FMT_BGRX32 a yuv format might be faster but needs ISP unit? 183 | fmt.fmt.pix.field = V4L2_FIELD_NONE; 184 | /// try setting this 185 | ioctl(fd, VIDIOC_S_FMT, &fmt); 186 | 187 | /// check what was actually set 188 | if(fmt.fmt.pix.pixelformat != V4L2_PIX_FMT_BGRX32) 189 | { 190 | printf("Libv4l2 didn't accept the suggested pixel format. Can't proceed.\n"); 191 | return -1; 192 | } 193 | if((fmt.fmt.pix.width != WIDTH) || (fmt.fmt.pix.height != HEIGHT)) 194 | { 195 | printf("Warning: driver is sending image at %dx%d\n", fmt.fmt.pix.width, fmt.fmt.pix.height); 196 | } 197 | 198 | printf("Device accepted fourcc: %c%c%c%c\n", /// RX24==BGRX32 199 | fmt.fmt.pix.pixelformat, 200 | fmt.fmt.pix.pixelformat >> 8, 201 | fmt.fmt.pix.pixelformat >> 16, 202 | fmt.fmt.pix.pixelformat >> 24); 203 | printf("Device accepted resolution: %dx%d\n", fmt.fmt.pix.width, fmt.fmt.pix.height); 204 | 205 | /// VIDIOC_REQBUFS 206 | int buffer_count = 0; 207 | struct v4l2_requestbuffers reqbuf; 208 | memset(&reqbuf, 0, sizeof(reqbuf)); 209 | reqbuf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; 210 | reqbuf.memory = V4L2_MEMORY_MMAP; 211 | reqbuf.count = 1; 212 | int res; 213 | res = ioctl(fd, VIDIOC_REQBUFS, &reqbuf); 214 | if(res == -1 && errno == EINVAL) 215 | { 216 | reqbuf.count = 1; 217 | res = ioctl(fd, VIDIOC_REQBUFS, &reqbuf); 218 | } 219 | if(res == -1) 220 | { 221 | if(errno == EINVAL){ 222 | printf("Video capturing or DMABUF streaming is not supported\n"); 223 | } else { 224 | perror("VIDIOC_REQBUFS"); 225 | } 226 | return -1; 227 | } 228 | buffer_count = reqbuf.count; 229 | printf("V4L2 Buffer Count: %d\n", buffer_count); 230 | 231 | /// VIDIOC_EXPBUF 232 | /// you must do this for each of your V4L2 buffers (above). 233 | /// if you do double buffering for example 234 | int expBuf_fd = -1; 235 | struct v4l2_exportbuffer expbuf; 236 | memset(&expbuf, 0, sizeof(expbuf)); 237 | expbuf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; 238 | expbuf.index = 0; /// '0' for one buffer 239 | expbuf.flags = O_RDONLY; 240 | if(ioctl(fd, VIDIOC_EXPBUF, &expbuf) == -1) 241 | { 242 | perror("VIDIOC_EXPBUF"); // I solved with custom kernel fix - https://www.raspberrypi.org/forums/viewtopic.php?f=43&t=291940 243 | return -1; 244 | } 245 | expBuf_fd = expbuf.fd; 246 | printf("Dmabuf fd: %d\n", expBuf_fd); 247 | 248 | /// Kick off the streaming 249 | if(ioctl(fd, VIDIOC_STREAMON, &(enum v4l2_buf_type){V4L2_BUF_TYPE_VIDEO_CAPTURE})) 250 | { 251 | perror("VIDIOC_STREAMON"); 252 | return -1; 253 | } 254 | printf("Camera streaming turned ON\n"); 255 | 256 | /// END getdmabuf() ------------------------------------------------ 257 | /// BEGIN create DMA texture --------------------------------------- 258 | 259 | EGLImageKHR dma_image; 260 | dma_image = eglCreateImageKHR( 261 | egl_display, 262 | EGL_NO_CONTEXT, 263 | EGL_LINUX_DMA_BUF_EXT, 264 | NULL, 265 | (EGLint[]) 266 | { 267 | EGL_WIDTH, fmt.fmt.pix.width, 268 | EGL_HEIGHT, fmt.fmt.pix.height, 269 | EGL_LINUX_DRM_FOURCC_EXT, DRM_FORMAT_XRGB8888, /// takes 16 or 32 bits per pixel (or 8 probably) 270 | EGL_DMA_BUF_PLANE0_FD_EXT, expBuf_fd, 271 | EGL_DMA_BUF_PLANE0_OFFSET_EXT, 0, 272 | EGL_DMA_BUF_PLANE0_PITCH_EXT, fmt.fmt.pix.bytesperline, 273 | EGL_NONE 274 | }); 275 | 276 | if(dma_image == EGL_NO_IMAGE_KHR) 277 | { 278 | printf("error: eglCreateImageKHR failed\n"); 279 | return -1; 280 | } 281 | 282 | 283 | GLuint dma_texture; 284 | glGenTextures(1, dma_texture); 285 | glEnable(GL_TEXTURE_EXTERNAL_OES); 286 | glBindTexture(GL_TEXTURE_EXTERNAL_OES, dma_texture); 287 | glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MIN_FILTER, GL_LINEAR); 288 | glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MAG_FILTER, GL_LINEAR); 289 | glEGLImageTargetTexture2DOES(GL_TEXTURE_EXTERNAL_OES, dma_image); 290 | glGetUniformLocation(shader_program, "texture"); 291 | 292 | /// END create DMA texture --------------------------------------- 293 | 294 | struct v4l2_buffer buf; 295 | memset(&buf, 0, sizeof(buf)); 296 | buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; 297 | buf.memory = V4L2_MEMORY_MMAP; 298 | buf.index = 0; 299 | 300 | //if(ioctl(fd, VIDIOC_QUERYBUF, &buf) == -1) 301 | //{ 302 | //perror("VIDIOC_QUERYBUF"); 303 | //return -1; 304 | //} 305 | 306 | /// Kick off the queue-dequeue cycle 307 | ioctl(fd, VIDIOC_QBUF, &buf); 308 | 309 | /// Main Program Loop 310 | /// 311 | while(!glfwWindowShouldClose(window)) 312 | { 313 | glfwPollEvents(); 314 | 315 | ioctl(fd, VIDIOC_DQBUF, &buf); 316 | glEGLImageTargetTexture2DOES(GL_TEXTURE_EXTERNAL_OES, dma_image); 317 | ioctl(fd, VIDIOC_QBUF, &buf); 318 | 319 | glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT); 320 | glUseProgram(shader_program); 321 | glDrawArrays(GL_TRIANGLE_STRIP, 0, 4); 322 | glfwSwapBuffers(window); 323 | } 324 | 325 | glDeleteBuffers(1, &vbo); 326 | glfwTerminate(); 327 | return EXIT_SUCCESS; 328 | } 329 | -------------------------------------------------------------------------------- /include/glhelp.h: -------------------------------------------------------------------------------- 1 | #ifndef GLHELP_H 2 | #define GLHELP_H 3 | 4 | #include 5 | #include 6 | #include 7 | 8 | /// Some gl extensions that don't seem to be part of raspberry gl. 9 | /// I think you could get these from other places like GLAD or libepoxy. 10 | /// 11 | EGLImageKHR eglCreateImageKHR(EGLDisplay dpy, EGLContext ctx, EGLenum target, EGLClientBuffer buffer, const EGLint *attrib_list) __attribute__((weak)); // May not be in libEGL symbol table, resolve manually :( 12 | EGLImageKHR eglCreateImageKHR(EGLDisplay dpy, EGLContext ctx, EGLenum target, EGLClientBuffer buffer, const EGLint *attrib_list) 13 | { 14 | static PFNEGLCREATEIMAGEKHRPROC createImageProc = 0; 15 | if(!createImageProc) { 16 | createImageProc = (PFNEGLCREATEIMAGEKHRPROC)eglGetProcAddress("eglCreateImageKHR"); 17 | } 18 | return createImageProc(dpy, ctx, target, buffer, attrib_list); 19 | } 20 | 21 | EGLBoolean eglDestroyImageKHR(EGLDisplay dpy, EGLImageKHR image) __attribute__((weak)); // May not be in libEGL symbol table, resolve manually :( 22 | EGLBoolean eglDestroyImageKHR(EGLDisplay dpy, EGLImageKHR image) 23 | { 24 | static PFNEGLDESTROYIMAGEKHRPROC destroyImageProc = 0; 25 | if(!destroyImageProc) { 26 | destroyImageProc = (PFNEGLDESTROYIMAGEKHRPROC)eglGetProcAddress("eglDestroyImageKHR"); 27 | } 28 | return destroyImageProc(dpy, image); 29 | } 30 | 31 | void glDebugMessageCallbackKHR(GLDEBUGPROCKHR callback, const void *userParam) __attribute__((weak)); // May not be in libEGL symbol table, resolve manually :( 32 | void glDebugMessageCallbackKHR(GLDEBUGPROCKHR callback, const void *userParam) 33 | { 34 | static PFNGLDEBUGMESSAGECALLBACKKHRPROC debugMessageCallbackProc = 0; 35 | if(!debugMessageCallbackProc) { 36 | debugMessageCallbackProc = (PFNGLDEBUGMESSAGECALLBACKKHRPROC)eglGetProcAddress("glDebugMessageCallbackKHR"); 37 | } 38 | debugMessageCallbackProc(callback, userParam); 39 | } 40 | 41 | 42 | void glEGLImageTargetTexture2DOES(GLenum target, GLeglImageOES image) __attribute__((weak)); // May not be in libEGL symbol table, resolve manually :( 43 | void glEGLImageTargetTexture2DOES(GLenum target, GLeglImageOES image) 44 | { 45 | static PFNGLEGLIMAGETARGETTEXTURE2DOESPROC imageTargetTexture2DOES = 0; 46 | if(!imageTargetTexture2DOES) { 47 | imageTargetTexture2DOES = (PFNGLEGLIMAGETARGETTEXTURE2DOESPROC)eglGetProcAddress("glEGLImageTargetTexture2DOES"); 48 | } 49 | imageTargetTexture2DOES(target, image); 50 | } 51 | /// END Gl Extensions -------------------------------------------------- 52 | 53 | 54 | 55 | 56 | #endif 57 | --------------------------------------------------------------------------------