├── .gitignore ├── README.md ├── build.gradle ├── cgedemo ├── .gitignore ├── build.gradle ├── proguard-rules.pro ├── project.properties └── src │ └── main │ ├── AndroidManifest.xml │ ├── assets │ └── hehe.jpg │ ├── java │ └── org │ │ └── wysaid │ │ ├── cgeDemo │ │ ├── CameraDemoActivity.java │ │ ├── FaceDemoActivity.java │ │ ├── ImageDemoActivity.java │ │ ├── MainActivity.java │ │ ├── SimplePlayerDemoActivity.java │ │ └── VideoPlayerDemoActivity.java │ │ └── demoViews │ │ └── FaceDemoView.java │ └── res │ ├── drawable-nodpi │ ├── adjust_face_chin.png │ ├── adjust_face_eye.png │ ├── adjust_face_mouth.png │ ├── bgview.png │ ├── face0.jpg │ ├── face1.jpg │ └── mask1.png │ ├── layout │ ├── activity_camera_demo.xml │ ├── activity_face_demo.xml │ ├── activity_filter_demo.xml │ ├── activity_main.xml │ ├── activity_simple_player_demo.xml │ └── activity_video_player_demo.xml │ ├── menu │ ├── menu_camera_demo.xml │ ├── menu_filter_demo.xml │ ├── menu_main.xml │ └── menu_video_player.xml │ ├── mipmap-nodpi │ └── ic_launcher.png │ ├── raw │ └── test.mp4 │ ├── values-w820dp │ └── dimens.xml │ └── values │ ├── dimens.xml │ ├── strings.xml │ └── styles.xml ├── demoRelease └── cgeDemo.apk ├── gradle.properties ├── gradlew ├── gradlew.bat ├── library ├── .gitignore ├── build.gradle ├── proguard-rules.pro ├── project.properties ├── so │ └── libgpuimage-library.so └── src │ └── main │ ├── AndroidManifest.xml │ ├── genJNIHeaders │ ├── java │ └── org │ │ └── wysaid │ │ ├── algorithm │ │ ├── Matrix2x2.java │ │ ├── Matrix3x3.java │ │ ├── Matrix4x4.java │ │ ├── Vector2.java │ │ ├── Vector3.java │ │ └── Vector4.java │ │ ├── camera │ │ └── CameraInstance.java │ │ ├── common │ │ ├── Common.java │ │ ├── FrameBufferObject.java │ │ ├── ProgramObject.java │ │ ├── SharedContext.java │ │ └── TextureDrawer.java │ │ ├── myUtils │ │ ├── FileUtil.java │ │ └── ImageUtil.java │ │ ├── nativePort │ │ ├── CGEFaceFunctions.java │ │ ├── CGEFrameRecorder.java │ │ ├── CGEFrameRenderer.java │ │ ├── CGEImageHandler.java │ │ ├── CGENativeLibrary.java │ │ ├── FFmpegNativeLibrary.java │ │ └── NativeLibraryLoader.java │ │ ├── texUtils │ │ ├── TextureRenderer.java │ │ ├── TextureRendererBlur.java │ │ ├── TextureRendererDrawOrigin.java │ │ ├── TextureRendererEdge.java │ │ ├── TextureRendererEmboss.java │ │ ├── TextureRendererLerpBlur.java │ │ ├── TextureRendererMask.java │ │ └── TextureRendererWave.java │ │ └── view │ │ ├── CameraGLSurfaceView.java │ │ ├── CameraRecordGLSurfaceView.java │ │ ├── ImageGLSurfaceView.java │ │ ├── SimplePlayerGLSurfaceView.java │ │ └── VideoPlayerGLSurfaceView.java │ ├── libs │ ├── arm64-v8a │ │ └── libCGE.so │ ├── x86 │ │ └── libCGE.so │ └── x86_64 │ │ └── libCGE.so │ └── res │ └── values │ └── strings.xml ├── screenshots ├── 0.jpg ├── 1.jpg ├── 2.jpg └── alipay.jpg └── settings.gradle /.gitignore: -------------------------------------------------------------------------------- 1 | .gradle 2 | /local.properties 3 | .idea 4 | *.iml 5 | .DS_Store 6 | /build 7 | /captures 8 | target/ 9 | tmp 10 | objs/ 11 | armeabi*/ 12 | obj/ 13 | jni/ -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Android-GPUImage-plus 2 | GPU accelerated filters for Android based on OpenGL. 3 | 4 | ## 简介 ## 5 | 6 | 1. This repo is an "Android Studio Project", comprising "cgeDemo", "library" two sub-modules. All java code and the "libCGE.so"(Written in C++&OpenGL with NDK) is provided. Hundreds of built-in filters are available in the demo. 😋If you'd like to add your own filter, please refer to the document for the "Effect String Definition Rule" below. 7 | (本repo为一个Android Studio Project, 包含 cgeDemo, library 两个子模块. 其中library 模块包含java部分所有代码以及一个包含cge核心模块的so链接库,内置近百种滤镜效果, 😋如果要自行添加滤镜, 请参考下方的滤镜描述文件。) 8 | 9 | 2. Demo and Library will be updated as needed. Welcome for your questions or PR. 10 | (不定期更新demo和library. 如有问题欢迎提问, 也欢迎PR.) 11 | 12 | 3. For study only, and no free tech support by now. 13 | 14 | 4. iOS version: [https://github.com/wysaid/ios-gpuimage-plus](https://github.com/wysaid/ios-gpuimage-plus "http://wysaid.org") 15 | 16 | 5. Extra functions can be provided to the donors such as 'realtime video recording with gpu filters'. See the precompiled apk about this function: [https://github.com/wysaid/android-gpuimage-plus/tree/master/demoRelease](https://github.com/wysaid/android-gpuimage-plus/tree/master/demoRelease "http://wysaid.org") 17 | 18 | ## Screen Shots ## 19 | 20 | ![Android-GPUImage-plus](https://raw.githubusercontent.com/wysaid/android-gpuimage-plus/master/screenshots/0.jpg "Android-GPUImage-plus") 21 |
Single Image Filter 22 | 23 | ![Android-GPUImage-plus](https://raw.githubusercontent.com/wysaid/android-gpuimage-plus/master/screenshots/1.jpg "Android-GPUImage-plus") 24 |
Camera Filter(With Taking Photo&Video) 25 | 26 | ![Android-GPUImage-plus](https://raw.githubusercontent.com/wysaid/android-gpuimage-plus/master/screenshots/2.jpg "Android-GPUImage-plus") 27 |
Live Video Filter 28 | 29 | ## Donate ## 30 | 31 | Welcome donations. More work may be paid for the donors' issue as thanks. 32 | 33 | Alipay: 34 | 35 | ![Alipay](https://raw.githubusercontent.com/wysaid/android-gpuimage-plus/master/screenshots/alipay.jpg "alipay") 36 | 37 | 38 | ## 文档 ## 39 | 40 | 本lib使用简单, 滤镜自定义可以全部通过纯文本配置来完成。 (目前不提供编辑器) 41 | 42 | 文本解析器的目的是让完全不懂GLSL甚至编程的人也知道如何快速添加新的特效。 43 | 44 | EffectString解析规则和常用方法说明: 45 | 46 | 1. 每一步不同的处理均以'@'开头,后面跟处理方式。中间可以有空格或者没有 47 | 例: "@ method" 或者 "@method" 均为正确格式。method后面可以跟任意个字符,以空格隔开。 48 | method后面的所有字符(直到下一个"@"符号或者字符串结束)将作为参数传递给该method对应的Parser. 49 | 50 | 2. curve方法参数解释: @curve方法格式为 "@curve <arg1> <arg2> ... <argN>" 51 | <argN> 的格式有两种: "RGB (x1,y1) (xn, yn)", xn和yn分别表示0~255之间的数字 52 | 或者 "R (rx1,ry1) ... (rxn, ryn) G (gx1,gy1) ... (gxn,gyn) B (bx1,by1)...(bxn,byn)" 53 | 其中R,G,B分别表示对应通道,后面跟的点即为该通道下的点。 54 | 括号与参数之间可以有空格也可以没有。括号中的x与y之间可以使用任意间隔符如空格,逗号等. 55 | 56 | 例: 57 | 曲线A: RGB 通道调整点为 (0, 0) (100, 200) (255, 255) 58 | 格式: "@curve RGB(0,0) (100, 200) (255, 255)" 59 | 60 | 曲线B: R通道(0, 0) (50, 25) (255, 255), 61 | G通道(0, 0) (100, 150) (255, 255), 62 | RGB通道(0, 0) (200, 150) (255, 255) 63 | 格式: "@curve R(0,0) (50, 25) (255, 255) G(0, 0) (100,150) (255, 255) RGB(0, 0) (200, 150) (255, 255)". PS(WangYang):为了简化我们的工作,我编写了曲线格式生成工具。请到tool目录下获取。 64 | 65 | 注: 当有连续多次curve调整时,可只写一次"@curve",以上面的例子为例,如果曲线A,B中间没有其他操作,那么可以将A,B连接为一条curve指令,这将节省开销加快程序执行速度。 66 | 例: "@curve RGB(0,0) (100, 200) (255, 255) R(0,0) (50, 25) (255, 255) G(0, 0) (100,150) (255, 255) RGB(0, 0) (200, 150) (255, 255)" (与上例先后执行曲线A,曲线B结果相同) 67 | 68 | 3. blend方法参数解释: @blend方法格式为 "@blend <function> <texture> <intensity>" 69 | <function>值目前有 70 | 71 | 正常: mix 72 | 溶解: dissolve[dsv] 73 | 74 | 变暗: darken[dk] 75 | 正片叠底: multiply[mp] 76 | 颜色加深: colorburn[cb] 77 | 线性加深: linearburn[lb] 78 | 深色: darkercolor[dc] 79 | 80 | 变亮: lighten[lt] 81 | 滤色: screen[sr] 82 | 颜色减淡: colordodge[cd] 83 | 线性减淡: lineardodge[ld] 84 | 浅色: lightercolor[lc] 85 | 86 | 叠加: overlay[ol] 87 | 柔光: softlight[sl] 88 | 强光: hardlight[hl] 89 | 亮光: vividlight[vvl] 90 | 线性光: linearlight[ll] 91 | 点光: pinlight[pl] 92 | 实色混合: hardmix[hm] 93 | 94 | 差值: difference[dif] 95 | 排除: exclude[ec] 96 | 减去: subtract[sub] 97 | 划分: divide[div] 98 | 99 | 色相: hue 100 | 饱和度: saturation[sat] 101 | 颜色: color[cl] 102 | 明度: luminosity[lum] 103 | 104 | 相加: add 105 | 反向加: addrev 106 | 黑白: colorbw //此方法仅依据texture的alpha通道将src color置换为黑白. 107 | 108 | 注: [] 中的表示该值的缩写,可以使用缩写代替原本过长的参数值。 109 | 110 | <texture> 参数表示所用到的资源文件名,包含后缀! 111 | 112 | 注: 增强型用法(添加日期2016-1-5) <texture>参数可以直接为纹理id,width,height (使用中括号包起来, 以英文逗号隔开, 中间不能有任何空格, 例如: [10,1024,768] 表示使用10号纹理id, 10号纹理宽1024像素, 高768像素). 由于中括号不能在文件名中直接使用 所以此法不会与一般文件名产生冲突. 注意: 此用法不适于编辑器等使用, 供编程人员提供增强型功能! 113 | 特别注意: 增强型用法中, 纹理id一旦填写错误, 与其他纹理id冲突, 极易造成整个app功能错误! 114 | 115 | <intensity>表示叠加强度(不透明度),为(0, 100] 之间的整数。 116 | 117 | 例:使用资源图 src.jpg 进行叠加,强度为80% 118 | 格式: "@blend overlay src.jpg 80" 或者 "@blend ol src.jpg 80" 119 | 120 | 4. krblend方法参数解释: @krblend方法格式与blend方法一样。参见@blend方法。 121 | 另, krblend所有参数以及用法均与blend方法一致。区别在于krblend方法进行纹理处理时, 122 | 将固定纹理的比例以及对纹理进行等比例缩放以使最终覆盖全图。 123 | 124 | 5. pixblend方法参数解释: @pixblend方法格式为 "@pixblend <function> <color> <intensity>" 125 | <function>参数与blend方法相同,请直接参考blend方法。 126 | <intensity>参数含义与blend方法相同,请直接参考blend方法。 127 | <color>参数包含四个浮点数,分别表示这个颜色的r,g,b,a,取值范围为 [0, 1] 或者 [0, 255] 128 | 例: 使用纯红色进行叠加,强度为90% 129 | 格式: "@pixblend overlay 1 0 0 0 90" -->注意,中间的颜色值可以写为小数。当alpha值大于1时,所有颜色参数值域范围将被认为是[0, 255] 否则被认为是[0, 1] 130 | 131 | 6. selfblend方法参数解释: @selfblend方法格式为 "@selfblend <function> <intensity>" 132 | 注: 此方法中对应的参数与blend方法相同,区别在于没有<texture>参数。本方法将使用待处理图片自身颜色进行混合。 133 | 134 | 7. adjust方法参数解释: @adjust方法格式为 "@adjust <function> <arg1>...<argN>" 135 | <function>值目前有 136 | brightness (亮度): 后接一个参数 intensity, 范围 [-1, 1] 137 | 138 | contrast (对比度): 后接一个参数 intensity, 范围 intensity > 0, 当 intensity = 0 时为灰色图像, intensity = 1 时为无效果, intensity > 1 时加强对比度. 139 | 140 | saturation (饱和度): 后接一个参数 intensity, 范围 intensity > 0, 当 intensity = 0 时为黑白图像, intensity = 1 时为无效果, intensity > 1 时加强饱和度 141 | 142 | monochrome (黑白): 后接六个参数, 范围 [-2, 3], 与photoshop一致。参数顺序分别为: red, green, blue, cyan, magenta, yellow 143 | 144 | sharpen (锐化): 后接一个参数 intensity, 范围 [0, 10], 当intensity为0时无效果 145 | blur (模糊): 后接一个参数 intensity, 范围 [0, 1], 当 intensity 为0时无效果 146 | 147 | whitebalance (白平衡): 后接两个参数 temperature (范围:[-1, 1], 0为无效果) 和 tint (范围: [0, 5], 1 为无效果) 148 | 149 | shadowhighlight[shl] (阴影&高光): 后接两个参数 shadow(范围: [-200, 100], 0为无效果) 和 highlight(范围: [-100, 200], 0为无效果) 150 | 151 | hsv : 后接六个参数red, green, blue, magenta, yellow, cyan. 六个参数范围均为 [-1, 1] 152 | hsl : 后接三个参数hue, saturation, luminance, 三个参数范围均为 [-1, 1] 153 | 154 | level (色阶): 后接三个参数 dark, light, gamma, 范围均为[0, 1], 其中 dark < light 155 | 156 | exposure (曝光) : 后接一个参数 intensity, 范围 [-10, 10] 157 | 158 | colorbalance (色彩平衡): 后接三个参数 redShift [-1, 1], greenShift [-1, 1], blueShift [-1, 1]. (添加日期: 2015-3-30) 159 | 160 | 注: [] 中的表示该值的缩写,可以使用缩写代替原本过长的参数值。 161 | <arg*> 表示该方法所需的参数,具体范围请参考相关class。 <arg*>的个数与具体<function>有关, 162 | 163 | 8. cvlomo方法参数解释: @cvlomo方法包含了子方法curve。格式 "@cvlomo <vignetteStart> <vignetteEnd> <colorScaleLow> <colorScaleRange> <saturation> <curve>" 164 | <vignetteStart>和<vignetteEnd>均为大于0的小数,一般小于1。 165 | <colorScaleLow> <colorScaleRange> 均为大于0的小数,一般小于1。 用于调整图像 166 | <saturation> 0~1之间的小数, 设置图像饱和度。 167 | 参数<curve> 为一个完整的curve方法,但是不添加@curve 标记。 168 | 例: "@cvlomo 0.2 0.8 0.1 0.2 1 RGB(0, 0) (255, 255)" 169 | 170 | 9. colorscale方法参数解释: @colorscale方法格式为 "@colorscale <low> <range> <saturation>" 171 | 注: colorscale方法需要进行CPU计算,较影响速度。 172 | 173 | 10. vignette 方法参数解释: @vignette方法格式为 "@vignette <low> <range> <centerX> <centerY> 174 | 注: 其中low和range是必须选项,centerX和centerY是可选项,若不填,则默认为0.5。 centerX和centerY必须同时存在才生效。 175 | 例: "@vignette 0.1 0.9" , "@vignette 0.1 0.9 0.5 0.5" 176 | 177 | 11. colormul 方法参数解释: @colormul方法格式为 "@colormul <function> <arg1> ...<argN>" 178 | 参数<function>值目前有 "flt", "vec" 和 "mat"。 179 | 当<function>为flt时, 后面跟一个参数 <arg>,将对所有像素执行乘法. 180 | 当<function>为vec时,后面跟三个参数<arg1> <arg2> <arg3>,将对所有像素分别执行rgb分量各自相乘。 181 | 当<function>为mat时,后面跟九个参数<arg1>...<argN>,将对所有像素分别执行矩阵的rgb分量进行矩阵乘法。 182 | 183 | 12. special方法参数解释: @special方法格式为 "@special <N>" 184 | 其中参数<N> 为该特效的编号。 185 | 此类用于处理所有不具备通用性的特效。直接重新编写一个processor以解决。 186 | 187 | 13. lomo方法参数解释:格式 "@lomo <vignetteStart> <vignetteEnd> <colorScaleLow> <colorScaleRange> <saturation> <isLinear>" 188 | <vignetteStart>和<vignetteEnd>均为大于0的小数,一般小于1。 189 | <colorScaleLow> <colorScaleRange> 均为大于0的小数,一般小于1。 用于调整图像 190 | <saturation> 0~1之间的小数, 设置图像饱和度。 191 | <isLinear> 0或1, 设置所使用的vignette方法是否为线性增长,不写此参数则默认为0 192 | 例: "@lomo 0.2 0.8 0.1 0.2 1 0" 193 | 194 | ======编号13 以前使用到的特效依赖库版本: 0.2.1.x ========= 195 | 196 | 14. vigblend 方法参数解释: @vigblend方法格式为 "@vigblend <function> <color> <intensity> <low> <range> <centerX> <centerY> [kind]" 197 | [isLinear] 参数为可选参数, 默认为0 198 | 0: 线性增强,vignette本身不包含alpha通道(alpha通道为1) 199 | 1: 线性增强,vignette本身以alpha通道形成渐变 200 | 2: 二次增强,vignette本身不包含alpha通道(alpha通道为1) 201 | 3: 二次增强,vignette本身以alpha通道形成渐变 202 | 例:"@vigblend ol 0 0 0 1 50 0.02 0.45 0.5 0.5 0" 203 | "@vigblend mix 10 10 30 255 100 0 1.5 0.5 0.5 0", 204 | "@vigblend mix 10 10 30 255 100 0 1.5 0.5 0.5 1", 205 | "@vigblend mix 10 10 30 255 100 0 1.5 0.5 0.5 2", 206 | "@vigblend mix 10 10 30 255 100 0 1.5 0.5 0.5 3", 207 | 注:其他参数含义以及用法参考 pixblend 方法以及 vignette 方法。 208 | 209 | 210 | ======↑此注释以上编号使用的特效库依赖版本 0.3.2.1 211 | 212 | 15. selcolor (selective color) 方法参数解释: @selcolor方法格式:"@selcolor <color1> <colorargs1> ...<colorN> <colorargsN>" 213 | 其中<colorN>为选择的颜色, 有效参数包括: 214 | red, green, blue, cyan, magenta, yellow, white, gray, black. 215 | 216 | <colorargsN> 为对选择颜色所做出的调整, 格式为 217 | (cyan, magenta, yellow, key) 范围: [-1, 1] 218 | 每一个<colorargsN> 为使用括号括起来的四个参数, 如果该参数未作调整, 则写0 219 | 220 | ======↑新增加 2014-11-12 221 | 222 | 16. style 方法参数解释: @style 方法格式为 "@style <function> <arg1> ... <argN>" 223 | <function>值目前有 224 | 225 | crosshatch (交叉阴影): 后接两个参数 spacing 范围[0, 0.1] 和 lineWidth 范围(0, 0.01] 226 | 227 | edge (sobel查找边缘): 后接两个参数 mix 范围[0, 1] 和 stride 范围(0, 5] 228 | 229 | emboss (浮雕): 后接三个参数 mix 范围[0, 1], stride 范围[1, 5] 和 angle 范围[0, 2π] 230 | 231 | halftone (半调): 后接一个参数 dotSize 范围 >= 1 232 | 233 | haze (雾): 后接三个参数 distance 范围[-0.5, 0.5], slope 范围 [-0.5, 0.5] 和 color (参数 color包含三个分量, 分别表示r, g, b, 范围均为[0, 1] ) 234 | 235 | polkadot (圆点花样): 后接一个参数 dotScaling 范围 (0, 1] 236 | 237 | sketch (素描): 后接一个参数 intensity [0, 1] 238 | 239 | max (最大值) 暂无参数 240 | 241 | min (最小值) 暂无参数 242 | 243 | mid (中值) 暂无参数 244 | 245 | ======↑新增加 2015-2-5 246 | 247 | 17. beautify 方法参数解释: @beautify 方法格式为 "@beautify <function> <arg1>...<argN>" 248 | 249 | <function>值 目前有 250 | 251 | bilateral (双边滤波): 后接 三个参数 模糊半径(blurScale) 范围[-100, 100], 色彩容差(distanceFactor) 范围[1, 20] 和 重复次数(repeat times) 范围 >= 1 252 | 其中 重复次数为可选参数, 如果不填, 则默认为 1 253 | 254 | ======↑新增加 2015-3-19 255 | 256 | 18. 新增功能性方法 unpack, 使用方式为 在整个配置的开头加入 #unpack 257 | 作用: 将去除MultipleEffects包装, 直接把解析出来的所有特效直接加入整个handler. 258 | 259 | ======↑新增加 2015-8-7 260 | 261 | 19. blur 方法参数解释 262 | 新增以 @blur 方法, 专门提供各类模糊算法, 格式: "@blur <function> <arg1> ... <argN>" 263 | 264 | <function>值目前有 265 | 266 | lerp (lerp blur): 后接两个个参数 模糊级别[0, 1], 模糊基数 [0.6, 2.0] 267 | 268 | 269 | ======↑新增加 2015-8-8 270 | 271 | 20. dynamic 方法参数解释 272 | 新增 @dynamic 方法, 专门提供各种动态滤镜效果, 格式: "@dynamic <function> <arg1> ... <argN>" 273 | 注意: 特效编辑器不提供 dynamic 方法的编辑 274 | 275 | wave (动态波纹, 2015-11-18加入) 后接1,3,或4个参数 276 | ①若后接一个参数,波纹动态,该参数表示 autoMotionSpeed, 也就是自动更新速度. (具体参见wave filter定义) 277 | ②若后接三个参数, 波纹静态, 第一个参数表示 motion, 第二个表示 angle, 第三个表示 strength(具体参见wave filter定义) 278 | ③若后接四个参数, 波纹动态,前三个参数含义与②一致, 第四个参数表示 autoMotionSpeed 279 | 280 | 即, 后接一个参数的情况下, 相当于 motion, angle和strength 都使用默认值. 281 | 282 | 283 | ======↑新增加 2015-11-18 -------------------------------------------------------------------------------- /build.gradle: -------------------------------------------------------------------------------- 1 | // Top-level build file where you can add configuration options common to all sub-projects/modules. 2 | 3 | buildscript { 4 | repositories { 5 | jcenter() 6 | } 7 | dependencies { 8 | classpath 'com.android.tools.build:gradle:2.1.0-alpha1' 9 | 10 | // NOTE: Do not place your application dependencies here; they belong 11 | // in the individual module build.gradle files 12 | } 13 | } 14 | 15 | allprojects { 16 | repositories { 17 | jcenter() 18 | } 19 | } 20 | -------------------------------------------------------------------------------- /cgedemo/.gitignore: -------------------------------------------------------------------------------- 1 | /build 2 | -------------------------------------------------------------------------------- /cgedemo/build.gradle: -------------------------------------------------------------------------------- 1 | apply plugin: 'com.android.application' 2 | 3 | android { 4 | compileSdkVersion 23 5 | buildToolsVersion "22.0.1" 6 | 7 | defaultConfig { 8 | applicationId "org.wysaid.cgedemo" 9 | minSdkVersion 14 10 | targetSdkVersion 23 11 | versionCode 1 12 | versionName "1.0" 13 | } 14 | buildTypes { 15 | release { 16 | minifyEnabled false 17 | proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro' 18 | } 19 | } 20 | 21 | } 22 | 23 | dependencies { 24 | compile fileTree(dir: 'libs', include: ['*.jar']) 25 | compile 'com.android.support:appcompat-v7:22.2.0' 26 | compile project(':library') 27 | } 28 | -------------------------------------------------------------------------------- /cgedemo/proguard-rules.pro: -------------------------------------------------------------------------------- 1 | # Add project specific ProGuard rules here. 2 | # By default, the flags in this file are appended to flags specified 3 | # in /Users/wysaid/android_develop/android-sdk-macosx/tools/proguard/proguard-android.txt 4 | # You can edit the include path and order by changing the proguardFiles 5 | # directive in build.gradle. 6 | # 7 | # For more details, see 8 | # http://developer.android.com/guide/developing/tools/proguard.html 9 | 10 | # Add any project specific keep options here: 11 | 12 | # If your project uses WebView with JS, uncomment the following 13 | # and specify the fully qualified class name to the JavaScript interface 14 | # class: 15 | #-keepclassmembers class fqcn.of.javascript.interface.for.webview { 16 | # public *; 17 | #} 18 | -------------------------------------------------------------------------------- /cgedemo/project.properties: -------------------------------------------------------------------------------- 1 | target=android-16 2 | proguard.config=${sdk.dir}/tools/proguard/proguard-android.txt:proguard-project.txt 3 | android.library.reference.1=../library -------------------------------------------------------------------------------- /cgedemo/src/main/AndroidManifest.xml: -------------------------------------------------------------------------------- 1 | 2 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 22 | 23 | 29 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 41 | 45 | 48 | 49 | 50 | 51 | 52 | 53 | -------------------------------------------------------------------------------- /cgedemo/src/main/assets/hehe.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/cgedemo/src/main/assets/hehe.jpg -------------------------------------------------------------------------------- /cgedemo/src/main/java/org/wysaid/cgeDemo/FaceDemoActivity.java: -------------------------------------------------------------------------------- 1 | package org.wysaid.cgeDemo; 2 | 3 | import android.content.Intent; 4 | import android.graphics.Bitmap; 5 | import android.graphics.BitmapFactory; 6 | import android.graphics.drawable.BitmapDrawable; 7 | import android.os.Bundle; 8 | import android.support.v7.app.AppCompatActivity; 9 | import android.util.Log; 10 | import android.view.View; 11 | import android.widget.ImageView; 12 | import android.widget.SeekBar; 13 | import android.widget.Toast; 14 | 15 | import org.wysaid.common.Common; 16 | import org.wysaid.demoViews.FaceDemoView; 17 | import org.wysaid.myUtils.ImageUtil; 18 | import org.wysaid.nativePort.CGEFaceFunctions; 19 | 20 | public class FaceDemoActivity extends AppCompatActivity { 21 | 22 | public static final int CHOOSE_PHOTO = 1; 23 | 24 | private FaceDemoView mFaceDemoView; 25 | private ImageView mResultView; 26 | private CGEFaceFunctions.FaceFeature mFirstFaceFeature; 27 | private Bitmap mFirstFaceImage; 28 | private CGEFaceFunctions.FaceFeature mSecondFaceFeature; 29 | private Bitmap mSecondFaceImage; 30 | 31 | @Override 32 | protected void onCreate(Bundle savedInstanceState) { 33 | super.onCreate(savedInstanceState); 34 | setContentView(R.layout.activity_face_demo); 35 | 36 | mFaceDemoView = (FaceDemoView)findViewById(R.id.faceDemoView); 37 | mResultView = (ImageView)findViewById(R.id.resultView); 38 | mResultView.setVisibility(View.GONE); 39 | 40 | Bitmap eye = BitmapFactory.decodeResource(getResources(), R.drawable.adjust_face_eye); 41 | Bitmap mouth = BitmapFactory.decodeResource(getResources(), R.drawable.adjust_face_mouth); 42 | Bitmap chin = BitmapFactory.decodeResource(getResources(), R.drawable.adjust_face_chin); 43 | 44 | mFaceDemoView.loadResources(eye, mouth, chin); 45 | 46 | SeekBar seekBar = (SeekBar)findViewById(R.id.seekBar); 47 | seekBar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() { 48 | @Override 49 | public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) { 50 | mFaceDemoView.setFaceImageZoom(progress / 20.0f + 0.5f); 51 | } 52 | 53 | @Override 54 | public void onStartTrackingTouch(SeekBar seekBar) { 55 | 56 | } 57 | 58 | @Override 59 | public void onStopTrackingTouch(SeekBar seekBar) { 60 | 61 | } 62 | }); 63 | } 64 | 65 | public void setAsFirstPhoto(View view) { 66 | Log.i(Common.LOG_TAG, "setAsFirstPhoto..."); 67 | mFirstFaceFeature = mFaceDemoView.getFeature(); 68 | mFirstFaceImage = mFaceDemoView.getFaceimage(); 69 | Toast.makeText(this, "当前画面设置为第一张图", Toast.LENGTH_SHORT).show(); 70 | } 71 | 72 | public void setAsSecondPhoto(View view) { 73 | Log.i(Common.LOG_TAG, "setAsSecondPhoto..."); 74 | 75 | mSecondFaceFeature = mFaceDemoView.getFeature(); 76 | mSecondFaceImage = mFaceDemoView.getFaceimage(); 77 | Toast.makeText(this, "当前画面设置为第二张图", Toast.LENGTH_SHORT).show(); 78 | } 79 | 80 | public void showResult(View view) { 81 | Log.i(Common.LOG_TAG, "showResult..."); 82 | Bitmap bmp = CGEFaceFunctions.blendFaceWidthFeatures(mFirstFaceImage, mFirstFaceFeature, mSecondFaceImage, mSecondFaceFeature, CGEFaceFunctions.AutoLumAdjustMode.LumAdjust_OnlyBrightness, null); 83 | mResultView.setImageBitmap(bmp); 84 | mResultView.setVisibility(View.VISIBLE); 85 | mFaceDemoView.setVisibility(View.GONE); 86 | } 87 | 88 | public void choosePhoto(View view) { 89 | mResultView.setVisibility(View.GONE); 90 | mFaceDemoView.setVisibility(View.VISIBLE); 91 | Intent photoPickerIntent = new Intent(Intent.ACTION_PICK); 92 | photoPickerIntent.setType("image/*"); 93 | startActivityForResult(photoPickerIntent, CHOOSE_PHOTO); 94 | } 95 | 96 | public void onActivityResult(final int requestCode, final int resultCode, final Intent data) { 97 | 98 | if(resultCode != RESULT_OK) 99 | return; 100 | 101 | switch (requestCode) 102 | { 103 | case CHOOSE_PHOTO: 104 | { 105 | mResultView.setImageURI(data.getData()); 106 | Bitmap result = ((BitmapDrawable)mResultView.getDrawable()).getBitmap(); 107 | if(result.getWidth() > 2000 || result.getHeight() > 2000) { 108 | float scaling = (float)Math.min(2000.0 / result.getWidth(), 2000.0 / result.getHeight()); 109 | result = Bitmap.createScaledBitmap(result, (int)(result.getWidth() * scaling), (int)(result.getHeight() * scaling), false); 110 | } 111 | mFaceDemoView.setFaceImage(result); 112 | } 113 | break; 114 | default:break; 115 | } 116 | } 117 | 118 | public void resetControllers(View view) { 119 | mFaceDemoView.resetControllers(); 120 | mFaceDemoView.postInvalidate(); 121 | } 122 | 123 | public void saveResult(View view) { 124 | Bitmap result = ((BitmapDrawable)mResultView.getDrawable()).getBitmap(); 125 | ImageUtil.saveBitmap(result); 126 | } 127 | 128 | public void useResult(View view) { 129 | Bitmap result = ((BitmapDrawable)mResultView.getDrawable()).getBitmap(); 130 | mFaceDemoView.setFaceImage(result); 131 | } 132 | } 133 | -------------------------------------------------------------------------------- /cgedemo/src/main/java/org/wysaid/cgeDemo/SimplePlayerDemoActivity.java: -------------------------------------------------------------------------------- 1 | package org.wysaid.cgeDemo; 2 | 3 | import android.content.Context; 4 | import android.content.Intent; 5 | import android.graphics.Bitmap; 6 | import android.graphics.BitmapFactory; 7 | import android.media.MediaPlayer; 8 | import android.net.Uri; 9 | import android.support.v7.app.AppCompatActivity; 10 | import android.os.Bundle; 11 | import android.util.Log; 12 | import android.view.Menu; 13 | import android.view.MenuItem; 14 | import android.view.View; 15 | import android.widget.Button; 16 | import android.widget.LinearLayout; 17 | import android.widget.Toast; 18 | 19 | import org.wysaid.common.Common; 20 | import org.wysaid.myUtils.FileUtil; 21 | import org.wysaid.myUtils.ImageUtil; 22 | import org.wysaid.texUtils.TextureRenderer; 23 | import org.wysaid.texUtils.TextureRendererDrawOrigin; 24 | import org.wysaid.texUtils.TextureRendererEdge; 25 | import org.wysaid.texUtils.TextureRendererEmboss; 26 | import org.wysaid.texUtils.TextureRendererLerpBlur; 27 | import org.wysaid.texUtils.TextureRendererMask; 28 | import org.wysaid.texUtils.TextureRendererWave; 29 | import org.wysaid.view.SimplePlayerGLSurfaceView; 30 | 31 | public class SimplePlayerDemoActivity extends AppCompatActivity { 32 | 33 | SimplePlayerGLSurfaceView mPlayerView; 34 | Button mShapeBtn; 35 | Button mTakeshotBtn; 36 | Button mGalleryBtn; 37 | 38 | public static final int REQUEST_CODE_PICK_VIDEO = 1; 39 | 40 | private SimplePlayerGLSurfaceView.PlayCompletionCallback playCompletionCallback = new SimplePlayerGLSurfaceView.PlayCompletionCallback() { 41 | @Override 42 | public void playComplete(MediaPlayer player) { 43 | Log.i(Common.LOG_TAG, "The video playing is over, restart..."); 44 | player.start(); 45 | } 46 | 47 | @Override 48 | public boolean playFailed(MediaPlayer player, final int what, final int extra) 49 | { 50 | Toast.makeText(SimplePlayerDemoActivity.this, String.format("Error occured! Stop playing, Err code: %d, %d", what, extra), Toast.LENGTH_LONG).show(); 51 | return true; 52 | } 53 | }; 54 | 55 | class MyVideoButton extends Button implements View.OnClickListener { 56 | 57 | Uri videoUri; 58 | SimplePlayerGLSurfaceView videoView; 59 | 60 | public MyVideoButton(Context context) { 61 | super(context); 62 | } 63 | 64 | @Override 65 | public void onClick(View v) { 66 | 67 | Toast.makeText(SimplePlayerDemoActivity.this, "正在准备播放视频 " + videoUri.getHost() + videoUri.getPath() + " 如果是网络视频, 可能需要一段时间的等待", Toast.LENGTH_SHORT).show(); 68 | 69 | videoView.setVideoUri(videoUri, new SimplePlayerGLSurfaceView.PlayPreparedCallback() { 70 | @Override 71 | public void playPrepared(MediaPlayer player) { 72 | mPlayerView.post(new Runnable() { 73 | @Override 74 | public void run() { 75 | String msg = "开始播放 "; 76 | if(videoUri.getHost() != null) 77 | msg += videoUri.getHost(); 78 | if(videoUri.getPath() != null) 79 | msg += videoUri.getPath(); 80 | 81 | Toast.makeText(SimplePlayerDemoActivity.this, msg, Toast.LENGTH_SHORT).show(); 82 | } 83 | }); 84 | 85 | player.start(); 86 | } 87 | }, playCompletionCallback); 88 | } 89 | } 90 | 91 | @Override 92 | protected void onCreate(Bundle savedInstanceState) { 93 | super.onCreate(savedInstanceState); 94 | setContentView(R.layout.activity_simple_player_demo); 95 | mPlayerView = (SimplePlayerGLSurfaceView)findViewById(R.id.videoGLSurfaceView); 96 | // mPlayerView.setZOrderOnTop(false); 97 | 98 | mShapeBtn = (Button)findViewById(R.id.switchShapeBtn); 99 | 100 | mShapeBtn.setOnClickListener(new View.OnClickListener() { 101 | 102 | private boolean useMask = false; 103 | Bitmap bmp; 104 | 105 | @Override 106 | public void onClick(View v) { 107 | useMask = !useMask; 108 | if(useMask) { 109 | if(bmp == null) 110 | bmp = BitmapFactory.decodeResource(getResources(), R.drawable.mask1); 111 | 112 | if(bmp != null) { 113 | mPlayerView.setMaskBitmap(bmp, false, new SimplePlayerGLSurfaceView.SetMaskBitmapCallback() { 114 | @Override 115 | public void setMaskOK(TextureRendererMask renderer) { 116 | if(mPlayerView.isUsingMask()) { 117 | renderer.setMaskFlipscale(1.0f, -1.0f); 118 | } 119 | } 120 | 121 | @Override 122 | public void unsetMaskOK(TextureRenderer renderer) { 123 | 124 | } 125 | }); 126 | } 127 | } else { 128 | mPlayerView.setMaskBitmap(null, false); 129 | } 130 | 131 | } 132 | }); 133 | 134 | mTakeshotBtn = (Button)findViewById(R.id.takeShotBtn); 135 | 136 | mTakeshotBtn.setOnClickListener(new View.OnClickListener() { 137 | @Override 138 | public void onClick(View v) { 139 | mPlayerView.takeShot(new SimplePlayerGLSurfaceView.TakeShotCallback() { 140 | @Override 141 | public void takeShotOK(Bitmap bmp) { 142 | if(bmp != null) 143 | ImageUtil.saveBitmap(bmp); 144 | else 145 | Log.e(Common.LOG_TAG, "take shot failed!"); 146 | } 147 | }); 148 | } 149 | }); 150 | 151 | LinearLayout menuLayout = (LinearLayout)findViewById(R.id.menuLinearLayout); 152 | 153 | String[] filePaths = { 154 | "android.resource://" + getPackageName() + "/" + R.raw.test, 155 | "http://wge.wysaid.org/res/video/1.mp4", //网络视频 156 | "http://wysaid.org/p/test.mp4", //网络视频 157 | }; 158 | 159 | { 160 | Button btn = new Button(this); 161 | menuLayout.addView(btn); 162 | btn.setAllCaps(false); 163 | btn.setText("Last Recorded Video"); 164 | btn.setOnClickListener(new View.OnClickListener() { 165 | @Override 166 | public void onClick(View v) { 167 | 168 | String lastVideoFileName = FileUtil.getTextContent(CameraDemoActivity.lastVideoPathFileName); 169 | if(lastVideoFileName == null) { 170 | Toast.makeText(SimplePlayerDemoActivity.this, "No video is recorded, please record one in the 2nd case.", Toast.LENGTH_LONG).show(); 171 | return; 172 | } 173 | 174 | Uri lastVideoUri = Uri.parse(lastVideoFileName); 175 | mPlayerView.setVideoUri(lastVideoUri, new SimplePlayerGLSurfaceView.PlayPreparedCallback() { 176 | @Override 177 | public void playPrepared(MediaPlayer player) { 178 | Log.i(Common.LOG_TAG, "The video is prepared to play"); 179 | player.start(); 180 | } 181 | }, playCompletionCallback); 182 | } 183 | }); 184 | } 185 | 186 | for(int i = 0; i != filePaths.length; ++i) { 187 | MyVideoButton btn = new MyVideoButton(this); 188 | btn.setText("视频" + i); 189 | btn.videoUri = Uri.parse(filePaths[i]); 190 | btn.videoView = mPlayerView; 191 | btn.setOnClickListener(btn); 192 | menuLayout.addView(btn); 193 | 194 | if(i == 0) { 195 | btn.onClick(btn); 196 | } 197 | } 198 | 199 | Button filterButton = new Button(this); 200 | menuLayout.addView(filterButton); 201 | filterButton.setText("切换滤镜"); 202 | filterButton.setOnClickListener(new View.OnClickListener() { 203 | private int filterIndex; 204 | 205 | @Override 206 | public void onClick(View v) { 207 | mPlayerView.queueEvent(new Runnable() { 208 | @Override 209 | public void run() { 210 | TextureRenderer drawer = null; 211 | ++filterIndex; 212 | filterIndex %= 5; 213 | switch (filterIndex) { 214 | case 0: //LerpBlur 215 | { 216 | TextureRendererLerpBlur lerpBlurDrawer = TextureRendererLerpBlur.create(true); 217 | if(lerpBlurDrawer != null) { 218 | lerpBlurDrawer.setIntensity(16); 219 | drawer = lerpBlurDrawer; 220 | } 221 | } 222 | break; 223 | 224 | case 1: 225 | { 226 | TextureRendererEdge edgeDrawer = TextureRendererEdge.create(true); 227 | if(edgeDrawer != null) { 228 | drawer = edgeDrawer; 229 | } 230 | } 231 | break; 232 | 233 | case 2: 234 | { 235 | TextureRendererEmboss embossDrawer = TextureRendererEmboss.create(true); 236 | if(embossDrawer != null) { 237 | drawer = embossDrawer; 238 | } 239 | } 240 | break; 241 | 242 | case 3: 243 | { 244 | TextureRendererWave waveDrawer = TextureRendererWave.create(true); 245 | if(waveDrawer != null) { 246 | waveDrawer.setAutoMotion(0.4f); 247 | drawer = waveDrawer; 248 | } 249 | } 250 | break; 251 | 252 | case 4: 253 | { 254 | TextureRendererDrawOrigin originDrawer = TextureRendererDrawOrigin.create(true); 255 | if(originDrawer != null) { 256 | drawer = originDrawer; 257 | } 258 | } 259 | break; 260 | 261 | default:break; 262 | } 263 | 264 | mPlayerView.setTextureRenderer(drawer); 265 | } 266 | }); 267 | } 268 | }); 269 | 270 | mGalleryBtn = (Button)findViewById(R.id.galleryBtn); 271 | mGalleryBtn.setOnClickListener(galleryBtnClickListener); 272 | 273 | Button fitViewBtn = (Button)findViewById(R.id.fitViewBtn); 274 | fitViewBtn.setOnClickListener(new View.OnClickListener() { 275 | boolean shouldFit = false; 276 | @Override 277 | public void onClick(View v) { 278 | shouldFit = !shouldFit; 279 | mPlayerView.setFitFullView(shouldFit); 280 | } 281 | }); 282 | 283 | mPlayerView.setPlayerInitializeCallback(new SimplePlayerGLSurfaceView.PlayerInitializeCallback() { 284 | @Override 285 | public void initPlayer(final MediaPlayer player) { 286 | //针对网络视频进行进度检查 287 | player.setOnBufferingUpdateListener(new MediaPlayer.OnBufferingUpdateListener() { 288 | @Override 289 | public void onBufferingUpdate(MediaPlayer mp, int percent) { 290 | Log.i(Common.LOG_TAG, "Buffer update: " + percent); 291 | if(percent == 100) { 292 | Log.i(Common.LOG_TAG, "缓冲完毕!"); 293 | player.setOnBufferingUpdateListener(null); 294 | } 295 | } 296 | }); 297 | } 298 | }); 299 | } 300 | 301 | android.view.View.OnClickListener galleryBtnClickListener = new android.view.View.OnClickListener(){ 302 | 303 | @Override 304 | public void onClick(final android.view.View view) { 305 | Intent videoPickerIntent = new Intent(Intent.ACTION_GET_CONTENT); 306 | videoPickerIntent.setType("video/*"); 307 | startActivityForResult(videoPickerIntent, REQUEST_CODE_PICK_VIDEO); 308 | } 309 | }; 310 | 311 | public void onActivityResult(final int requestCode, final int resultCode, final Intent data) { 312 | switch (requestCode) 313 | { 314 | case REQUEST_CODE_PICK_VIDEO: 315 | if(resultCode == RESULT_OK) 316 | { 317 | 318 | mPlayerView.setVideoUri(data.getData(), new SimplePlayerGLSurfaceView.PlayPreparedCallback() { 319 | @Override 320 | public void playPrepared(MediaPlayer player) { 321 | Log.i(Common.LOG_TAG, "The video is prepared to play"); 322 | player.start(); 323 | } 324 | }, playCompletionCallback); 325 | } 326 | default: break; 327 | } 328 | } 329 | 330 | @Override 331 | public void onPause() { 332 | super.onPause(); 333 | Log.i(SimplePlayerGLSurfaceView.LOG_TAG, "activity onPause..."); 334 | mPlayerView.release(); 335 | mPlayerView.onPause(); 336 | } 337 | 338 | @Override 339 | public void onResume() { 340 | super.onResume(); 341 | 342 | mPlayerView.onResume(); 343 | } 344 | 345 | @Override 346 | public boolean onCreateOptionsMenu(Menu menu) { 347 | // Inflate the menu; this adds items to the action bar if it is present. 348 | getMenuInflater().inflate(R.menu.menu_video_player, menu); 349 | return true; 350 | } 351 | 352 | @Override 353 | public boolean onOptionsItemSelected(MenuItem item) { 354 | // Handle action bar item clicks here. The action bar will 355 | // automatically handle clicks on the Home/Up button, so long 356 | // as you specify a parent activity in AndroidManifest.xml. 357 | int id = item.getItemId(); 358 | 359 | //noinspection SimplifiableIfStatement 360 | if (id == R.id.action_settings) { 361 | return true; 362 | } 363 | 364 | return super.onOptionsItemSelected(item); 365 | } 366 | } 367 | -------------------------------------------------------------------------------- /cgedemo/src/main/java/org/wysaid/cgeDemo/VideoPlayerDemoActivity.java: -------------------------------------------------------------------------------- 1 | package org.wysaid.cgeDemo; 2 | 3 | import android.content.Context; 4 | import android.content.Intent; 5 | import android.graphics.Bitmap; 6 | import android.graphics.BitmapFactory; 7 | import android.media.MediaPlayer; 8 | import android.net.Uri; 9 | import android.support.v7.app.AppCompatActivity; 10 | import android.os.Bundle; 11 | import android.util.Log; 12 | import android.view.Menu; 13 | import android.view.MenuItem; 14 | import android.view.View; 15 | import android.widget.Button; 16 | import android.widget.LinearLayout; 17 | import android.widget.SeekBar; 18 | import android.widget.Toast; 19 | 20 | import org.wysaid.common.Common; 21 | import org.wysaid.myUtils.FileUtil; 22 | import org.wysaid.myUtils.ImageUtil; 23 | import org.wysaid.nativePort.CGEFrameRenderer; 24 | import org.wysaid.view.VideoPlayerGLSurfaceView; 25 | 26 | public class VideoPlayerDemoActivity extends AppCompatActivity { 27 | 28 | VideoPlayerGLSurfaceView mPlayerView; 29 | Button mShapeBtn; 30 | Button mTakeshotBtn; 31 | Button mGalleryBtn; 32 | 33 | String mCurrentConfig; 34 | 35 | public static final int REQUEST_CODE_PICK_VIDEO = 1; 36 | 37 | private VideoPlayerGLSurfaceView.PlayCompletionCallback playCompletionCallback = new VideoPlayerGLSurfaceView.PlayCompletionCallback() { 38 | @Override 39 | public void playComplete(MediaPlayer player) { 40 | Log.i(Common.LOG_TAG, "The video playing is over, restart..."); 41 | player.start(); 42 | } 43 | 44 | @Override 45 | public boolean playFailed(MediaPlayer player, final int what, final int extra) 46 | { 47 | Toast.makeText(VideoPlayerDemoActivity.this, String.format("Error occured! Stop playing, Err code: %d, %d", what, extra), Toast.LENGTH_LONG).show(); 48 | return true; 49 | } 50 | }; 51 | 52 | class MyVideoButton extends Button implements View.OnClickListener { 53 | 54 | Uri videoUri; 55 | VideoPlayerGLSurfaceView videoView; 56 | 57 | public MyVideoButton(Context context) { 58 | super(context); 59 | } 60 | 61 | @Override 62 | public void onClick(View v) { 63 | 64 | Toast.makeText(VideoPlayerDemoActivity.this, "正在准备播放视频 " + videoUri.getHost() + videoUri.getPath() + " 如果是网络视频, 可能需要一段时间的等待", Toast.LENGTH_SHORT).show(); 65 | 66 | videoView.setVideoUri(videoUri, new VideoPlayerGLSurfaceView.PlayPreparedCallback() { 67 | @Override 68 | public void playPrepared(MediaPlayer player) { 69 | mPlayerView.post(new Runnable() { 70 | @Override 71 | public void run() { 72 | Toast.makeText(VideoPlayerDemoActivity.this, "开始播放 " + videoUri.getHost() + videoUri.getPath(), Toast.LENGTH_SHORT).show(); 73 | } 74 | }); 75 | 76 | player.start(); 77 | } 78 | }, playCompletionCallback); 79 | } 80 | } 81 | 82 | @Override 83 | protected void onCreate(Bundle savedInstanceState) { 84 | super.onCreate(savedInstanceState); 85 | setContentView(R.layout.activity_video_player_demo); 86 | mPlayerView = (VideoPlayerGLSurfaceView)findViewById(R.id.videoGLSurfaceView); 87 | mPlayerView.setZOrderOnTop(false); 88 | mPlayerView.setZOrderMediaOverlay(true); 89 | 90 | mShapeBtn = (Button)findViewById(R.id.switchShapeBtn); 91 | 92 | mShapeBtn.setOnClickListener(new View.OnClickListener() { 93 | 94 | private boolean useMask = false; 95 | Bitmap bmp; 96 | 97 | @Override 98 | public void onClick(View v) { 99 | useMask = !useMask; 100 | if(useMask) { 101 | if(bmp == null) 102 | bmp = BitmapFactory.decodeResource(getResources(), R.drawable.mask1); 103 | 104 | if(bmp != null) { 105 | mPlayerView.setMaskBitmap(bmp, false, new VideoPlayerGLSurfaceView.SetMaskBitmapCallback() { 106 | @Override 107 | public void setMaskOK(CGEFrameRenderer renderer) { 108 | // if(mPlayerView.isUsingMask()) { 109 | // renderer.setMaskFlipScale(1.0f, -1.0f); 110 | // } 111 | Log.i(Common.LOG_TAG, "启用mask!"); 112 | } 113 | }); 114 | } 115 | } else { 116 | mPlayerView.setMaskBitmap(null, false); 117 | } 118 | 119 | } 120 | }); 121 | 122 | mTakeshotBtn = (Button)findViewById(R.id.takeShotBtn); 123 | 124 | mTakeshotBtn.setOnClickListener(new View.OnClickListener() { 125 | @Override 126 | public void onClick(View v) { 127 | mPlayerView.takeShot(new VideoPlayerGLSurfaceView.TakeShotCallback() { 128 | @Override 129 | public void takeShotOK(Bitmap bmp) { 130 | if(bmp != null) 131 | ImageUtil.saveBitmap(bmp); 132 | else 133 | Log.e(Common.LOG_TAG, "take shot failed!"); 134 | } 135 | }); 136 | } 137 | }); 138 | 139 | LinearLayout menuLayout = (LinearLayout)findViewById(R.id.menuLinearLayout); 140 | 141 | { 142 | Button btn = new Button(this); 143 | menuLayout.addView(btn); 144 | btn.setAllCaps(false); 145 | btn.setText("Last Recorded Video"); 146 | btn.setOnClickListener(new View.OnClickListener() { 147 | @Override 148 | public void onClick(View v) { 149 | 150 | String lastVideoFileName = FileUtil.getTextContent(CameraDemoActivity.lastVideoPathFileName); 151 | if(lastVideoFileName == null) { 152 | Toast.makeText(VideoPlayerDemoActivity.this, "No video is recorded, please record one in the 2nd case.", Toast.LENGTH_LONG).show(); 153 | return; 154 | } 155 | 156 | Uri lastVideoUri = Uri.parse(lastVideoFileName); 157 | mPlayerView.setVideoUri(lastVideoUri, new VideoPlayerGLSurfaceView.PlayPreparedCallback() { 158 | @Override 159 | public void playPrepared(MediaPlayer player) { 160 | Log.i(Common.LOG_TAG, "The video is prepared to play"); 161 | player.start(); 162 | } 163 | }, playCompletionCallback); 164 | } 165 | }); 166 | } 167 | 168 | String[] filePaths = { 169 | "android.resource://" + getPackageName() + "/" + R.raw.test, 170 | "http://wge.wysaid.org/res/video/1.mp4", //网络视频 171 | "http://wysaid.org/p/test.mp4", //网络视频 172 | }; 173 | 174 | for(int i = 0; i != filePaths.length; ++i) { 175 | MyVideoButton btn = new MyVideoButton(this); 176 | btn.setText("视频" + i); 177 | btn.videoUri = Uri.parse(filePaths[i]); 178 | btn.videoView = mPlayerView; 179 | btn.setOnClickListener(btn); 180 | menuLayout.addView(btn); 181 | 182 | if(i == 0) { 183 | btn.onClick(btn); 184 | } 185 | } 186 | 187 | for(int i = 0; i != MainActivity.effectConfigs.length; ++i) { 188 | CameraDemoActivity.MyButtons button = new CameraDemoActivity.MyButtons(this, MainActivity.effectConfigs[i]); 189 | button.setText("特效" + i); 190 | button.setOnClickListener(mFilterSwitchListener); 191 | menuLayout.addView(button); 192 | } 193 | 194 | mGalleryBtn = (Button)findViewById(R.id.galleryBtn); 195 | mGalleryBtn.setOnClickListener(galleryBtnClickListener); 196 | 197 | Button fitViewBtn = (Button)findViewById(R.id.fitViewBtn); 198 | fitViewBtn.setOnClickListener(new View.OnClickListener() { 199 | boolean shouldFit = false; 200 | @Override 201 | public void onClick(View v) { 202 | shouldFit = !shouldFit; 203 | mPlayerView.setFitFullView(shouldFit); 204 | } 205 | }); 206 | 207 | mPlayerView.setPlayerInitializeCallback(new VideoPlayerGLSurfaceView.PlayerInitializeCallback() { 208 | @Override 209 | public void initPlayer(final MediaPlayer player) { 210 | //针对网络视频进行进度检查 211 | player.setOnBufferingUpdateListener(new MediaPlayer.OnBufferingUpdateListener() { 212 | @Override 213 | public void onBufferingUpdate(MediaPlayer mp, int percent) { 214 | Log.i(Common.LOG_TAG, "Buffer update: " + percent); 215 | if(percent == 100) { 216 | Log.i(Common.LOG_TAG, "缓冲完毕!"); 217 | player.setOnBufferingUpdateListener(null); 218 | } 219 | } 220 | }); 221 | } 222 | }); 223 | 224 | SeekBar seekBar = (SeekBar) findViewById(R.id.seekBar); 225 | 226 | seekBar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() { 227 | @Override 228 | public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) { 229 | float intensity = progress / 100.0f; 230 | mPlayerView.setFilterIntensity(intensity); 231 | } 232 | 233 | @Override 234 | public void onStartTrackingTouch(SeekBar seekBar) { 235 | 236 | } 237 | 238 | @Override 239 | public void onStopTrackingTouch(SeekBar seekBar) { 240 | 241 | } 242 | }); 243 | 244 | } 245 | 246 | private View.OnClickListener mFilterSwitchListener = new View.OnClickListener() { 247 | @Override 248 | public void onClick(View v) { 249 | CameraDemoActivity.MyButtons btn = (CameraDemoActivity.MyButtons)v; 250 | mPlayerView.setFilterWithConfig(btn.filterConfig); 251 | mCurrentConfig = btn.filterConfig; 252 | } 253 | }; 254 | 255 | android.view.View.OnClickListener galleryBtnClickListener = new android.view.View.OnClickListener(){ 256 | 257 | @Override 258 | public void onClick(final android.view.View view) { 259 | Intent videoPickerIntent = new Intent(Intent.ACTION_GET_CONTENT); 260 | videoPickerIntent.setType("video/*"); 261 | startActivityForResult(videoPickerIntent, REQUEST_CODE_PICK_VIDEO); 262 | } 263 | }; 264 | 265 | public void onActivityResult(final int requestCode, final int resultCode, final Intent data) { 266 | switch (requestCode) 267 | { 268 | case REQUEST_CODE_PICK_VIDEO: 269 | if(resultCode == RESULT_OK) 270 | { 271 | 272 | mPlayerView.setVideoUri(data.getData(), new VideoPlayerGLSurfaceView.PlayPreparedCallback() { 273 | @Override 274 | public void playPrepared(MediaPlayer player) { 275 | Log.i(Common.LOG_TAG, "The video is prepared to play"); 276 | player.start(); 277 | } 278 | }, playCompletionCallback); 279 | } 280 | default: break; 281 | } 282 | } 283 | 284 | @Override 285 | public void onPause() { 286 | super.onPause(); 287 | Log.i(VideoPlayerGLSurfaceView.LOG_TAG, "activity onPause..."); 288 | mPlayerView.release(); 289 | mPlayerView.onPause(); 290 | } 291 | 292 | @Override 293 | public void onResume() { 294 | super.onResume(); 295 | 296 | mPlayerView.onResume(); 297 | } 298 | 299 | @Override 300 | public boolean onCreateOptionsMenu(Menu menu) { 301 | // Inflate the menu; this adds items to the action bar if it is present. 302 | getMenuInflater().inflate(R.menu.menu_video_player, menu); 303 | return true; 304 | } 305 | 306 | @Override 307 | public boolean onOptionsItemSelected(MenuItem item) { 308 | // Handle action bar item clicks here. The action bar will 309 | // automatically handle clicks on the Home/Up button, so long 310 | // as you specify a parent activity in AndroidManifest.xml. 311 | int id = item.getItemId(); 312 | 313 | //noinspection SimplifiableIfStatement 314 | if (id == R.id.action_settings) { 315 | return true; 316 | } 317 | 318 | return super.onOptionsItemSelected(item); 319 | } 320 | } 321 | -------------------------------------------------------------------------------- /cgedemo/src/main/java/org/wysaid/demoViews/FaceDemoView.java: -------------------------------------------------------------------------------- 1 | package org.wysaid.demoViews; 2 | 3 | import android.content.Context; 4 | import android.graphics.Bitmap; 5 | import android.graphics.Canvas; 6 | import android.graphics.Color; 7 | import android.graphics.Matrix; 8 | import android.graphics.Paint; 9 | import android.graphics.PointF; 10 | import android.media.FaceDetector; 11 | import android.util.AttributeSet; 12 | import android.util.Log; 13 | import android.view.MotionEvent; 14 | import android.view.View; 15 | import android.view.View.OnTouchListener; 16 | import android.widget.Toast; 17 | 18 | import org.wysaid.common.Common; 19 | import org.wysaid.myUtils.ImageUtil; 20 | import org.wysaid.nativePort.CGEFaceFunctions; 21 | 22 | /** 23 | * Created by wysaid on 15/12/24. 24 | * Mail: admin@wysaid.org 25 | * blog: wysaid.org 26 | */ 27 | public class FaceDemoView extends View implements OnTouchListener{ 28 | 29 | public static class FaceController { 30 | public Bitmap image; 31 | public PointF pos; 32 | public float rotation; 33 | public float halfWidth, halfHeight; 34 | 35 | FaceController(Bitmap img) { 36 | image = img; 37 | pos = new PointF(0, 0); 38 | halfWidth = img.getWidth() / 2.0f; 39 | halfHeight = img.getHeight() / 2.0f; 40 | } 41 | 42 | public void drawController(Canvas canvas) { 43 | Matrix matrix = new Matrix(); 44 | matrix.setTranslate(pos.x - halfWidth, pos.y - halfHeight); 45 | matrix.postRotate(rotation, pos.x, pos.y); 46 | canvas.drawBitmap(image, matrix, null); 47 | } 48 | 49 | public boolean onTouchPosition(float x, float y) { 50 | if(x < pos.x - halfWidth || x > pos.x + halfWidth || y < pos.y - halfHeight || y > pos.y + halfHeight) 51 | return false; 52 | return true; 53 | } 54 | } 55 | 56 | private Bitmap mFaceImage; 57 | private FaceController mLeftEyeController, mRightEyeController; 58 | private FaceController mMouthController; 59 | private FaceController mChinController; 60 | private FaceController mSelectedController; 61 | 62 | public Bitmap getFaceimage() { 63 | return mFaceImage; 64 | } 65 | 66 | PointF mOriginPos = new PointF(0.0f, 0.0f); 67 | PointF mLastFingerPos = new PointF(); 68 | float mImageScaling = 1.0f; 69 | 70 | public FaceDemoView(Context context, AttributeSet attrs) { 71 | super(context, attrs); 72 | setOnTouchListener(this); 73 | } 74 | 75 | public void loadResources(Bitmap eye, Bitmap mouth, Bitmap chin) { 76 | mLeftEyeController = new FaceController(eye); 77 | mRightEyeController = new FaceController(eye); 78 | mMouthController = new FaceController(mouth); 79 | mChinController = new FaceController(chin); 80 | } 81 | 82 | public static float getProperScaling(Bitmap bmp, View view) { 83 | float bmpWidth = bmp.getWidth(); 84 | float bmpHeight = bmp.getHeight(); 85 | float viewWidth = view.getWidth(); 86 | float viewHeight = view.getHeight(); 87 | 88 | return Math.min(viewWidth / bmpWidth, viewHeight / bmpHeight); 89 | } 90 | 91 | public void setFaceImage(Bitmap faceImage) { 92 | mFaceImage = faceImage; 93 | 94 | mImageScaling = getProperScaling(mFaceImage, this); 95 | 96 | mOriginPos.x = (this.getWidth() - mFaceImage.getWidth() * mImageScaling) / 2.0f; 97 | mOriginPos.y = (this.getHeight() - mFaceImage.getHeight() * mImageScaling) / 2.0f; 98 | 99 | ImageUtil.FaceRects faceRects = ImageUtil.findFaceByBitmap(faceImage); 100 | 101 | if(faceRects.numOfFaces > 0) { 102 | String content = ""; 103 | 104 | FaceDetector.Face face = faceRects.faces[0]; 105 | PointF pnt = new PointF(); 106 | face.getMidPoint(pnt); 107 | 108 | float eyeDis = face.eyesDistance(); 109 | float halfEyeDis = eyeDis / 2.0f; 110 | 111 | content += String.format("准确率: %g, 人脸中心 %g, %g, 眼间距: %g\n", face.confidence(), pnt.x, pnt.y, eyeDis); 112 | 113 | CGEFaceFunctions.FaceFeature feature = new CGEFaceFunctions.FaceFeature(); 114 | 115 | feature.leftEyePos = new PointF(pnt.x - halfEyeDis, pnt.y); 116 | feature.rightEyePos = new PointF(pnt.x + halfEyeDis, pnt.y); 117 | feature.mouthPos = new PointF(pnt.x, pnt.y + eyeDis); 118 | feature.chinPos = new PointF(pnt.x, pnt.y + eyeDis * 1.5f); 119 | 120 | // if(!mFaceImage.isMutable()) 121 | // mFaceImage = mFaceImage.copy(mFaceImage.getConfig(), true); 122 | // 123 | // Canvas canvas = new Canvas(mFaceImage); 124 | // Paint paint = new Paint(); 125 | // paint.setColor(Color.RED); 126 | // paint.setStyle(Paint.Style.STROKE); 127 | // paint.setStrokeWidth(4); 128 | // canvas.drawRect((int) (pnt.x - eyeDis * 1.5f), (int) (pnt.y - eyeDis * 1.5f), (int) (pnt.x + eyeDis * 1.5f), (int) (pnt.y + eyeDis * 1.5f), paint); 129 | // 130 | // //眼睛中心 131 | // canvas.drawRect((int) (pnt.x - 2.0f), (int) (pnt.y - 2.0f), (int) (pnt.x + 2.0f), (int) (pnt.y + 2.0f), paint); 132 | // 133 | // //双眼 134 | // canvas.drawRect((int) (pnt.x - halfEyeDis - 2.0f), (int) (pnt.y - 2.0f), (int) (pnt.x - halfEyeDis + 2.0f), (int) (pnt.y + 2.0f), paint); 135 | // canvas.drawRect((int) (pnt.x + halfEyeDis - 2.0f), (int) (pnt.y - 2.0f), (int) (pnt.x + halfEyeDis + 2.0f), (int) (pnt.y + 2.0f), paint); 136 | 137 | mLeftEyeController.pos = new PointF(feature.leftEyePos.x * mImageScaling + mOriginPos.x, feature.leftEyePos.y * mImageScaling + mOriginPos.y); 138 | mRightEyeController.pos = new PointF(feature.rightEyePos.x * mImageScaling + mOriginPos.x, feature.rightEyePos.y * mImageScaling + mOriginPos.y); 139 | mMouthController.pos = new PointF(feature.mouthPos.x * mImageScaling + mOriginPos.x, feature.mouthPos.y * mImageScaling + mOriginPos.y); 140 | mChinController.pos = new PointF(feature.chinPos.x * mImageScaling + mOriginPos.x, feature.chinPos.y * mImageScaling + mOriginPos.y); 141 | 142 | Toast.makeText(this.getContext(), content, Toast.LENGTH_SHORT).show(); 143 | } else { 144 | Toast.makeText(this.getContext(), "检测人脸失败, 情调整控件位置", Toast.LENGTH_LONG).show(); 145 | resetControllers(); 146 | } 147 | 148 | calcAngles(); 149 | postInvalidate(); 150 | } 151 | 152 | public CGEFaceFunctions.FaceFeature getFeature() { 153 | 154 | CGEFaceFunctions.FaceFeature feature = new CGEFaceFunctions.FaceFeature(); 155 | 156 | feature.leftEyePos.x = (mLeftEyeController.pos.x - mOriginPos.x) / mImageScaling; 157 | feature.leftEyePos.y = (mLeftEyeController.pos.y - mOriginPos.y) / mImageScaling; 158 | 159 | feature.rightEyePos.x = (mRightEyeController.pos.x - mOriginPos.x) / mImageScaling; 160 | feature.rightEyePos.y = (mRightEyeController.pos.y - mOriginPos.y) / mImageScaling; 161 | 162 | feature.mouthPos.x = (mMouthController.pos.x - mOriginPos.x) / mImageScaling; 163 | feature.mouthPos.y = (mMouthController.pos.y - mOriginPos.y) / mImageScaling; 164 | 165 | feature.chinPos.x = (mChinController.pos.x - mOriginPos.x) / mImageScaling; 166 | feature.chinPos.y = (mChinController.pos.y - mOriginPos.y) / mImageScaling; 167 | 168 | feature.faceImageWidth = mFaceImage.getWidth(); 169 | feature.faceImageHeight = mFaceImage.getHeight(); 170 | 171 | return feature; 172 | } 173 | 174 | public void setFaceImageZoom(float zoom) { 175 | mImageScaling = zoom; 176 | postInvalidate(); 177 | } 178 | 179 | public void resetControllers() { 180 | mLeftEyeController.pos.x = getWidth() / 2.0f - 100; 181 | mLeftEyeController.pos.y = getHeight() / 2.0f - 100; 182 | 183 | mRightEyeController.pos.x = getWidth() / 2.0f + 100; 184 | mRightEyeController.pos.y = getHeight() / 2.0f - 100; 185 | 186 | mMouthController.pos.x = getWidth() / 2.0f; 187 | mMouthController.pos.y = getHeight() / 2.0f + 50; 188 | 189 | mChinController.pos.x = getWidth() / 2.0f; 190 | mChinController.pos.y = getHeight() / 2.0f + 150; 191 | } 192 | 193 | private void calcAngles() { 194 | 195 | double dirX = mRightEyeController.pos.x - mLeftEyeController.pos.x; 196 | double dirY = mRightEyeController.pos.y - mLeftEyeController.pos.y; 197 | 198 | double len = Math.sqrt(dirX * dirX + dirY * dirY); 199 | if(len < 0.001) 200 | return; 201 | 202 | double angle = Math.asin(dirY / len); 203 | 204 | if(dirX >= 0.0) { 205 | if(dirY < 0) 206 | angle += 2.0 * Math.PI; 207 | 208 | } else { 209 | angle = Math.PI - angle; 210 | } 211 | 212 | float rotation = (float)(angle * 180.0 / Math.PI); 213 | 214 | mLeftEyeController.rotation = rotation; 215 | mRightEyeController.rotation = rotation; 216 | mMouthController.rotation = rotation; 217 | mChinController.rotation = rotation; 218 | } 219 | 220 | @Override 221 | public boolean onTouch(View v, MotionEvent event) { 222 | 223 | float touchX = event.getX(); 224 | float touchY = event.getY(); 225 | 226 | int action = event.getAction(); 227 | 228 | switch (action) 229 | { 230 | case MotionEvent.ACTION_DOWN: 231 | 232 | mSelectedController = null; 233 | 234 | if(mLeftEyeController.onTouchPosition(touchX, touchY)) 235 | mSelectedController = mLeftEyeController; 236 | else if(mRightEyeController.onTouchPosition(touchX, touchY)) 237 | mSelectedController = mRightEyeController; 238 | else if(mMouthController.onTouchPosition(touchX, touchY)) 239 | mSelectedController = mMouthController; 240 | else if(mChinController.onTouchPosition(touchX, touchY)) 241 | mSelectedController = mChinController; 242 | 243 | mLastFingerPos.x = touchX; 244 | mLastFingerPos.y = touchY; 245 | 246 | if(mSelectedController != null) { 247 | 248 | mSelectedController.pos.y -= 100; 249 | postInvalidate(); 250 | Log.i(Common.LOG_TAG, "Controller selected!"); 251 | } 252 | break; 253 | case MotionEvent.ACTION_MOVE: 254 | if(mSelectedController != null) { 255 | mSelectedController.pos.x += touchX - mLastFingerPos.x; 256 | mSelectedController.pos.y += touchY - mLastFingerPos.y; 257 | 258 | if(mSelectedController == mLeftEyeController || mSelectedController == mRightEyeController) 259 | calcAngles(); 260 | } else { 261 | mOriginPos.x += touchX - mLastFingerPos.x; 262 | mOriginPos.y += touchY - mLastFingerPos.y; 263 | } 264 | 265 | mLastFingerPos.x = touchX; 266 | mLastFingerPos.y = touchY; 267 | 268 | postInvalidate(); 269 | 270 | break; 271 | case MotionEvent.ACTION_UP: 272 | mSelectedController = null; 273 | break; 274 | default: 275 | break; 276 | } 277 | 278 | return true; 279 | } 280 | 281 | @Override 282 | public void onDraw(Canvas canvas) { 283 | 284 | if(mFaceImage == null) 285 | return; 286 | 287 | Matrix matrix = new Matrix(); 288 | matrix.setScale(mImageScaling, mImageScaling); 289 | matrix.postTranslate(mOriginPos.x, mOriginPos.y); 290 | canvas.drawBitmap(mFaceImage, matrix, null); 291 | 292 | mLeftEyeController.drawController(canvas); 293 | mRightEyeController.drawController(canvas); 294 | mMouthController.drawController(canvas); 295 | mChinController.drawController(canvas); 296 | 297 | } 298 | 299 | } 300 | -------------------------------------------------------------------------------- /cgedemo/src/main/res/drawable-nodpi/adjust_face_chin.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/cgedemo/src/main/res/drawable-nodpi/adjust_face_chin.png -------------------------------------------------------------------------------- /cgedemo/src/main/res/drawable-nodpi/adjust_face_eye.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/cgedemo/src/main/res/drawable-nodpi/adjust_face_eye.png -------------------------------------------------------------------------------- /cgedemo/src/main/res/drawable-nodpi/adjust_face_mouth.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/cgedemo/src/main/res/drawable-nodpi/adjust_face_mouth.png -------------------------------------------------------------------------------- /cgedemo/src/main/res/drawable-nodpi/bgview.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/cgedemo/src/main/res/drawable-nodpi/bgview.png -------------------------------------------------------------------------------- /cgedemo/src/main/res/drawable-nodpi/face0.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/cgedemo/src/main/res/drawable-nodpi/face0.jpg -------------------------------------------------------------------------------- /cgedemo/src/main/res/drawable-nodpi/face1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/cgedemo/src/main/res/drawable-nodpi/face1.jpg -------------------------------------------------------------------------------- /cgedemo/src/main/res/drawable-nodpi/mask1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/cgedemo/src/main/res/drawable-nodpi/mask1.png -------------------------------------------------------------------------------- /cgedemo/src/main/res/layout/activity_camera_demo.xml: -------------------------------------------------------------------------------- 1 | 8 | 9 | 14 | 15 | 19 | 20 | 26 | 27 | 35 | 36 | 37 | 38 | 42 | 47 | 48 |