├── .gitignore
├── README.md
├── build.gradle
├── cgedemo
├── .gitignore
├── build.gradle
├── proguard-rules.pro
├── project.properties
└── src
│ └── main
│ ├── AndroidManifest.xml
│ ├── assets
│ └── hehe.jpg
│ ├── java
│ └── org
│ │ └── wysaid
│ │ ├── cgeDemo
│ │ ├── CameraDemoActivity.java
│ │ ├── FaceDemoActivity.java
│ │ ├── ImageDemoActivity.java
│ │ ├── MainActivity.java
│ │ ├── SimplePlayerDemoActivity.java
│ │ └── VideoPlayerDemoActivity.java
│ │ └── demoViews
│ │ └── FaceDemoView.java
│ └── res
│ ├── drawable-nodpi
│ ├── adjust_face_chin.png
│ ├── adjust_face_eye.png
│ ├── adjust_face_mouth.png
│ ├── bgview.png
│ ├── face0.jpg
│ ├── face1.jpg
│ └── mask1.png
│ ├── layout
│ ├── activity_camera_demo.xml
│ ├── activity_face_demo.xml
│ ├── activity_filter_demo.xml
│ ├── activity_main.xml
│ ├── activity_simple_player_demo.xml
│ └── activity_video_player_demo.xml
│ ├── menu
│ ├── menu_camera_demo.xml
│ ├── menu_filter_demo.xml
│ ├── menu_main.xml
│ └── menu_video_player.xml
│ ├── mipmap-nodpi
│ └── ic_launcher.png
│ ├── raw
│ └── test.mp4
│ ├── values-w820dp
│ └── dimens.xml
│ └── values
│ ├── dimens.xml
│ ├── strings.xml
│ └── styles.xml
├── demoRelease
└── cgeDemo.apk
├── gradle.properties
├── gradlew
├── gradlew.bat
├── library
├── .gitignore
├── build.gradle
├── proguard-rules.pro
├── project.properties
├── so
│ └── libgpuimage-library.so
└── src
│ └── main
│ ├── AndroidManifest.xml
│ ├── genJNIHeaders
│ ├── java
│ └── org
│ │ └── wysaid
│ │ ├── algorithm
│ │ ├── Matrix2x2.java
│ │ ├── Matrix3x3.java
│ │ ├── Matrix4x4.java
│ │ ├── Vector2.java
│ │ ├── Vector3.java
│ │ └── Vector4.java
│ │ ├── camera
│ │ └── CameraInstance.java
│ │ ├── common
│ │ ├── Common.java
│ │ ├── FrameBufferObject.java
│ │ ├── ProgramObject.java
│ │ ├── SharedContext.java
│ │ └── TextureDrawer.java
│ │ ├── myUtils
│ │ ├── FileUtil.java
│ │ └── ImageUtil.java
│ │ ├── nativePort
│ │ ├── CGEFaceFunctions.java
│ │ ├── CGEFrameRecorder.java
│ │ ├── CGEFrameRenderer.java
│ │ ├── CGEImageHandler.java
│ │ ├── CGENativeLibrary.java
│ │ ├── FFmpegNativeLibrary.java
│ │ └── NativeLibraryLoader.java
│ │ ├── texUtils
│ │ ├── TextureRenderer.java
│ │ ├── TextureRendererBlur.java
│ │ ├── TextureRendererDrawOrigin.java
│ │ ├── TextureRendererEdge.java
│ │ ├── TextureRendererEmboss.java
│ │ ├── TextureRendererLerpBlur.java
│ │ ├── TextureRendererMask.java
│ │ └── TextureRendererWave.java
│ │ └── view
│ │ ├── CameraGLSurfaceView.java
│ │ ├── CameraRecordGLSurfaceView.java
│ │ ├── ImageGLSurfaceView.java
│ │ ├── SimplePlayerGLSurfaceView.java
│ │ └── VideoPlayerGLSurfaceView.java
│ ├── libs
│ ├── arm64-v8a
│ │ └── libCGE.so
│ ├── x86
│ │ └── libCGE.so
│ └── x86_64
│ │ └── libCGE.so
│ └── res
│ └── values
│ └── strings.xml
├── screenshots
├── 0.jpg
├── 1.jpg
├── 2.jpg
└── alipay.jpg
└── settings.gradle
/.gitignore:
--------------------------------------------------------------------------------
1 | .gradle
2 | /local.properties
3 | .idea
4 | *.iml
5 | .DS_Store
6 | /build
7 | /captures
8 | target/
9 | tmp
10 | objs/
11 | armeabi*/
12 | obj/
13 | jni/
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Android-GPUImage-plus
2 | GPU accelerated filters for Android based on OpenGL.
3 |
4 | ## 简介 ##
5 |
6 | 1. This repo is an "Android Studio Project", comprising "cgeDemo", "library" two sub-modules. All java code and the "libCGE.so"(Written in C++&OpenGL with NDK) is provided. Hundreds of built-in filters are available in the demo. 😋If you'd like to add your own filter, please refer to the document for the "Effect String Definition Rule" below.
7 | (本repo为一个Android Studio Project, 包含 cgeDemo, library 两个子模块. 其中library 模块包含java部分所有代码以及一个包含cge核心模块的so链接库,内置近百种滤镜效果, 😋如果要自行添加滤镜, 请参考下方的滤镜描述文件。)
8 |
9 | 2. Demo and Library will be updated as needed. Welcome for your questions or PR.
10 | (不定期更新demo和library. 如有问题欢迎提问, 也欢迎PR.)
11 |
12 | 3. For study only, and no free tech support by now.
13 |
14 | 4. iOS version: [https://github.com/wysaid/ios-gpuimage-plus](https://github.com/wysaid/ios-gpuimage-plus "http://wysaid.org")
15 |
16 | 5. Extra functions can be provided to the donors such as 'realtime video recording with gpu filters'. See the precompiled apk about this function: [https://github.com/wysaid/android-gpuimage-plus/tree/master/demoRelease](https://github.com/wysaid/android-gpuimage-plus/tree/master/demoRelease "http://wysaid.org")
17 |
18 | ## Screen Shots ##
19 |
20 | 
21 |
Single Image Filter
22 |
23 | 
24 |
Camera Filter(With Taking Photo&Video)
25 |
26 | 
27 |
Live Video Filter
28 |
29 | ## Donate ##
30 |
31 | Welcome donations. More work may be paid for the donors' issue as thanks.
32 |
33 | Alipay:
34 |
35 | 
36 |
37 |
38 | ## 文档 ##
39 |
40 | 本lib使用简单, 滤镜自定义可以全部通过纯文本配置来完成。 (目前不提供编辑器)
41 |
42 | 文本解析器的目的是让完全不懂GLSL甚至编程的人也知道如何快速添加新的特效。
43 |
44 | EffectString解析规则和常用方法说明:
45 |
46 | 1. 每一步不同的处理均以'@'开头,后面跟处理方式。中间可以有空格或者没有
47 | 例: "@ method" 或者 "@method" 均为正确格式。method后面可以跟任意个字符,以空格隔开。
48 | method后面的所有字符(直到下一个"@"符号或者字符串结束)将作为参数传递给该method对应的Parser.
49 |
50 | 2. curve方法参数解释: @curve方法格式为 "@curve <arg1> <arg2> ... <argN>"
51 | <argN> 的格式有两种: "RGB (x1,y1) (xn, yn)", xn和yn分别表示0~255之间的数字
52 | 或者 "R (rx1,ry1) ... (rxn, ryn) G (gx1,gy1) ... (gxn,gyn) B (bx1,by1)...(bxn,byn)"
53 | 其中R,G,B分别表示对应通道,后面跟的点即为该通道下的点。
54 | 括号与参数之间可以有空格也可以没有。括号中的x与y之间可以使用任意间隔符如空格,逗号等.
55 |
56 | 例:
57 | 曲线A: RGB 通道调整点为 (0, 0) (100, 200) (255, 255)
58 | 格式: "@curve RGB(0,0) (100, 200) (255, 255)"
59 |
60 | 曲线B: R通道(0, 0) (50, 25) (255, 255),
61 | G通道(0, 0) (100, 150) (255, 255),
62 | RGB通道(0, 0) (200, 150) (255, 255)
63 | 格式: "@curve R(0,0) (50, 25) (255, 255) G(0, 0) (100,150) (255, 255) RGB(0, 0) (200, 150) (255, 255)". PS(WangYang):为了简化我们的工作,我编写了曲线格式生成工具。请到tool目录下获取。
64 |
65 | 注: 当有连续多次curve调整时,可只写一次"@curve",以上面的例子为例,如果曲线A,B中间没有其他操作,那么可以将A,B连接为一条curve指令,这将节省开销加快程序执行速度。
66 | 例: "@curve RGB(0,0) (100, 200) (255, 255) R(0,0) (50, 25) (255, 255) G(0, 0) (100,150) (255, 255) RGB(0, 0) (200, 150) (255, 255)" (与上例先后执行曲线A,曲线B结果相同)
67 |
68 | 3. blend方法参数解释: @blend方法格式为 "@blend <function> <texture> <intensity>"
69 | <function>值目前有
70 |
71 | 正常: mix
72 | 溶解: dissolve[dsv]
73 |
74 | 变暗: darken[dk]
75 | 正片叠底: multiply[mp]
76 | 颜色加深: colorburn[cb]
77 | 线性加深: linearburn[lb]
78 | 深色: darkercolor[dc]
79 |
80 | 变亮: lighten[lt]
81 | 滤色: screen[sr]
82 | 颜色减淡: colordodge[cd]
83 | 线性减淡: lineardodge[ld]
84 | 浅色: lightercolor[lc]
85 |
86 | 叠加: overlay[ol]
87 | 柔光: softlight[sl]
88 | 强光: hardlight[hl]
89 | 亮光: vividlight[vvl]
90 | 线性光: linearlight[ll]
91 | 点光: pinlight[pl]
92 | 实色混合: hardmix[hm]
93 |
94 | 差值: difference[dif]
95 | 排除: exclude[ec]
96 | 减去: subtract[sub]
97 | 划分: divide[div]
98 |
99 | 色相: hue
100 | 饱和度: saturation[sat]
101 | 颜色: color[cl]
102 | 明度: luminosity[lum]
103 |
104 | 相加: add
105 | 反向加: addrev
106 | 黑白: colorbw //此方法仅依据texture的alpha通道将src color置换为黑白.
107 |
108 | 注: [] 中的表示该值的缩写,可以使用缩写代替原本过长的参数值。
109 |
110 | <texture> 参数表示所用到的资源文件名,包含后缀!
111 |
112 | 注: 增强型用法(添加日期2016-1-5) <texture>参数可以直接为纹理id,width,height (使用中括号包起来, 以英文逗号隔开, 中间不能有任何空格, 例如: [10,1024,768] 表示使用10号纹理id, 10号纹理宽1024像素, 高768像素). 由于中括号不能在文件名中直接使用 所以此法不会与一般文件名产生冲突. 注意: 此用法不适于编辑器等使用, 供编程人员提供增强型功能!
113 | 特别注意: 增强型用法中, 纹理id一旦填写错误, 与其他纹理id冲突, 极易造成整个app功能错误!
114 |
115 | <intensity>表示叠加强度(不透明度),为(0, 100] 之间的整数。
116 |
117 | 例:使用资源图 src.jpg 进行叠加,强度为80%
118 | 格式: "@blend overlay src.jpg 80" 或者 "@blend ol src.jpg 80"
119 |
120 | 4. krblend方法参数解释: @krblend方法格式与blend方法一样。参见@blend方法。
121 | 另, krblend所有参数以及用法均与blend方法一致。区别在于krblend方法进行纹理处理时,
122 | 将固定纹理的比例以及对纹理进行等比例缩放以使最终覆盖全图。
123 |
124 | 5. pixblend方法参数解释: @pixblend方法格式为 "@pixblend <function> <color> <intensity>"
125 | <function>参数与blend方法相同,请直接参考blend方法。
126 | <intensity>参数含义与blend方法相同,请直接参考blend方法。
127 | <color>参数包含四个浮点数,分别表示这个颜色的r,g,b,a,取值范围为 [0, 1] 或者 [0, 255]
128 | 例: 使用纯红色进行叠加,强度为90%
129 | 格式: "@pixblend overlay 1 0 0 0 90" -->注意,中间的颜色值可以写为小数。当alpha值大于1时,所有颜色参数值域范围将被认为是[0, 255] 否则被认为是[0, 1]
130 |
131 | 6. selfblend方法参数解释: @selfblend方法格式为 "@selfblend <function> <intensity>"
132 | 注: 此方法中对应的参数与blend方法相同,区别在于没有<texture>参数。本方法将使用待处理图片自身颜色进行混合。
133 |
134 | 7. adjust方法参数解释: @adjust方法格式为 "@adjust <function> <arg1>...<argN>"
135 | <function>值目前有
136 | brightness (亮度): 后接一个参数 intensity, 范围 [-1, 1]
137 |
138 | contrast (对比度): 后接一个参数 intensity, 范围 intensity > 0, 当 intensity = 0 时为灰色图像, intensity = 1 时为无效果, intensity > 1 时加强对比度.
139 |
140 | saturation (饱和度): 后接一个参数 intensity, 范围 intensity > 0, 当 intensity = 0 时为黑白图像, intensity = 1 时为无效果, intensity > 1 时加强饱和度
141 |
142 | monochrome (黑白): 后接六个参数, 范围 [-2, 3], 与photoshop一致。参数顺序分别为: red, green, blue, cyan, magenta, yellow
143 |
144 | sharpen (锐化): 后接一个参数 intensity, 范围 [0, 10], 当intensity为0时无效果
145 | blur (模糊): 后接一个参数 intensity, 范围 [0, 1], 当 intensity 为0时无效果
146 |
147 | whitebalance (白平衡): 后接两个参数 temperature (范围:[-1, 1], 0为无效果) 和 tint (范围: [0, 5], 1 为无效果)
148 |
149 | shadowhighlight[shl] (阴影&高光): 后接两个参数 shadow(范围: [-200, 100], 0为无效果) 和 highlight(范围: [-100, 200], 0为无效果)
150 |
151 | hsv : 后接六个参数red, green, blue, magenta, yellow, cyan. 六个参数范围均为 [-1, 1]
152 | hsl : 后接三个参数hue, saturation, luminance, 三个参数范围均为 [-1, 1]
153 |
154 | level (色阶): 后接三个参数 dark, light, gamma, 范围均为[0, 1], 其中 dark < light
155 |
156 | exposure (曝光) : 后接一个参数 intensity, 范围 [-10, 10]
157 |
158 | colorbalance (色彩平衡): 后接三个参数 redShift [-1, 1], greenShift [-1, 1], blueShift [-1, 1]. (添加日期: 2015-3-30)
159 |
160 | 注: [] 中的表示该值的缩写,可以使用缩写代替原本过长的参数值。
161 | <arg*> 表示该方法所需的参数,具体范围请参考相关class。 <arg*>的个数与具体<function>有关,
162 |
163 | 8. cvlomo方法参数解释: @cvlomo方法包含了子方法curve。格式 "@cvlomo <vignetteStart> <vignetteEnd> <colorScaleLow> <colorScaleRange> <saturation> <curve>"
164 | <vignetteStart>和<vignetteEnd>均为大于0的小数,一般小于1。
165 | <colorScaleLow> <colorScaleRange> 均为大于0的小数,一般小于1。 用于调整图像
166 | <saturation> 0~1之间的小数, 设置图像饱和度。
167 | 参数<curve> 为一个完整的curve方法,但是不添加@curve 标记。
168 | 例: "@cvlomo 0.2 0.8 0.1 0.2 1 RGB(0, 0) (255, 255)"
169 |
170 | 9. colorscale方法参数解释: @colorscale方法格式为 "@colorscale <low> <range> <saturation>"
171 | 注: colorscale方法需要进行CPU计算,较影响速度。
172 |
173 | 10. vignette 方法参数解释: @vignette方法格式为 "@vignette <low> <range> <centerX> <centerY>
174 | 注: 其中low和range是必须选项,centerX和centerY是可选项,若不填,则默认为0.5。 centerX和centerY必须同时存在才生效。
175 | 例: "@vignette 0.1 0.9" , "@vignette 0.1 0.9 0.5 0.5"
176 |
177 | 11. colormul 方法参数解释: @colormul方法格式为 "@colormul <function> <arg1> ...<argN>"
178 | 参数<function>值目前有 "flt", "vec" 和 "mat"。
179 | 当<function>为flt时, 后面跟一个参数 <arg>,将对所有像素执行乘法.
180 | 当<function>为vec时,后面跟三个参数<arg1> <arg2> <arg3>,将对所有像素分别执行rgb分量各自相乘。
181 | 当<function>为mat时,后面跟九个参数<arg1>...<argN>,将对所有像素分别执行矩阵的rgb分量进行矩阵乘法。
182 |
183 | 12. special方法参数解释: @special方法格式为 "@special <N>"
184 | 其中参数<N> 为该特效的编号。
185 | 此类用于处理所有不具备通用性的特效。直接重新编写一个processor以解决。
186 |
187 | 13. lomo方法参数解释:格式 "@lomo <vignetteStart> <vignetteEnd> <colorScaleLow> <colorScaleRange> <saturation> <isLinear>"
188 | <vignetteStart>和<vignetteEnd>均为大于0的小数,一般小于1。
189 | <colorScaleLow> <colorScaleRange> 均为大于0的小数,一般小于1。 用于调整图像
190 | <saturation> 0~1之间的小数, 设置图像饱和度。
191 | <isLinear> 0或1, 设置所使用的vignette方法是否为线性增长,不写此参数则默认为0
192 | 例: "@lomo 0.2 0.8 0.1 0.2 1 0"
193 |
194 | ======编号13 以前使用到的特效依赖库版本: 0.2.1.x =========
195 |
196 | 14. vigblend 方法参数解释: @vigblend方法格式为 "@vigblend <function> <color> <intensity> <low> <range> <centerX> <centerY> [kind]"
197 | [isLinear] 参数为可选参数, 默认为0
198 | 0: 线性增强,vignette本身不包含alpha通道(alpha通道为1)
199 | 1: 线性增强,vignette本身以alpha通道形成渐变
200 | 2: 二次增强,vignette本身不包含alpha通道(alpha通道为1)
201 | 3: 二次增强,vignette本身以alpha通道形成渐变
202 | 例:"@vigblend ol 0 0 0 1 50 0.02 0.45 0.5 0.5 0"
203 | "@vigblend mix 10 10 30 255 100 0 1.5 0.5 0.5 0",
204 | "@vigblend mix 10 10 30 255 100 0 1.5 0.5 0.5 1",
205 | "@vigblend mix 10 10 30 255 100 0 1.5 0.5 0.5 2",
206 | "@vigblend mix 10 10 30 255 100 0 1.5 0.5 0.5 3",
207 | 注:其他参数含义以及用法参考 pixblend 方法以及 vignette 方法。
208 |
209 |
210 | ======↑此注释以上编号使用的特效库依赖版本 0.3.2.1
211 |
212 | 15. selcolor (selective color) 方法参数解释: @selcolor方法格式:"@selcolor <color1> <colorargs1> ...<colorN> <colorargsN>"
213 | 其中<colorN>为选择的颜色, 有效参数包括:
214 | red, green, blue, cyan, magenta, yellow, white, gray, black.
215 |
216 | <colorargsN> 为对选择颜色所做出的调整, 格式为
217 | (cyan, magenta, yellow, key) 范围: [-1, 1]
218 | 每一个<colorargsN> 为使用括号括起来的四个参数, 如果该参数未作调整, 则写0
219 |
220 | ======↑新增加 2014-11-12
221 |
222 | 16. style 方法参数解释: @style 方法格式为 "@style <function> <arg1> ... <argN>"
223 | <function>值目前有
224 |
225 | crosshatch (交叉阴影): 后接两个参数 spacing 范围[0, 0.1] 和 lineWidth 范围(0, 0.01]
226 |
227 | edge (sobel查找边缘): 后接两个参数 mix 范围[0, 1] 和 stride 范围(0, 5]
228 |
229 | emboss (浮雕): 后接三个参数 mix 范围[0, 1], stride 范围[1, 5] 和 angle 范围[0, 2π]
230 |
231 | halftone (半调): 后接一个参数 dotSize 范围 >= 1
232 |
233 | haze (雾): 后接三个参数 distance 范围[-0.5, 0.5], slope 范围 [-0.5, 0.5] 和 color (参数 color包含三个分量, 分别表示r, g, b, 范围均为[0, 1] )
234 |
235 | polkadot (圆点花样): 后接一个参数 dotScaling 范围 (0, 1]
236 |
237 | sketch (素描): 后接一个参数 intensity [0, 1]
238 |
239 | max (最大值) 暂无参数
240 |
241 | min (最小值) 暂无参数
242 |
243 | mid (中值) 暂无参数
244 |
245 | ======↑新增加 2015-2-5
246 |
247 | 17. beautify 方法参数解释: @beautify 方法格式为 "@beautify <function> <arg1>...<argN>"
248 |
249 | <function>值 目前有
250 |
251 | bilateral (双边滤波): 后接 三个参数 模糊半径(blurScale) 范围[-100, 100], 色彩容差(distanceFactor) 范围[1, 20] 和 重复次数(repeat times) 范围 >= 1
252 | 其中 重复次数为可选参数, 如果不填, 则默认为 1
253 |
254 | ======↑新增加 2015-3-19
255 |
256 | 18. 新增功能性方法 unpack, 使用方式为 在整个配置的开头加入 #unpack
257 | 作用: 将去除MultipleEffects包装, 直接把解析出来的所有特效直接加入整个handler.
258 |
259 | ======↑新增加 2015-8-7
260 |
261 | 19. blur 方法参数解释
262 | 新增以 @blur 方法, 专门提供各类模糊算法, 格式: "@blur <function> <arg1> ... <argN>"
263 |
264 | <function>值目前有
265 |
266 | lerp (lerp blur): 后接两个个参数 模糊级别[0, 1], 模糊基数 [0.6, 2.0]
267 |
268 |
269 | ======↑新增加 2015-8-8
270 |
271 | 20. dynamic 方法参数解释
272 | 新增 @dynamic 方法, 专门提供各种动态滤镜效果, 格式: "@dynamic <function> <arg1> ... <argN>"
273 | 注意: 特效编辑器不提供 dynamic 方法的编辑
274 |
275 | wave (动态波纹, 2015-11-18加入) 后接1,3,或4个参数
276 | ①若后接一个参数,波纹动态,该参数表示 autoMotionSpeed, 也就是自动更新速度. (具体参见wave filter定义)
277 | ②若后接三个参数, 波纹静态, 第一个参数表示 motion, 第二个表示 angle, 第三个表示 strength(具体参见wave filter定义)
278 | ③若后接四个参数, 波纹动态,前三个参数含义与②一致, 第四个参数表示 autoMotionSpeed
279 |
280 | 即, 后接一个参数的情况下, 相当于 motion, angle和strength 都使用默认值.
281 |
282 |
283 | ======↑新增加 2015-11-18
--------------------------------------------------------------------------------
/build.gradle:
--------------------------------------------------------------------------------
1 | // Top-level build file where you can add configuration options common to all sub-projects/modules.
2 |
3 | buildscript {
4 | repositories {
5 | jcenter()
6 | }
7 | dependencies {
8 | classpath 'com.android.tools.build:gradle:2.1.0-alpha1'
9 |
10 | // NOTE: Do not place your application dependencies here; they belong
11 | // in the individual module build.gradle files
12 | }
13 | }
14 |
15 | allprojects {
16 | repositories {
17 | jcenter()
18 | }
19 | }
20 |
--------------------------------------------------------------------------------
/cgedemo/.gitignore:
--------------------------------------------------------------------------------
1 | /build
2 |
--------------------------------------------------------------------------------
/cgedemo/build.gradle:
--------------------------------------------------------------------------------
1 | apply plugin: 'com.android.application'
2 |
3 | android {
4 | compileSdkVersion 23
5 | buildToolsVersion "22.0.1"
6 |
7 | defaultConfig {
8 | applicationId "org.wysaid.cgedemo"
9 | minSdkVersion 14
10 | targetSdkVersion 23
11 | versionCode 1
12 | versionName "1.0"
13 | }
14 | buildTypes {
15 | release {
16 | minifyEnabled false
17 | proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
18 | }
19 | }
20 |
21 | }
22 |
23 | dependencies {
24 | compile fileTree(dir: 'libs', include: ['*.jar'])
25 | compile 'com.android.support:appcompat-v7:22.2.0'
26 | compile project(':library')
27 | }
28 |
--------------------------------------------------------------------------------
/cgedemo/proguard-rules.pro:
--------------------------------------------------------------------------------
1 | # Add project specific ProGuard rules here.
2 | # By default, the flags in this file are appended to flags specified
3 | # in /Users/wysaid/android_develop/android-sdk-macosx/tools/proguard/proguard-android.txt
4 | # You can edit the include path and order by changing the proguardFiles
5 | # directive in build.gradle.
6 | #
7 | # For more details, see
8 | # http://developer.android.com/guide/developing/tools/proguard.html
9 |
10 | # Add any project specific keep options here:
11 |
12 | # If your project uses WebView with JS, uncomment the following
13 | # and specify the fully qualified class name to the JavaScript interface
14 | # class:
15 | #-keepclassmembers class fqcn.of.javascript.interface.for.webview {
16 | # public *;
17 | #}
18 |
--------------------------------------------------------------------------------
/cgedemo/project.properties:
--------------------------------------------------------------------------------
1 | target=android-16
2 | proguard.config=${sdk.dir}/tools/proguard/proguard-android.txt:proguard-project.txt
3 | android.library.reference.1=../library
--------------------------------------------------------------------------------
/cgedemo/src/main/AndroidManifest.xml:
--------------------------------------------------------------------------------
1 |
2 |
4 |
5 |
6 |
7 |
8 |
9 |
10 |
11 |
12 |
13 |
14 |
15 |
16 |
17 |
18 |
22 |
23 |
29 |
32 |
33 |
34 |
35 |
36 |
37 |
38 |
41 |
45 |
48 |
49 |
50 |
51 |
52 |
53 |
--------------------------------------------------------------------------------
/cgedemo/src/main/assets/hehe.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/cgedemo/src/main/assets/hehe.jpg
--------------------------------------------------------------------------------
/cgedemo/src/main/java/org/wysaid/cgeDemo/FaceDemoActivity.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.cgeDemo;
2 |
3 | import android.content.Intent;
4 | import android.graphics.Bitmap;
5 | import android.graphics.BitmapFactory;
6 | import android.graphics.drawable.BitmapDrawable;
7 | import android.os.Bundle;
8 | import android.support.v7.app.AppCompatActivity;
9 | import android.util.Log;
10 | import android.view.View;
11 | import android.widget.ImageView;
12 | import android.widget.SeekBar;
13 | import android.widget.Toast;
14 |
15 | import org.wysaid.common.Common;
16 | import org.wysaid.demoViews.FaceDemoView;
17 | import org.wysaid.myUtils.ImageUtil;
18 | import org.wysaid.nativePort.CGEFaceFunctions;
19 |
20 | public class FaceDemoActivity extends AppCompatActivity {
21 |
22 | public static final int CHOOSE_PHOTO = 1;
23 |
24 | private FaceDemoView mFaceDemoView;
25 | private ImageView mResultView;
26 | private CGEFaceFunctions.FaceFeature mFirstFaceFeature;
27 | private Bitmap mFirstFaceImage;
28 | private CGEFaceFunctions.FaceFeature mSecondFaceFeature;
29 | private Bitmap mSecondFaceImage;
30 |
31 | @Override
32 | protected void onCreate(Bundle savedInstanceState) {
33 | super.onCreate(savedInstanceState);
34 | setContentView(R.layout.activity_face_demo);
35 |
36 | mFaceDemoView = (FaceDemoView)findViewById(R.id.faceDemoView);
37 | mResultView = (ImageView)findViewById(R.id.resultView);
38 | mResultView.setVisibility(View.GONE);
39 |
40 | Bitmap eye = BitmapFactory.decodeResource(getResources(), R.drawable.adjust_face_eye);
41 | Bitmap mouth = BitmapFactory.decodeResource(getResources(), R.drawable.adjust_face_mouth);
42 | Bitmap chin = BitmapFactory.decodeResource(getResources(), R.drawable.adjust_face_chin);
43 |
44 | mFaceDemoView.loadResources(eye, mouth, chin);
45 |
46 | SeekBar seekBar = (SeekBar)findViewById(R.id.seekBar);
47 | seekBar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
48 | @Override
49 | public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
50 | mFaceDemoView.setFaceImageZoom(progress / 20.0f + 0.5f);
51 | }
52 |
53 | @Override
54 | public void onStartTrackingTouch(SeekBar seekBar) {
55 |
56 | }
57 |
58 | @Override
59 | public void onStopTrackingTouch(SeekBar seekBar) {
60 |
61 | }
62 | });
63 | }
64 |
65 | public void setAsFirstPhoto(View view) {
66 | Log.i(Common.LOG_TAG, "setAsFirstPhoto...");
67 | mFirstFaceFeature = mFaceDemoView.getFeature();
68 | mFirstFaceImage = mFaceDemoView.getFaceimage();
69 | Toast.makeText(this, "当前画面设置为第一张图", Toast.LENGTH_SHORT).show();
70 | }
71 |
72 | public void setAsSecondPhoto(View view) {
73 | Log.i(Common.LOG_TAG, "setAsSecondPhoto...");
74 |
75 | mSecondFaceFeature = mFaceDemoView.getFeature();
76 | mSecondFaceImage = mFaceDemoView.getFaceimage();
77 | Toast.makeText(this, "当前画面设置为第二张图", Toast.LENGTH_SHORT).show();
78 | }
79 |
80 | public void showResult(View view) {
81 | Log.i(Common.LOG_TAG, "showResult...");
82 | Bitmap bmp = CGEFaceFunctions.blendFaceWidthFeatures(mFirstFaceImage, mFirstFaceFeature, mSecondFaceImage, mSecondFaceFeature, CGEFaceFunctions.AutoLumAdjustMode.LumAdjust_OnlyBrightness, null);
83 | mResultView.setImageBitmap(bmp);
84 | mResultView.setVisibility(View.VISIBLE);
85 | mFaceDemoView.setVisibility(View.GONE);
86 | }
87 |
88 | public void choosePhoto(View view) {
89 | mResultView.setVisibility(View.GONE);
90 | mFaceDemoView.setVisibility(View.VISIBLE);
91 | Intent photoPickerIntent = new Intent(Intent.ACTION_PICK);
92 | photoPickerIntent.setType("image/*");
93 | startActivityForResult(photoPickerIntent, CHOOSE_PHOTO);
94 | }
95 |
96 | public void onActivityResult(final int requestCode, final int resultCode, final Intent data) {
97 |
98 | if(resultCode != RESULT_OK)
99 | return;
100 |
101 | switch (requestCode)
102 | {
103 | case CHOOSE_PHOTO:
104 | {
105 | mResultView.setImageURI(data.getData());
106 | Bitmap result = ((BitmapDrawable)mResultView.getDrawable()).getBitmap();
107 | if(result.getWidth() > 2000 || result.getHeight() > 2000) {
108 | float scaling = (float)Math.min(2000.0 / result.getWidth(), 2000.0 / result.getHeight());
109 | result = Bitmap.createScaledBitmap(result, (int)(result.getWidth() * scaling), (int)(result.getHeight() * scaling), false);
110 | }
111 | mFaceDemoView.setFaceImage(result);
112 | }
113 | break;
114 | default:break;
115 | }
116 | }
117 |
118 | public void resetControllers(View view) {
119 | mFaceDemoView.resetControllers();
120 | mFaceDemoView.postInvalidate();
121 | }
122 |
123 | public void saveResult(View view) {
124 | Bitmap result = ((BitmapDrawable)mResultView.getDrawable()).getBitmap();
125 | ImageUtil.saveBitmap(result);
126 | }
127 |
128 | public void useResult(View view) {
129 | Bitmap result = ((BitmapDrawable)mResultView.getDrawable()).getBitmap();
130 | mFaceDemoView.setFaceImage(result);
131 | }
132 | }
133 |
--------------------------------------------------------------------------------
/cgedemo/src/main/java/org/wysaid/cgeDemo/SimplePlayerDemoActivity.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.cgeDemo;
2 |
3 | import android.content.Context;
4 | import android.content.Intent;
5 | import android.graphics.Bitmap;
6 | import android.graphics.BitmapFactory;
7 | import android.media.MediaPlayer;
8 | import android.net.Uri;
9 | import android.support.v7.app.AppCompatActivity;
10 | import android.os.Bundle;
11 | import android.util.Log;
12 | import android.view.Menu;
13 | import android.view.MenuItem;
14 | import android.view.View;
15 | import android.widget.Button;
16 | import android.widget.LinearLayout;
17 | import android.widget.Toast;
18 |
19 | import org.wysaid.common.Common;
20 | import org.wysaid.myUtils.FileUtil;
21 | import org.wysaid.myUtils.ImageUtil;
22 | import org.wysaid.texUtils.TextureRenderer;
23 | import org.wysaid.texUtils.TextureRendererDrawOrigin;
24 | import org.wysaid.texUtils.TextureRendererEdge;
25 | import org.wysaid.texUtils.TextureRendererEmboss;
26 | import org.wysaid.texUtils.TextureRendererLerpBlur;
27 | import org.wysaid.texUtils.TextureRendererMask;
28 | import org.wysaid.texUtils.TextureRendererWave;
29 | import org.wysaid.view.SimplePlayerGLSurfaceView;
30 |
31 | public class SimplePlayerDemoActivity extends AppCompatActivity {
32 |
33 | SimplePlayerGLSurfaceView mPlayerView;
34 | Button mShapeBtn;
35 | Button mTakeshotBtn;
36 | Button mGalleryBtn;
37 |
38 | public static final int REQUEST_CODE_PICK_VIDEO = 1;
39 |
40 | private SimplePlayerGLSurfaceView.PlayCompletionCallback playCompletionCallback = new SimplePlayerGLSurfaceView.PlayCompletionCallback() {
41 | @Override
42 | public void playComplete(MediaPlayer player) {
43 | Log.i(Common.LOG_TAG, "The video playing is over, restart...");
44 | player.start();
45 | }
46 |
47 | @Override
48 | public boolean playFailed(MediaPlayer player, final int what, final int extra)
49 | {
50 | Toast.makeText(SimplePlayerDemoActivity.this, String.format("Error occured! Stop playing, Err code: %d, %d", what, extra), Toast.LENGTH_LONG).show();
51 | return true;
52 | }
53 | };
54 |
55 | class MyVideoButton extends Button implements View.OnClickListener {
56 |
57 | Uri videoUri;
58 | SimplePlayerGLSurfaceView videoView;
59 |
60 | public MyVideoButton(Context context) {
61 | super(context);
62 | }
63 |
64 | @Override
65 | public void onClick(View v) {
66 |
67 | Toast.makeText(SimplePlayerDemoActivity.this, "正在准备播放视频 " + videoUri.getHost() + videoUri.getPath() + " 如果是网络视频, 可能需要一段时间的等待", Toast.LENGTH_SHORT).show();
68 |
69 | videoView.setVideoUri(videoUri, new SimplePlayerGLSurfaceView.PlayPreparedCallback() {
70 | @Override
71 | public void playPrepared(MediaPlayer player) {
72 | mPlayerView.post(new Runnable() {
73 | @Override
74 | public void run() {
75 | String msg = "开始播放 ";
76 | if(videoUri.getHost() != null)
77 | msg += videoUri.getHost();
78 | if(videoUri.getPath() != null)
79 | msg += videoUri.getPath();
80 |
81 | Toast.makeText(SimplePlayerDemoActivity.this, msg, Toast.LENGTH_SHORT).show();
82 | }
83 | });
84 |
85 | player.start();
86 | }
87 | }, playCompletionCallback);
88 | }
89 | }
90 |
91 | @Override
92 | protected void onCreate(Bundle savedInstanceState) {
93 | super.onCreate(savedInstanceState);
94 | setContentView(R.layout.activity_simple_player_demo);
95 | mPlayerView = (SimplePlayerGLSurfaceView)findViewById(R.id.videoGLSurfaceView);
96 | // mPlayerView.setZOrderOnTop(false);
97 |
98 | mShapeBtn = (Button)findViewById(R.id.switchShapeBtn);
99 |
100 | mShapeBtn.setOnClickListener(new View.OnClickListener() {
101 |
102 | private boolean useMask = false;
103 | Bitmap bmp;
104 |
105 | @Override
106 | public void onClick(View v) {
107 | useMask = !useMask;
108 | if(useMask) {
109 | if(bmp == null)
110 | bmp = BitmapFactory.decodeResource(getResources(), R.drawable.mask1);
111 |
112 | if(bmp != null) {
113 | mPlayerView.setMaskBitmap(bmp, false, new SimplePlayerGLSurfaceView.SetMaskBitmapCallback() {
114 | @Override
115 | public void setMaskOK(TextureRendererMask renderer) {
116 | if(mPlayerView.isUsingMask()) {
117 | renderer.setMaskFlipscale(1.0f, -1.0f);
118 | }
119 | }
120 |
121 | @Override
122 | public void unsetMaskOK(TextureRenderer renderer) {
123 |
124 | }
125 | });
126 | }
127 | } else {
128 | mPlayerView.setMaskBitmap(null, false);
129 | }
130 |
131 | }
132 | });
133 |
134 | mTakeshotBtn = (Button)findViewById(R.id.takeShotBtn);
135 |
136 | mTakeshotBtn.setOnClickListener(new View.OnClickListener() {
137 | @Override
138 | public void onClick(View v) {
139 | mPlayerView.takeShot(new SimplePlayerGLSurfaceView.TakeShotCallback() {
140 | @Override
141 | public void takeShotOK(Bitmap bmp) {
142 | if(bmp != null)
143 | ImageUtil.saveBitmap(bmp);
144 | else
145 | Log.e(Common.LOG_TAG, "take shot failed!");
146 | }
147 | });
148 | }
149 | });
150 |
151 | LinearLayout menuLayout = (LinearLayout)findViewById(R.id.menuLinearLayout);
152 |
153 | String[] filePaths = {
154 | "android.resource://" + getPackageName() + "/" + R.raw.test,
155 | "http://wge.wysaid.org/res/video/1.mp4", //网络视频
156 | "http://wysaid.org/p/test.mp4", //网络视频
157 | };
158 |
159 | {
160 | Button btn = new Button(this);
161 | menuLayout.addView(btn);
162 | btn.setAllCaps(false);
163 | btn.setText("Last Recorded Video");
164 | btn.setOnClickListener(new View.OnClickListener() {
165 | @Override
166 | public void onClick(View v) {
167 |
168 | String lastVideoFileName = FileUtil.getTextContent(CameraDemoActivity.lastVideoPathFileName);
169 | if(lastVideoFileName == null) {
170 | Toast.makeText(SimplePlayerDemoActivity.this, "No video is recorded, please record one in the 2nd case.", Toast.LENGTH_LONG).show();
171 | return;
172 | }
173 |
174 | Uri lastVideoUri = Uri.parse(lastVideoFileName);
175 | mPlayerView.setVideoUri(lastVideoUri, new SimplePlayerGLSurfaceView.PlayPreparedCallback() {
176 | @Override
177 | public void playPrepared(MediaPlayer player) {
178 | Log.i(Common.LOG_TAG, "The video is prepared to play");
179 | player.start();
180 | }
181 | }, playCompletionCallback);
182 | }
183 | });
184 | }
185 |
186 | for(int i = 0; i != filePaths.length; ++i) {
187 | MyVideoButton btn = new MyVideoButton(this);
188 | btn.setText("视频" + i);
189 | btn.videoUri = Uri.parse(filePaths[i]);
190 | btn.videoView = mPlayerView;
191 | btn.setOnClickListener(btn);
192 | menuLayout.addView(btn);
193 |
194 | if(i == 0) {
195 | btn.onClick(btn);
196 | }
197 | }
198 |
199 | Button filterButton = new Button(this);
200 | menuLayout.addView(filterButton);
201 | filterButton.setText("切换滤镜");
202 | filterButton.setOnClickListener(new View.OnClickListener() {
203 | private int filterIndex;
204 |
205 | @Override
206 | public void onClick(View v) {
207 | mPlayerView.queueEvent(new Runnable() {
208 | @Override
209 | public void run() {
210 | TextureRenderer drawer = null;
211 | ++filterIndex;
212 | filterIndex %= 5;
213 | switch (filterIndex) {
214 | case 0: //LerpBlur
215 | {
216 | TextureRendererLerpBlur lerpBlurDrawer = TextureRendererLerpBlur.create(true);
217 | if(lerpBlurDrawer != null) {
218 | lerpBlurDrawer.setIntensity(16);
219 | drawer = lerpBlurDrawer;
220 | }
221 | }
222 | break;
223 |
224 | case 1:
225 | {
226 | TextureRendererEdge edgeDrawer = TextureRendererEdge.create(true);
227 | if(edgeDrawer != null) {
228 | drawer = edgeDrawer;
229 | }
230 | }
231 | break;
232 |
233 | case 2:
234 | {
235 | TextureRendererEmboss embossDrawer = TextureRendererEmboss.create(true);
236 | if(embossDrawer != null) {
237 | drawer = embossDrawer;
238 | }
239 | }
240 | break;
241 |
242 | case 3:
243 | {
244 | TextureRendererWave waveDrawer = TextureRendererWave.create(true);
245 | if(waveDrawer != null) {
246 | waveDrawer.setAutoMotion(0.4f);
247 | drawer = waveDrawer;
248 | }
249 | }
250 | break;
251 |
252 | case 4:
253 | {
254 | TextureRendererDrawOrigin originDrawer = TextureRendererDrawOrigin.create(true);
255 | if(originDrawer != null) {
256 | drawer = originDrawer;
257 | }
258 | }
259 | break;
260 |
261 | default:break;
262 | }
263 |
264 | mPlayerView.setTextureRenderer(drawer);
265 | }
266 | });
267 | }
268 | });
269 |
270 | mGalleryBtn = (Button)findViewById(R.id.galleryBtn);
271 | mGalleryBtn.setOnClickListener(galleryBtnClickListener);
272 |
273 | Button fitViewBtn = (Button)findViewById(R.id.fitViewBtn);
274 | fitViewBtn.setOnClickListener(new View.OnClickListener() {
275 | boolean shouldFit = false;
276 | @Override
277 | public void onClick(View v) {
278 | shouldFit = !shouldFit;
279 | mPlayerView.setFitFullView(shouldFit);
280 | }
281 | });
282 |
283 | mPlayerView.setPlayerInitializeCallback(new SimplePlayerGLSurfaceView.PlayerInitializeCallback() {
284 | @Override
285 | public void initPlayer(final MediaPlayer player) {
286 | //针对网络视频进行进度检查
287 | player.setOnBufferingUpdateListener(new MediaPlayer.OnBufferingUpdateListener() {
288 | @Override
289 | public void onBufferingUpdate(MediaPlayer mp, int percent) {
290 | Log.i(Common.LOG_TAG, "Buffer update: " + percent);
291 | if(percent == 100) {
292 | Log.i(Common.LOG_TAG, "缓冲完毕!");
293 | player.setOnBufferingUpdateListener(null);
294 | }
295 | }
296 | });
297 | }
298 | });
299 | }
300 |
301 | android.view.View.OnClickListener galleryBtnClickListener = new android.view.View.OnClickListener(){
302 |
303 | @Override
304 | public void onClick(final android.view.View view) {
305 | Intent videoPickerIntent = new Intent(Intent.ACTION_GET_CONTENT);
306 | videoPickerIntent.setType("video/*");
307 | startActivityForResult(videoPickerIntent, REQUEST_CODE_PICK_VIDEO);
308 | }
309 | };
310 |
311 | public void onActivityResult(final int requestCode, final int resultCode, final Intent data) {
312 | switch (requestCode)
313 | {
314 | case REQUEST_CODE_PICK_VIDEO:
315 | if(resultCode == RESULT_OK)
316 | {
317 |
318 | mPlayerView.setVideoUri(data.getData(), new SimplePlayerGLSurfaceView.PlayPreparedCallback() {
319 | @Override
320 | public void playPrepared(MediaPlayer player) {
321 | Log.i(Common.LOG_TAG, "The video is prepared to play");
322 | player.start();
323 | }
324 | }, playCompletionCallback);
325 | }
326 | default: break;
327 | }
328 | }
329 |
330 | @Override
331 | public void onPause() {
332 | super.onPause();
333 | Log.i(SimplePlayerGLSurfaceView.LOG_TAG, "activity onPause...");
334 | mPlayerView.release();
335 | mPlayerView.onPause();
336 | }
337 |
338 | @Override
339 | public void onResume() {
340 | super.onResume();
341 |
342 | mPlayerView.onResume();
343 | }
344 |
345 | @Override
346 | public boolean onCreateOptionsMenu(Menu menu) {
347 | // Inflate the menu; this adds items to the action bar if it is present.
348 | getMenuInflater().inflate(R.menu.menu_video_player, menu);
349 | return true;
350 | }
351 |
352 | @Override
353 | public boolean onOptionsItemSelected(MenuItem item) {
354 | // Handle action bar item clicks here. The action bar will
355 | // automatically handle clicks on the Home/Up button, so long
356 | // as you specify a parent activity in AndroidManifest.xml.
357 | int id = item.getItemId();
358 |
359 | //noinspection SimplifiableIfStatement
360 | if (id == R.id.action_settings) {
361 | return true;
362 | }
363 |
364 | return super.onOptionsItemSelected(item);
365 | }
366 | }
367 |
--------------------------------------------------------------------------------
/cgedemo/src/main/java/org/wysaid/cgeDemo/VideoPlayerDemoActivity.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.cgeDemo;
2 |
3 | import android.content.Context;
4 | import android.content.Intent;
5 | import android.graphics.Bitmap;
6 | import android.graphics.BitmapFactory;
7 | import android.media.MediaPlayer;
8 | import android.net.Uri;
9 | import android.support.v7.app.AppCompatActivity;
10 | import android.os.Bundle;
11 | import android.util.Log;
12 | import android.view.Menu;
13 | import android.view.MenuItem;
14 | import android.view.View;
15 | import android.widget.Button;
16 | import android.widget.LinearLayout;
17 | import android.widget.SeekBar;
18 | import android.widget.Toast;
19 |
20 | import org.wysaid.common.Common;
21 | import org.wysaid.myUtils.FileUtil;
22 | import org.wysaid.myUtils.ImageUtil;
23 | import org.wysaid.nativePort.CGEFrameRenderer;
24 | import org.wysaid.view.VideoPlayerGLSurfaceView;
25 |
26 | public class VideoPlayerDemoActivity extends AppCompatActivity {
27 |
28 | VideoPlayerGLSurfaceView mPlayerView;
29 | Button mShapeBtn;
30 | Button mTakeshotBtn;
31 | Button mGalleryBtn;
32 |
33 | String mCurrentConfig;
34 |
35 | public static final int REQUEST_CODE_PICK_VIDEO = 1;
36 |
37 | private VideoPlayerGLSurfaceView.PlayCompletionCallback playCompletionCallback = new VideoPlayerGLSurfaceView.PlayCompletionCallback() {
38 | @Override
39 | public void playComplete(MediaPlayer player) {
40 | Log.i(Common.LOG_TAG, "The video playing is over, restart...");
41 | player.start();
42 | }
43 |
44 | @Override
45 | public boolean playFailed(MediaPlayer player, final int what, final int extra)
46 | {
47 | Toast.makeText(VideoPlayerDemoActivity.this, String.format("Error occured! Stop playing, Err code: %d, %d", what, extra), Toast.LENGTH_LONG).show();
48 | return true;
49 | }
50 | };
51 |
52 | class MyVideoButton extends Button implements View.OnClickListener {
53 |
54 | Uri videoUri;
55 | VideoPlayerGLSurfaceView videoView;
56 |
57 | public MyVideoButton(Context context) {
58 | super(context);
59 | }
60 |
61 | @Override
62 | public void onClick(View v) {
63 |
64 | Toast.makeText(VideoPlayerDemoActivity.this, "正在准备播放视频 " + videoUri.getHost() + videoUri.getPath() + " 如果是网络视频, 可能需要一段时间的等待", Toast.LENGTH_SHORT).show();
65 |
66 | videoView.setVideoUri(videoUri, new VideoPlayerGLSurfaceView.PlayPreparedCallback() {
67 | @Override
68 | public void playPrepared(MediaPlayer player) {
69 | mPlayerView.post(new Runnable() {
70 | @Override
71 | public void run() {
72 | Toast.makeText(VideoPlayerDemoActivity.this, "开始播放 " + videoUri.getHost() + videoUri.getPath(), Toast.LENGTH_SHORT).show();
73 | }
74 | });
75 |
76 | player.start();
77 | }
78 | }, playCompletionCallback);
79 | }
80 | }
81 |
82 | @Override
83 | protected void onCreate(Bundle savedInstanceState) {
84 | super.onCreate(savedInstanceState);
85 | setContentView(R.layout.activity_video_player_demo);
86 | mPlayerView = (VideoPlayerGLSurfaceView)findViewById(R.id.videoGLSurfaceView);
87 | mPlayerView.setZOrderOnTop(false);
88 | mPlayerView.setZOrderMediaOverlay(true);
89 |
90 | mShapeBtn = (Button)findViewById(R.id.switchShapeBtn);
91 |
92 | mShapeBtn.setOnClickListener(new View.OnClickListener() {
93 |
94 | private boolean useMask = false;
95 | Bitmap bmp;
96 |
97 | @Override
98 | public void onClick(View v) {
99 | useMask = !useMask;
100 | if(useMask) {
101 | if(bmp == null)
102 | bmp = BitmapFactory.decodeResource(getResources(), R.drawable.mask1);
103 |
104 | if(bmp != null) {
105 | mPlayerView.setMaskBitmap(bmp, false, new VideoPlayerGLSurfaceView.SetMaskBitmapCallback() {
106 | @Override
107 | public void setMaskOK(CGEFrameRenderer renderer) {
108 | // if(mPlayerView.isUsingMask()) {
109 | // renderer.setMaskFlipScale(1.0f, -1.0f);
110 | // }
111 | Log.i(Common.LOG_TAG, "启用mask!");
112 | }
113 | });
114 | }
115 | } else {
116 | mPlayerView.setMaskBitmap(null, false);
117 | }
118 |
119 | }
120 | });
121 |
122 | mTakeshotBtn = (Button)findViewById(R.id.takeShotBtn);
123 |
124 | mTakeshotBtn.setOnClickListener(new View.OnClickListener() {
125 | @Override
126 | public void onClick(View v) {
127 | mPlayerView.takeShot(new VideoPlayerGLSurfaceView.TakeShotCallback() {
128 | @Override
129 | public void takeShotOK(Bitmap bmp) {
130 | if(bmp != null)
131 | ImageUtil.saveBitmap(bmp);
132 | else
133 | Log.e(Common.LOG_TAG, "take shot failed!");
134 | }
135 | });
136 | }
137 | });
138 |
139 | LinearLayout menuLayout = (LinearLayout)findViewById(R.id.menuLinearLayout);
140 |
141 | {
142 | Button btn = new Button(this);
143 | menuLayout.addView(btn);
144 | btn.setAllCaps(false);
145 | btn.setText("Last Recorded Video");
146 | btn.setOnClickListener(new View.OnClickListener() {
147 | @Override
148 | public void onClick(View v) {
149 |
150 | String lastVideoFileName = FileUtil.getTextContent(CameraDemoActivity.lastVideoPathFileName);
151 | if(lastVideoFileName == null) {
152 | Toast.makeText(VideoPlayerDemoActivity.this, "No video is recorded, please record one in the 2nd case.", Toast.LENGTH_LONG).show();
153 | return;
154 | }
155 |
156 | Uri lastVideoUri = Uri.parse(lastVideoFileName);
157 | mPlayerView.setVideoUri(lastVideoUri, new VideoPlayerGLSurfaceView.PlayPreparedCallback() {
158 | @Override
159 | public void playPrepared(MediaPlayer player) {
160 | Log.i(Common.LOG_TAG, "The video is prepared to play");
161 | player.start();
162 | }
163 | }, playCompletionCallback);
164 | }
165 | });
166 | }
167 |
168 | String[] filePaths = {
169 | "android.resource://" + getPackageName() + "/" + R.raw.test,
170 | "http://wge.wysaid.org/res/video/1.mp4", //网络视频
171 | "http://wysaid.org/p/test.mp4", //网络视频
172 | };
173 |
174 | for(int i = 0; i != filePaths.length; ++i) {
175 | MyVideoButton btn = new MyVideoButton(this);
176 | btn.setText("视频" + i);
177 | btn.videoUri = Uri.parse(filePaths[i]);
178 | btn.videoView = mPlayerView;
179 | btn.setOnClickListener(btn);
180 | menuLayout.addView(btn);
181 |
182 | if(i == 0) {
183 | btn.onClick(btn);
184 | }
185 | }
186 |
187 | for(int i = 0; i != MainActivity.effectConfigs.length; ++i) {
188 | CameraDemoActivity.MyButtons button = new CameraDemoActivity.MyButtons(this, MainActivity.effectConfigs[i]);
189 | button.setText("特效" + i);
190 | button.setOnClickListener(mFilterSwitchListener);
191 | menuLayout.addView(button);
192 | }
193 |
194 | mGalleryBtn = (Button)findViewById(R.id.galleryBtn);
195 | mGalleryBtn.setOnClickListener(galleryBtnClickListener);
196 |
197 | Button fitViewBtn = (Button)findViewById(R.id.fitViewBtn);
198 | fitViewBtn.setOnClickListener(new View.OnClickListener() {
199 | boolean shouldFit = false;
200 | @Override
201 | public void onClick(View v) {
202 | shouldFit = !shouldFit;
203 | mPlayerView.setFitFullView(shouldFit);
204 | }
205 | });
206 |
207 | mPlayerView.setPlayerInitializeCallback(new VideoPlayerGLSurfaceView.PlayerInitializeCallback() {
208 | @Override
209 | public void initPlayer(final MediaPlayer player) {
210 | //针对网络视频进行进度检查
211 | player.setOnBufferingUpdateListener(new MediaPlayer.OnBufferingUpdateListener() {
212 | @Override
213 | public void onBufferingUpdate(MediaPlayer mp, int percent) {
214 | Log.i(Common.LOG_TAG, "Buffer update: " + percent);
215 | if(percent == 100) {
216 | Log.i(Common.LOG_TAG, "缓冲完毕!");
217 | player.setOnBufferingUpdateListener(null);
218 | }
219 | }
220 | });
221 | }
222 | });
223 |
224 | SeekBar seekBar = (SeekBar) findViewById(R.id.seekBar);
225 |
226 | seekBar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
227 | @Override
228 | public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
229 | float intensity = progress / 100.0f;
230 | mPlayerView.setFilterIntensity(intensity);
231 | }
232 |
233 | @Override
234 | public void onStartTrackingTouch(SeekBar seekBar) {
235 |
236 | }
237 |
238 | @Override
239 | public void onStopTrackingTouch(SeekBar seekBar) {
240 |
241 | }
242 | });
243 |
244 | }
245 |
246 | private View.OnClickListener mFilterSwitchListener = new View.OnClickListener() {
247 | @Override
248 | public void onClick(View v) {
249 | CameraDemoActivity.MyButtons btn = (CameraDemoActivity.MyButtons)v;
250 | mPlayerView.setFilterWithConfig(btn.filterConfig);
251 | mCurrentConfig = btn.filterConfig;
252 | }
253 | };
254 |
255 | android.view.View.OnClickListener galleryBtnClickListener = new android.view.View.OnClickListener(){
256 |
257 | @Override
258 | public void onClick(final android.view.View view) {
259 | Intent videoPickerIntent = new Intent(Intent.ACTION_GET_CONTENT);
260 | videoPickerIntent.setType("video/*");
261 | startActivityForResult(videoPickerIntent, REQUEST_CODE_PICK_VIDEO);
262 | }
263 | };
264 |
265 | public void onActivityResult(final int requestCode, final int resultCode, final Intent data) {
266 | switch (requestCode)
267 | {
268 | case REQUEST_CODE_PICK_VIDEO:
269 | if(resultCode == RESULT_OK)
270 | {
271 |
272 | mPlayerView.setVideoUri(data.getData(), new VideoPlayerGLSurfaceView.PlayPreparedCallback() {
273 | @Override
274 | public void playPrepared(MediaPlayer player) {
275 | Log.i(Common.LOG_TAG, "The video is prepared to play");
276 | player.start();
277 | }
278 | }, playCompletionCallback);
279 | }
280 | default: break;
281 | }
282 | }
283 |
284 | @Override
285 | public void onPause() {
286 | super.onPause();
287 | Log.i(VideoPlayerGLSurfaceView.LOG_TAG, "activity onPause...");
288 | mPlayerView.release();
289 | mPlayerView.onPause();
290 | }
291 |
292 | @Override
293 | public void onResume() {
294 | super.onResume();
295 |
296 | mPlayerView.onResume();
297 | }
298 |
299 | @Override
300 | public boolean onCreateOptionsMenu(Menu menu) {
301 | // Inflate the menu; this adds items to the action bar if it is present.
302 | getMenuInflater().inflate(R.menu.menu_video_player, menu);
303 | return true;
304 | }
305 |
306 | @Override
307 | public boolean onOptionsItemSelected(MenuItem item) {
308 | // Handle action bar item clicks here. The action bar will
309 | // automatically handle clicks on the Home/Up button, so long
310 | // as you specify a parent activity in AndroidManifest.xml.
311 | int id = item.getItemId();
312 |
313 | //noinspection SimplifiableIfStatement
314 | if (id == R.id.action_settings) {
315 | return true;
316 | }
317 |
318 | return super.onOptionsItemSelected(item);
319 | }
320 | }
321 |
--------------------------------------------------------------------------------
/cgedemo/src/main/java/org/wysaid/demoViews/FaceDemoView.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.demoViews;
2 |
3 | import android.content.Context;
4 | import android.graphics.Bitmap;
5 | import android.graphics.Canvas;
6 | import android.graphics.Color;
7 | import android.graphics.Matrix;
8 | import android.graphics.Paint;
9 | import android.graphics.PointF;
10 | import android.media.FaceDetector;
11 | import android.util.AttributeSet;
12 | import android.util.Log;
13 | import android.view.MotionEvent;
14 | import android.view.View;
15 | import android.view.View.OnTouchListener;
16 | import android.widget.Toast;
17 |
18 | import org.wysaid.common.Common;
19 | import org.wysaid.myUtils.ImageUtil;
20 | import org.wysaid.nativePort.CGEFaceFunctions;
21 |
22 | /**
23 | * Created by wysaid on 15/12/24.
24 | * Mail: admin@wysaid.org
25 | * blog: wysaid.org
26 | */
27 | public class FaceDemoView extends View implements OnTouchListener{
28 |
29 | public static class FaceController {
30 | public Bitmap image;
31 | public PointF pos;
32 | public float rotation;
33 | public float halfWidth, halfHeight;
34 |
35 | FaceController(Bitmap img) {
36 | image = img;
37 | pos = new PointF(0, 0);
38 | halfWidth = img.getWidth() / 2.0f;
39 | halfHeight = img.getHeight() / 2.0f;
40 | }
41 |
42 | public void drawController(Canvas canvas) {
43 | Matrix matrix = new Matrix();
44 | matrix.setTranslate(pos.x - halfWidth, pos.y - halfHeight);
45 | matrix.postRotate(rotation, pos.x, pos.y);
46 | canvas.drawBitmap(image, matrix, null);
47 | }
48 |
49 | public boolean onTouchPosition(float x, float y) {
50 | if(x < pos.x - halfWidth || x > pos.x + halfWidth || y < pos.y - halfHeight || y > pos.y + halfHeight)
51 | return false;
52 | return true;
53 | }
54 | }
55 |
56 | private Bitmap mFaceImage;
57 | private FaceController mLeftEyeController, mRightEyeController;
58 | private FaceController mMouthController;
59 | private FaceController mChinController;
60 | private FaceController mSelectedController;
61 |
62 | public Bitmap getFaceimage() {
63 | return mFaceImage;
64 | }
65 |
66 | PointF mOriginPos = new PointF(0.0f, 0.0f);
67 | PointF mLastFingerPos = new PointF();
68 | float mImageScaling = 1.0f;
69 |
70 | public FaceDemoView(Context context, AttributeSet attrs) {
71 | super(context, attrs);
72 | setOnTouchListener(this);
73 | }
74 |
75 | public void loadResources(Bitmap eye, Bitmap mouth, Bitmap chin) {
76 | mLeftEyeController = new FaceController(eye);
77 | mRightEyeController = new FaceController(eye);
78 | mMouthController = new FaceController(mouth);
79 | mChinController = new FaceController(chin);
80 | }
81 |
82 | public static float getProperScaling(Bitmap bmp, View view) {
83 | float bmpWidth = bmp.getWidth();
84 | float bmpHeight = bmp.getHeight();
85 | float viewWidth = view.getWidth();
86 | float viewHeight = view.getHeight();
87 |
88 | return Math.min(viewWidth / bmpWidth, viewHeight / bmpHeight);
89 | }
90 |
91 | public void setFaceImage(Bitmap faceImage) {
92 | mFaceImage = faceImage;
93 |
94 | mImageScaling = getProperScaling(mFaceImage, this);
95 |
96 | mOriginPos.x = (this.getWidth() - mFaceImage.getWidth() * mImageScaling) / 2.0f;
97 | mOriginPos.y = (this.getHeight() - mFaceImage.getHeight() * mImageScaling) / 2.0f;
98 |
99 | ImageUtil.FaceRects faceRects = ImageUtil.findFaceByBitmap(faceImage);
100 |
101 | if(faceRects.numOfFaces > 0) {
102 | String content = "";
103 |
104 | FaceDetector.Face face = faceRects.faces[0];
105 | PointF pnt = new PointF();
106 | face.getMidPoint(pnt);
107 |
108 | float eyeDis = face.eyesDistance();
109 | float halfEyeDis = eyeDis / 2.0f;
110 |
111 | content += String.format("准确率: %g, 人脸中心 %g, %g, 眼间距: %g\n", face.confidence(), pnt.x, pnt.y, eyeDis);
112 |
113 | CGEFaceFunctions.FaceFeature feature = new CGEFaceFunctions.FaceFeature();
114 |
115 | feature.leftEyePos = new PointF(pnt.x - halfEyeDis, pnt.y);
116 | feature.rightEyePos = new PointF(pnt.x + halfEyeDis, pnt.y);
117 | feature.mouthPos = new PointF(pnt.x, pnt.y + eyeDis);
118 | feature.chinPos = new PointF(pnt.x, pnt.y + eyeDis * 1.5f);
119 |
120 | // if(!mFaceImage.isMutable())
121 | // mFaceImage = mFaceImage.copy(mFaceImage.getConfig(), true);
122 | //
123 | // Canvas canvas = new Canvas(mFaceImage);
124 | // Paint paint = new Paint();
125 | // paint.setColor(Color.RED);
126 | // paint.setStyle(Paint.Style.STROKE);
127 | // paint.setStrokeWidth(4);
128 | // canvas.drawRect((int) (pnt.x - eyeDis * 1.5f), (int) (pnt.y - eyeDis * 1.5f), (int) (pnt.x + eyeDis * 1.5f), (int) (pnt.y + eyeDis * 1.5f), paint);
129 | //
130 | // //眼睛中心
131 | // canvas.drawRect((int) (pnt.x - 2.0f), (int) (pnt.y - 2.0f), (int) (pnt.x + 2.0f), (int) (pnt.y + 2.0f), paint);
132 | //
133 | // //双眼
134 | // canvas.drawRect((int) (pnt.x - halfEyeDis - 2.0f), (int) (pnt.y - 2.0f), (int) (pnt.x - halfEyeDis + 2.0f), (int) (pnt.y + 2.0f), paint);
135 | // canvas.drawRect((int) (pnt.x + halfEyeDis - 2.0f), (int) (pnt.y - 2.0f), (int) (pnt.x + halfEyeDis + 2.0f), (int) (pnt.y + 2.0f), paint);
136 |
137 | mLeftEyeController.pos = new PointF(feature.leftEyePos.x * mImageScaling + mOriginPos.x, feature.leftEyePos.y * mImageScaling + mOriginPos.y);
138 | mRightEyeController.pos = new PointF(feature.rightEyePos.x * mImageScaling + mOriginPos.x, feature.rightEyePos.y * mImageScaling + mOriginPos.y);
139 | mMouthController.pos = new PointF(feature.mouthPos.x * mImageScaling + mOriginPos.x, feature.mouthPos.y * mImageScaling + mOriginPos.y);
140 | mChinController.pos = new PointF(feature.chinPos.x * mImageScaling + mOriginPos.x, feature.chinPos.y * mImageScaling + mOriginPos.y);
141 |
142 | Toast.makeText(this.getContext(), content, Toast.LENGTH_SHORT).show();
143 | } else {
144 | Toast.makeText(this.getContext(), "检测人脸失败, 情调整控件位置", Toast.LENGTH_LONG).show();
145 | resetControllers();
146 | }
147 |
148 | calcAngles();
149 | postInvalidate();
150 | }
151 |
152 | public CGEFaceFunctions.FaceFeature getFeature() {
153 |
154 | CGEFaceFunctions.FaceFeature feature = new CGEFaceFunctions.FaceFeature();
155 |
156 | feature.leftEyePos.x = (mLeftEyeController.pos.x - mOriginPos.x) / mImageScaling;
157 | feature.leftEyePos.y = (mLeftEyeController.pos.y - mOriginPos.y) / mImageScaling;
158 |
159 | feature.rightEyePos.x = (mRightEyeController.pos.x - mOriginPos.x) / mImageScaling;
160 | feature.rightEyePos.y = (mRightEyeController.pos.y - mOriginPos.y) / mImageScaling;
161 |
162 | feature.mouthPos.x = (mMouthController.pos.x - mOriginPos.x) / mImageScaling;
163 | feature.mouthPos.y = (mMouthController.pos.y - mOriginPos.y) / mImageScaling;
164 |
165 | feature.chinPos.x = (mChinController.pos.x - mOriginPos.x) / mImageScaling;
166 | feature.chinPos.y = (mChinController.pos.y - mOriginPos.y) / mImageScaling;
167 |
168 | feature.faceImageWidth = mFaceImage.getWidth();
169 | feature.faceImageHeight = mFaceImage.getHeight();
170 |
171 | return feature;
172 | }
173 |
174 | public void setFaceImageZoom(float zoom) {
175 | mImageScaling = zoom;
176 | postInvalidate();
177 | }
178 |
179 | public void resetControllers() {
180 | mLeftEyeController.pos.x = getWidth() / 2.0f - 100;
181 | mLeftEyeController.pos.y = getHeight() / 2.0f - 100;
182 |
183 | mRightEyeController.pos.x = getWidth() / 2.0f + 100;
184 | mRightEyeController.pos.y = getHeight() / 2.0f - 100;
185 |
186 | mMouthController.pos.x = getWidth() / 2.0f;
187 | mMouthController.pos.y = getHeight() / 2.0f + 50;
188 |
189 | mChinController.pos.x = getWidth() / 2.0f;
190 | mChinController.pos.y = getHeight() / 2.0f + 150;
191 | }
192 |
193 | private void calcAngles() {
194 |
195 | double dirX = mRightEyeController.pos.x - mLeftEyeController.pos.x;
196 | double dirY = mRightEyeController.pos.y - mLeftEyeController.pos.y;
197 |
198 | double len = Math.sqrt(dirX * dirX + dirY * dirY);
199 | if(len < 0.001)
200 | return;
201 |
202 | double angle = Math.asin(dirY / len);
203 |
204 | if(dirX >= 0.0) {
205 | if(dirY < 0)
206 | angle += 2.0 * Math.PI;
207 |
208 | } else {
209 | angle = Math.PI - angle;
210 | }
211 |
212 | float rotation = (float)(angle * 180.0 / Math.PI);
213 |
214 | mLeftEyeController.rotation = rotation;
215 | mRightEyeController.rotation = rotation;
216 | mMouthController.rotation = rotation;
217 | mChinController.rotation = rotation;
218 | }
219 |
220 | @Override
221 | public boolean onTouch(View v, MotionEvent event) {
222 |
223 | float touchX = event.getX();
224 | float touchY = event.getY();
225 |
226 | int action = event.getAction();
227 |
228 | switch (action)
229 | {
230 | case MotionEvent.ACTION_DOWN:
231 |
232 | mSelectedController = null;
233 |
234 | if(mLeftEyeController.onTouchPosition(touchX, touchY))
235 | mSelectedController = mLeftEyeController;
236 | else if(mRightEyeController.onTouchPosition(touchX, touchY))
237 | mSelectedController = mRightEyeController;
238 | else if(mMouthController.onTouchPosition(touchX, touchY))
239 | mSelectedController = mMouthController;
240 | else if(mChinController.onTouchPosition(touchX, touchY))
241 | mSelectedController = mChinController;
242 |
243 | mLastFingerPos.x = touchX;
244 | mLastFingerPos.y = touchY;
245 |
246 | if(mSelectedController != null) {
247 |
248 | mSelectedController.pos.y -= 100;
249 | postInvalidate();
250 | Log.i(Common.LOG_TAG, "Controller selected!");
251 | }
252 | break;
253 | case MotionEvent.ACTION_MOVE:
254 | if(mSelectedController != null) {
255 | mSelectedController.pos.x += touchX - mLastFingerPos.x;
256 | mSelectedController.pos.y += touchY - mLastFingerPos.y;
257 |
258 | if(mSelectedController == mLeftEyeController || mSelectedController == mRightEyeController)
259 | calcAngles();
260 | } else {
261 | mOriginPos.x += touchX - mLastFingerPos.x;
262 | mOriginPos.y += touchY - mLastFingerPos.y;
263 | }
264 |
265 | mLastFingerPos.x = touchX;
266 | mLastFingerPos.y = touchY;
267 |
268 | postInvalidate();
269 |
270 | break;
271 | case MotionEvent.ACTION_UP:
272 | mSelectedController = null;
273 | break;
274 | default:
275 | break;
276 | }
277 |
278 | return true;
279 | }
280 |
281 | @Override
282 | public void onDraw(Canvas canvas) {
283 |
284 | if(mFaceImage == null)
285 | return;
286 |
287 | Matrix matrix = new Matrix();
288 | matrix.setScale(mImageScaling, mImageScaling);
289 | matrix.postTranslate(mOriginPos.x, mOriginPos.y);
290 | canvas.drawBitmap(mFaceImage, matrix, null);
291 |
292 | mLeftEyeController.drawController(canvas);
293 | mRightEyeController.drawController(canvas);
294 | mMouthController.drawController(canvas);
295 | mChinController.drawController(canvas);
296 |
297 | }
298 |
299 | }
300 |
--------------------------------------------------------------------------------
/cgedemo/src/main/res/drawable-nodpi/adjust_face_chin.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/cgedemo/src/main/res/drawable-nodpi/adjust_face_chin.png
--------------------------------------------------------------------------------
/cgedemo/src/main/res/drawable-nodpi/adjust_face_eye.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/cgedemo/src/main/res/drawable-nodpi/adjust_face_eye.png
--------------------------------------------------------------------------------
/cgedemo/src/main/res/drawable-nodpi/adjust_face_mouth.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/cgedemo/src/main/res/drawable-nodpi/adjust_face_mouth.png
--------------------------------------------------------------------------------
/cgedemo/src/main/res/drawable-nodpi/bgview.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/cgedemo/src/main/res/drawable-nodpi/bgview.png
--------------------------------------------------------------------------------
/cgedemo/src/main/res/drawable-nodpi/face0.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/cgedemo/src/main/res/drawable-nodpi/face0.jpg
--------------------------------------------------------------------------------
/cgedemo/src/main/res/drawable-nodpi/face1.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/cgedemo/src/main/res/drawable-nodpi/face1.jpg
--------------------------------------------------------------------------------
/cgedemo/src/main/res/drawable-nodpi/mask1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/cgedemo/src/main/res/drawable-nodpi/mask1.png
--------------------------------------------------------------------------------
/cgedemo/src/main/res/layout/activity_camera_demo.xml:
--------------------------------------------------------------------------------
1 |
8 |
9 |
14 |
15 |
19 |
20 |
26 |
27 |
35 |
36 |
37 |
38 |
42 |
47 |
48 |
53 |
54 |
59 |
60 |
65 |
66 |
71 |
72 |
77 |
78 |
83 |
84 |
89 |
90 |
95 |
96 |
101 |
102 |
107 |
108 |
113 |
114 |
119 |
120 |
121 |
122 |
123 |
124 |
--------------------------------------------------------------------------------
/cgedemo/src/main/res/layout/activity_face_demo.xml:
--------------------------------------------------------------------------------
1 |
2 |
11 |
12 |
16 |
17 |
21 |
22 |
28 |
29 |
33 |
38 |
39 |
45 |
46 |
52 |
53 |
59 |
60 |
66 |
67 |
73 |
74 |
80 |
81 |
87 |
88 |
89 |
90 |
91 |
92 |
--------------------------------------------------------------------------------
/cgedemo/src/main/res/layout/activity_filter_demo.xml:
--------------------------------------------------------------------------------
1 |
8 |
9 |
13 |
14 |
21 |
22 |
26 |
27 |
32 |
33 |
39 |
40 |
45 |
46 |
52 |
53 |
59 |
60 |
67 |
68 |
75 |
76 |
77 |
78 |
79 |
80 |
81 |
--------------------------------------------------------------------------------
/cgedemo/src/main/res/layout/activity_main.xml:
--------------------------------------------------------------------------------
1 |
7 |
8 |
15 |
16 |
17 |
18 |
19 |
--------------------------------------------------------------------------------
/cgedemo/src/main/res/layout/activity_simple_player_demo.xml:
--------------------------------------------------------------------------------
1 |
8 |
9 |
14 |
15 |
19 |
20 |
21 |
25 |
26 |
30 |
35 |
36 |
42 |
43 |
48 |
49 |
54 |
55 |
60 |
61 |
62 |
63 |
64 |
65 |
--------------------------------------------------------------------------------
/cgedemo/src/main/res/layout/activity_video_player_demo.xml:
--------------------------------------------------------------------------------
1 |
2 |
11 |
12 |
17 |
18 |
22 |
23 |
24 |
28 |
29 |
33 |
38 |
39 |
45 |
46 |
51 |
52 |
57 |
58 |
63 |
64 |
65 |
66 |
67 |
68 |
--------------------------------------------------------------------------------
/cgedemo/src/main/res/menu/menu_camera_demo.xml:
--------------------------------------------------------------------------------
1 |
8 |
--------------------------------------------------------------------------------
/cgedemo/src/main/res/menu/menu_filter_demo.xml:
--------------------------------------------------------------------------------
1 |
8 |
--------------------------------------------------------------------------------
/cgedemo/src/main/res/menu/menu_main.xml:
--------------------------------------------------------------------------------
1 |
7 |
--------------------------------------------------------------------------------
/cgedemo/src/main/res/menu/menu_video_player.xml:
--------------------------------------------------------------------------------
1 |
8 |
--------------------------------------------------------------------------------
/cgedemo/src/main/res/mipmap-nodpi/ic_launcher.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/cgedemo/src/main/res/mipmap-nodpi/ic_launcher.png
--------------------------------------------------------------------------------
/cgedemo/src/main/res/raw/test.mp4:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/cgedemo/src/main/res/raw/test.mp4
--------------------------------------------------------------------------------
/cgedemo/src/main/res/values-w820dp/dimens.xml:
--------------------------------------------------------------------------------
1 |
2 |
5 | 64dp
6 |
7 |
--------------------------------------------------------------------------------
/cgedemo/src/main/res/values/dimens.xml:
--------------------------------------------------------------------------------
1 |
2 |
3 | 16dp
4 | 16dp
5 |
6 |
--------------------------------------------------------------------------------
/cgedemo/src/main/res/values/strings.xml:
--------------------------------------------------------------------------------
1 |
2 | CGE demo
3 |
4 | Settings
5 | FilterDemoActivity
6 | CameraDemoActivity
7 | VideoPlayerActivity
8 |
9 |
--------------------------------------------------------------------------------
/cgedemo/src/main/res/values/styles.xml:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
7 |
8 |
9 |
--------------------------------------------------------------------------------
/demoRelease/cgeDemo.apk:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/demoRelease/cgeDemo.apk
--------------------------------------------------------------------------------
/gradle.properties:
--------------------------------------------------------------------------------
1 | # Project-wide Gradle settings.
2 |
3 | # IDE (e.g. Android Studio) users:
4 | # Gradle settings configured through the IDE *will override*
5 | # any settings specified in this file.
6 |
7 | # For more details on how to configure your build environment visit
8 | # http://www.gradle.org/docs/current/userguide/build_environment.html
9 |
10 | # Specifies the JVM arguments used for the daemon process.
11 | # The setting is particularly useful for tweaking memory settings.
12 | # Default value: -Xmx10248m -XX:MaxPermSize=256m
13 | # org.gradle.jvmargs=-Xmx2048m -XX:MaxPermSize=512m -XX:+HeapDumpOnOutOfMemoryError -Dfile.encoding=UTF-8
14 |
15 | # When configured, Gradle will run in incubating parallel mode.
16 | # This option should only be used with decoupled projects. More details, visit
17 | # http://www.gradle.org/docs/current/userguide/multi_project_builds.html#sec:decoupled_projects
18 | # org.gradle.parallel=true
--------------------------------------------------------------------------------
/gradlew:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 | ##############################################################################
4 | ##
5 | ## Gradle start up script for UN*X
6 | ##
7 | ##############################################################################
8 |
9 | # Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
10 | DEFAULT_JVM_OPTS=""
11 |
12 | APP_NAME="Gradle"
13 | APP_BASE_NAME=`basename "$0"`
14 |
15 | # Use the maximum available, or set MAX_FD != -1 to use that value.
16 | MAX_FD="maximum"
17 |
18 | warn ( ) {
19 | echo "$*"
20 | }
21 |
22 | die ( ) {
23 | echo
24 | echo "$*"
25 | echo
26 | exit 1
27 | }
28 |
29 | # OS specific support (must be 'true' or 'false').
30 | cygwin=false
31 | msys=false
32 | darwin=false
33 | case "`uname`" in
34 | CYGWIN* )
35 | cygwin=true
36 | ;;
37 | Darwin* )
38 | darwin=true
39 | ;;
40 | MINGW* )
41 | msys=true
42 | ;;
43 | esac
44 |
45 | # Attempt to set APP_HOME
46 | # Resolve links: $0 may be a link
47 | PRG="$0"
48 | # Need this for relative symlinks.
49 | while [ -h "$PRG" ] ; do
50 | ls=`ls -ld "$PRG"`
51 | link=`expr "$ls" : '.*-> \(.*\)$'`
52 | if expr "$link" : '/.*' > /dev/null; then
53 | PRG="$link"
54 | else
55 | PRG=`dirname "$PRG"`"/$link"
56 | fi
57 | done
58 | SAVED="`pwd`"
59 | cd "`dirname \"$PRG\"`/" >/dev/null
60 | APP_HOME="`pwd -P`"
61 | cd "$SAVED" >/dev/null
62 |
63 | CLASSPATH=$APP_HOME/gradle/wrapper/gradle-wrapper.jar
64 |
65 | # Determine the Java command to use to start the JVM.
66 | if [ -n "$JAVA_HOME" ] ; then
67 | if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
68 | # IBM's JDK on AIX uses strange locations for the executables
69 | JAVACMD="$JAVA_HOME/jre/sh/java"
70 | else
71 | JAVACMD="$JAVA_HOME/bin/java"
72 | fi
73 | if [ ! -x "$JAVACMD" ] ; then
74 | die "ERROR: JAVA_HOME is set to an invalid directory: $JAVA_HOME
75 |
76 | Please set the JAVA_HOME variable in your environment to match the
77 | location of your Java installation."
78 | fi
79 | else
80 | JAVACMD="java"
81 | which java >/dev/null 2>&1 || die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
82 |
83 | Please set the JAVA_HOME variable in your environment to match the
84 | location of your Java installation."
85 | fi
86 |
87 | # Increase the maximum file descriptors if we can.
88 | if [ "$cygwin" = "false" -a "$darwin" = "false" ] ; then
89 | MAX_FD_LIMIT=`ulimit -H -n`
90 | if [ $? -eq 0 ] ; then
91 | if [ "$MAX_FD" = "maximum" -o "$MAX_FD" = "max" ] ; then
92 | MAX_FD="$MAX_FD_LIMIT"
93 | fi
94 | ulimit -n $MAX_FD
95 | if [ $? -ne 0 ] ; then
96 | warn "Could not set maximum file descriptor limit: $MAX_FD"
97 | fi
98 | else
99 | warn "Could not query maximum file descriptor limit: $MAX_FD_LIMIT"
100 | fi
101 | fi
102 |
103 | # For Darwin, add options to specify how the application appears in the dock
104 | if $darwin; then
105 | GRADLE_OPTS="$GRADLE_OPTS \"-Xdock:name=$APP_NAME\" \"-Xdock:icon=$APP_HOME/media/gradle.icns\""
106 | fi
107 |
108 | # For Cygwin, switch paths to Windows format before running java
109 | if $cygwin ; then
110 | APP_HOME=`cygpath --path --mixed "$APP_HOME"`
111 | CLASSPATH=`cygpath --path --mixed "$CLASSPATH"`
112 | JAVACMD=`cygpath --unix "$JAVACMD"`
113 |
114 | # We build the pattern for arguments to be converted via cygpath
115 | ROOTDIRSRAW=`find -L / -maxdepth 1 -mindepth 1 -type d 2>/dev/null`
116 | SEP=""
117 | for dir in $ROOTDIRSRAW ; do
118 | ROOTDIRS="$ROOTDIRS$SEP$dir"
119 | SEP="|"
120 | done
121 | OURCYGPATTERN="(^($ROOTDIRS))"
122 | # Add a user-defined pattern to the cygpath arguments
123 | if [ "$GRADLE_CYGPATTERN" != "" ] ; then
124 | OURCYGPATTERN="$OURCYGPATTERN|($GRADLE_CYGPATTERN)"
125 | fi
126 | # Now convert the arguments - kludge to limit ourselves to /bin/sh
127 | i=0
128 | for arg in "$@" ; do
129 | CHECK=`echo "$arg"|egrep -c "$OURCYGPATTERN" -`
130 | CHECK2=`echo "$arg"|egrep -c "^-"` ### Determine if an option
131 |
132 | if [ $CHECK -ne 0 ] && [ $CHECK2 -eq 0 ] ; then ### Added a condition
133 | eval `echo args$i`=`cygpath --path --ignore --mixed "$arg"`
134 | else
135 | eval `echo args$i`="\"$arg\""
136 | fi
137 | i=$((i+1))
138 | done
139 | case $i in
140 | (0) set -- ;;
141 | (1) set -- "$args0" ;;
142 | (2) set -- "$args0" "$args1" ;;
143 | (3) set -- "$args0" "$args1" "$args2" ;;
144 | (4) set -- "$args0" "$args1" "$args2" "$args3" ;;
145 | (5) set -- "$args0" "$args1" "$args2" "$args3" "$args4" ;;
146 | (6) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" ;;
147 | (7) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" ;;
148 | (8) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" ;;
149 | (9) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" "$args8" ;;
150 | esac
151 | fi
152 |
153 | # Split up the JVM_OPTS And GRADLE_OPTS values into an array, following the shell quoting and substitution rules
154 | function splitJvmOpts() {
155 | JVM_OPTS=("$@")
156 | }
157 | eval splitJvmOpts $DEFAULT_JVM_OPTS $JAVA_OPTS $GRADLE_OPTS
158 | JVM_OPTS[${#JVM_OPTS[*]}]="-Dorg.gradle.appname=$APP_BASE_NAME"
159 |
160 | exec "$JAVACMD" "${JVM_OPTS[@]}" -classpath "$CLASSPATH" org.gradle.wrapper.GradleWrapperMain "$@"
161 |
--------------------------------------------------------------------------------
/gradlew.bat:
--------------------------------------------------------------------------------
1 | @if "%DEBUG%" == "" @echo off
2 | @rem ##########################################################################
3 | @rem
4 | @rem Gradle startup script for Windows
5 | @rem
6 | @rem ##########################################################################
7 |
8 | @rem Set local scope for the variables with windows NT shell
9 | if "%OS%"=="Windows_NT" setlocal
10 |
11 | @rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
12 | set DEFAULT_JVM_OPTS=
13 |
14 | set DIRNAME=%~dp0
15 | if "%DIRNAME%" == "" set DIRNAME=.
16 | set APP_BASE_NAME=%~n0
17 | set APP_HOME=%DIRNAME%
18 |
19 | @rem Find java.exe
20 | if defined JAVA_HOME goto findJavaFromJavaHome
21 |
22 | set JAVA_EXE=java.exe
23 | %JAVA_EXE% -version >NUL 2>&1
24 | if "%ERRORLEVEL%" == "0" goto init
25 |
26 | echo.
27 | echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
28 | echo.
29 | echo Please set the JAVA_HOME variable in your environment to match the
30 | echo location of your Java installation.
31 |
32 | goto fail
33 |
34 | :findJavaFromJavaHome
35 | set JAVA_HOME=%JAVA_HOME:"=%
36 | set JAVA_EXE=%JAVA_HOME%/bin/java.exe
37 |
38 | if exist "%JAVA_EXE%" goto init
39 |
40 | echo.
41 | echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME%
42 | echo.
43 | echo Please set the JAVA_HOME variable in your environment to match the
44 | echo location of your Java installation.
45 |
46 | goto fail
47 |
48 | :init
49 | @rem Get command-line arguments, handling Windowz variants
50 |
51 | if not "%OS%" == "Windows_NT" goto win9xME_args
52 | if "%@eval[2+2]" == "4" goto 4NT_args
53 |
54 | :win9xME_args
55 | @rem Slurp the command line arguments.
56 | set CMD_LINE_ARGS=
57 | set _SKIP=2
58 |
59 | :win9xME_args_slurp
60 | if "x%~1" == "x" goto execute
61 |
62 | set CMD_LINE_ARGS=%*
63 | goto execute
64 |
65 | :4NT_args
66 | @rem Get arguments from the 4NT Shell from JP Software
67 | set CMD_LINE_ARGS=%$
68 |
69 | :execute
70 | @rem Setup the command line
71 |
72 | set CLASSPATH=%APP_HOME%\gradle\wrapper\gradle-wrapper.jar
73 |
74 | @rem Execute Gradle
75 | "%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" org.gradle.wrapper.GradleWrapperMain %CMD_LINE_ARGS%
76 |
77 | :end
78 | @rem End local scope for the variables with windows NT shell
79 | if "%ERRORLEVEL%"=="0" goto mainEnd
80 |
81 | :fail
82 | rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of
83 | rem the _cmd.exe /c_ return code!
84 | if not "" == "%GRADLE_EXIT_CONSOLE%" exit 1
85 | exit /b 1
86 |
87 | :mainEnd
88 | if "%OS%"=="Windows_NT" endlocal
89 |
90 | :omega
91 |
--------------------------------------------------------------------------------
/library/.gitignore:
--------------------------------------------------------------------------------
1 | /build
2 |
--------------------------------------------------------------------------------
/library/build.gradle:
--------------------------------------------------------------------------------
1 | apply plugin: 'com.android.library'
2 |
3 | android {
4 | compileSdkVersion 23
5 | buildToolsVersion "22.0.1"
6 |
7 | defaultConfig {
8 | minSdkVersion 14
9 | targetSdkVersion 23
10 | versionCode 1
11 | versionName "1.0"
12 |
13 | // ndk {
14 | // moduleName "CGE"
15 | //// cFlags "-std=c++11 -DANDROID_NDK -DDEBUG -D_CGE_ONLY_FILTERS_ -D_CGE_STATIC_ASSERT_ -DCGE_TEXTURE_PREMULTIPLIED=1 -DCGE_LOG_TAG=\\\"libCGE\\\" -I${project.buildDir}/../src/main/jni/include -I${project.buildDir}/../src/main/jni/include/filters"
16 | //// stl "gnustl_shared"
17 | ////// abiFilters "all"
18 | //// abiFilters "armeabi", "armeabi-v7a"
19 | //// ldLibs "log", "android", "EGL", "GLESv2", "jnigraphics"
20 | // }
21 |
22 | }
23 | buildTypes {
24 | release {
25 | minifyEnabled false
26 | proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
27 | }
28 | }
29 |
30 | ////////////////////////////////////////
31 |
32 | sourceSets.main {
33 | jniLibs.srcDir 'src/main/libs' //set libs as .so's location instead of jni
34 | jni.srcDirs = [] //disable automatic ndk-build call with auto-generated Android.mk file
35 | }
36 |
37 | }
38 |
39 | dependencies {
40 | compile fileTree(dir: 'libs', include: ['*.jar'])
41 | compile 'com.android.support:appcompat-v7:22.2.0'
42 | }
43 |
--------------------------------------------------------------------------------
/library/proguard-rules.pro:
--------------------------------------------------------------------------------
1 | # Add project specific ProGuard rules here.
2 | # By default, the flags in this file are appended to flags specified
3 | # in /Users/wysaid/android_develop/android-sdk-macosx/tools/proguard/proguard-android.txt
4 | # You can edit the include path and order by changing the proguardFiles
5 | # directive in build.gradle.
6 | #
7 | # For more details, see
8 | # http://developer.android.com/guide/developing/tools/proguard.html
9 |
10 | # Add any project specific keep options here:
11 |
12 | # If your project uses WebView with JS, uncomment the following
13 | # and specify the fully qualified class name to the JavaScript interface
14 | # class:
15 | #-keepclassmembers class fqcn.of.javascript.interface.for.webview {
16 | # public *;
17 | #}
18 |
--------------------------------------------------------------------------------
/library/project.properties:
--------------------------------------------------------------------------------
1 | target=android-21
2 | android.library=true
3 |
--------------------------------------------------------------------------------
/library/so/libgpuimage-library.so:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/library/so/libgpuimage-library.so
--------------------------------------------------------------------------------
/library/src/main/AndroidManifest.xml:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
--------------------------------------------------------------------------------
/library/src/main/genJNIHeaders:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | cd $(dirname $0)
3 | javah -d jni -classpath /Users/wysaid/android_develop/android-sdk-macosx/platforms/android-22/android.jar:../../build/intermediates/classes/debug -jni org.wysaid.nativePort.CGEFrameRecorder
4 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/algorithm/Matrix2x2.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.algorithm;
2 |
3 | /**
4 | * Created by wangyang on 15/11/27.
5 | */
6 | public class Matrix2x2 {
7 | }
8 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/algorithm/Matrix3x3.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.algorithm;
2 |
3 | /**
4 | * Created by wangyang on 15/11/27.
5 | */
6 | public class Matrix3x3 {
7 | }
8 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/algorithm/Matrix4x4.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.algorithm;
2 |
3 | /**
4 | * Created by wangyang on 15/11/27.
5 | */
6 | public class Matrix4x4 {
7 | }
8 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/algorithm/Vector2.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.algorithm;
2 |
3 | /**
4 | * Created by wangyang on 15/11/27.
5 | */
6 | public class Vector2 {
7 | }
8 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/algorithm/Vector3.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.algorithm;
2 |
3 | /**
4 | * Created by wangyang on 15/11/27.
5 | */
6 | public class Vector3 {
7 | }
8 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/algorithm/Vector4.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.algorithm;
2 |
3 | /**
4 | * Created by wangyang on 15/11/27.
5 | */
6 | public class Vector4 {
7 | }
8 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/camera/CameraInstance.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.camera;
2 |
3 | import android.graphics.PixelFormat;
4 | import android.graphics.Rect;
5 | import android.graphics.SurfaceTexture;
6 | import android.hardware.Camera;
7 | import android.os.Build;
8 | import android.util.Log;
9 |
10 | import org.wysaid.common.Common;
11 |
12 | import java.io.IOException;
13 | import java.util.ArrayList;
14 | import java.util.Collections;
15 | import java.util.Comparator;
16 | import java.util.List;
17 |
18 | /**
19 | * Created by wangyang on 15/7/27.
20 | */
21 |
22 |
23 | // Camera 仅适用单例
24 | public class CameraInstance {
25 | public static final String LOG_TAG = Common.LOG_TAG;
26 |
27 | private static final String ASSERT_MSG = "检测到CameraDevice 为 null! 请检查";
28 |
29 | private Camera mCameraDevice;
30 | private Camera.Parameters mParams;
31 |
32 | public static final int DEFAULT_PREVIEW_RATE = 30;
33 |
34 |
35 | private boolean mIsPreviewing = false;
36 |
37 | private int mDefaultCameraID = -1;
38 |
39 | private static CameraInstance mThisInstance;
40 | private int mPreviewWidth;
41 | private int mPreviewHeight;
42 |
43 | private int mPictureWidth = 1000;
44 | private int mPictureHeight = 1000;
45 |
46 | private int mPreferPreviewWidth = 640;
47 | private int mPreferPreviewHeight = 640;
48 |
49 | private int mFacing = 0;
50 |
51 | private CameraInstance() {}
52 |
53 | public static synchronized CameraInstance getInstance() {
54 | if(mThisInstance == null) {
55 | mThisInstance = new CameraInstance();
56 | }
57 | return mThisInstance;
58 | }
59 |
60 | public boolean isPreviewing() { return mIsPreviewing; }
61 |
62 | public int previewWidth() { return mPreviewWidth; }
63 | public int previewHeight() { return mPreviewHeight; }
64 | public int pictureWidth() { return mPictureWidth; }
65 | public int pictureHeight() { return mPictureHeight; }
66 |
67 | public void setPreferPreviewSize(int w, int h) {
68 | mPreferPreviewHeight = w;
69 | mPreferPreviewWidth = h;
70 | }
71 |
72 | public interface CameraOpenCallback {
73 | void cameraReady();
74 | }
75 |
76 | public boolean tryOpenCamera(CameraOpenCallback callback) {
77 | return tryOpenCamera(callback, Camera.CameraInfo.CAMERA_FACING_BACK);
78 | }
79 |
80 | public int getFacing() {
81 | return mFacing;
82 | }
83 |
84 | public synchronized boolean tryOpenCamera(CameraOpenCallback callback, int facing) {
85 | Log.i(LOG_TAG, "try open camera...");
86 |
87 | try
88 | {
89 | if(Build.VERSION.SDK_INT > Build.VERSION_CODES.FROYO)
90 | {
91 | int numberOfCameras = Camera.getNumberOfCameras();
92 |
93 | Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
94 | for (int i = 0; i < numberOfCameras; i++) {
95 | Camera.getCameraInfo(i, cameraInfo);
96 | if (cameraInfo.facing == facing) {
97 | mDefaultCameraID = i;
98 | mFacing = facing;
99 | }
100 | }
101 | }
102 | stopPreview();
103 | if(mCameraDevice != null)
104 | mCameraDevice.release();
105 |
106 | if(mDefaultCameraID >= 0) {
107 | mCameraDevice = Camera.open(mDefaultCameraID);
108 | }
109 | else {
110 | mCameraDevice = Camera.open();
111 | mFacing = Camera.CameraInfo.CAMERA_FACING_BACK; //default: back facing
112 | }
113 | }
114 | catch(Exception e)
115 | {
116 | Log.e(LOG_TAG, "Open Camera Failed!");
117 | e.printStackTrace();
118 | mCameraDevice = null;
119 | return false;
120 | }
121 |
122 | if(mCameraDevice != null) {
123 | Log.i(LOG_TAG, "Camera opened!");
124 |
125 | try {
126 | initCamera(DEFAULT_PREVIEW_RATE);
127 | } catch (Exception e) {
128 | mCameraDevice.release();
129 | mCameraDevice = null;
130 | return false;
131 | }
132 |
133 | if (callback != null) {
134 | callback.cameraReady();
135 | }
136 |
137 | return true;
138 | }
139 |
140 | return false;
141 | }
142 |
143 | public synchronized void stopCamera() {
144 | if(mCameraDevice != null) {
145 | mIsPreviewing = false;
146 | mCameraDevice.stopPreview();
147 | mCameraDevice.setPreviewCallback(null);
148 | mCameraDevice.release();
149 | mCameraDevice = null;
150 | }
151 | }
152 |
153 | public boolean isCameraOpened() {
154 | return mCameraDevice != null;
155 | }
156 |
157 | public synchronized void startPreview(SurfaceTexture texture) {
158 | Log.i(LOG_TAG, "Camera startPreview...");
159 | if(mIsPreviewing) {
160 | Log.e(LOG_TAG, "Err: camera is previewing...");
161 | // stopPreview();
162 | return ;
163 | }
164 |
165 | if(mCameraDevice != null) {
166 | try {
167 | mCameraDevice.setPreviewTexture(texture);
168 | } catch (IOException e) {
169 | e.printStackTrace();
170 | }
171 |
172 | mCameraDevice.startPreview();
173 | mIsPreviewing = true;
174 | }
175 | }
176 |
177 | public synchronized void stopPreview() {
178 | if(mIsPreviewing && mCameraDevice != null) {
179 | Log.i(LOG_TAG, "Camera stopPreview...");
180 | mIsPreviewing = false;
181 | mCameraDevice.stopPreview();
182 | }
183 | }
184 |
185 | public synchronized Camera.Parameters getParams() {
186 | if(mCameraDevice != null)
187 | return mCameraDevice.getParameters();
188 | assert mCameraDevice != null : ASSERT_MSG;
189 | return null;
190 | }
191 |
192 | public synchronized void setParams(Camera.Parameters param) {
193 | if(mCameraDevice != null) {
194 | mParams = param;
195 | mCameraDevice.setParameters(mParams);
196 | }
197 | assert mCameraDevice != null : ASSERT_MSG;
198 | }
199 |
200 | public Camera getCameraDevice() {
201 | return mCameraDevice;
202 | }
203 |
204 | //保证从大到小排列
205 | private Comparator comparatorBigger = new Comparator() {
206 | @Override
207 | public int compare(Camera.Size lhs, Camera.Size rhs) {
208 | int w = rhs.width - lhs.width;
209 | if(w == 0)
210 | return rhs.height - lhs.height;
211 | return w;
212 | }
213 | };
214 |
215 | //保证从小到大排列
216 | private Comparator comparatorSmaller= new Comparator() {
217 | @Override
218 | public int compare(Camera.Size lhs, Camera.Size rhs) {
219 | int w = lhs.width - rhs.width;
220 | if(w == 0)
221 | return lhs.height - rhs.height;
222 | return w;
223 | }
224 | };
225 |
226 | public void initCamera(int previewRate) {
227 | if(mCameraDevice == null) {
228 | Log.e(LOG_TAG, "initCamera: Camera is not opened!");
229 | return;
230 | }
231 |
232 | mParams = mCameraDevice.getParameters();
233 | List supportedPictureFormats = mParams.getSupportedPictureFormats();
234 |
235 | for(int fmt : supportedPictureFormats) {
236 | Log.i(LOG_TAG, String.format("Picture Format: %x", fmt));
237 | }
238 |
239 | mParams.setPictureFormat(PixelFormat.JPEG);
240 |
241 | List picSizes = mParams.getSupportedPictureSizes();
242 | Camera.Size picSz = null;
243 |
244 | Collections.sort(picSizes, comparatorBigger);
245 |
246 | for(Camera.Size sz : picSizes) {
247 | Log.i(LOG_TAG, String.format("Supported picture size: %d x %d", sz.width, sz.height));
248 | if(picSz == null || (sz.width >= mPictureWidth && sz.height >= mPictureHeight)) {
249 | picSz = sz;
250 | }
251 | }
252 |
253 | List prevSizes = mParams.getSupportedPreviewSizes();
254 | Camera.Size prevSz = null;
255 |
256 | Collections.sort(prevSizes, comparatorBigger);
257 |
258 | for(Camera.Size sz : prevSizes) {
259 | Log.i(LOG_TAG, String.format("Supported preview size: %d x %d", sz.width, sz.height));
260 | if(prevSz == null || (sz.width >= mPreferPreviewWidth && sz.height >= mPreferPreviewHeight)) {
261 | prevSz = sz;
262 | }
263 | }
264 |
265 | List frameRates = mParams.getSupportedPreviewFrameRates();
266 |
267 | int fpsMax = 0;
268 |
269 | for(Integer n : frameRates) {
270 | Log.i(LOG_TAG, "Supported frame rate: " + n);
271 | if(fpsMax < n) {
272 | fpsMax = n;
273 | }
274 | }
275 |
276 | mParams.setPreviewSize(prevSz.width, prevSz.height);
277 | mParams.setPictureSize(picSz.width, picSz.height);
278 |
279 | List focusModes = mParams.getSupportedFocusModes();
280 | if(focusModes.contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)){
281 | mParams.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
282 | }
283 |
284 | previewRate = fpsMax;
285 | mParams.setPreviewFrameRate(previewRate); //设置相机预览帧率
286 | // mParams.setPreviewFpsRange(20, 60);
287 |
288 | try {
289 | mCameraDevice.setParameters(mParams);
290 | }catch (Exception e) {
291 | e.printStackTrace();
292 | }
293 |
294 |
295 | mParams = mCameraDevice.getParameters();
296 |
297 | Camera.Size szPic = mParams.getPictureSize();
298 | Camera.Size szPrev = mParams.getPreviewSize();
299 |
300 | mPreviewWidth = szPrev.width;
301 | mPreviewHeight = szPrev.height;
302 |
303 | mPictureWidth = szPic.width;
304 | mPictureHeight = szPic.height;
305 |
306 | Log.i(LOG_TAG, String.format("Camera Picture Size: %d x %d", szPic.width, szPic.height));
307 | Log.i(LOG_TAG, String.format("Camera Preview Size: %d x %d", szPrev.width, szPrev.height));
308 | }
309 |
310 | public synchronized void setFocusMode(String focusMode) {
311 |
312 | if(mCameraDevice == null)
313 | return;
314 |
315 | mParams = mCameraDevice.getParameters();
316 | List focusModes = mParams.getSupportedFocusModes();
317 | if(focusModes.contains(focusMode)){
318 | mParams.setFocusMode(focusMode);
319 | }
320 | }
321 |
322 | public synchronized void setPictureSize(int width, int height, boolean isBigger) {
323 |
324 | if(mCameraDevice == null) {
325 | mPictureWidth = width;
326 | mPictureHeight = height;
327 | return;
328 | }
329 |
330 | mParams = mCameraDevice.getParameters();
331 |
332 |
333 | List picSizes = mParams.getSupportedPictureSizes();
334 | Camera.Size picSz = null;
335 |
336 | if(isBigger) {
337 | Collections.sort(picSizes, comparatorBigger);
338 | for(Camera.Size sz : picSizes) {
339 | if(picSz == null || (sz.width >= width && sz.height >= height)) {
340 | picSz = sz;
341 | }
342 | }
343 | } else {
344 | Collections.sort(picSizes, comparatorSmaller);
345 | for(Camera.Size sz : picSizes) {
346 | if(picSz == null || (sz.width <= width && sz.height <= height)) {
347 | picSz = sz;
348 | }
349 | }
350 | }
351 |
352 | mPictureWidth = picSz.width;
353 | mPictureHeight= picSz.height;
354 |
355 | try {
356 | mParams.setPictureSize(mPictureWidth, mPictureHeight);
357 | mCameraDevice.setParameters(mParams);
358 | } catch (Exception e) {
359 | e.printStackTrace();
360 | }
361 | }
362 |
363 | public void focusAtPoint(float x, float y, final Camera.AutoFocusCallback callback) {
364 | focusAtPoint(x, y, 0.2f, callback);
365 | }
366 |
367 | public synchronized void focusAtPoint(float x, float y, float radius, final Camera.AutoFocusCallback callback) {
368 | if(mCameraDevice == null) {
369 | Log.e(LOG_TAG, "Error: focus after release.");
370 | return;
371 | }
372 |
373 | mParams = mCameraDevice.getParameters();
374 |
375 | if(mParams.getMaxNumMeteringAreas() > 0) {
376 |
377 | int focusRadius = (int) (radius * 1000.0f);
378 | int left = (int) (x * 2000.0f - 1000.0f) - focusRadius;
379 | int top = (int) (y * 2000.0f - 1000.0f) - focusRadius;
380 |
381 | Rect focusArea = new Rect();
382 | focusArea.left = Math.max(left, -1000);
383 | focusArea.top = Math.max(top, -1000);
384 | focusArea.right = Math.min(left + focusRadius, 1000);
385 | focusArea.bottom = Math.min(top + focusRadius, 1000);
386 | List meteringAreas = new ArrayList();
387 | meteringAreas.add(new Camera.Area(focusArea, 800));
388 |
389 | try {
390 | mCameraDevice.cancelAutoFocus();
391 | mParams.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
392 | mParams.setFocusAreas(meteringAreas);
393 | mCameraDevice.setParameters(mParams);
394 | mCameraDevice.autoFocus(callback);
395 | } catch (Exception e) {
396 | Log.e(LOG_TAG, "Error: focusAtPoint failed: " + e.toString());
397 | }
398 | } else {
399 | Log.i(LOG_TAG, "The device does not support metering areas...");
400 | try {
401 | mCameraDevice.autoFocus(callback);
402 | } catch (Exception e) {
403 | Log.e(LOG_TAG, "Error: focusAtPoint failed: " + e.toString());
404 | }
405 | }
406 |
407 | }
408 | }
409 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/common/Common.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.common;
2 |
3 | import android.graphics.Bitmap;
4 | import android.opengl.GLES11Ext;
5 | import android.opengl.GLES20;
6 | import android.opengl.GLUtils;
7 | import android.util.Log;
8 |
9 | /**
10 | * Created by wangyang on 15/7/27.
11 | */
12 |
13 | public class Common {
14 |
15 | public static final boolean DEBUG = true;
16 | public static final String LOG_TAG = "wysaid";
17 |
18 | public static void checkGLError(final String tag) {
19 | int loopCnt = 0;
20 | for(int err = GLES20.glGetError(); loopCnt < 32 && err != GLES20.GL_FALSE; err = GLES20.glGetError(), ++loopCnt)
21 | {
22 | String msg;
23 | switch (err)
24 | {
25 | case GLES20.GL_INVALID_ENUM:
26 | msg = "invalid enum"; break;
27 | case GLES20.GL_INVALID_FRAMEBUFFER_OPERATION:
28 | msg = "invalid framebuffer operation"; break;
29 | case GLES20.GL_INVALID_OPERATION:
30 | msg = "invalid operation";break;
31 | case GLES20.GL_INVALID_VALUE:
32 | msg = "invalid value";break;
33 | case GLES20.GL_OUT_OF_MEMORY:
34 | msg = "out of memory"; break;
35 | default: msg = "unknown error";
36 | }
37 | Log.e(LOG_TAG, String.format("After tag \"%s\" glGetError %s(0x%x) ", tag, msg, err));
38 | }
39 | }
40 |
41 | private static void _texParamHelper(int type, int filter, int wrap) {
42 | GLES20.glTexParameterf(type, GLES20.GL_TEXTURE_MIN_FILTER, filter);
43 | GLES20.glTexParameterf(type, GLES20.GL_TEXTURE_MAG_FILTER, filter);
44 | GLES20.glTexParameteri(type, GLES20.GL_TEXTURE_WRAP_S, wrap);
45 | GLES20.glTexParameteri(type, GLES20.GL_TEXTURE_WRAP_T, wrap);
46 | }
47 |
48 | public static int genBlankTextureID(int width, int height) {
49 | return genBlankTextureID(width, height, GLES20.GL_LINEAR, GLES20.GL_CLAMP_TO_EDGE);
50 | }
51 |
52 | public static int genBlankTextureID(int width, int height, int filter, int wrap) {
53 | int[] texID = new int[1];
54 | GLES20.glGenTextures(1, texID, 0);
55 | GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, texID[0]);
56 | GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, width, height, 0, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, null);
57 | _texParamHelper(GLES20.GL_TEXTURE_2D, filter, wrap);
58 | return texID[0];
59 | }
60 |
61 | public static int genNormalTextureID(Bitmap bmp) {
62 | return genNormalTextureID(bmp, GLES20.GL_LINEAR, GLES20.GL_CLAMP_TO_EDGE);
63 | }
64 |
65 | public static int genNormalTextureID(Bitmap bmp, int filter, int wrap) {
66 | int[] texID = new int[1];
67 | GLES20.glGenTextures(1, texID, 0);
68 | GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, texID[0]);
69 | GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bmp, 0);
70 | _texParamHelper(GLES20.GL_TEXTURE_2D, filter, wrap);
71 | return texID[0];
72 | }
73 |
74 | public static int genSurfaceTextureID() {
75 | int[] texID = new int[1];
76 | GLES20.glGenTextures(1, texID, 0);
77 | GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texID[0]);
78 | _texParamHelper(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_LINEAR, GLES20.GL_CLAMP_TO_EDGE);
79 | return texID[0];
80 | }
81 |
82 | public static void deleteTextureID(int texID) {
83 | GLES20.glDeleteTextures(1, new int[]{texID}, 0);
84 | }
85 | }
86 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/common/FrameBufferObject.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.common;
2 |
3 | import android.opengl.GLES20;
4 | import android.util.Log;
5 |
6 | /**
7 | * Created by wangyang on 15/7/27.
8 | */
9 |
10 | public class FrameBufferObject {
11 | private int mFramebufferID;
12 |
13 | public FrameBufferObject() {
14 | int[] buf = new int[1];
15 | GLES20.glGenFramebuffers(1, buf, 0);
16 | mFramebufferID = buf[0];
17 | }
18 |
19 | public void release() {
20 | GLES20.glDeleteFramebuffers(1, new int[]{mFramebufferID}, 0);
21 | }
22 |
23 | public void bind() {
24 | GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, mFramebufferID);
25 | }
26 |
27 | //将texture 绑定到该framebuffer的 GL_COLOR_ATTACHMENT0
28 | public void bindTexture(int texID) {
29 | bind();
30 | GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_TEXTURE_2D, texID, 0);
31 | if(GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER) != GLES20.GL_FRAMEBUFFER_COMPLETE)
32 | {
33 | Log.e(Common.LOG_TAG, "CGE::FrameBuffer::bindTexture2D - Frame buffer is not valid!");
34 | }
35 | }
36 |
37 | }
38 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/common/ProgramObject.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.common;
2 |
3 | import android.opengl.GLES20;
4 | import android.util.Log;
5 |
6 | /**
7 | * Created by wangyang on 15/7/27.
8 | */
9 |
10 |
11 | public class ProgramObject {
12 | public static final String LOG_TAG = Common.LOG_TAG;
13 |
14 | private int mProgramID;
15 | private ShaderObject mVertexShader, mFragmentShader;
16 |
17 | //单独初始化之后可以进行一些 attribute location 的绑定操作
18 | //之后再进行init
19 | public ProgramObject() {
20 | mProgramID = GLES20.glCreateProgram();
21 | }
22 |
23 | public ProgramObject(final String vsh, final String fsh) {
24 | init(vsh, fsh);
25 | }
26 |
27 | public int programID() {
28 | return mProgramID;
29 | }
30 |
31 | public final void release() {
32 | if(mProgramID != 0)
33 | {
34 | GLES20.glDeleteProgram(mProgramID);
35 | mProgramID = 0;
36 | }
37 | }
38 |
39 | public boolean init(final String vsh, final String fsh) {
40 | return init(vsh, fsh, 0);
41 | }
42 |
43 | public boolean init(final String vsh, final String fsh, int programID) {
44 | if(programID == 0)
45 | programID = GLES20.glCreateProgram();
46 |
47 | if(programID == 0) {
48 | Log.e(LOG_TAG, "Invalid Program ID! Check if the context is binded!");
49 | }
50 |
51 | if(mVertexShader != null)
52 | mVertexShader.release();
53 | if(mFragmentShader != null)
54 | mFragmentShader.release();
55 |
56 | mVertexShader = new ShaderObject(vsh, GLES20.GL_VERTEX_SHADER);
57 | mFragmentShader = new ShaderObject(fsh, GLES20.GL_FRAGMENT_SHADER);
58 |
59 | GLES20.glAttachShader(programID, mVertexShader.shaderID());
60 | GLES20.glAttachShader(programID, mFragmentShader.shaderID());
61 | Common.checkGLError("AttachShaders...");
62 | GLES20.glLinkProgram(programID);
63 |
64 | int[] programStatus = {0};
65 | GLES20.glGetProgramiv(programID, GLES20.GL_LINK_STATUS, programStatus, 0);
66 |
67 | //link 完毕之后即可释放 shader object
68 | mVertexShader.release();
69 | mFragmentShader.release();
70 | mVertexShader = null;
71 | mFragmentShader = null;
72 |
73 | if(programStatus[0] != GLES20.GL_TRUE) {
74 | String msg = GLES20.glGetProgramInfoLog(programID);
75 | Log.e(LOG_TAG, msg);
76 | return false;
77 | }
78 |
79 | mProgramID = programID;
80 | return true;
81 | }
82 |
83 | public void bind() {
84 | GLES20.glUseProgram(mProgramID);
85 | }
86 |
87 | public int getUniformLoc(final String name) {
88 | int uniform = GLES20.glGetUniformLocation(mProgramID, name);
89 | if(Common.DEBUG) {
90 | if(uniform < 0)
91 | Log.e(LOG_TAG, String.format("uniform name %s does not exist", name));
92 | }
93 | return uniform;
94 | }
95 |
96 | public void sendUniformf(final String name, float x) {
97 | GLES20.glUniform1f(getUniformLoc(name), x);
98 | }
99 |
100 | public void sendUniformf(final String name, float x, float y) {
101 | GLES20.glUniform2f(getUniformLoc(name), x, y);
102 | }
103 |
104 | public void sendUniformf(final String name, float x, float y, float z) {
105 | GLES20.glUniform3f(getUniformLoc(name), x, y, z);
106 | }
107 |
108 | public void sendUniformf(final String name, float x, float y, float z, float w) {
109 | GLES20.glUniform4f(getUniformLoc(name), x, y, z, w);
110 | }
111 |
112 | public void sendUniformi(final String name, int x) {
113 | GLES20.glUniform1i(getUniformLoc(name), x);
114 | }
115 |
116 | public void sendUniformi(final String name, int x, int y) {
117 | GLES20.glUniform2i(getUniformLoc(name), x, y);
118 | }
119 |
120 | public void sendUniformi(final String name, int x, int y, int z) {
121 | GLES20.glUniform3i(getUniformLoc(name), x, y, z);
122 | }
123 |
124 | public void sendUniformi(final String name, int x, int y, int z, int w) {
125 | GLES20.glUniform4i(getUniformLoc(name), x, y, z, w);
126 | }
127 |
128 | public void sendUniformMat2(final String name, int count, boolean transpose, float[] matrix) {
129 | GLES20.glUniformMatrix2fv(getUniformLoc(name), count, transpose, matrix, 0);
130 | }
131 |
132 | public void sendUniformMat3(final String name, int count, boolean transpose, float[] matrix) {
133 | GLES20.glUniformMatrix3fv(getUniformLoc(name), count, transpose, matrix, 0);
134 | }
135 |
136 | public void sendUniformMat4(final String name, int count, boolean transpose, float[] matrix) {
137 | GLES20.glUniformMatrix4fv(getUniformLoc(name), count, transpose, matrix, 0);
138 | }
139 |
140 | public int attributeLocation(final String name) {
141 | return GLES20.glGetAttribLocation(mProgramID, name);
142 | }
143 |
144 | public void bindAttribLocation(final String name, int index) {
145 | GLES20.glBindAttribLocation(mProgramID, index, name);
146 | }
147 |
148 | /**
149 | * Created by wangyang on 15/7/18.
150 | */
151 | public static class ShaderObject {
152 |
153 | private int mShaderType;
154 | private int mShaderID;
155 |
156 |
157 | public int shaderID() {
158 | return mShaderID;
159 | }
160 |
161 | public ShaderObject() {
162 | mShaderType = 0;
163 | mShaderID = 0;
164 | }
165 |
166 | public ShaderObject(final String shaderCode, final int shaderType) {
167 | init(shaderCode, shaderType);
168 | }
169 |
170 | public boolean init(final String shaderCode, final int shaderType) {
171 | mShaderType = shaderType;
172 | mShaderID = loadShader(shaderType, shaderCode);
173 |
174 | //Debug Only
175 | assert mShaderID != 0 : "Shader Create Failed!";
176 |
177 | if(mShaderID == 0) {
178 | Log.e(LOG_TAG, "glCreateShader Failed!...");
179 | return false;
180 | }
181 |
182 | return true;
183 | }
184 |
185 | public final void release() {
186 | if(mShaderID == 0)
187 | return;
188 | GLES20.glDeleteShader(mShaderID);
189 | mShaderID = 0;
190 | }
191 |
192 | public static int loadShader(int type, final String code) {
193 | int shaderID = GLES20.glCreateShader(type);
194 |
195 | if(shaderID != 0) {
196 | GLES20.glShaderSource(shaderID, code);
197 | GLES20.glCompileShader(shaderID);
198 | int[] compiled = {0};
199 | GLES20.glGetShaderiv(shaderID, GLES20.GL_COMPILE_STATUS, compiled, 0);
200 | if(compiled[0] != GLES20.GL_TRUE)
201 | {
202 | String errMsg = GLES20.glGetShaderInfoLog(shaderID);
203 | Log.e(LOG_TAG, errMsg);
204 | GLES20.glDeleteShader(shaderID);
205 | return 0;
206 | }
207 | }
208 | return shaderID;
209 | }
210 |
211 | }
212 | }
213 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/common/SharedContext.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.common;
2 |
3 | import android.annotation.SuppressLint;
4 | import android.opengl.EGL14;
5 | import android.util.Log;
6 |
7 | import javax.microedition.khronos.egl.EGL10;
8 | import javax.microedition.khronos.egl.EGLConfig;
9 | import javax.microedition.khronos.egl.EGLContext;
10 | import javax.microedition.khronos.egl.EGLDisplay;
11 | import javax.microedition.khronos.egl.EGLSurface;
12 | import javax.microedition.khronos.opengles.GL10;
13 |
14 | /**
15 | * Created by wangyang on 15/12/13.
16 | */
17 |
18 | @SuppressLint("InlinedApi")
19 | public class SharedContext {
20 | public static final String LOG_TAG = Common.LOG_TAG;
21 | public static final int EGL_RECORDABLE_ANDROID = 0x3142;
22 |
23 | private EGLContext mContext;
24 | private EGLConfig mConfig;
25 | private EGLDisplay mDisplay;
26 | private EGLSurface mSurface;
27 | private EGL10 mEgl;
28 | private GL10 mGl;
29 |
30 | private static int mBitsR = 8, mBitsG = 8, mBitsB = 8, mBitsA = 8;
31 |
32 | //注意, 设置之后将影响之后的所有操作
33 | static public void setContextColorBits(int r, int g, int b, int a) {
34 | mBitsR = r;
35 | mBitsG = g;
36 | mBitsB = b;
37 | mBitsA = a;
38 | }
39 |
40 | public static SharedContext create() {
41 | return create(EGL10.EGL_NO_CONTEXT, 64, 64, EGL10.EGL_PBUFFER_BIT, null);
42 | }
43 |
44 | public static SharedContext create(int width, int height) {
45 | return create(EGL10.EGL_NO_CONTEXT, width, height, EGL10.EGL_PBUFFER_BIT, null);
46 | }
47 |
48 | public static SharedContext create(EGLContext context, int width, int height) {
49 | return create(context, width, height, EGL10.EGL_PBUFFER_BIT, null);
50 | }
51 |
52 | //contextType: EGL10.EGL_PBUFFER_BIT
53 | // EGL10.EGL_WINDOW_BIT
54 | // EGL10.EGL_PIXMAP_BIT
55 | // EGL_RECORDABLE_ANDROID ( = 0x3142 )
56 | // etc.
57 | public static SharedContext create(EGLContext context, int width, int height, int contextType, Object obj) {
58 |
59 | SharedContext sharedContext = new SharedContext();
60 | if(!sharedContext.initEGL(context, width, height, contextType, obj)) {
61 | sharedContext.release();
62 | sharedContext = null;
63 | }
64 | return sharedContext;
65 | }
66 |
67 | public EGLContext getContext() {
68 | return mContext;
69 | }
70 |
71 | public EGLDisplay getDisplay() {
72 | return mDisplay;
73 | }
74 |
75 | public EGLSurface getSurface() {
76 | return mSurface;
77 | }
78 |
79 | public EGL10 getEGL() {
80 | return mEgl;
81 | }
82 |
83 | public GL10 getGL() {
84 | return mGl;
85 | }
86 |
87 | SharedContext() {}
88 |
89 | public void release() {
90 | Log.i(LOG_TAG, "#### CGESharedGLContext Destroying context... ####");
91 | if(mDisplay != EGL10.EGL_NO_DISPLAY) {
92 | mEgl.eglMakeCurrent(mDisplay, EGL10.EGL_NO_SURFACE, EGL10.EGL_NO_SURFACE, EGL10.EGL_NO_CONTEXT);
93 | mEgl.eglDestroyContext(mDisplay, mContext);
94 | mEgl.eglDestroySurface(mDisplay, mSurface);
95 | mEgl.eglTerminate(mDisplay);
96 | }
97 |
98 | mDisplay = EGL10.EGL_NO_DISPLAY;
99 | mSurface = EGL10.EGL_NO_SURFACE;
100 | mContext = EGL10.EGL_NO_CONTEXT;
101 | }
102 |
103 | public void makeCurrent() {
104 | if(!mEgl.eglMakeCurrent(mDisplay, mSurface, mSurface, mContext)) {
105 | Log.e(LOG_TAG, "eglMakeCurrent failed:" + mEgl.eglGetError());
106 | }
107 | }
108 |
109 | public boolean swapBuffers() {
110 | return mEgl.eglSwapBuffers(mDisplay, mSurface);
111 | }
112 |
113 | private boolean initEGL(EGLContext context, int width, int height, int contextType, Object obj) {
114 |
115 | int[] contextAttribList = {
116 | EGL14.EGL_CONTEXT_CLIENT_VERSION, 2,
117 | EGL10.EGL_NONE
118 | };
119 |
120 | int[] configSpec = {
121 | EGL10.EGL_SURFACE_TYPE, contextType,
122 | EGL10.EGL_RENDERABLE_TYPE, EGL14.EGL_OPENGL_ES2_BIT,
123 | EGL10.EGL_RED_SIZE, 8, EGL10.EGL_GREEN_SIZE, 8,
124 | EGL10.EGL_BLUE_SIZE, 8, EGL10.EGL_ALPHA_SIZE, 8,
125 | EGL10.EGL_NONE
126 | };
127 |
128 | EGLConfig[] configs = new EGLConfig[1];
129 | int[] numConfig = new int[1];
130 | int[] version = new int[2];
131 |
132 | int surfaceAttribList[] = {
133 | EGL10.EGL_WIDTH, width,
134 | EGL10.EGL_HEIGHT, height,
135 | EGL10.EGL_NONE
136 | };
137 |
138 | mEgl = (EGL10) EGLContext.getEGL();
139 |
140 | if((mDisplay = mEgl.eglGetDisplay(EGL10.EGL_DEFAULT_DISPLAY)) == EGL10.EGL_NO_DISPLAY) {
141 | Log.e(LOG_TAG, String.format("eglGetDisplay() returned error 0x%x", mEgl.eglGetError()));
142 | return false;
143 | }
144 |
145 | if(!mEgl.eglInitialize(mDisplay, version)) {
146 | Log.e(LOG_TAG, String.format("eglInitialize() returned error 0x%x", mEgl.eglGetError()));
147 | return false;
148 | }
149 |
150 | Log.i(LOG_TAG, String.format("eglInitialize - major: %d, minor: %d", version[0], version[1]));
151 |
152 | if(!mEgl.eglChooseConfig(mDisplay, configSpec, configs, 1, numConfig)) {
153 | Log.e(LOG_TAG, String.format("eglChooseConfig() returned error 0x%x", mEgl.eglGetError()));
154 | return false;
155 | }
156 |
157 | Log.i(LOG_TAG, String.format("Config num: %d, has sharedContext: %s", numConfig[0], context == EGL10.EGL_NO_CONTEXT ? "NO" : "YES"));
158 |
159 | mConfig = configs[0];
160 |
161 | mContext = mEgl.eglCreateContext(mDisplay, mConfig,
162 | context, contextAttribList);
163 | if (mContext == EGL10.EGL_NO_CONTEXT) {
164 | Log.e(LOG_TAG, "eglCreateContext Failed!");
165 | return false;
166 | }
167 |
168 | switch (contextType) {
169 | case EGL10.EGL_PIXMAP_BIT:
170 | mSurface = mEgl.eglCreatePixmapSurface(mDisplay, mConfig, obj, surfaceAttribList);
171 | break;
172 | case EGL10.EGL_WINDOW_BIT:
173 | mSurface = mEgl.eglCreateWindowSurface(mDisplay, mConfig, obj, surfaceAttribList);
174 | break;
175 | case EGL10.EGL_PBUFFER_BIT:
176 | case EGL_RECORDABLE_ANDROID:
177 | mSurface = mEgl.eglCreatePbufferSurface(mDisplay, mConfig,
178 | surfaceAttribList);
179 | }
180 |
181 | if (mSurface == EGL10.EGL_NO_SURFACE) {
182 | Log.e(LOG_TAG, "eglCreatePbufferSurface Failed!");
183 | return false;
184 | }
185 |
186 | if (!mEgl.eglMakeCurrent(mDisplay, mSurface, mSurface, mContext)) {
187 | Log.e(LOG_TAG, "eglMakeCurrent failed:" + mEgl.eglGetError());
188 | return false;
189 | }
190 |
191 | int[] clientVersion = new int[1];
192 | mEgl.eglQueryContext(mDisplay, mContext, EGL14.EGL_CONTEXT_CLIENT_VERSION, clientVersion);
193 | Log.i(LOG_TAG, "EGLContext created, client version " + clientVersion[0]);
194 |
195 | mGl = (GL10) mContext.getGL();
196 |
197 | return true;
198 | }
199 | }
200 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/common/TextureDrawer.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.common;
2 |
3 | import android.opengl.GLES20;
4 | import android.util.Log;
5 |
6 | import java.nio.FloatBuffer;
7 |
8 | /**
9 | * Created by wangyang on 15/8/8.
10 | * A simple direct drawer with flip, scale & rotate
11 | */
12 | public class TextureDrawer {
13 |
14 | private static final String vsh = "" +
15 | "attribute vec2 vPosition;\n"+
16 | "varying vec2 texCoord;\n"+
17 | "uniform mat2 rotation;\n"+
18 | "uniform vec2 flipScale;\n"+
19 | "void main()\n"+
20 | "{\n"+
21 | " gl_Position = vec4(vPosition, 0.0, 1.0);\n"+
22 | " texCoord = flipScale * (vPosition / 2.0 * rotation) + 0.5;\n"+
23 | "}";
24 |
25 | private static final String fsh = "" +
26 | "precision mediump float;\n" +
27 | "varying vec2 texCoord;\n" +
28 | "uniform sampler2D inputImageTexture;\n" +
29 | "void main()\n" +
30 | "{\n" +
31 | " gl_FragColor = texture2D(inputImageTexture, texCoord);\n" +
32 | "}";
33 |
34 | public static final float[] vertices = {-1.0f, -1.0f, 1.0f, -1.0f, 1.0f, 1.0f, -1.0f, 1.0f};
35 | public static final int DRAW_FUNCTION = GLES20.GL_TRIANGLE_FAN;
36 |
37 | private ProgramObject mProgram;
38 | private int mVertBuffer;
39 | private int mRotLoc, mFlipScaleLoc;
40 |
41 | private TextureDrawer() {
42 | }
43 |
44 | protected boolean init() {
45 | mProgram = new ProgramObject();
46 | if(!mProgram.init(vsh, fsh)) {
47 | mProgram.release();
48 | mProgram = null;
49 | return false;
50 | }
51 |
52 | mProgram.bind();
53 |
54 | mRotLoc = mProgram.getUniformLoc("rotation");
55 | mFlipScaleLoc = mProgram.getUniformLoc("flipScale");
56 |
57 | int[] vertBuffer = new int[1];
58 | GLES20.glGenBuffers(1, vertBuffer, 0);
59 | mVertBuffer = vertBuffer[0];
60 |
61 | GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, mVertBuffer);
62 | FloatBuffer buffer = FloatBuffer.allocate(vertices.length);
63 | buffer.put(vertices).position(0);
64 |
65 | GLES20.glBufferData(GLES20.GL_ARRAY_BUFFER, 32, buffer, GLES20.GL_STATIC_DRAW);
66 | setRotation(0.0f);
67 | setFlipScale(1.0f, 1.0f);
68 | return true;
69 | }
70 |
71 |
72 | public static TextureDrawer create() {
73 | TextureDrawer drawer = new TextureDrawer();
74 | if(!drawer.init())
75 | {
76 | Log.e(Common.LOG_TAG, "TextureDrawer create failed!");
77 | drawer.release();
78 | drawer = null;
79 | }
80 | return drawer;
81 | }
82 |
83 | public void release() {
84 | mProgram.release();
85 | GLES20.glDeleteBuffers(1, new int[]{mVertBuffer}, 0);
86 | mProgram = null;
87 | mVertBuffer = 0;
88 | }
89 |
90 | public void drawTexture(int texID) {
91 | drawTexture(texID, GLES20.GL_TEXTURE_2D);
92 | }
93 |
94 | public void drawTexture(int texID, int type) {
95 | GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
96 | GLES20.glBindTexture(type, texID);
97 |
98 | GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, mVertBuffer);
99 | GLES20.glEnableVertexAttribArray(0);
100 | GLES20.glVertexAttribPointer(0, 2, GLES20.GL_FLOAT, false, 0, 0);
101 | mProgram.bind();
102 | GLES20.glDrawArrays(DRAW_FUNCTION, 0, 4);
103 | }
104 |
105 | //特殊外部辅助用法
106 | public void bindVertexBuffer() {
107 | GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, mVertBuffer);
108 | }
109 |
110 | public void setRotation(float rad) {
111 | _rotate(mRotLoc, rad);
112 | }
113 |
114 | public void setFlipScale(float x, float y) {
115 | mProgram.bind();
116 | GLES20.glUniform2f(mFlipScaleLoc, x, y);
117 | }
118 |
119 | private void _rotate(int location, float rad) {
120 | final float cosRad = (float)Math.cos(rad);
121 | final float sinRad = (float)Math.sin(rad);
122 |
123 | final float[] rotation = new float[] {
124 | cosRad, sinRad,
125 | -sinRad, cosRad
126 | };
127 |
128 | mProgram.bind();
129 | GLES20.glUniformMatrix2fv(location, 1, false, rotation, 0);
130 | }
131 |
132 | }
133 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/myUtils/FileUtil.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.myUtils;
2 |
3 | import android.content.Context;
4 | import android.os.Environment;
5 | import android.util.Log;
6 |
7 | import org.wysaid.common.Common;
8 |
9 | import java.io.File;
10 | import java.io.FileInputStream;
11 | import java.io.FileOutputStream;
12 |
13 | /**
14 | * Created by wangyang on 15/11/27.
15 | */
16 | public class FileUtil {
17 |
18 | public static final String LOG_TAG = Common.LOG_TAG;
19 | public static final File externalStorageDirectory = Environment.getExternalStorageDirectory();
20 | public static String packageFilesDirectory = null;
21 | public static String storagePath = null;
22 | private static String mDefaultFolder = "libCGE";
23 |
24 | public static void setDefaultFolder(String defaultFolder) {
25 | mDefaultFolder = defaultFolder;
26 | }
27 |
28 | public static String getPath() {
29 | return getPath(null);
30 | }
31 |
32 | public static String getPath(Context context) {
33 |
34 | if(storagePath == null) {
35 | storagePath = externalStorageDirectory.getAbsolutePath() + "/" + mDefaultFolder;
36 | File file = new File(storagePath);
37 | if(!file.exists()) {
38 | if(!file.mkdirs()) {
39 | storagePath = getPathInPackage(context, true);
40 | }
41 | }
42 | }
43 |
44 | return storagePath;
45 | }
46 |
47 | public static String getPathInPackage(Context context, boolean grantPermissions) {
48 |
49 | if(context == null || packageFilesDirectory != null)
50 | return packageFilesDirectory;
51 |
52 | //手机不存在sdcard, 需要使用 data/data/name.of.package/files 目录
53 | String path = context.getFilesDir() + "/" + mDefaultFolder;
54 | File file = new File(path);
55 |
56 | if(!file.exists()) {
57 | if(!file.mkdirs()) {
58 | Log.e(LOG_TAG, "在pakage目录创建CGE临时目录失败!");
59 | return null;
60 | }
61 |
62 | if(grantPermissions) {
63 |
64 | //设置隐藏目录权限.
65 | if (file.setExecutable(true, false)) {
66 | Log.i(LOG_TAG, "Package folder is executable");
67 | }
68 |
69 | if (file.setReadable(true, false)) {
70 | Log.i(LOG_TAG, "Package folder is readable");
71 | }
72 |
73 | if (file.setWritable(true, false)) {
74 | Log.i(LOG_TAG, "Package folder is writable");
75 | }
76 | }
77 | }
78 |
79 | packageFilesDirectory = path;
80 | return packageFilesDirectory;
81 | }
82 |
83 | public static void saveTextContent(String text, String filename) {
84 | Log.i(LOG_TAG, "Saving text : " + filename);
85 |
86 | try {
87 | FileOutputStream fileout = new FileOutputStream(filename);
88 | fileout.write(text.getBytes());
89 | fileout.flush();
90 | fileout.close();
91 | } catch (Exception e) {
92 | Log.e(LOG_TAG, "Error: " + e.getMessage());
93 | }
94 | }
95 |
96 | public static String getTextContent(String filename) {
97 | Log.i(LOG_TAG, "Reading text : " + filename);
98 |
99 | if(filename == null) {
100 | return null;
101 | }
102 |
103 | String content = "";
104 | byte[] buffer = new byte[256]; //Create cache for reading.
105 |
106 | try {
107 |
108 | FileInputStream filein = new FileInputStream(filename);
109 | int len;
110 |
111 | while(true) {
112 | len = filein.read(buffer);
113 |
114 | if(len <= 0)
115 | break;
116 |
117 | content += new String(buffer, 0, len);
118 | }
119 |
120 | } catch (Exception e) {
121 | Log.e(LOG_TAG, "Error: " + e.getMessage());
122 | return null;
123 | }
124 |
125 | return content;
126 | }
127 |
128 | }
129 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/myUtils/ImageUtil.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.myUtils;
2 |
3 | import android.graphics.Bitmap;
4 | import android.media.FaceDetector;
5 | import android.util.Log;
6 |
7 | import java.io.BufferedOutputStream;
8 | import java.io.FileOutputStream;
9 | import java.io.IOException;
10 |
11 | /**
12 | * Created by wangyang on 15/7/27.
13 | */
14 |
15 | public class ImageUtil extends FileUtil {
16 |
17 |
18 | public static void saveBitmap(Bitmap bmp) {
19 | String path = getPath();
20 | long currentTime = System.currentTimeMillis();
21 | String filename = path + "/" + currentTime + ".jpg";
22 | saveBitmap(bmp, filename);
23 | }
24 |
25 | public static void saveBitmap(Bitmap bmp, String filename) {
26 |
27 | Log.i(LOG_TAG, "saving Bitmap : " + filename);
28 |
29 | try {
30 | FileOutputStream fileout = new FileOutputStream(filename);
31 | BufferedOutputStream bufferOutStream = new BufferedOutputStream(fileout);
32 | bmp.compress(Bitmap.CompressFormat.JPEG, 100, bufferOutStream);
33 | bufferOutStream.flush();
34 | bufferOutStream.close();
35 | } catch (IOException e) {
36 | Log.e(LOG_TAG, "Err when saving bitmap...");
37 | e.printStackTrace();
38 | return;
39 | }
40 |
41 | Log.i(LOG_TAG, "Bitmap " + filename + " saved!");
42 | }
43 |
44 | public static class FaceRects {
45 | public int numOfFaces; // 实际检测出的人脸数
46 | public FaceDetector.Face[] faces; // faces.length >= numOfFaces
47 | }
48 |
49 | public static FaceRects findFaceByBitmap(Bitmap bmp) {
50 | return findFaceByBitmap(bmp, 1);
51 | }
52 |
53 | public static FaceRects findFaceByBitmap(Bitmap bmp, int maxFaces) {
54 |
55 | if(bmp == null) {
56 | Log.e(LOG_TAG, "Invalid Bitmap for Face Detection!");
57 | return null;
58 | }
59 |
60 | Bitmap newBitmap = bmp;
61 |
62 | //人脸检测API 仅支持 RGB_565 格式当图像. (for now)
63 | if(newBitmap.getConfig() != Bitmap.Config.RGB_565) {
64 | newBitmap = newBitmap.copy(Bitmap.Config.RGB_565, false);
65 | }
66 |
67 | FaceRects rects = new FaceRects();
68 | rects.faces = new FaceDetector.Face[maxFaces];
69 |
70 | try {
71 | FaceDetector detector = new FaceDetector(newBitmap.getWidth(), newBitmap.getHeight(), maxFaces);
72 | rects.numOfFaces = detector.findFaces(newBitmap, rects.faces);
73 | } catch (Exception e) {
74 | Log.e(LOG_TAG, "findFaceByBitmap error: " + e.getMessage());
75 | return null;
76 | }
77 |
78 |
79 | if(newBitmap != bmp) {
80 | newBitmap.recycle();
81 | }
82 | return rects;
83 | }
84 |
85 |
86 |
87 | }
88 |
89 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/nativePort/CGEFaceFunctions.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.nativePort;
2 |
3 | import android.graphics.Bitmap;
4 | import android.graphics.PointF;
5 |
6 | import org.wysaid.common.Common;
7 | import org.wysaid.common.SharedContext;
8 |
9 | /**
10 | * Created by wangyang on 15/12/20.
11 | */
12 | public class CGEFaceFunctions {
13 |
14 | static {
15 | NativeLibraryLoader.load();
16 | }
17 |
18 | public static class FaceFeature {
19 |
20 | public FaceFeature() {
21 | leftEyePos = new PointF();
22 | rightEyePos = new PointF();
23 | mouthPos = new PointF();
24 | chinPos = new PointF();
25 | }
26 |
27 | public FaceFeature(PointF leftEye, PointF rightEye, PointF mouth, PointF chin, float imgWidth, float imgHeight) {
28 | leftEyePos = leftEye;
29 | rightEyePos = rightEye;
30 | mouthPos = mouth;
31 | chinPos = chin;
32 | faceImageWidth = imgWidth;
33 | faceImageHeight = imgHeight;
34 | }
35 |
36 | public PointF leftEyePos, rightEyePos;
37 | public PointF mouthPos;
38 | public PointF chinPos;
39 | public float faceImageWidth, faceImageHeight;
40 | }
41 |
42 | public enum AutoLumAdjustMode
43 | {
44 | LumAdjust_NONE,
45 | LumAdjust_FollowHSl,
46 | LumAdjust_OnlyBrightness,
47 | };
48 |
49 | public static Bitmap blendFaceWidthFeatures(Bitmap srcImage, FaceFeature srcFeature, Bitmap dstImage, FaceFeature dstFeature, AutoLumAdjustMode mode, SharedContext context) {
50 | SharedContext ctx = context == null ? SharedContext.create() : context;
51 | ctx.makeCurrent();
52 |
53 | int srcTexID = Common.genNormalTextureID(srcImage);
54 | int dstTexID = Common.genNormalTextureID(dstImage);
55 |
56 | if(srcTexID == 0 || dstTexID == 0) {
57 | return null;
58 | }
59 |
60 | float[] srcFaceFeature = {
61 | srcFeature.leftEyePos.x, srcFeature.leftEyePos.y,
62 | srcFeature.rightEyePos.x, srcFeature.rightEyePos.y,
63 | srcFeature.mouthPos.x, srcFeature.mouthPos.y,
64 | srcFeature.chinPos.x, srcFeature.chinPos.y,
65 | srcFeature.faceImageWidth, srcFeature.faceImageHeight,
66 | };
67 |
68 | float[] dstFaceFeature = {
69 | dstFeature.leftEyePos.x, dstFeature.leftEyePos.y,
70 | dstFeature.rightEyePos.x, dstFeature.rightEyePos.y,
71 | dstFeature.mouthPos.x, dstFeature.mouthPos.y,
72 | dstFeature.chinPos.x, dstFeature.chinPos.y,
73 | dstFeature.faceImageWidth, dstFeature.faceImageHeight,
74 | };
75 |
76 | Bitmap result = nativeBlendFaceWithFeatures(srcTexID, srcFaceFeature, dstTexID, dstFaceFeature, mode.ordinal());
77 |
78 | if(context == null) {
79 | ctx.release();
80 | }
81 |
82 | return result;
83 | }
84 |
85 |
86 | ////////////////////////////////////
87 |
88 | private static native Bitmap nativeBlendFaceWithFeatures(int srcTexID, float[] srcFeature, int dstTexID, float[] dstFeature, int mode);
89 |
90 | }
91 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/nativePort/CGEFrameRecorder.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.nativePort;
2 |
3 | //import java.nio.ByteBuffer;
4 | //import java.nio.ShortBuffer;
5 |
6 | /**
7 | * Created by wangyang on 15/7/29.
8 | */
9 |
10 | //A wrapper for native class FrameRecorder
11 | public class CGEFrameRecorder extends CGEFrameRenderer {
12 |
13 | // static {
14 | // NativeLibraryLoader.load();
15 | // }
16 |
17 | // public CGEFrameRecorder() {
18 | // super(0); //防止多次初始化
19 | // mNativeAddress = nativeCreateRecorder();
20 | // }
21 |
22 | // public void enableFaceDetectWithDefaultFilter(boolean shouldDetect) {
23 | // if(shouldDetect) {
24 | // nativeSetTrackedFilter(mNativeAddress, "@beautify bilateral 100 6.8 1 @adjust brightness 0.14 @adjust contrast 1.12 ");
25 | // } else {
26 | // nativeSetTrackedFilter(mNativeAddress, "");
27 | // }
28 | // }
29 | //
30 | // public void setTrackedFilterIntensity(float intensity) {
31 | // nativeSetTrackedFilterIntensity(mNativeAddress, intensity);
32 | // }
33 | //
34 | // public void setFaceArea(float x, float y, float gx, float gy) {
35 | // nativeSetFaceArea(mNativeAddress, x, y, gx, gy);
36 | // }
37 |
38 | /////////////////视频录制相关////////////////////
39 |
40 | // public boolean startRecording(int fps, String filename) {
41 | // if(mNativeAddress != null)
42 | // return nativeStartRecording(mNativeAddress, fps, filename);
43 | // return false;
44 | // }
45 | //
46 | // public boolean isRecordingStarted() {
47 | // if(mNativeAddress != null)
48 | // return nativeIsRecordingStarted(mNativeAddress);
49 | // return false;
50 | // }
51 | //
52 | // public boolean endRecording(boolean shouldSave) {
53 | // if(mNativeAddress != null)
54 | // return nativeEndRecording(mNativeAddress, shouldSave);
55 | // return false;
56 | // }
57 | //
58 | // public void pauseRecording() {
59 | // if(mNativeAddress != null)
60 | // nativePauseRecording(mNativeAddress);
61 | // }
62 | //
63 | // public boolean isRecordingPaused() {
64 | // if(mNativeAddress != null)
65 | // return nativeIsRecordingPaused(mNativeAddress);
66 | // return false;
67 | // }
68 | //
69 | // public boolean resumeRecording() {
70 | // if(mNativeAddress != null)
71 | // return nativeResumeRecording(mNativeAddress);
72 | // return false;
73 | // }
74 | //
75 | // public double getTimestamp() {
76 | // if(mNativeAddress != null)
77 | // return nativeGetTimestamp(mNativeAddress);
78 | // return 0.0;
79 | // }
80 | //
81 | // public double getVideoStreamtime() {
82 | // if(mNativeAddress != null)
83 | // return nativeGetVideoStreamtime(mNativeAddress);
84 | // return 0.0;
85 | // }
86 | //
87 | // public double getAudioStreamtime() {
88 | // if(mNativeAddress != null)
89 | // return nativeGetAudioStreamtime(mNativeAddress);
90 | // return 0.0;
91 | // }
92 | //
93 | // public void setTempDir(String dir) {
94 | // if(mNativeAddress != null)
95 | // nativeSetTempDir(mNativeAddress, dir);
96 | // }
97 | //
98 | // //需要置于GPU绘制线程
99 | // public void recordImageFrame() {
100 | // if(mNativeAddress != null)
101 | // nativeRecordImageFrame(mNativeAddress);
102 | // }
103 | //
104 | // //需要自行loop
105 | // public void recordAudioFrame(ShortBuffer audioBuffer, int bufferLen) {
106 | // if(mNativeAddress != null)
107 | // nativeRecordAudioFrame(mNativeAddress, audioBuffer, bufferLen);
108 | // }
109 |
110 | ///////////////// private ///////////////////////
111 |
112 | // private native ByteBuffer nativeCreateRecorder();
113 |
114 | //人脸美化相关
115 |
116 | // private native void nativeSetTrackedFilter(ByteBuffer holder, String config);
117 | // private native void nativeSetTrackedFilterIntensity(ByteBuffer holder, float intensity);
118 | // private native void nativeSetFaceArea(ByteBuffer holder, float x, float y, float gx, float gy);
119 |
120 | /////////////////视频录制相关////////////////////
121 | // private native boolean nativeStartRecording(ByteBuffer holder, int fps, String filename);
122 | // private native boolean nativeIsRecordingStarted(ByteBuffer holder);
123 | // private native boolean nativeEndRecording(ByteBuffer holder, boolean shouldSave);
124 | // private native void nativePauseRecording(ByteBuffer holder);
125 | // private native boolean nativeIsRecordingPaused(ByteBuffer holder);
126 | // private native boolean nativeResumeRecording(ByteBuffer holder);
127 | // private native double nativeGetTimestamp(ByteBuffer holder);
128 | //
129 | // private native double nativeGetVideoStreamtime(ByteBuffer holder);
130 | // private native double nativeGetAudioStreamtime(ByteBuffer holder);
131 | //
132 | // private native void nativeSetTempDir(ByteBuffer holder, String dir);
133 | //
134 | // private native void nativeRecordImageFrame(ByteBuffer holder);
135 | // private native void nativeRecordAudioFrame(ByteBuffer holder, ShortBuffer audioBuffer, int bufferLen);
136 |
137 | }
138 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/nativePort/CGEFrameRenderer.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.nativePort;
2 |
3 | import java.nio.ByteBuffer;
4 |
5 | /**
6 | * Created by wangyang on 15/11/26.
7 | */
8 |
9 | //A wrapper for native class FrameRecorder
10 | public class CGEFrameRenderer {
11 |
12 | static {
13 | NativeLibraryLoader.load();
14 | }
15 |
16 | protected ByteBuffer mNativeAddress;
17 |
18 | public CGEFrameRenderer() {
19 | mNativeAddress = nativeCreateRenderer();
20 | }
21 |
22 | //Avoid 'nativeCreateRenderer' being called.
23 | protected CGEFrameRenderer(int dummy) {
24 |
25 | }
26 |
27 | //srcWidth&srcheight stands for the external_texture's width&height
28 | //dstWidth&dstHeight stands for the recording resolution (default 640*480)
29 | //dstWidth/dstHeight should not be changed after "init()" is called.
30 | //srcWidth/srcHeight could be changed by calling "srcResize" function.
31 | public boolean init(int srcWidth, int srcHeight, int dstWidth, int dstHeight) {
32 | if(mNativeAddress != null)
33 | return nativeInit(mNativeAddress, srcWidth, srcHeight, dstWidth, dstHeight);
34 | return false;
35 | }
36 |
37 | //Will effect the framebuffer
38 | public void update(int externalTexture, float[] transformMatrix) {
39 | if(mNativeAddress != null)
40 | nativeUpdate(mNativeAddress, externalTexture, transformMatrix);
41 | }
42 |
43 | //Won't effect the framebuffer
44 | //the arguments means the viewport.
45 | public void render(int x, int y, int width, int height) {
46 | if(mNativeAddress != null)
47 | nativeRender(mNativeAddress, x, y, width, height);
48 | }
49 |
50 | public void drawCache() {
51 | if(mNativeAddress != null)
52 | nativeDrawCache(mNativeAddress);
53 | }
54 |
55 | //set the rotation of the camera texture
56 | public void setSrcRotation(float rad) {
57 | if(mNativeAddress != null)
58 | nativeSetSrcRotation(mNativeAddress, rad);
59 | }
60 |
61 | //set the flip/scaling for the camera texture
62 | public void setSrcFlipScale(float x, float y) {
63 | if(mNativeAddress != null)
64 | nativeSetSrcFlipScale(mNativeAddress, x, y);
65 | }
66 |
67 | //set the render result's rotation
68 | public void setRenderRotation(float rad) {
69 | if(mNativeAddress != null)
70 | nativeSetRenderRotation(mNativeAddress, rad);
71 | }
72 |
73 | //set the render result's flip/scaling
74 | public void setRenderFlipScale(float x, float y) {
75 | if(mNativeAddress != null)
76 | nativeSetRenderFlipScale(mNativeAddress, x, y);
77 | }
78 |
79 | //initialize the filters width config string
80 | public void setFilterWidthConfig(final String config) {
81 | if(mNativeAddress != null)
82 | nativeSetFilterWidthConfig(mNativeAddress, config);
83 | }
84 |
85 | //set the mask rotation (radian)
86 | public void setMaskRotation(float rot) {
87 | if(mNativeAddress != null)
88 | nativeSetMaskRotation(mNativeAddress, rot);
89 | }
90 |
91 | //set the mask flipscale
92 | public void setMaskFlipScale(float x, float y) {
93 | if(mNativeAddress != null)
94 | nativeSetMaskFlipScale(mNativeAddress, x, y);
95 |
96 | }
97 |
98 |
99 | //set the intensity of the filter
100 | public void setFilterIntensity(float value) {
101 | if(mNativeAddress != null)
102 | nativeSetFilterIntensity(mNativeAddress, value);
103 | }
104 |
105 | public void srcResize(int width, int height) {
106 | if(mNativeAddress != null)
107 | nativeSrcResize(mNativeAddress, width, height);
108 | }
109 |
110 | public void release() {
111 | if(mNativeAddress != null) {
112 | nativeRelease(mNativeAddress);
113 | mNativeAddress = null;
114 | }
115 | }
116 |
117 |
118 | public void setMaskTexture(int texID, float aspectRatio) {
119 | if(mNativeAddress != null)
120 | nativeSetMaskTexture(mNativeAddress, texID, aspectRatio);
121 | }
122 |
123 | public void setMaskTextureRatio(float aspectRatio) {
124 | if(mNativeAddress != null)
125 | nativeSetMaskTextureRatio(mNativeAddress, aspectRatio);
126 | }
127 |
128 | public int queryBufferTexture() {
129 | if(mNativeAddress != null)
130 | return nativeQueryBufferTexture(mNativeAddress);
131 | return 0;
132 | }
133 |
134 | ///////////////// protected ///////////////////////
135 |
136 | protected native ByteBuffer nativeCreateRenderer();
137 | protected native boolean nativeInit(ByteBuffer holder, int srcWidth, int srcHeight, int dstWidth, int dstHeight);
138 | protected native void nativeUpdate(ByteBuffer holder, int externalTexture, float[] transformMatrix);
139 |
140 | protected native void nativeRender(ByteBuffer holder, int x, int y, int width, int height);
141 | protected native void nativeDrawCache(ByteBuffer holder);
142 |
143 | protected native void nativeSetSrcRotation(ByteBuffer holder, float rad);
144 | protected native void nativeSetSrcFlipScale(ByteBuffer holder, float x, float y);
145 | protected native void nativeSetRenderRotation(ByteBuffer holder, float rad);
146 | protected native void nativeSetRenderFlipScale(ByteBuffer holder, float x, float y);
147 | protected native void nativeSetFilterWidthConfig(ByteBuffer holder, String config);
148 | protected native void nativeSetFilterIntensity(ByteBuffer holder, float value);
149 | protected native void nativeSetMaskRotation(ByteBuffer holder, float value);
150 | protected native void nativeSetMaskFlipScale(ByteBuffer holder, float x, float y);
151 |
152 | protected native void nativeSrcResize(ByteBuffer holder, int width, int height);
153 |
154 | protected native void nativeSetMaskTexture(ByteBuffer holder, int texID, float aspectRatio);
155 | protected native void nativeSetMaskTextureRatio(ByteBuffer holder, float aspectRatio);
156 |
157 | protected native void nativeRelease(ByteBuffer holder);
158 |
159 | protected native int nativeQueryBufferTexture(ByteBuffer holder);
160 |
161 | }
162 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/nativePort/CGEImageHandler.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.nativePort;
2 |
3 | import android.graphics.Bitmap;
4 |
5 | import java.nio.ByteBuffer;
6 |
7 | /**
8 | * Created by wysaid on 15/12/25.
9 | * Mail: admin@wysaid.org
10 | * blog: wysaid.org
11 | */
12 |
13 | //对 c++ native class 'CGEImageHandlerAndroid' 做映射
14 |
15 | public class CGEImageHandler {
16 |
17 | static {
18 | NativeLibraryLoader.load();
19 | }
20 |
21 | protected ByteBuffer mNativeAddress;
22 |
23 | public CGEImageHandler() {
24 | mNativeAddress = nativeCreateHandler();
25 | }
26 |
27 | public boolean initWidthBitmap(Bitmap bmp) {
28 | if(bmp == null)
29 | return false;
30 |
31 | if(bmp.getConfig() != Bitmap.Config.ARGB_8888) {
32 | bmp = bmp.copy(Bitmap.Config.ARGB_8888, false);
33 | }
34 |
35 | return nativeInitWithBitmap(mNativeAddress, bmp);
36 | }
37 |
38 | public Bitmap getResultBitmap() {
39 | return nativeGetResultBitmap(mNativeAddress);
40 | }
41 |
42 | public void setDrawerRotation(float rad) {
43 | nativeSetDrawerRotation(mNativeAddress, rad);
44 | }
45 |
46 | public void setDrawerFlipScale(float x, float y) {
47 | nativeSetDrawerFlipScale(mNativeAddress, x, y);
48 | }
49 |
50 | public void setFilterWithConfig(String config) {
51 | nativeSetFilterWithConfig(mNativeAddress, config);
52 | }
53 |
54 | public void setFilterIntensity(float intensity) {
55 | nativeSetFilterIntensity(mNativeAddress, intensity);
56 | }
57 |
58 | public void drawResult() {
59 | nativeDrawResult(mNativeAddress);
60 | }
61 |
62 | public void bindTargetFBO() {
63 | nativeBindTargetFBO(mNativeAddress);
64 | }
65 |
66 | public void setAsTarget() {
67 | nativeSetAsTarget(mNativeAddress);
68 | }
69 |
70 | public void swapBufferFBO() {
71 | nativeSwapBufferFBO(mNativeAddress);
72 | }
73 |
74 | public void revertImage() {
75 | nativeRevertImage(mNativeAddress);
76 | }
77 |
78 | public void processingFilters() {
79 | nativeProcessingFilters(mNativeAddress);
80 | }
81 |
82 | public void release() {
83 | if(mNativeAddress != null) {
84 | nativeRelease(mNativeAddress);
85 | mNativeAddress = null;
86 | }
87 | }
88 |
89 | ///////////////// protected ///////////////////////
90 |
91 | protected native ByteBuffer nativeCreateHandler();
92 | protected native boolean nativeInitWithBitmap(ByteBuffer holder, Bitmap bmp);
93 | protected native Bitmap nativeGetResultBitmap(ByteBuffer holder);
94 |
95 | protected native void nativeSetDrawerRotation(ByteBuffer holder, float rad);
96 | protected native void nativeSetDrawerFlipScale(ByteBuffer holder, float x, float y);
97 | protected native boolean nativeSetFilterWithConfig(ByteBuffer holder, String config);
98 | protected native void nativeSetFilterIntensity(ByteBuffer holder, float value);
99 |
100 | protected native void nativeDrawResult(ByteBuffer holder);
101 | protected native void nativeBindTargetFBO(ByteBuffer holder);
102 | protected native void nativeSetAsTarget(ByteBuffer holder);
103 | protected native void nativeSwapBufferFBO(ByteBuffer holder);
104 | protected native void nativeRevertImage(ByteBuffer holder);
105 | protected native void nativeProcessingFilters(ByteBuffer holder);
106 |
107 | protected native void nativeRelease(ByteBuffer holder);
108 | }
109 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/nativePort/CGENativeLibrary.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Created by wysaid on 15/7/8.
3 | * Mail: admin@wysaid.org
4 | * blog: wysaid.org
5 | */
6 |
7 | package org.wysaid.nativePort;
8 |
9 | import android.graphics.Bitmap;
10 | import android.opengl.GLES20;
11 | import android.opengl.GLUtils;
12 | import android.util.Log;
13 |
14 | import org.wysaid.common.Common;
15 |
16 | public class CGENativeLibrary {
17 |
18 | static {
19 | NativeLibraryLoader.load();
20 | }
21 |
22 | public interface LoadImageCallback {
23 | Bitmap loadImage(String name, Object arg);
24 | void loadImageOK(Bitmap bmp, Object arg);
25 | }
26 |
27 | static LoadImageCallback loadImageCallback;
28 | static Object callbackArg;
29 |
30 | public static void setLoadImageCallback(LoadImageCallback callback, Object arg) {
31 | loadImageCallback = callback;
32 | callbackArg = arg;
33 | }
34 |
35 | public static class TextureResult {
36 | int texID;
37 | int width, height;
38 | }
39 |
40 | //will be called from jni.
41 | public static TextureResult loadTextureByName(String sourceName) {
42 | if(loadImageCallback == null) {
43 | Log.i(Common.LOG_TAG, "The loading callback is not set!");
44 | return null;
45 | }
46 |
47 | Bitmap bmp = loadImageCallback.loadImage(sourceName, callbackArg);
48 |
49 | if(bmp == null) {
50 | return null;
51 | }
52 |
53 | TextureResult result = new TextureResult();
54 | int[] texID = new int[1];
55 | GLES20.glGenTextures(1, texID, 0);
56 | result.texID = texID[0];
57 | result.width = bmp.getWidth();
58 | result.height = bmp.getHeight();
59 | GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, result.texID);
60 | GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bmp, 0);
61 | GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
62 | GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
63 | GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
64 | GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
65 |
66 | loadImageCallback.loadImageOK(bmp, callbackArg);
67 | return result;
68 | }
69 |
70 | public static Bitmap filterImage_MultipleEffects(Bitmap bmp, String config, float intensity) {
71 | if(config == null || config == "") {
72 | return bmp;
73 | }
74 | return cgeFilterImage_MultipleEffects(bmp, config, intensity);
75 | }
76 |
77 | public static void filterImage_MultipleEffectsWriteBack(Bitmap bmp, String config, float intensity) {
78 | if(config != null && config != "") {
79 | cgeFilterImage_MultipleEffectsWriteBack(bmp, config, intensity);
80 | }
81 | }
82 |
83 | // 多重特效滤镜, 提供配置文件内容直接进行, 返回相同大小的bitmap。
84 | // intensity 表示滤镜强度 [0, 1]
85 | public static native Bitmap cgeFilterImage_MultipleEffects(Bitmap bmp, String config, float intensity);
86 |
87 | // 同上, 结果直接写回传入bitmap, 无返回值
88 | public static native void cgeFilterImage_MultipleEffectsWriteBack(Bitmap bmp, String config, float intensity);
89 |
90 | ////////////////////////////////////
91 |
92 | // private static native boolean cgeContextInit();
93 | // private static native void cgeContextRelease();
94 |
95 |
96 |
97 | }
98 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/nativePort/FFmpegNativeLibrary.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.nativePort;
2 |
3 | /**
4 | * Created by wangyang on 15/7/30.
5 | */
6 | public class FFmpegNativeLibrary {
7 | static {
8 | NativeLibraryLoader.load();
9 | }
10 |
11 |
12 | //////////////////////////////////////////
13 | // public static native void avRegisterAll();
14 |
15 | }
16 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/nativePort/NativeLibraryLoader.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.nativePort;
2 |
3 | /**
4 | * Created by wangyang on 15/7/30.
5 | */
6 | public class NativeLibraryLoader {
7 |
8 | private static boolean mLibraryLoaded = false;
9 |
10 | public static void load() {
11 | if(mLibraryLoaded)
12 | return;
13 | mLibraryLoaded = true;
14 | // System.loadLibrary("gnustl_shared");
15 | // System.loadLibrary("x264.142");
16 | // System.loadLibrary("ffmpeg");
17 | System.loadLibrary("CGE");
18 | // FFmpegNativeLibrary.avRegisterAll();
19 | }
20 |
21 | }
22 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/texUtils/TextureRenderer.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.texUtils;
2 |
3 | import android.opengl.GLES11Ext;
4 | import android.opengl.GLES20;
5 | import android.util.Log;
6 |
7 | import org.wysaid.common.Common;
8 | import org.wysaid.common.ProgramObject;
9 |
10 | import java.nio.FloatBuffer;
11 |
12 | /**
13 | * Created by wangyang on 15/7/23.
14 | */
15 | public abstract class TextureRenderer {
16 |
17 | public static final String LOG_TAG = Common.LOG_TAG;
18 |
19 | //初始化program 等
20 | public abstract boolean init(boolean isExternalOES);
21 |
22 | //为了保证GLContext 的对应, 不能等待finalize
23 | public abstract void release();
24 |
25 | public abstract void renderTexture(int texID, Viewport viewport);
26 |
27 | public abstract void setTextureSize(int width, int height);
28 |
29 | public abstract String getVertexShaderString();
30 |
31 | public abstract String getFragmentShaderString();
32 |
33 | public static class Viewport {
34 | public int x, y;
35 | public int width, height;
36 | public Viewport() {}
37 | public Viewport(int _x, int _y, int _width, int _height) {
38 | x = _x;
39 | y = _y;
40 | width = _width;
41 | height = _height;
42 | }
43 | }
44 |
45 | ////////////////////////////////////////////////////////////////
46 |
47 | protected static final String REQUIRE_STRING_EXTERNAL_OES = "#extension GL_OES_EGL_image_external : require\n";
48 | protected static final String SAMPLER2D_VAR_EXTERNAL_OES = "samplerExternalOES";
49 | protected static final String SAMPLER2D_VAR = "sampler2D";
50 |
51 | protected static final String vshDrawDefault = "" +
52 | "attribute vec2 vPosition;\n" +
53 | "varying vec2 texCoord;\n" +
54 | "uniform mat4 transform;\n" +
55 | "uniform mat2 rotation;\n" +
56 | "uniform vec2 flipScale;\n" +
57 | "void main()\n" +
58 | "{\n" +
59 | " gl_Position = vec4(vPosition, 0.0, 1.0);\n" +
60 | " vec2 coord = flipScale * (vPosition / 2.0 * rotation) + 0.5;\n" +
61 | " texCoord = (transform * vec4(coord, 0.0, 1.0)).xy;\n" +
62 | "}";
63 |
64 |
65 | protected static final String POSITION_NAME = "vPosition";
66 | protected static final String ROTATION_NAME = "rotation";
67 | protected static final String FLIPSCALE_NAME = "flipScale";
68 | protected static final String TRANSFORM_NAME = "transform";
69 |
70 | public static final float[] vertices = {-1.0f, -1.0f, 1.0f, -1.0f, 1.0f, 1.0f, -1.0f, 1.0f};
71 | public static final int DRAW_FUNCTION = GLES20.GL_TRIANGLE_FAN;
72 |
73 | protected int TEXTURE_2D_BINDABLE;
74 |
75 | protected int mVertexBuffer;
76 | protected ProgramObject mProgram;
77 |
78 | protected int mTextureWidth, mTextureHeight;
79 |
80 | protected int mRotationLoc, mFlipScaleLoc, mTransformLoc;
81 |
82 | //设置界面旋转弧度 -- 录像时一般是 PI / 2 (也就是 90°) 的整数倍
83 | public void setRotation(float rad) {
84 | final float cosRad = (float)Math.cos(rad);
85 | final float sinRad = (float)Math.sin(rad);
86 |
87 | float rot[] = new float[] {
88 | cosRad, sinRad,
89 | -sinRad, cosRad
90 | };
91 |
92 | assert mProgram != null : "setRotation must not be called before init!";
93 |
94 | mProgram.bind();
95 | GLES20.glUniformMatrix2fv(mRotationLoc, 1, false, rot, 0);
96 | }
97 |
98 | public void setFlipscale(float x, float y) {
99 | mProgram.bind();
100 | GLES20.glUniform2f(mFlipScaleLoc, x, y);
101 | }
102 |
103 | public void setTransform(float[] matrix) {
104 | mProgram.bind();
105 | GLES20.glUniformMatrix4fv(mTransformLoc, 1, false, matrix, 0);
106 | }
107 |
108 | protected boolean setProgramDefualt(String vsh, String fsh, boolean isExternalOES) {
109 | TEXTURE_2D_BINDABLE = isExternalOES ? GLES11Ext.GL_TEXTURE_EXTERNAL_OES : GLES20.GL_TEXTURE_2D;
110 | mProgram = new ProgramObject();
111 | mProgram.bindAttribLocation(POSITION_NAME, 0);
112 | String fshResult = (isExternalOES ? REQUIRE_STRING_EXTERNAL_OES : "") + String.format(fsh, isExternalOES ? SAMPLER2D_VAR_EXTERNAL_OES : SAMPLER2D_VAR);
113 | if(mProgram.init(vsh, fshResult)) {
114 | mRotationLoc = mProgram.getUniformLoc(ROTATION_NAME);
115 | mFlipScaleLoc = mProgram.getUniformLoc(FLIPSCALE_NAME);
116 | mTransformLoc = mProgram.getUniformLoc(TRANSFORM_NAME);
117 | setRotation(0.0f);
118 | setFlipscale(1.0f, 1.0f);
119 | setTransform(new float[]{
120 | 1.0f, 0.0f, 0.0f, 0.0f,
121 | 0.0f, 1.0f, 0.0f, 0.0f,
122 | 0.0f, 0.0f, 1.0f, 0.0f,
123 | 0.0f, 0.0f, 0.0f, 1.0f
124 | });
125 | return true;
126 | }
127 | return false;
128 | }
129 |
130 | protected void defaultInitialize() {
131 | int[] vertexBuffer = new int[1];
132 | GLES20.glGenBuffers(1, vertexBuffer, 0);
133 | mVertexBuffer = vertexBuffer[0];
134 |
135 | if(mVertexBuffer == 0) {
136 | Log.e(LOG_TAG, "Invalid VertexBuffer!");
137 | }
138 |
139 | GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, mVertexBuffer);
140 | FloatBuffer buffer = FloatBuffer.allocate(vertices.length);
141 | buffer.put(vertices).position(0);
142 | GLES20.glBufferData(GLES20.GL_ARRAY_BUFFER, 32, buffer, GLES20.GL_STATIC_DRAW);
143 | }
144 | }
145 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/texUtils/TextureRendererBlur.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.texUtils;
2 |
3 | import android.opengl.GLES11Ext;
4 | import android.opengl.GLES20;
5 | import android.util.Log;
6 |
7 | import org.wysaid.common.FrameBufferObject;
8 | import org.wysaid.common.ProgramObject;
9 |
10 | /**
11 | * Created by wangyang on 15/7/23.
12 | */
13 | public class TextureRendererBlur extends TextureRendererDrawOrigin {
14 |
15 | private static final String vshBlur = vshDrawDefault;
16 |
17 | private static final String vshBlurCache = "" +
18 | "attribute vec2 vPosition;\n" +
19 | "varying vec2 texCoord;\n" +
20 | "void main()\n" +
21 | "{\n" +
22 | " gl_Position = vec4(vPosition, 0.0, 1.0);\n" +
23 | " texCoord = vPosition / 2.0 + 0.5;\n" +
24 | "}";
25 |
26 | private static final String fshBlur = "" +
27 | "precision highp float;\n" +
28 | "varying vec2 texCoord;\n" +
29 | "uniform %s inputImageTexture;\n" +
30 | "uniform vec2 samplerSteps;\n" +
31 |
32 | "const int samplerRadius = 5;\n" +
33 | "const float samplerRadiusFloat = 5.0;\n" +
34 |
35 | "float random(vec2 seed)\n" +
36 | "{\n" +
37 | " return fract(sin(dot(seed ,vec2(12.9898,78.233))) * 43758.5453);\n" +
38 | "}\n" +
39 |
40 | "void main()\n" +
41 | "{\n" +
42 | " vec3 resultColor = vec3(0.0);\n" +
43 | " float blurPixels = 0.0;\n" +
44 | " float offset = random(texCoord) - 0.5;\n" +
45 | " \n" +
46 | " for(int i = -samplerRadius; i <= samplerRadius; ++i)\n" +
47 | " {\n" +
48 | " float percent = (float(i) + offset) / samplerRadiusFloat;\n" +
49 | " float weight = 1.0 - abs(percent);\n" +
50 | " vec2 coord = texCoord + samplerSteps * percent;\n" +
51 | " resultColor += texture2D(inputImageTexture, coord).rgb * weight;\n" +
52 | " blurPixels += weight;\n" +
53 | " }\n" +
54 |
55 | " gl_FragColor = vec4(resultColor / blurPixels, 1.0);\n" +
56 | // " gl_FragColor.r = texture2D(inputImageTexture, texCoord).r;\n" +
57 | "}";
58 |
59 | protected int mTexCache = 0;
60 |
61 | protected FrameBufferObject mFBO;
62 |
63 | protected int mCacheTexWidth, mCacheTexHeight;
64 |
65 | private static final String SAMPLER_STEPS = "samplerSteps";
66 |
67 | private int mStepsLoc = 0;
68 | private int mStepsLocCache = 0;
69 | private float mSamplerScale = 1.0f;
70 |
71 | private ProgramObject mProgramDrawCache;
72 |
73 | public static TextureRendererBlur create(boolean isExternalOES) {
74 | TextureRendererBlur renderer = new TextureRendererBlur();
75 | if(!renderer.init(isExternalOES)) {
76 | renderer.release();
77 | return null;
78 | }
79 | return renderer;
80 | }
81 |
82 | public void setSamplerRadius(float radius) {
83 | mSamplerScale = radius / 4.0f;
84 | }
85 |
86 |
87 | //TODO 优化非external_OES逻辑, cache和原始相同
88 | @Override
89 | public boolean init(boolean isExternalOES) {
90 | TEXTURE_2D_BINDABLE = isExternalOES ? GLES11Ext.GL_TEXTURE_EXTERNAL_OES : GLES20.GL_TEXTURE_2D;
91 | final String fshBlurExtOES = (isExternalOES ? REQUIRE_STRING_EXTERNAL_OES : "") + String.format(fshBlur, isExternalOES ? SAMPLER2D_VAR_EXTERNAL_OES : SAMPLER2D_VAR);
92 | final String fshBlurTex2D = String.format(fshBlur, SAMPLER2D_VAR);
93 | mFBO = new FrameBufferObject();
94 |
95 | mProgramDrawCache = new ProgramObject();
96 | mProgramDrawCache.bindAttribLocation(POSITION_NAME, 0);
97 |
98 | if(!mProgramDrawCache.init(vshBlurCache, fshBlurExtOES)) {
99 | Log.e(LOG_TAG, "blur filter program init failed - 1...");
100 | return false;
101 | }
102 |
103 | mProgramDrawCache.bind();
104 | mStepsLocCache = mProgramDrawCache.getUniformLoc(SAMPLER_STEPS);
105 |
106 | mProgram = new ProgramObject();
107 | mProgram.bindAttribLocation(POSITION_NAME, 0);
108 |
109 | if(!mProgram.init(vshBlur, fshBlurTex2D)) {
110 | Log.e(LOG_TAG, "blur filter program init failed - 2...");
111 | return false;
112 | }
113 |
114 | mProgram.bind();
115 | mStepsLoc = mProgram.getUniformLoc(SAMPLER_STEPS);
116 | setRotation(0.0f);
117 |
118 | return true;
119 | }
120 |
121 | @Override
122 | public void release() {
123 | if(mProgramDrawCache != mProgram)
124 | mProgramDrawCache.release();
125 | super.release();
126 | GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
127 | mFBO.release();
128 | mFBO = null;
129 | GLES20.glDeleteTextures(1, new int[]{mTexCache}, 0);
130 | mTexCache = 0;
131 | mProgramDrawCache = null;
132 | }
133 |
134 | @Override
135 | public void renderTexture(int texID, Viewport viewport) {
136 |
137 | if(mTexCache == 0 || mCacheTexWidth != mTextureWidth || mCacheTexHeight != mTextureHeight) {
138 | resetCacheTexture();
139 | }
140 |
141 | mFBO.bind();
142 |
143 | GLES20.glViewport(0, 0, mCacheTexWidth, mCacheTexHeight);
144 |
145 | GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, mVertexBuffer);
146 | GLES20.glEnableVertexAttribArray(0);
147 | GLES20.glVertexAttribPointer(0, 2, GLES20.GL_FLOAT, false, 0, 0);
148 |
149 | GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
150 | GLES20.glBindTexture(TEXTURE_2D_BINDABLE, texID);
151 |
152 | mProgramDrawCache.bind();
153 | GLES20.glUniform2f(mStepsLocCache, (1.0f / mTextureWidth) * mSamplerScale, 0.0f);
154 |
155 | GLES20.glDrawArrays(DRAW_FUNCTION, 0, 4);
156 |
157 | if(viewport != null)
158 | GLES20.glViewport(viewport.x, viewport.y, viewport.width, viewport.height);
159 |
160 | mProgram.bind();
161 | GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
162 | GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTexCache);
163 | GLES20.glUniform2f(mStepsLoc, 0.0f, (1.0f / mCacheTexWidth) * mSamplerScale);
164 | GLES20.glDrawArrays(DRAW_FUNCTION, 0, 4);
165 | }
166 |
167 | @Override
168 | public void setTextureSize(int w, int h) {
169 | super.setTextureSize(w, h);
170 | }
171 |
172 | @Override
173 | public String getVertexShaderString() {
174 | return vshBlur;
175 | }
176 |
177 | @Override
178 | public String getFragmentShaderString() {
179 | return fshBlur;
180 | }
181 |
182 |
183 | protected void resetCacheTexture() {
184 | Log.i(LOG_TAG, "resetCacheTexture...");
185 | mCacheTexWidth = mTextureWidth;
186 | mCacheTexHeight = mTextureHeight;
187 | if(mTexCache == 0)
188 | {
189 | int[] texCache = new int[1];
190 | GLES20.glGenTextures(1, texCache, 0);
191 | mTexCache = texCache[0];
192 | }
193 |
194 | GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTexCache);
195 |
196 | GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, mCacheTexWidth, mCacheTexHeight, 0, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, null);
197 |
198 | GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
199 | GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
200 | GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
201 | GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
202 |
203 | mFBO.bindTexture(mTexCache);
204 | }
205 |
206 | }
207 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/texUtils/TextureRendererDrawOrigin.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.texUtils;
2 |
3 | import android.opengl.GLES20;
4 |
5 | /**
6 | * Created by wangyang on 15/7/23.
7 | */
8 | public class TextureRendererDrawOrigin extends TextureRenderer {
9 |
10 | private static final String fshDrawOrigin = "" +
11 | "precision mediump float;\n" +
12 | "varying vec2 texCoord;\n" +
13 | "uniform %s inputImageTexture;\n" +
14 | "void main()\n" +
15 | "{\n" +
16 | " gl_FragColor = texture2D(inputImageTexture, texCoord);\n" +
17 | "}";
18 |
19 | //初始化默认的顶点序列等。
20 | TextureRendererDrawOrigin() {
21 | defaultInitialize();
22 | }
23 |
24 | TextureRendererDrawOrigin(boolean noDefaultInitialize) {
25 | if(!noDefaultInitialize)
26 | defaultInitialize();
27 | }
28 |
29 | public static TextureRendererDrawOrigin create(boolean isExternalOES) {
30 | TextureRendererDrawOrigin renderer = new TextureRendererDrawOrigin();
31 | if(!renderer.init(isExternalOES)) {
32 | renderer.release();
33 | return null;
34 | }
35 | return renderer;
36 | }
37 |
38 | @Override
39 | public boolean init(boolean isExternalOES) {
40 | return setProgramDefualt(getVertexShaderString(), getFragmentShaderString(), isExternalOES);
41 | }
42 |
43 | @Override
44 | public void release() {
45 | GLES20.glDeleteBuffers(1, new int[]{mVertexBuffer}, 0);
46 | mVertexBuffer = 0;
47 | mProgram.release();
48 | mProgram = null;
49 | }
50 |
51 | @Override
52 | public void renderTexture(int texID, Viewport viewport) {
53 |
54 | if(viewport != null) {
55 | GLES20.glViewport(viewport.x, viewport.y, viewport.width, viewport.height);
56 | }
57 |
58 | GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
59 | GLES20.glBindTexture(TEXTURE_2D_BINDABLE, texID);
60 |
61 | GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, mVertexBuffer);
62 | GLES20.glEnableVertexAttribArray(0);
63 | GLES20.glVertexAttribPointer(0, 2, GLES20.GL_FLOAT, false, 0, 0);
64 |
65 | mProgram.bind();
66 | GLES20.glDrawArrays(DRAW_FUNCTION, 0, 4);
67 | }
68 |
69 | @Override
70 | public void setTextureSize(int w, int h) {
71 | mTextureWidth = w;
72 | mTextureHeight = h;
73 | }
74 |
75 | @Override
76 | public String getVertexShaderString() {
77 | return vshDrawDefault;
78 | }
79 |
80 | @Override
81 | public String getFragmentShaderString() {
82 | return fshDrawOrigin;
83 | }
84 | }
85 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/texUtils/TextureRendererEdge.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.texUtils;
2 |
3 | /**
4 | * Created by wangyang on 15/7/23.
5 | */
6 | public class TextureRendererEdge extends TextureRendererEmboss {
7 |
8 | private static final String vshEdge = "" +
9 | "attribute vec2 vPosition;\n" +
10 | "varying vec2 texCoord;\n" +
11 | "varying vec2 coords[8];\n" +
12 |
13 | "uniform mat4 transform;\n" +
14 | "uniform mat2 rotation;\n" +
15 | "uniform vec2 flipScale;\n" +
16 | "uniform vec2 samplerSteps;\n" +
17 |
18 | "const float stride = 2.0;\n" +
19 |
20 | "void main()\n" +
21 | "{\n" +
22 | " gl_Position = vec4(vPosition, 0.0, 1.0);\n" +
23 | " vec2 coord = flipScale * (vPosition / 2.0 * rotation) + 0.5;\n" +
24 | " texCoord = (transform * vec4(coord, 0.0, 1.0)).xy;\n" +
25 |
26 | " coords[0] = texCoord - samplerSteps * stride;\n" +
27 | " coords[1] = texCoord + vec2(0.0, -samplerSteps.y) * stride;\n" +
28 | " coords[2] = texCoord + vec2(samplerSteps.x, -samplerSteps.y) * stride;\n" +
29 |
30 | " coords[3] = texCoord - vec2(samplerSteps.x, 0.0) * stride;\n" +
31 | " coords[4] = texCoord + vec2(samplerSteps.x, 0.0) * stride;\n" +
32 |
33 | " coords[5] = texCoord + vec2(-samplerSteps.x, samplerSteps.y) * stride;\n" +
34 | " coords[6] = texCoord + vec2(0.0, samplerSteps.y) * stride;\n" +
35 | " coords[7] = texCoord + vec2(samplerSteps.x, samplerSteps.y) * stride;\n" +
36 |
37 | "}";
38 |
39 | private static final String fshEdge = "" +
40 | "precision mediump float;\n" +
41 | "varying vec2 texCoord;\n" +
42 | "uniform %s inputImageTexture;\n" +
43 | "varying vec2 coords[8];\n" +
44 |
45 | "void main()\n" +
46 | "{\n" +
47 | " vec3 colors[8];\n" +
48 |
49 | " for(int i = 0; i < 8; ++i)\n" +
50 | " {\n" +
51 | " colors[i] = texture2D(inputImageTexture, coords[i]).rgb;\n" +
52 | " }\n" +
53 |
54 | " vec4 src = texture2D(inputImageTexture, texCoord);\n" +
55 |
56 | " vec3 h = -colors[0] - 2.0 * colors[1] - colors[2] + colors[5] + 2.0 * colors[6] + colors[7];\n" +
57 | " vec3 v = -colors[0] + colors[2] - 2.0 * colors[3] + 2.0 * colors[4] - colors[5] + colors[7];\n" +
58 |
59 | " gl_FragColor = vec4(sqrt(h * h + v * v), 1.0);\n" +
60 | "}";
61 |
62 |
63 | public static TextureRendererEdge create(boolean isExternalOES) {
64 | TextureRendererEdge renderer = new TextureRendererEdge();
65 | if(!renderer.init(isExternalOES)) {
66 | renderer.release();
67 | return null;
68 | }
69 | return renderer;
70 | }
71 |
72 | @Override
73 | public String getFragmentShaderString() {
74 | return fshEdge;
75 | }
76 |
77 | @Override
78 | public String getVertexShaderString() {
79 | return vshEdge;
80 | }
81 |
82 | }
83 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/texUtils/TextureRendererEmboss.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.texUtils;
2 |
3 | /**
4 | * Created by wangyang on 15/7/23.
5 | */
6 | public class TextureRendererEmboss extends TextureRendererDrawOrigin {
7 | private static final String fshEmboss = "" +
8 | "precision mediump float;\n" +
9 | "uniform %s inputImageTexture;\n" +
10 | "varying vec2 texCoord;\n" +
11 | "uniform vec2 samplerSteps;\n" +
12 | "const float stride = 2.0;\n" +
13 | "const vec2 norm = vec2(0.72, 0.72);\n" +
14 | "void main() {\n" +
15 | " vec4 src = texture2D(inputImageTexture, texCoord);\n" +
16 | " vec3 tmp = texture2D(inputImageTexture, texCoord + samplerSteps * stride * norm).rgb - src.rgb + 0.5;\n" +
17 | " float f = (tmp.r + tmp.g + tmp.b) / 3.0;\n" +
18 | " gl_FragColor = vec4(f, f, f, src.a);\n" +
19 | "}";
20 |
21 | protected static final String SAMPLER_STEPS = "samplerSteps";
22 |
23 | public static TextureRendererEmboss create(boolean isExternalOES) {
24 | TextureRendererEmboss renderer = new TextureRendererEmboss();
25 | if(!renderer.init(isExternalOES)) {
26 | renderer.release();
27 | return null;
28 | }
29 | return renderer;
30 | }
31 |
32 | @Override
33 | public boolean init(boolean isExternalOES) {
34 | if(setProgramDefualt(getVertexShaderString(), getFragmentShaderString(), isExternalOES)) {
35 | mProgram.bind();
36 | mProgram.sendUniformf(SAMPLER_STEPS, 1.0f / 640.0f, 1.0f / 640.0f);
37 | return true;
38 | }
39 | return false;
40 | }
41 |
42 | @Override
43 | public void setTextureSize(int w, int h) {
44 | super.setTextureSize(w, h);
45 | mProgram.bind();
46 | mProgram.sendUniformf(SAMPLER_STEPS, 1.0f / w, 1.0f / h);
47 | }
48 |
49 |
50 | @Override
51 | public String getFragmentShaderString() {
52 | return fshEmboss;
53 | }
54 | }
55 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/texUtils/TextureRendererLerpBlur.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.texUtils;
2 |
3 | import android.opengl.GLES20;
4 | import android.util.Log;
5 |
6 | import org.wysaid.common.FrameBufferObject;
7 | import org.wysaid.common.ProgramObject;
8 |
9 | /**
10 | * Created by wangyang on 15/7/24.
11 | */
12 | public class TextureRendererLerpBlur extends TextureRendererDrawOrigin {
13 |
14 | private static final String vshUpScale = "" +
15 | "attribute vec2 vPosition;\n" +
16 | "varying vec2 texCoord;\n" +
17 | "void main()\n" +
18 | "{\n" +
19 | " gl_Position = vec4(vPosition, 0.0, 1.0);\n" +
20 | " texCoord = vPosition / 2.0 + 0.5;\n" +
21 | "}";
22 |
23 | private static final String fshUpScale = "" +
24 | "precision mediump float;\n" +
25 | "varying vec2 texCoord;\n" +
26 | "uniform sampler2D inputImageTexture;\n" +
27 |
28 | "void main()\n" +
29 | "{\n" +
30 | " gl_FragColor = texture2D(inputImageTexture, texCoord);\n" +
31 | "}";
32 |
33 | private static final String vshBlurUpScale = "" +
34 | "attribute vec2 vPosition;\n" +
35 | "varying vec2 texCoords[5];\n" +
36 | "uniform vec2 samplerSteps;\n" +
37 | "\n" +
38 | "void main()\n" +
39 | "{\n" +
40 | " gl_Position = vec4(vPosition, 0.0, 1.0);\n" +
41 | " vec2 texCoord = vPosition / 2.0 + 0.5;\n" +
42 | " texCoords[0] = texCoord - 2.0 * samplerSteps;\n" +
43 | " texCoords[1] = texCoord - 1.0 * samplerSteps;\n" +
44 | " texCoords[2] = texCoord;\n" +
45 | " texCoords[3] = texCoord + 1.0 * samplerSteps;\n" +
46 | " texCoords[4] = texCoord + 2.0 * samplerSteps;\n" +
47 | "}";
48 |
49 | private static final String fshBlurUpScale = "" +
50 | "precision mediump float;\n" +
51 | "varying vec2 texCoords[5];\n" +
52 | "uniform sampler2D inputImageTexture;\n" +
53 | "\n" +
54 | "void main()\n" +
55 | "{\n" +
56 | " vec3 color = texture2D(inputImageTexture, texCoords[0]).rgb * 0.1;\n" +
57 | " color += texture2D(inputImageTexture, texCoords[1]).rgb * 0.2;\n" +
58 | " color += texture2D(inputImageTexture, texCoords[2]).rgb * 0.4;\n" +
59 | " color += texture2D(inputImageTexture, texCoords[3]).rgb * 0.2;\n" +
60 | " color += texture2D(inputImageTexture, texCoords[4]).rgb * 0.1;\n" +
61 | "\n" +
62 | " gl_FragColor = vec4(color, 1.0);\n" +
63 | "}";
64 |
65 | private static final String vshBlurCache = "" +
66 | "attribute vec2 vPosition;\n" +
67 | "varying vec2 texCoord;\n" +
68 | "void main()\n" +
69 | "{\n" +
70 | " gl_Position = vec4(vPosition, 0.0, 1.0);\n" +
71 | " texCoord = vPosition / 2.0 + 0.5;\n" +
72 | "}";
73 |
74 | private static final String fshBlur = "" +
75 | "precision highp float;\n" +
76 | "varying vec2 texCoord;\n" +
77 | "uniform sampler2D inputImageTexture;\n" +
78 | "uniform vec2 samplerSteps;\n" +
79 |
80 | "const int samplerRadius = 5;\n" +
81 | "const float samplerRadiusFloat = 5.0;\n" +
82 |
83 | "float random(vec2 seed)\n" +
84 | "{\n" +
85 | " return fract(sin(dot(seed ,vec2(12.9898,78.233))) * 43758.5453);\n" +
86 | "}\n" +
87 |
88 | "void main()\n" +
89 | "{\n" +
90 | " vec3 resultColor = vec3(0.0);\n" +
91 | " float blurPixels = 0.0;\n" +
92 | " float offset = random(texCoord) - 0.5;\n" +
93 | " \n" +
94 | " for(int i = -samplerRadius; i <= samplerRadius; ++i)\n" +
95 | " {\n" +
96 | " float percent = (float(i) + offset) / samplerRadiusFloat;\n" +
97 | " float weight = 1.0 - abs(percent);\n" +
98 | " vec2 coord = texCoord + samplerSteps * percent;\n" +
99 | " resultColor += texture2D(inputImageTexture, coord).rgb * weight;\n" +
100 | " blurPixels += weight;\n" +
101 | " }\n" +
102 |
103 | " gl_FragColor = vec4(resultColor / blurPixels, 1.0);\n" +
104 | "}";
105 |
106 | private static final String SAMPLER_STEPS = "samplerSteps";
107 |
108 | private ProgramObject mScaleProgram;
109 | private int[] mTextureDownScale;
110 |
111 | private FrameBufferObject mFramebuffer;
112 | private Viewport mTexViewport;
113 | private int mSamplerStepLoc = 0;
114 |
115 | private int mIntensity = 0;
116 |
117 | private float mSampleScaling = 1.0f;
118 |
119 | private final int mLevel = 16;
120 | private final float mBase = 2.0f;
121 |
122 | public static TextureRendererLerpBlur create(boolean isExternalOES) {
123 | TextureRendererLerpBlur renderer = new TextureRendererLerpBlur();
124 | if(!renderer.init(isExternalOES)) {
125 | renderer.release();
126 | return null;
127 | }
128 | return renderer;
129 | }
130 |
131 | //intensity >= 0
132 | public void setIntensity(int intensity) {
133 |
134 | if(intensity == mIntensity)
135 | return;
136 |
137 | mIntensity = intensity;
138 | if(mIntensity > mLevel)
139 | mIntensity = mLevel;
140 | }
141 |
142 | @Override
143 | public boolean init(boolean isExternalOES) {
144 | return super.init(isExternalOES) && initLocal();
145 | }
146 |
147 | @Override
148 | public void renderTexture(int texID, Viewport viewport) {
149 |
150 | if(mIntensity == 0) {
151 | GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
152 | super.renderTexture(texID, viewport);
153 | return;
154 | }
155 |
156 | // if(mShouldUpdateTexture) {
157 | // updateTexture();
158 | // }
159 |
160 | GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
161 |
162 | mFramebuffer.bindTexture(mTextureDownScale[0]);
163 | //down scale
164 |
165 | mTexViewport.width = calcMips(512, 1);
166 | mTexViewport.height = calcMips(512, 1);
167 | super.renderTexture(texID, mTexViewport);
168 |
169 | mScaleProgram.bind();
170 | for(int i = 1; i < mIntensity; ++i) {
171 | mFramebuffer.bindTexture(mTextureDownScale[i]);
172 | GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDownScale[i - 1]);
173 | GLES20.glViewport(0, 0, calcMips(512, i + 1), calcMips(512, i + 1));
174 | GLES20.glDrawArrays(DRAW_FUNCTION, 0, 4);
175 | }
176 |
177 | for(int i = mIntensity - 1; i > 0; --i) {
178 | mFramebuffer.bindTexture(mTextureDownScale[i - 1]);
179 | GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDownScale[i]);
180 | GLES20.glViewport(0, 0, calcMips(512, i), calcMips(512, i));
181 | GLES20.glDrawArrays(DRAW_FUNCTION, 0, 4);
182 | }
183 |
184 | GLES20.glViewport(viewport.x, viewport.y, viewport.width, viewport.height);
185 |
186 | GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
187 | GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDownScale[0]);
188 | // GLES20.glUniform2f(mSamplerStepLoc, 0.0f, (0.5f / mTexViewport.width) * mSampleScaling);
189 |
190 | GLES20.glDrawArrays(DRAW_FUNCTION, 0, 4);
191 | }
192 |
193 | @Override
194 | public void release() {
195 | mScaleProgram.release();
196 | mFramebuffer.release();
197 | GLES20.glDeleteTextures(mTextureDownScale.length, mTextureDownScale, 0);
198 | mScaleProgram = null;
199 | mFramebuffer = null;
200 | }
201 |
202 | private boolean initLocal() {
203 |
204 | genMipmaps(mLevel, 512, 512);
205 | mFramebuffer = new FrameBufferObject();
206 |
207 | mScaleProgram = new ProgramObject();
208 | mScaleProgram.bindAttribLocation(POSITION_NAME, 0);
209 |
210 | // if(!mScaleProgram.init(vshBlurUpScale, fshBlurUpScale)) {
211 | if(!mScaleProgram.init(vshUpScale, fshUpScale)) {
212 | Log.e(LOG_TAG, "Lerp blur initLocal failed...");
213 | return false;
214 | }
215 |
216 | // mScaleProgram.bind();
217 | // mSamplerStepLoc = mScaleProgram.getUniformLoc(SAMPLER_STEPS);
218 |
219 | return true;
220 | }
221 |
222 | private void updateTexture() {
223 | // if(mIntensity == 0)
224 | // return;
225 | //
226 | // int useIntensity = mIntensity;
227 | //
228 | // if(useIntensity > 6) {
229 | // mSampleScaling = useIntensity / 6.0f;
230 | // useIntensity = 6;
231 | // }
232 | //
233 | // int scalingWidth = mTextureHeight / useIntensity;
234 | // int scalingHeight = mTextureWidth / useIntensity;
235 | //
236 | // if(scalingWidth == 0)
237 | // scalingWidth = 1;
238 | // if(scalingHeight == 0)
239 | // scalingHeight = 1;
240 | //
241 | // mTexViewport = new Viewport(0, 0, scalingWidth, scalingHeight);
242 | //
243 | // GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDownScale[0]);
244 | // GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, scalingWidth, scalingHeight, 0, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, null);
245 | //
246 | // GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDownScale[1]);
247 | // GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, scalingWidth, scalingHeight, 0, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, null);
248 | //
249 | // mShouldUpdateTexture = false;
250 | //
251 | // Log.i(LOG_TAG, "Lerp blur - updateTexture");
252 | //
253 | // Common.checkGLError("Lerp blur - updateTexture");
254 | }
255 |
256 |
257 |
258 | @Override
259 | public void setTextureSize(int w, int h) {
260 | super.setTextureSize(w, h);
261 | }
262 |
263 | private void genMipmaps(int level, int width, int height) {
264 | mTextureDownScale = new int[level];
265 | GLES20.glGenTextures(level, mTextureDownScale, 0);
266 |
267 | for(int i = 0; i < level; ++i) {
268 | GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDownScale[i]);
269 | GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, calcMips(width, i + 1), calcMips(height, i + 1), 0, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, null);
270 | GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
271 | GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
272 | GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
273 | GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
274 | }
275 |
276 | mTexViewport = new Viewport(0, 0, 512, 512);
277 | }
278 |
279 | private int calcMips(int len, int level) {
280 | // return (int)(len / Math.pow(mBase, (level + 1)));
281 | return len / (level + 1);
282 | }
283 |
284 | }
285 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/texUtils/TextureRendererMask.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.texUtils;
2 |
3 | import android.opengl.GLES20;
4 |
5 | /**
6 | * Created by wangyang on 15/8/20.
7 | */
8 | public class TextureRendererMask extends TextureRendererDrawOrigin {
9 |
10 | private static final String vshMask = "" +
11 | "attribute vec2 vPosition;\n" +
12 | "varying vec2 texCoord;\n" +
13 | "varying vec2 maskCoord;\n" +
14 |
15 | "uniform mat2 rotation;\n" +
16 | "uniform vec2 flipScale;\n" +
17 |
18 | "uniform mat2 maskRotation;\n" +
19 | "uniform vec2 maskFlipScale;\n" +
20 |
21 | "uniform mat4 transform;\n" +
22 |
23 | "void main()\n" +
24 | "{\n" +
25 | " gl_Position = vec4(vPosition, 0.0, 1.0);\n" +
26 |
27 | " vec2 coord = flipScale * (vPosition / 2.0 * rotation) + 0.5;\n" +
28 | " texCoord = (transform * vec4(coord, 0.0, 1.0)).xy;\n" +
29 |
30 | " maskCoord = maskFlipScale * (vPosition / 2.0 * maskRotation) + 0.5;\n" +
31 | "}";
32 |
33 | private static final String fshMask = "" +
34 | "precision mediump float;\n" +
35 | "varying vec2 texCoord;\n" +
36 | "varying vec2 maskCoord;\n" +
37 | "uniform %s inputImageTexture;\n" +
38 | "uniform sampler2D maskTexture;\n" +
39 | "void main()\n" +
40 | "{\n" +
41 | " gl_FragColor = texture2D(inputImageTexture, texCoord);\n" +
42 | " vec4 maskColor = texture2D(maskTexture, maskCoord);\n" +
43 | //不预乘
44 | // " maskColor.rgb *= maskColor.a;\n" +
45 | " gl_FragColor *= maskColor;\n" +
46 | "}";
47 |
48 | protected static final String MASK_ROTATION_NAME = "maskRotation";
49 | protected static final String MASK_FLIPSCALE_NAME = "maskFlipScale";
50 | protected static final String MASK_TEXTURE_NAME = "maskTexture";
51 |
52 | protected int mMaskRotLoc, mMaskFlipscaleLoc;
53 | protected int mMaskTexture;
54 |
55 | public static TextureRendererMask create(boolean isExternalOES) {
56 | TextureRendererMask renderer = new TextureRendererMask();
57 | if(!renderer.init(isExternalOES)) {
58 | renderer.release();
59 | return null;
60 | }
61 | return renderer;
62 | }
63 |
64 | @Override
65 | public boolean init(boolean isExternalOES) {
66 | if(setProgramDefualt(getVertexShaderString(), getFragmentShaderString(), isExternalOES)) {
67 | mProgram.bind();
68 | mMaskRotLoc = mProgram.getUniformLoc(MASK_ROTATION_NAME);
69 | mMaskFlipscaleLoc = mProgram.getUniformLoc(MASK_FLIPSCALE_NAME);
70 | mProgram.sendUniformi(MASK_TEXTURE_NAME, 1);
71 | setMaskRotation(0.0f);
72 | setMaskFlipscale(1.0f, 1.0f);
73 | return true;
74 | }
75 | return false;
76 | }
77 |
78 | public void setMaskRotation(float rad) {
79 | final float cosRad = (float)Math.cos(rad);
80 | final float sinRad = (float)Math.sin(rad);
81 |
82 | float rot[] = new float[] {
83 | cosRad, sinRad,
84 | -sinRad, cosRad
85 | };
86 |
87 | assert mProgram != null : "setRotation must not be called before init!";
88 |
89 | mProgram.bind();
90 | GLES20.glUniformMatrix2fv(mMaskRotLoc, 1, false, rot, 0);
91 | }
92 |
93 | public void setMaskFlipscale(float x, float y) {
94 | mProgram.bind();
95 | GLES20.glUniform2f(mMaskFlipscaleLoc, x, y);
96 | }
97 |
98 | public void setMaskTexture(int texID) {
99 | if(texID == mMaskTexture)
100 | return;
101 |
102 | GLES20.glDeleteTextures(1, new int[]{mMaskTexture}, 0);
103 | mMaskTexture = texID;
104 | }
105 |
106 | @Override
107 | public void renderTexture(int texID, Viewport viewport) {
108 |
109 | if(viewport != null) {
110 | GLES20.glViewport(viewport.x, viewport.y, viewport.width, viewport.height);
111 | }
112 |
113 | GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
114 | GLES20.glBindTexture(TEXTURE_2D_BINDABLE, texID);
115 |
116 | GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
117 | GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mMaskTexture);
118 |
119 | GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, mVertexBuffer);
120 | GLES20.glEnableVertexAttribArray(0);
121 | GLES20.glVertexAttribPointer(0, 2, GLES20.GL_FLOAT, false, 0, 0);
122 |
123 | mProgram.bind();
124 | GLES20.glDrawArrays(DRAW_FUNCTION, 0, 4);
125 | }
126 |
127 | @Override
128 | public String getVertexShaderString() {
129 | return vshMask;
130 | }
131 |
132 | @Override
133 | public String getFragmentShaderString() {
134 | return fshMask;
135 | }
136 |
137 | @Override
138 | public void release() {
139 | super.release();
140 | GLES20.glDeleteTextures(1, new int[]{mMaskTexture}, 0);
141 | }
142 |
143 | }
144 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/texUtils/TextureRendererWave.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.texUtils;
2 |
3 | import android.opengl.GLES20;
4 |
5 |
6 | /**
7 | * Created by wangyang on 15/7/18.
8 | */
9 | public class TextureRendererWave extends TextureRendererDrawOrigin {
10 |
11 | private static final String fshWave = "" +
12 | "precision mediump float;\n" +
13 | "varying vec2 texCoord;\n" +
14 | "uniform %s inputImageTexture;\n" +
15 | "uniform float motion;\n" +
16 | "const float angle = 20.0;" +
17 | "void main()\n" +
18 | "{\n" +
19 | " vec2 coord;\n" +
20 | " coord.x = texCoord.x + 0.01 * sin(motion + texCoord.x * angle);\n" +
21 | " coord.y = texCoord.y + 0.01 * sin(motion + texCoord.y * angle);\n" +
22 | " gl_FragColor = texture2D(inputImageTexture, coord);\n" +
23 | "}";
24 |
25 | private int mMotionLoc = 0;
26 |
27 | private boolean mAutoMotion = false;
28 | private float mMotion = 0.0f;
29 | private float mMotionSpeed = 0.0f;
30 |
31 | public TextureRendererWave() {
32 | }
33 |
34 | public static TextureRendererWave create(boolean isExternalOES) {
35 | TextureRendererWave renderer = new TextureRendererWave();
36 | if(!renderer.init(isExternalOES)) {
37 | renderer.release();
38 | return null;
39 | }
40 | return renderer;
41 | }
42 |
43 | @Override
44 | public boolean init(boolean isExternalOES) {
45 | if(setProgramDefualt(vshDrawDefault, fshWave, isExternalOES)) {
46 | mProgram.bind();
47 | mMotionLoc = mProgram.getUniformLoc("motion");
48 | return true;
49 | }
50 | return false;
51 | }
52 |
53 | public void setWaveMotion(float motion) {
54 | mProgram.bind();
55 | GLES20.glUniform1f(mMotionLoc, motion);
56 | }
57 |
58 | public void setAutoMotion(float speed) {
59 | mMotionSpeed = speed;
60 | mAutoMotion = (speed != 0.0f);
61 | }
62 |
63 | @Override
64 | public void renderTexture(int texID, Viewport viewport) {
65 |
66 | if(viewport != null) {
67 | GLES20.glViewport(viewport.x, viewport.y, viewport.width, viewport.height);
68 | }
69 |
70 | GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
71 | GLES20.glBindTexture(TEXTURE_2D_BINDABLE, texID);
72 |
73 | GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, mVertexBuffer);
74 | GLES20.glEnableVertexAttribArray(0);
75 | GLES20.glVertexAttribPointer(0, 2, GLES20.GL_FLOAT, false, 0, 0);
76 |
77 | mProgram.bind();
78 | if(mAutoMotion) {
79 | mMotion += mMotionSpeed;
80 | GLES20.glUniform1f(mMotionLoc, mMotion);
81 | if(mMotion > Math.PI * 20.0f) {
82 | mMotion -= Math.PI * 20.0f;
83 | }
84 | }
85 | GLES20.glDrawArrays(DRAW_FUNCTION, 0, 4);
86 | }
87 | }
88 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/view/CameraRecordGLSurfaceView.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.view;
2 |
3 | /**
4 | * Created by wangyang on 15/7/27.
5 | */
6 |
7 |
8 | import android.content.Context;
9 | import android.media.AudioFormat;
10 | import android.media.AudioRecord;
11 | import android.media.MediaRecorder;
12 | import android.util.AttributeSet;
13 | import android.util.Log;
14 |
15 | import java.nio.ByteBuffer;
16 | import java.nio.ShortBuffer;
17 |
18 | /**
19 | * Created by wangyang on 15/7/17.
20 | */
21 | public class CameraRecordGLSurfaceView extends CameraGLSurfaceView {
22 |
23 | public CameraRecordGLSurfaceView(Context context, AttributeSet attrs) {
24 | super(context, attrs);
25 | }
26 |
27 | //Not provided by now.
28 | }
29 |
--------------------------------------------------------------------------------
/library/src/main/java/org/wysaid/view/ImageGLSurfaceView.java:
--------------------------------------------------------------------------------
1 | package org.wysaid.view;
2 |
3 | import android.content.Context;
4 | import android.graphics.Bitmap;
5 | import android.graphics.PixelFormat;
6 | import android.opengl.GLES20;
7 | import android.opengl.GLSurfaceView;
8 | import android.opengl.GLSurfaceView.Renderer;
9 | import android.util.AttributeSet;
10 | import android.util.Log;
11 |
12 | import org.wysaid.common.Common;
13 | import org.wysaid.nativePort.CGEImageHandler;
14 | import org.wysaid.texUtils.TextureRenderer;
15 |
16 | import javax.microedition.khronos.egl.EGLConfig;
17 | import javax.microedition.khronos.opengles.GL10;
18 |
19 | /**
20 | * Created by wysaid on 15/12/23.
21 | * Mail: admin@wysaid.org
22 | * blog: wysaid.org
23 | */
24 | public class ImageGLSurfaceView extends GLSurfaceView implements Renderer{
25 |
26 | public static final String LOG_TAG = Common.LOG_TAG;
27 |
28 | public enum DisplayMode {
29 | DISPLAY_SCALE_TO_FILL,
30 | DISPLAY_ASPECT_FILL,
31 | DISPLAY_ASPECT_FIT,
32 | }
33 |
34 | protected CGEImageHandler mImageHandler;
35 |
36 | public CGEImageHandler getImageHandler() {
37 | return mImageHandler;
38 | }
39 |
40 | protected TextureRenderer.Viewport mRenderViewport = new TextureRenderer.Viewport();
41 | protected int mImageWidth;
42 | protected int mImageHeight;
43 | protected int mViewWidth;
44 | protected int mViewHeight;
45 |
46 | public int getImageWidth() {
47 | return mImageWidth;
48 | }
49 |
50 | public int getImageheight() {
51 | return mImageHeight;
52 | }
53 |
54 | protected DisplayMode mDisplayMode = DisplayMode.DISPLAY_SCALE_TO_FILL;
55 |
56 | public DisplayMode getDisplayMode() {
57 | return mDisplayMode;
58 | }
59 |
60 | public void setDisplayMode(DisplayMode displayMode) {
61 | mDisplayMode = displayMode;
62 | calcViewport();
63 | requestRender();
64 | }
65 |
66 | public void setFilterWithConfig(final String config) {
67 |
68 | if(mImageHandler == null)
69 | return;
70 |
71 | queueEvent(new Runnable() {
72 | @Override
73 | public void run() {
74 | mImageHandler.setFilterWithConfig(config);
75 | requestRender();
76 | }
77 | });
78 | }
79 |
80 | public void setFilterIntensity(final float intensity) {
81 | if(mImageHandler == null)
82 | return;
83 |
84 | queueEvent(new Runnable() {
85 | @Override
86 | public void run() {
87 | mImageHandler.setFilterIntensity(intensity);
88 | }
89 | });
90 | }
91 |
92 | public void setImageBitmap(final Bitmap bmp) {
93 |
94 | if(bmp == null)
95 | return;
96 |
97 | if(mImageHandler == null) {
98 | Log.e(LOG_TAG, "Handler not initialized!");
99 | return;
100 | }
101 |
102 | mImageWidth = bmp.getWidth();
103 | mImageHeight = bmp.getHeight();
104 |
105 | queueEvent(new Runnable() {
106 | @Override
107 | public void run() {
108 |
109 | if(mImageHandler.initWidthBitmap(bmp)) {
110 |
111 | calcViewport();
112 | requestRender();
113 |
114 | } else {
115 | Log.e(LOG_TAG, "setImageBitmap: 初始化 handler 失败!");
116 | }
117 | }
118 | });
119 | }
120 |
121 | public interface QueryResultBitmapCallback {
122 | void get(Bitmap bmp);
123 | }
124 |
125 | public void getResultBitmap(final QueryResultBitmapCallback callback) {
126 |
127 | if(callback == null)
128 | return;
129 |
130 | queueEvent(new Runnable() {
131 | @Override
132 | public void run() {
133 |
134 | Bitmap bmp = mImageHandler.getResultBitmap();
135 | callback.get(bmp);
136 | }
137 | });
138 | }
139 |
140 |
141 | public ImageGLSurfaceView(Context context, AttributeSet attrs) {
142 | super(context, attrs);
143 |
144 | setEGLContextClientVersion(2);
145 | setEGLConfigChooser(8, 8, 8, 8, 8, 0);
146 | getHolder().setFormat(PixelFormat.RGBA_8888);
147 | setRenderer(this);
148 | setRenderMode(RENDERMODE_WHEN_DIRTY);
149 | // setZOrderMediaOverlay(true);
150 |
151 | Log.i(LOG_TAG, "ImageGLSurfaceView Construct...");
152 | }
153 |
154 | public interface OnSurfaceCreatedCallback {
155 | void surfaceCreated();
156 | }
157 |
158 | protected OnSurfaceCreatedCallback mSurfaceCreatedCallback;
159 | public void setSurfaceCreatedCallback(OnSurfaceCreatedCallback callback) {
160 | mSurfaceCreatedCallback = callback;
161 | }
162 |
163 | @Override
164 | public void onSurfaceCreated(GL10 gl, EGLConfig config) {
165 | Log.i(LOG_TAG, "ImageGLSurfaceView onSurfaceCreated...");
166 |
167 | GLES20.glDisable(GLES20.GL_DEPTH_TEST);
168 | GLES20.glDisable(GLES20.GL_STENCIL_TEST);
169 |
170 | mImageHandler = new CGEImageHandler();
171 |
172 | mImageHandler.setDrawerFlipScale(1.0f, -1.0f);
173 |
174 | if(mSurfaceCreatedCallback != null) {
175 | mSurfaceCreatedCallback.surfaceCreated();
176 | }
177 | }
178 |
179 | @Override
180 | public void onSurfaceChanged(GL10 gl, int width, int height) {
181 | GLES20.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
182 | mViewWidth = width;
183 | mViewHeight = height;
184 | calcViewport();
185 | }
186 |
187 | @Override
188 | public void onDrawFrame(GL10 gl) {
189 |
190 | GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
191 | GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
192 |
193 | if(mImageHandler == null)
194 | return;
195 |
196 | GLES20.glViewport(mRenderViewport.x, mRenderViewport.y, mRenderViewport.width, mRenderViewport.height);
197 | mImageHandler.drawResult();
198 | }
199 |
200 | public void release() {
201 |
202 | if(mImageHandler != null) {
203 | queueEvent(new Runnable() {
204 | @Override
205 | public void run() {
206 | Log.i(LOG_TAG, "ImageGLSurfaceView release...");
207 |
208 | if(mImageHandler != null) {
209 | mImageHandler.release();
210 | mImageHandler = null;
211 | }
212 | }
213 | });
214 | }
215 | }
216 |
217 | protected void calcViewport() {
218 |
219 | if(mDisplayMode == DisplayMode.DISPLAY_SCALE_TO_FILL) {
220 | mRenderViewport.x = 0;
221 | mRenderViewport.y = 0;
222 | mRenderViewport.width = mViewWidth;
223 | mRenderViewport.height = mViewHeight;
224 | return;
225 | }
226 |
227 | float scaling;
228 |
229 | scaling = mImageWidth / (float)mImageHeight;
230 |
231 | float viewRatio = mViewWidth / (float)mViewHeight;
232 | float s = scaling / viewRatio;
233 |
234 | int w, h;
235 |
236 | switch (mDisplayMode)
237 | {
238 | case DISPLAY_ASPECT_FILL:
239 | {
240 | //保持比例撑满全部view(内容大于view)
241 | if(s > 1.0)
242 | {
243 | w = (int)(mViewHeight * scaling);
244 | h = mViewHeight;
245 | }
246 | else
247 | {
248 | w = mViewWidth;
249 | h = (int)(mViewWidth / scaling);
250 | }
251 | }
252 | break;
253 | case DISPLAY_ASPECT_FIT:
254 | {
255 | //保持比例撑满全部view(内容小于view)
256 | if(s < 1.0)
257 | {
258 | w = (int)(mViewHeight * scaling);
259 | h = mViewHeight;
260 | }
261 | else
262 | {
263 | w = mViewWidth;
264 | h = (int)(mViewWidth / scaling);
265 | }
266 | }
267 | break;
268 |
269 | default:
270 | Log.i(LOG_TAG, "Error occured, please check the code...");
271 | return;
272 | }
273 |
274 |
275 |
276 | mRenderViewport.width = w;
277 | mRenderViewport.height = h;
278 | mRenderViewport.x = (mViewWidth - w) / 2;
279 | mRenderViewport.y = (mViewHeight - h) / 2;
280 |
281 | Log.i(LOG_TAG, String.format("View port: %d, %d, %d, %d", mRenderViewport.x, mRenderViewport.y, mRenderViewport.width, mRenderViewport.height));
282 | }
283 | }
284 |
--------------------------------------------------------------------------------
/library/src/main/libs/arm64-v8a/libCGE.so:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/library/src/main/libs/arm64-v8a/libCGE.so
--------------------------------------------------------------------------------
/library/src/main/libs/x86/libCGE.so:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/library/src/main/libs/x86/libCGE.so
--------------------------------------------------------------------------------
/library/src/main/libs/x86_64/libCGE.so:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/library/src/main/libs/x86_64/libCGE.so
--------------------------------------------------------------------------------
/library/src/main/res/values/strings.xml:
--------------------------------------------------------------------------------
1 |
2 | Library
3 |
4 |
--------------------------------------------------------------------------------
/screenshots/0.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/screenshots/0.jpg
--------------------------------------------------------------------------------
/screenshots/1.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/screenshots/1.jpg
--------------------------------------------------------------------------------
/screenshots/2.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/screenshots/2.jpg
--------------------------------------------------------------------------------
/screenshots/alipay.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/angcyo/android-gpuimage-plus-master/33bc07edbbb833ed812b446572751f2f4b56a614/screenshots/alipay.jpg
--------------------------------------------------------------------------------
/settings.gradle:
--------------------------------------------------------------------------------
1 | include ':library', ':cgedemo'
2 |
--------------------------------------------------------------------------------