├── .DS_Store
├── 2021_Open_Source_Promotion_Plan-Summer_MegEngine.md
├── 2022_Open_Source_Promotion_Plan-Summer_MegEngine.md
├── Contributor_Program.md
├── LICENSE
├── MegEngine Meetup
├── 31_31 conv docs images
│ ├── 1.png
│ ├── 2.png
│ ├── 3.png
│ ├── 4.png
│ ├── 5.png
│ ├── 6.png
│ ├── 7.png
│ └── 8.png
├── DTR docs image
│ ├── dtr-1.png
│ ├── dtr-10.png
│ ├── dtr-11.png
│ ├── dtr-12.png
│ ├── dtr-13.png
│ ├── dtr-14.png
│ ├── dtr-15.png
│ ├── dtr-16.png
│ ├── dtr-17.png
│ ├── dtr-18.png
│ ├── dtr-19.png
│ ├── dtr-2.png
│ ├── dtr-20.png
│ ├── dtr-3.png
│ ├── dtr-4.png
│ ├── dtr-5.png
│ ├── dtr-6.png
│ ├── dtr-7.png
│ ├── dtr-8.png
│ └── dtr-9.png
├── MegCC 用模型编译的方式实现超轻量端上高性能推理.pdf
├── MegEngine Meetup_Pytorch与MegEngine之间不得不说的事__MegEngine Meetup No.4.pdf
├── 《2020 年代的卷积网络》_2022.3.19.pdf
├── 《JIT in MegEngine》_MegEngine Meetup No.1.pdf
├── 《MegEngine 中的动态图 Sublinear 显存优化》_MegEngine Meetup No.3.pdf
├── 《MegEngine 大 kernel 工程优化实践》_2022.3.19.pdf
├── 《MegEngine 开源一周年架构回顾》_MegEngine Meetup No.2.pdf
├── 《MegEngine“花式整活”纪实》_MegEngine Meetup No.5.pdf
├── 《利用MegEngine的分布式通信算子实现复杂的并行训练》 _MegEngine Meetup No.2..pdf
├── 《手把手带你玩转最新开源”看图神器“ MegSpot》- MegEngine Meetup No.9.pdf
├── 《超大卷积核架构设计与高效实践》_2022.3.19.pdf
├── 中国农业人工智能创新创业大赛分享_MegEngine Meetup No.6.pdf
├── 深度学习框架 MegEngine CUDA int4 推理详解.pdf
└── 聊聊底层优化_MegEngine Meetup No.7.pdf
├── Open_Source_Promotion_Plan-Summer_2020_MegEngine.md
├── README.md
├── competition activities
├── slides
│ ├── README.md
│ ├── 旷视 AI 智慧交通开源大赛-赛题讲解.pdf
│ ├── 旷视 AI 智慧交通开源赛 - MegStudio 介绍.pdf
│ ├── 视频超分辨率分享.pdf
│ └── 金枪鱼之夜:MegEngine 框架设计.pdf
├── 旷视人工智能开源大赛_视频超分辨率.md
└── 金枪鱼之夜:MegEngine 框架设计.md
├── 【2022 翻倍争夺战提交示例】mlp-mixer
├── compare.py
├── data
│ ├── 640px-Cat_close-up_2004_b.jpg
│ ├── 640px-Cat_eyes_2007-1.jpg
│ ├── 640px-Cat_kitten_side_profile.jpg
│ ├── 640px-Red_cat_Felis_silvestris_catus.jpg
│ ├── 640px-Siberian_black_tabby_blotched_cat_01.jpg
│ ├── Black_cat_IMG_1618.jpg
│ ├── Felis_catus-cat_on_snow.jpg
│ ├── Tabby_cat_with_blue_eyes-3336579.jpg
│ ├── Tony_senior_cat_6-6-22.jpg
│ └── lADPDgQ9qoGDocbNAnLNAZ0_413_626.jpg_620x10000q90g.jpg
├── imagenet-labels.json
├── model.py
└── requirements.txt
└── 零基础入门旷视天元MegEngine
├── 1.旷视天元MegEngine介绍和快速上手.md
├── 2.模型构建和训练入门.md
├── 3.模型构建和训练进阶 I:分类问题.md
├── 4.模型构建和训练进阶 II:物体检测.md
├── 5. Android 移动端模型推理部署.md
├── 6.部署进阶:推理端优化.md
├── notebooks
├── 2.模型构建和训练入门.ipynb
├── 3.模型构建和训练进阶 I:分类问题1.ipynb
├── 3.模型构建和训练进阶 I:分类问题2.ipynb
├── 3.模型构建和训练进阶 I:分类问题3.ipynb
├── 3.模型构建和训练进阶 I:分类问题4.ipynb
├── 4.模型构建和训练进阶 II:物体检测(anchor).ipynb
├── 4.模型构建和训练进阶 II:物体检测(nms).ipynb
├── 4.模型构建和训练进阶 II:物体检测(transform_pipeline).ipynb
├── 5. Android 移动端模型推理部署.zip
├── 6.部署进阶:推理端优化.zip
└── README.md
└── slides
├── 1.旷视天元MegEngine介绍和快速上手.pptx
├── 2.模型构建和训练入门课件.pdf
├── 3.模型构建和训练进阶 I:分类问题.pptx
├── 4.模型构建和训练进阶 II:物体检测.pdf
├── 5. Android 移动端模型推理部署.pdf
├── 6.部署进阶:推理端优化.pdf
└── README.md
/.DS_Store:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/.DS_Store
--------------------------------------------------------------------------------
/2021_Open_Source_Promotion_Plan-Summer_MegEngine.md:
--------------------------------------------------------------------------------
1 |
2 | MegEngine (天元)是一个快速、可拓展、易于使用且支持自动求导的开源深度学习框架。拥有三大核心优势:训练推理一体、全平台高效支持、动静结合的训练能力。
3 |
4 | 在开源软件供应链点亮计划 - 暑期 2021 活动中,MegEngine 将提供以下项目任务,并搭配导师,帮助大家更好的完成项目。
5 |
6 | MegEngine 官网:https://megengine.org.cn/
7 |
8 | 天元开发者交流 QQ 群:1029741705
9 |
10 |
11 |
12 | ## 项目一:添加 RNN/LSTM 系列算子
13 |
14 | 1. 项目标题:添加 RNN/LSTM 系列算子
15 |
16 | 2. 项目描述:目前 MegEngine 对于 NLP 相关接口的支持比较少,较为经典的 RNN/LSTM 算子没有提供支持,希望能在 MegEngine 中提供支持
17 |
18 | 3. 项目难度:高
19 |
20 | 4. 项目社区导师:陈振寰
21 |
22 | 5. 导师联系方式:chenzhenhuan@megvii.com
23 |
24 | 6. 项目产出要求:
25 | - 在 megengine python 接口层面添加 RNN/LSTM 算子
26 | - 参考内部文档在 c++层面实现 RNN/LSTMCell 算子,改进上层 python 接口,并达到与 pytorch 接近的性能
27 |
28 | 7. 项目技术要求:python、pytorch、C++
29 |
30 | 8. 相关的开源软件仓库列表:https://github.com/pytorch/pytorch
31 |
32 | ## 项目二:x86 中添加并优化 local
33 |
34 | 1. 项目标题:x86 中添加并优化 local
35 | 2. 项目描述:目前 MegEngine 中 naive 和 CUDA 中支持了 local 和 local share 算子,但是 x86 中并没有支持,因此这些算子在 x86 上运行只能使用 naive 的算子,速度很慢
36 | 3. 项目难度:高
37 | 4. 项目社区导师:陈其友
38 | 5. 导师联系方式:chenqiyou@megvii.com
39 | 6. 项目产出要求:
40 | - 在 x86 backend 下面完成 local 算子填加
41 | - 算子计算结果和 naive 在所有 case 下结果完全一致
42 | - 优化算子,使其逼近最佳性能
43 |
44 | 7. 项目技术要求:C++,性能优化
45 | 8. 相关的开源软件仓库列表:https://github.com/MegEngine/MegEngine/tree/master/dnn/src/naive/local
46 |
47 | ## 项目三:x86 中添加并优化 local share 算子
48 |
49 | 1. 项目标题:x86 中添加并优化 local share 算子
50 | 2. 项目描述:目前 MegEngine 中 naive 和 CUDA 中支持了 local 和 local share 算子,但是 x86 中并没有支持,因此这些算子在 x86 上运行只能使用 naive 的算子,速度很慢
51 | 3. 项目难度:高
52 | 4. 项目社区导师:陈其友
53 | 5. 导师联系方式:chenqiyou@megvii.com
54 | 6. 项目产出要求:
55 | - 在 x86 backend 下面完成 local share 算子填加
56 | - 算子计算结果和 naive 在所有 case 下结果完全一致
57 | - benchmark 并优化算子,使其逼近最佳性能
58 |
59 | 7. 项目技术要求:C++,性能优化
60 |
61 | 8. 相关的开源软件仓库列表:https://github.com/MegEngine/MegEngine/tree/master/dnn/src/naive/local_share
62 |
63 | ## 项目四:web 上运行 mge 模型
64 |
65 | 1. 项目标题:web 上运行 mge 模型
66 | 2. 项目描述:在 web 上跑起来简单的 mge 模型
67 | 3. 项目难度:高
68 | 4. 项目社区导师:柳俊杰
69 | 5. 导师联系方式:liujunjie@megvii.com
70 | 6. 项目产出要求:
71 |
72 | 在 web 上跑起来线性回归模型/手写字模型,要求提供 mge.js 和可运行的 model。
73 |
74 | 方案:
75 |
76 | - 框架部分用 wasm 编译为 mge.js(需要能加载 dump 的 mge 模型,最好不要做模型转换)
77 | - 缺的 kernel 可以参考 xnnpack 用 wasm 进行补充。(webgpu 当前效率不高,webgl 用于计算比较别扭,所以还是选支持 simd 的 wasm)
78 |
79 | 实现的效果分为 7 级:
80 |
81 | - L0:可以加载模型和数据(可以不支持图优化/存储优化等各种优化,可以对现有的框架通过宏进行裁剪,甚至可以重写一个新框架,但必须保证模型的兼容性加载)
82 | - L1:dense/matmul(必选)的前向 op 单测通过(要求 nchw 的 layout 下正确性ok)
83 | - L2: 跑通线性回归模型前向
84 | - L3: 跑通线性回归模型的后向和训练
85 | - L4: 跑通 mnist 模型前向
86 | - L5: 跑通 mnist 后向和训练
87 | - L6: mnist 的 demo(参考 https://storage.googleapis.com/tfjs-examples/mnist/dist/index.html)
88 |
89 | 7. 项目技术要求:js/wasm/c++/python
90 |
91 | 8. 相关的开源软件仓库列表:
92 |
93 | - https://github.com/tensorflow/tfjs
94 |
95 | - https://github.com/google/XNNPACK
96 |
97 | - https://www.tensorflow.org/js/demos?hl=zh-cn
98 |
99 | - https://zhuanlan.zhihu.com/p/356317676
100 |
101 | - https://github.com/alibaba/MNN/blob/master/3rd_party/flatbuffers/docs/source/JavaScriptUsage.md
102 |
103 | - https://github.com/torch-js/torch-js
104 |
105 |
106 | ## 项目五:一个对用户友好的 web profile 工具
107 |
108 | 1. 项目标题:一个对用户友好的 web profile 工具
109 | 2. 项目描述:在 web 上提交 mge 模型,后端执行并反馈 profile 细节
110 | 3. 项目难度:高
111 | 4. 项目社区导师:柳俊杰
112 | 5. 导师联系方式:liujunjie@megvii.com
113 | 6. 项目产出要求:
114 |
115 | 一个对用户友好的 web profile 工具 C/S 结构,前端负责收集模型、可视化、打印 profile 结果。后端负责实际跑模型,后端的支持可以覆盖 x86/cuda/arm 等。
116 |
117 | - L0: 前端能提交 mge 模型,打印模型的执行延迟。后端支持X86,能接受前端的模型并反馈延迟结果。后端的执行可以包装l oad_and_run(需要能填充模型的输入数据,加载完模型后打印模型的输入输出 tensor)。
118 |
119 | - L1: 前端支持打印 profile 结果,后端反馈 profile 结果。后端的 profile 可以直接返回 load_and_run的profile json 结果,并提供过滤、排序、聚合等操作。反馈模型的输出 tensor name。
120 |
121 | - L2: 前端支持模型结构预览(加载完模型后即可预览),并可视化每层的耗时(profile 后可以看到耗时)
122 |
123 | - L3: 扩展后端范围,包括 cuda/arm。方便没有设备的用户能直接得到性能数据。
124 |
125 | - L4: 支持 megengine 的版本切换,支持上传 np 文件作为输入
126 |
127 | 7. 项目技术要求:html/js/c++/python
128 |
129 | ## 项目六:dnn backend 加入 apple NPU 的支持
130 |
131 | 1. 项目标题:dnn backend 加入 apple NPU 的支持
132 | 2. 项目描述:在 MACOS 上推理可用 APPLE np
133 | 3. 项目难度:高
134 | 4. 项目社区导师:张浩龙
135 | 5. 导师联系方式:zhanghaolong@megvii.com
136 | 6. 项目产出要求:
137 |
138 | 在 MACOS上 推理可用 APPLE npu,基于 CoreML API
139 |
140 | - L0: MegEngine 模型转换到 CoreML 可用格式, 能使用 CoreML API 把这个模型跑起来
141 |
142 | - L1: 参考 MegEngine/src/tensorrt/impl 实现 CoreML 在 MegEngine 下的 runtime opr
143 |
144 | - L2:最终在支持 CoreML 的平台上跑起来推理 MegEngine 并且利用 CoreML 加速
145 |
146 | 7. 项目技术要求:c++, macos
147 |
148 | 8. 相关的开源软件仓库列表:
149 |
150 | - https://developer.apple.com/documentation/coreml/core_ml_api
151 |
152 | - https://github.com/MegEngine/MegEngine
153 |
154 | ## 项目七:python whl 支持 apple ARM macos
155 |
156 | 1. 项目标题:python whl 支持 apple ARM macos
157 | 2. 项目描述:在 apple arm macos 上,可用 megengine python
158 | 3. 项目难度:高
159 | 4. 项目社区导师:张浩龙
160 | 5. 导师联系方式:zhanghaolong@megvii.com
161 | 6. 项目产出要求:
162 |
163 | 在 apple arm macos 上,可用 MegEngine python
164 |
165 | - L0: 本地编译通过
166 |
167 | - L1: 包具备分发性,可安装到其他 aarch64+ macbook 上,且能运行
168 |
169 | - L2: 能在基于 aarch64 + macbook 上训练一个模型
170 |
171 | 7. 项目技术要求:c++, 编译,macos
172 |
173 | 8. 相关的开源软件仓库列表:https://github.com/MegEngine/MegEngine
174 |
175 | ## 项目八:python whl 支持 Android (限定在主流的 ARM)
176 |
177 | 1. 项目标题:python whl 支持 Android (限定在主流的ARM吧)
178 | 2. 项目描述:在 Android arm 上,可用 megengine python
179 | 3. 项目难度:高
180 | 4. 项目社区导师:王博文
181 | 5. 导师联系方式:wangbowen02@megvii.com
182 | 6. 项目产出要求:
183 |
184 | 在 Android aarch64 上(termux 环境),可用 MegEngine python
185 |
186 | - L0: NDK-x86 linux cross build aarch64 Android 编译通过
187 |
188 | - L1: 包具备分发性,可安装到其他 aarch64+ Android 上,且能运行
189 |
190 | - L2: 能在基于 aarch64 + Android + termux 训练一个小模型
191 |
192 | 7. 项目技术要求:c++, 编译, Android
193 |
194 | 8. 相关的开源软件仓库列表:https://github.com/MegEngine/MegEngine
195 |
196 | ## 项目九:模型转换 - ONNX 转 MegEngine
197 |
198 | 1. 项目标题:模型转换 - ONNX 转 MegEngine
199 | 2. 项目描述:
200 |
201 | 背景:目前官方的 mgeconvert 仓库里面只提供了 MegEngine 模型转到其他框架的功能,反过来不行。
202 | 目标:完成 ONNX 模型到 MegEngine 的转换。
203 |
204 | - 支持选择 ONNX/MegEngine 版本
205 | - 转换并 dump 出 cpp model(可以用 MegEngine op graph 建图并 dump)
206 | - 只支持推理,不支持训练。
207 |
208 | 3. 项目难度:中
209 | 4. 项目社区导师:熊鹏
210 | 5. 导师联系方式:xiongpeng@megvii.com
211 | 6. 项目产出要求:
212 |
213 | 转换能力需覆盖 model hub 里的模型
214 |
215 | 方案:
216 |
217 | - 完成 ONNX 模型的解析
218 | - 实现 ONNX 与 MGE 各类算子的转换
219 | - 根据 ONNX 模型中包含的算子搭建出完整的 MGE 模型
220 |
221 | 实现效果:转换出来的模型可以完成推理。
222 |
223 | - L0:在特定版本上实现模型的转换
224 | - L1:实现不同版本 ONNX 到 MGE 的转换
225 |
226 | 7. 项目技术要求:Python、ONNX
227 |
228 | 8. 相关的开源软件仓库列表:
229 |
230 | - https://github.com/MegEngine/mgeconvert
231 |
232 | - https://github.com/MegEngine/MegEngine/blob/master/imperative/python/megengine/utils/network.py
233 |
234 |
235 |
236 |
--------------------------------------------------------------------------------
/2022_Open_Source_Promotion_Plan-Summer_MegEngine.md:
--------------------------------------------------------------------------------
1 |
2 | MegEngine 是一个快速、可拓展、易于使用且支持自动求导的深度学习框架,具备训练推理一体、超低硬件门槛和全平台高效推理 3 大核心优势。
3 |
4 | 在开源软件供应链点亮计划 - 暑期 2022 活动中,MegEngine 将提供以下项目任务,并搭配导师,帮助大家更好的完成项目。
5 |
6 | ## 项目一:MegEngine 中 Arm backend 中反卷积支持 nc4hw4 layout 计算优化
7 |
8 |
9 | 1. 项目描述:
10 | * 背景:MegEngine 作为训推一体框架,其在 Arm 上的推理性能也非常重要,目前 MegEngine Arm 推理性能经过 benchmark 在业界处于第一梯队,主要优化方式是在Arm上支持 NC4HW4的layout,Layout 的详细解释见:[知乎回答](https://www.zhihu.com/question/337513515),目前反卷积 Forward 算子中没有支持这种 Layout 形式优化,希望通过支持这种 Layout 达到优化性能的目的。
11 | * 需求:在 ArmCommon 中 ConvolutionBackwardDataImpl 中支持 NC4HW4 layout 的计算,使得 ConvolutionBackwardDataImpl 可以在 NC4HW4 的 layout 下完成计算,并且性能不差于目前的 NCHW layout。
12 |
13 | 2. 项目难度:进阶
14 |
15 | 3. 项目产出要求:
16 | * 代码规范
17 | * 相同 shape 下,性能超过目前 NCHW
18 |
19 | 4. 项目技术要求
20 | * C++
21 | * 反卷积计算原理
22 | * Arm neon 优化
23 |
24 | 5. 项目成果仓库:https://github.com/MegEngine/MegEngine
25 |
26 |
27 | ## 项目二:MegEngineLite 支持进程化 Debug
28 |
29 | 1. 项目描述:
30 | * 背景:目前 MegEngine 作为用户的最底层,很多情况下崩溃,会将栈暴露在 MegEngine 中,但是很多情况是由于用户环境里面的其他程序踩踏了 Lite 的内存,因此看上去是崩溃在 MegEngine 中。
31 | * 需求:lite 计算支持一种 debug 模式,这种模式通过 env 控制(模式配置需要在 caller 调用发生调用任意 LITE API 之前就完成,所以需要和 API 本身解绑),在这种模式下模型的执行会 fork 一个单独的进程,并执行,这时候就和用户的进程进行了隔离,避免内存被踩踏的情况发生。
32 |
33 | 2. 项目难度:进阶
34 |
35 | 3. 项目产出要求:能让 MegEngine 推理服务和 Lite 接口 caller 分别运行在不同的进程,进程间通信高效
36 |
37 | 4. 项目技术要求
38 | * C++/C
39 | * 主流操作系统创建进程
40 | * 高效的进程间通信
41 |
42 | 5. 项目成果仓库:https://github.com/MegEngine/MegEngine
43 |
44 | ## 项目三:MegEngine 补充跨模态模型的实现
45 |
46 | 1. 项目描述:
47 | * 背景:MegEngine Hub 中实现了常用的分类检测等算法,但是还缺少一些最新的深度学习研究领域的算法实现。
48 | * 需求:用 MegEngine 中,添加 CLIP、VQGAN、DALL·E、BigSleep 模型的推理代码,确保精度与其他框架中一致,并添加到 MegEngine Hub 上。
49 |
50 | 2. 项目难度:进阶
51 |
52 | 3. 项目产出要求:
53 | * 在 https://github.com/MegEngine/ 下贡献一个代码实现的 repo 并有对应的使用文档说明
54 | * 模型运行与其他框架结果可对应(比如实现 CLIP,可与 https://github.com/openai/CLIP 进行模型对分)
55 |
56 | 4. 项目技术要求
57 | * Python
58 | * 深度学习
59 |
60 | 5. 项目成果仓库:
61 | * https://github.com/MegEngine/MegEngine
62 | * https://github.com/MegEngine/Models
63 |
64 | ## 项目四:python whl 支持 apple ARM macos
65 |
66 | 1. 项目描述:
67 | * 背景:目前 MegEngine 开发时没有没有进行 apple aarch64 macos 操作系统的适配,MegEngine 并没有发布适配 apple aarch64 macos 的安装包。随着 apple aarch64 macos 电脑越来越多,此需求日益剧增。
68 | * 需求:在 MegEngine 中适配 apple aarch64 macos,完成 MegEngine apple aarch64 macos 安装包发布。
69 |
70 | 2. 项目难度:进阶
71 |
72 | 3. 项目产出要求:在 apple aarch64 macos 上,产出 MegEngine 的 wheel 安装包。
73 | * L0: MegEngine 在 apple aarch64 macos 上可本地编译通过。
74 | * L1: 包具备分发性,可安装到其他 aarch64+ macbook 上,且能正常运行。
75 | * L2: 安装后能训练一个模型。
76 |
77 | 4. 项目技术要求
78 | * C++
79 | * 编译
80 | * macos
81 |
82 | 5. 项目成果仓库:https://github.com/MegEngine/MegEngine
83 |
84 |
85 | 项目申请:https://summer-ospp.ac.cn/#/org/orgdetail/294bd67f-5476-40d4-bd74-2a9d5296cf5a/
86 |
87 | MegEngine 官网:https://www.megengine.org.cn/
88 |
89 | MegEngine 技术交流 QQ 群:1029741705
90 |
--------------------------------------------------------------------------------
/Contributor_Program.md:
--------------------------------------------------------------------------------
1 | # MegEngine贡献者计划
2 | ## 写在前面
3 | 无开源,不 AI,开源精神为人工智能在近 10 年的大力发展提供了极为重要的「源」动力,开源让知识得以有效分享,让工具能够便利所有人,开源社区的繁荣需要社区和贡献者长久的坚持和对代码的不懈追求。
4 | MegEngine 的全面发展离不开生态内广大开发者的大力支持。为了更好的激励深度学习开发者,天元社区发起「MegEngine 贡献者计划」,招募更多的贡献者参与到社区的生态建设,天元与贡献者互相见证,共同成长。
5 |
6 | MegEngine 贡献者计划—MegEngine Contributor Program 简称 MECP,由 MegEngine 开源委员会组织发起,目的是共同建设 MegEngine 开源社区,促进社区繁荣,践行开源精神。
7 | ## 面向对象
8 | 高校学生
9 | ## 面向社区
10 | MegEngine:https://github.com/MegEngine
11 | ## 任务类型
12 | ### 成为 contributor
13 | 只要在 https://github.com/MegEngine/ 下任一自有 repo - MegEngine、Models、Hub、Documentation 等,有 pr 合并记录,即可自动成为 MegEngine Contributor。
14 |
15 | 我们为每一位 Contributor 准备了专属身份徽章和限量社区贡献礼 - 「纪念丹炉」。
16 |
17 | 可填写 [登记表单](http://i8x1e20aau991m05.mikecrm.com/YhT8Kpw) ,小助手会(微信号:megengine_bot)在核查 GitHub ID 及资料信息无误后,逐一安排邮寄。
18 |
19 | ### Awesome MegEngineer
20 |
21 | Awesome MegEngineer 是 MegEngine 社区颁发给社区优秀贡献者的一项荣誉认证,感谢大家为 MegEngine 做出的卓越贡献。
22 |
23 | 满足以下任意两项,即可申请加入:
24 | + 在实验项目、论文中应用 MegEngine 并成功发表;
25 | + 在公开比赛中使用 MegEngine 并获得 Top 3 名次;
26 | + 创建 MegEngine 相关开源项目,并被收录入 awesome-megengine repo ;
27 | + 在 GitHub MegEngine 组织中至少有 2 个 PR 被合并;
28 | + 在媒体/自媒体平台公开发布 1 篇 MegEngine 相关原创文章;
29 | + 主讲 1 场有关 MegEngine 的线上/线下技术分享;
30 | + 组织过 1 场当地/学校的线下 20 人以上的技术活动;
31 |
32 | **活动详情见:[官网](https://www.megengine.org.cn/community-AMGE)**
33 |
34 |
35 | MegEngine 开源委员保留《MegEngine 贡献者计划》解释权。
36 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | Apache License
2 | Version 2.0, January 2004
3 | http://www.apache.org/licenses/
4 |
5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
6 |
7 | 1. Definitions.
8 |
9 | "License" shall mean the terms and conditions for use, reproduction,
10 | and distribution as defined by Sections 1 through 9 of this document.
11 |
12 | "Licensor" shall mean the copyright owner or entity authorized by
13 | the copyright owner that is granting the License.
14 |
15 | "Legal Entity" shall mean the union of the acting entity and all
16 | other entities that control, are controlled by, or are under common
17 | control with that entity. For the purposes of this definition,
18 | "control" means (i) the power, direct or indirect, to cause the
19 | direction or management of such entity, whether by contract or
20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the
21 | outstanding shares, or (iii) beneficial ownership of such entity.
22 |
23 | "You" (or "Your") shall mean an individual or Legal Entity
24 | exercising permissions granted by this License.
25 |
26 | "Source" form shall mean the preferred form for making modifications,
27 | including but not limited to software source code, documentation
28 | source, and configuration files.
29 |
30 | "Object" form shall mean any form resulting from mechanical
31 | transformation or translation of a Source form, including but
32 | not limited to compiled object code, generated documentation,
33 | and conversions to other media types.
34 |
35 | "Work" shall mean the work of authorship, whether in Source or
36 | Object form, made available under the License, as indicated by a
37 | copyright notice that is included in or attached to the work
38 | (an example is provided in the Appendix below).
39 |
40 | "Derivative Works" shall mean any work, whether in Source or Object
41 | form, that is based on (or derived from) the Work and for which the
42 | editorial revisions, annotations, elaborations, or other modifications
43 | represent, as a whole, an original work of authorship. For the purposes
44 | of this License, Derivative Works shall not include works that remain
45 | separable from, or merely link (or bind by name) to the interfaces of,
46 | the Work and Derivative Works thereof.
47 |
48 | "Contribution" shall mean any work of authorship, including
49 | the original version of the Work and any modifications or additions
50 | to that Work or Derivative Works thereof, that is intentionally
51 | submitted to Licensor for inclusion in the Work by the copyright owner
52 | or by an individual or Legal Entity authorized to submit on behalf of
53 | the copyright owner. For the purposes of this definition, "submitted"
54 | means any form of electronic, verbal, or written communication sent
55 | to the Licensor or its representatives, including but not limited to
56 | communication on electronic mailing lists, source code control systems,
57 | and issue tracking systems that are managed by, or on behalf of, the
58 | Licensor for the purpose of discussing and improving the Work, but
59 | excluding communication that is conspicuously marked or otherwise
60 | designated in writing by the copyright owner as "Not a Contribution."
61 |
62 | "Contributor" shall mean Licensor and any individual or Legal Entity
63 | on behalf of whom a Contribution has been received by Licensor and
64 | subsequently incorporated within the Work.
65 |
66 | 2. Grant of Copyright License. Subject to the terms and conditions of
67 | this License, each Contributor hereby grants to You a perpetual,
68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable
69 | copyright license to reproduce, prepare Derivative Works of,
70 | publicly display, publicly perform, sublicense, and distribute the
71 | Work and such Derivative Works in Source or Object form.
72 |
73 | 3. Grant of Patent License. Subject to the terms and conditions of
74 | this License, each Contributor hereby grants to You a perpetual,
75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable
76 | (except as stated in this section) patent license to make, have made,
77 | use, offer to sell, sell, import, and otherwise transfer the Work,
78 | where such license applies only to those patent claims licensable
79 | by such Contributor that are necessarily infringed by their
80 | Contribution(s) alone or by combination of their Contribution(s)
81 | with the Work to which such Contribution(s) was submitted. If You
82 | institute patent litigation against any entity (including a
83 | cross-claim or counterclaim in a lawsuit) alleging that the Work
84 | or a Contribution incorporated within the Work constitutes direct
85 | or contributory patent infringement, then any patent licenses
86 | granted to You under this License for that Work shall terminate
87 | as of the date such litigation is filed.
88 |
89 | 4. Redistribution. You may reproduce and distribute copies of the
90 | Work or Derivative Works thereof in any medium, with or without
91 | modifications, and in Source or Object form, provided that You
92 | meet the following conditions:
93 |
94 | (a) You must give any other recipients of the Work or
95 | Derivative Works a copy of this License; and
96 |
97 | (b) You must cause any modified files to carry prominent notices
98 | stating that You changed the files; and
99 |
100 | (c) You must retain, in the Source form of any Derivative Works
101 | that You distribute, all copyright, patent, trademark, and
102 | attribution notices from the Source form of the Work,
103 | excluding those notices that do not pertain to any part of
104 | the Derivative Works; and
105 |
106 | (d) If the Work includes a "NOTICE" text file as part of its
107 | distribution, then any Derivative Works that You distribute must
108 | include a readable copy of the attribution notices contained
109 | within such NOTICE file, excluding those notices that do not
110 | pertain to any part of the Derivative Works, in at least one
111 | of the following places: within a NOTICE text file distributed
112 | as part of the Derivative Works; within the Source form or
113 | documentation, if provided along with the Derivative Works; or,
114 | within a display generated by the Derivative Works, if and
115 | wherever such third-party notices normally appear. The contents
116 | of the NOTICE file are for informational purposes only and
117 | do not modify the License. You may add Your own attribution
118 | notices within Derivative Works that You distribute, alongside
119 | or as an addendum to the NOTICE text from the Work, provided
120 | that such additional attribution notices cannot be construed
121 | as modifying the License.
122 |
123 | You may add Your own copyright statement to Your modifications and
124 | may provide additional or different license terms and conditions
125 | for use, reproduction, or distribution of Your modifications, or
126 | for any such Derivative Works as a whole, provided Your use,
127 | reproduction, and distribution of the Work otherwise complies with
128 | the conditions stated in this License.
129 |
130 | 5. Submission of Contributions. Unless You explicitly state otherwise,
131 | any Contribution intentionally submitted for inclusion in the Work
132 | by You to the Licensor shall be under the terms and conditions of
133 | this License, without any additional terms or conditions.
134 | Notwithstanding the above, nothing herein shall supersede or modify
135 | the terms of any separate license agreement you may have executed
136 | with Licensor regarding such Contributions.
137 |
138 | 6. Trademarks. This License does not grant permission to use the trade
139 | names, trademarks, service marks, or product names of the Licensor,
140 | except as required for reasonable and customary use in describing the
141 | origin of the Work and reproducing the content of the NOTICE file.
142 |
143 | 7. Disclaimer of Warranty. Unless required by applicable law or
144 | agreed to in writing, Licensor provides the Work (and each
145 | Contributor provides its Contributions) on an "AS IS" BASIS,
146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
147 | implied, including, without limitation, any warranties or conditions
148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
149 | PARTICULAR PURPOSE. You are solely responsible for determining the
150 | appropriateness of using or redistributing the Work and assume any
151 | risks associated with Your exercise of permissions under this License.
152 |
153 | 8. Limitation of Liability. In no event and under no legal theory,
154 | whether in tort (including negligence), contract, or otherwise,
155 | unless required by applicable law (such as deliberate and grossly
156 | negligent acts) or agreed to in writing, shall any Contributor be
157 | liable to You for damages, including any direct, indirect, special,
158 | incidental, or consequential damages of any character arising as a
159 | result of this License or out of the use or inability to use the
160 | Work (including but not limited to damages for loss of goodwill,
161 | work stoppage, computer failure or malfunction, or any and all
162 | other commercial damages or losses), even if such Contributor
163 | has been advised of the possibility of such damages.
164 |
165 | 9. Accepting Warranty or Additional Liability. While redistributing
166 | the Work or Derivative Works thereof, You may choose to offer,
167 | and charge a fee for, acceptance of support, warranty, indemnity,
168 | or other liability obligations and/or rights consistent with this
169 | License. However, in accepting such obligations, You may act only
170 | on Your own behalf and on Your sole responsibility, not on behalf
171 | of any other Contributor, and only if You agree to indemnify,
172 | defend, and hold each Contributor harmless for any liability
173 | incurred by, or claims asserted against, such Contributor by reason
174 | of your accepting any such warranty or additional liability.
175 |
176 | END OF TERMS AND CONDITIONS
177 |
178 | APPENDIX: How to apply the Apache License to your work.
179 |
180 | To apply the Apache License to your work, attach the following
181 | boilerplate notice, with the fields enclosed by brackets "[]"
182 | replaced with your own identifying information. (Don't include
183 | the brackets!) The text should be enclosed in the appropriate
184 | comment syntax for the file format. We also recommend that a
185 | file or class name and description of purpose be included on the
186 | same "printed page" as the copyright notice for easier
187 | identification within third-party archives.
188 |
189 | Copyright [yyyy] [name of copyright owner]
190 |
191 | Licensed under the Apache License, Version 2.0 (the "License");
192 | you may not use this file except in compliance with the License.
193 | You may obtain a copy of the License at
194 |
195 | http://www.apache.org/licenses/LICENSE-2.0
196 |
197 | Unless required by applicable law or agreed to in writing, software
198 | distributed under the License is distributed on an "AS IS" BASIS,
199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
200 | See the License for the specific language governing permissions and
201 | limitations under the License.
202 |
--------------------------------------------------------------------------------
/MegEngine Meetup/31_31 conv docs images/1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/31_31 conv docs images/1.png
--------------------------------------------------------------------------------
/MegEngine Meetup/31_31 conv docs images/2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/31_31 conv docs images/2.png
--------------------------------------------------------------------------------
/MegEngine Meetup/31_31 conv docs images/3.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/31_31 conv docs images/3.png
--------------------------------------------------------------------------------
/MegEngine Meetup/31_31 conv docs images/4.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/31_31 conv docs images/4.png
--------------------------------------------------------------------------------
/MegEngine Meetup/31_31 conv docs images/5.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/31_31 conv docs images/5.png
--------------------------------------------------------------------------------
/MegEngine Meetup/31_31 conv docs images/6.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/31_31 conv docs images/6.png
--------------------------------------------------------------------------------
/MegEngine Meetup/31_31 conv docs images/7.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/31_31 conv docs images/7.png
--------------------------------------------------------------------------------
/MegEngine Meetup/31_31 conv docs images/8.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/31_31 conv docs images/8.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-1.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-10.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-10.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-11.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-11.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-12.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-12.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-13.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-13.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-14.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-14.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-15.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-15.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-16.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-16.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-17.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-17.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-18.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-18.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-19.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-19.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-2.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-20.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-20.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-3.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-3.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-4.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-4.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-5.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-5.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-6.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-6.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-7.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-7.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-8.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-8.png
--------------------------------------------------------------------------------
/MegEngine Meetup/DTR docs image/dtr-9.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/DTR docs image/dtr-9.png
--------------------------------------------------------------------------------
/MegEngine Meetup/MegCC 用模型编译的方式实现超轻量端上高性能推理.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/MegCC 用模型编译的方式实现超轻量端上高性能推理.pdf
--------------------------------------------------------------------------------
/MegEngine Meetup/MegEngine Meetup_Pytorch与MegEngine之间不得不说的事__MegEngine Meetup No.4.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/MegEngine Meetup_Pytorch与MegEngine之间不得不说的事__MegEngine Meetup No.4.pdf
--------------------------------------------------------------------------------
/MegEngine Meetup/《2020 年代的卷积网络》_2022.3.19.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/《2020 年代的卷积网络》_2022.3.19.pdf
--------------------------------------------------------------------------------
/MegEngine Meetup/《JIT in MegEngine》_MegEngine Meetup No.1.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/《JIT in MegEngine》_MegEngine Meetup No.1.pdf
--------------------------------------------------------------------------------
/MegEngine Meetup/《MegEngine 中的动态图 Sublinear 显存优化》_MegEngine Meetup No.3.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/《MegEngine 中的动态图 Sublinear 显存优化》_MegEngine Meetup No.3.pdf
--------------------------------------------------------------------------------
/MegEngine Meetup/《MegEngine 大 kernel 工程优化实践》_2022.3.19.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/《MegEngine 大 kernel 工程优化实践》_2022.3.19.pdf
--------------------------------------------------------------------------------
/MegEngine Meetup/《MegEngine 开源一周年架构回顾》_MegEngine Meetup No.2.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/《MegEngine 开源一周年架构回顾》_MegEngine Meetup No.2.pdf
--------------------------------------------------------------------------------
/MegEngine Meetup/《MegEngine“花式整活”纪实》_MegEngine Meetup No.5.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/《MegEngine“花式整活”纪实》_MegEngine Meetup No.5.pdf
--------------------------------------------------------------------------------
/MegEngine Meetup/《利用MegEngine的分布式通信算子实现复杂的并行训练》 _MegEngine Meetup No.2..pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/《利用MegEngine的分布式通信算子实现复杂的并行训练》 _MegEngine Meetup No.2..pdf
--------------------------------------------------------------------------------
/MegEngine Meetup/《手把手带你玩转最新开源”看图神器“ MegSpot》- MegEngine Meetup No.9.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/《手把手带你玩转最新开源”看图神器“ MegSpot》- MegEngine Meetup No.9.pdf
--------------------------------------------------------------------------------
/MegEngine Meetup/《超大卷积核架构设计与高效实践》_2022.3.19.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/《超大卷积核架构设计与高效实践》_2022.3.19.pdf
--------------------------------------------------------------------------------
/MegEngine Meetup/中国农业人工智能创新创业大赛分享_MegEngine Meetup No.6.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/中国农业人工智能创新创业大赛分享_MegEngine Meetup No.6.pdf
--------------------------------------------------------------------------------
/MegEngine Meetup/深度学习框架 MegEngine CUDA int4 推理详解.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/深度学习框架 MegEngine CUDA int4 推理详解.pdf
--------------------------------------------------------------------------------
/MegEngine Meetup/聊聊底层优化_MegEngine Meetup No.7.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/MegEngine Meetup/聊聊底层优化_MegEngine Meetup No.7.pdf
--------------------------------------------------------------------------------
/Open_Source_Promotion_Plan-Summer_2020_MegEngine.md:
--------------------------------------------------------------------------------
1 | ## 开源软件供应链点亮计划—天元MegEngine项目
2 | 天元(MegEngine)是在旷视长期进行大规模AI业务落地过程中诞生的,这种业务形态对深度学习框架有很多挑战,在不断解决这些挑战的过程中,天元形成了动静合一、兼容并包、灵活高效、训练推理一体化等特性,能够帮助开发者高效的完成深度学习算法的设计、训练、部署,有效提升AI研发工作效率。
3 | 社区官方网址:https://megengine.org.cn/
4 |
5 | 天元开发者交流QQ群:1029741705
6 |
7 | ### 任务列表
8 | >注:以下任务均需要能在纯开源版本中进行,因此最好避开过于底层的 C++ 代码(因为外部没有手机 Arm 测试)
9 |
10 | 任务名称| 内容 | 难度| 导师|具体内容|所需技能
11 | ---|---|---|---|---|---|
12 | Github CI 增加 commit 内容检查 | 增加 commit message checker,以规范贡献者 commit 格式|中|张禄
zhanglu@megvii.com|
1. 使用 commitlint 工具,对于 commit message 进行规范
2. 集成 Gitlab Action|
1. SHELL
2. Git
13 | 实现 MegEngine to Caffe 转换器 | 编写转换器,将 MegEngine 训练出的模型转换成 Caffe 模型|高|陈远昭
chenyuanzhao@megvii.com|
1. 学习 MegEngine to MACE 转换器
2. 编写 Caffe 转换器
3. 正确性、性能测试|
1. Python
2. Caffe
14 | 实现 MegEngine to TFLite 转换器 | 编写转换器,将 MegEngine 训练出的模型转换成 TFLite 模型|高|陈远昭
chenyuanzhao@megvii.com|
1. 学习 MegEngine to MACE 转换器
2. 编写 TFLite 转换器
3. 正确性、性能测试|
1. Python
2. TF Lite
15 | 添加一些常用的opr |添加一些目前MegEngine缺乏的、常用的opr,大部分只需要在python层进行封装。
opr包括:
deformable_conv
deformable_ps_roipooing
mask_convolution
matinv
svd
cumsum
batchrenormalization
adaptive_max_pooling
adaptive_avg_pooling
maxout|中|胡焜
hukun@megvii.com|
1. 开发 operator
2. 添加测试证明正确性
3. 添加详细的docstring|
1. Python
16 | 添加各类常用分类模型|classification模型比如mobilenet/inception,但不需要训练/推理等完整代码,也不需要预训练权重,只需要模型实现即可,统一放在 vision/classification/models[contribution]下面,类似于torchvision那样,前期可以提供示例。|中|周亦庄
zhouyizhuang@megvii.com|
InceptionNet
Googlenet
EfficientNet
Mobilenetv123
SqueezeNet
DenseNet
NASNet-series|
1. Python
2.PyTorch
17 | 实现 assistant 库 | 针对容易出错的使用方式,进行 warning 提示。
希望达到玩游戏的时候的那种辅助提示的效果
比如 momentum < 0.5 的时候,提示:"The momentum of batch normalization layer rarely uses a value less than 0.5, Please check the document for momentum's definition, which is different from PyTorch."
至多提示一次,如果用户觉得我明白风险,可以忽略或显式用 API 表明”我了解这个风险”|高|许欣然
xxr@megvii.com|
1. 实现一套 warning 的机制
2. 对于 momentum 等地方进行处理,在错读的地方报错
3. 编写测试|
1. Python
2.深度学习框架使用经验
18 | MegEngine 网络可视化|基于 Netron,对 MegEngine 训练网络进行可视化|高|许欣然
xxr@megvii.com||
1.Javascript
2. Python
19 | C++ 文档生成|基于 Doxygen 和 Sphinx,生成用于集成在官网的 C++ 使用文档|中|许欣然
xxr@megvii.com||
1. Python
20 | Models-CI 增加模型检查|对于Models中开源出去模型,进行试运行检查,及时定位Models中的问题|高|王枫
wangfeng02@megvii.com|1. 了解github中的workflow
2. 实现在CI下对模型试运行的逻辑
3. 编写对应测试|
1. Python
2. shell
3. Git
21 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | MegEngine 的周边资源,包括技术文章、活动、最新资讯等。
2 |
--------------------------------------------------------------------------------
/competition activities/slides/README.md:
--------------------------------------------------------------------------------
1 | 包含比赛活动的课件资源
2 |
--------------------------------------------------------------------------------
/competition activities/slides/旷视 AI 智慧交通开源大赛-赛题讲解.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/competition activities/slides/旷视 AI 智慧交通开源大赛-赛题讲解.pdf
--------------------------------------------------------------------------------
/competition activities/slides/旷视 AI 智慧交通开源赛 - MegStudio 介绍.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/competition activities/slides/旷视 AI 智慧交通开源赛 - MegStudio 介绍.pdf
--------------------------------------------------------------------------------
/competition activities/slides/视频超分辨率分享.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/competition activities/slides/视频超分辨率分享.pdf
--------------------------------------------------------------------------------
/competition activities/slides/金枪鱼之夜:MegEngine 框架设计.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/competition activities/slides/金枪鱼之夜:MegEngine 框架设计.pdf
--------------------------------------------------------------------------------
/competition activities/旷视人工智能开源大赛_视频超分辨率.md:
--------------------------------------------------------------------------------
1 | ## 背景
2 | 恢复压缩视频的画面质量,对于改善在网络带宽受限环境、提高在高分辨率显示设备上的视频观看体验有现实的价值和积极的作用。本次比赛围绕这个主题展开,为现实问题寻找最优解决方案。
3 |
4 | ## 详情
5 | **比赛主题**:视频超分辨率
6 | **比赛时间**:2020/08/10 - 2020/09/24
7 | **详情**:https://studio.brainpp.com/competition
8 |
9 | ## 视频讲解
10 | https://www.bilibili.com/video/BV19C4y1t7Ae
11 |
12 | ## 课件地址
13 | [视频超分辨率分享.pdf](./slides/视频超分辨率分享.pdf)
14 |
15 |
--------------------------------------------------------------------------------
/competition activities/金枪鱼之夜:MegEngine 框架设计.md:
--------------------------------------------------------------------------------
1 | ## 背景
2 | AI 浪潮一波又一波,仿佛不会算法就落后于时代。
3 | 深度学习框架处理了各种设备的计算细节、求导、计算优化序列的工作,而在动态、静态两套截然不同的世界中,这些步骤又各自有他们不同的优化点和瓶颈。
4 | 如何在中间获取一个高效的平衡呢?以及如何克服训练完的模型在推理部署中无数的坑(闻者落泪),那个堪称对此进行降维打击的“训练推理一体化”到为何物?
5 |
6 | ## 分享内容
7 | MegEngine 天元作为旷视全员自用6年的自研深度学习框架,是一个在淘金热时,坚持选择卖铲子的团队。旷视研究院 AI 系统高级技术总监、MegEngine 技术负责人许欣然作为主讲人。
8 | 他将带我们了解一个深度学习框架是如何把网络的定义逐步优化并最终执行的,从框架开发者的视角来看待深度学习。
9 |
10 | **视频地址:** https://www.bilibili.com/video/BV11C4y1t7xH
11 |
12 | **课件地址:**[金枪鱼之夜:MegEngine 框架设计.pdf](./slides/金枪鱼之夜:MegEngine%20框架设计.pdf)
13 |
14 | ## 分享大纲
15 | - 背景介绍
16 | - 深度学习框架是干啥的?
17 | - 道理我都懂,为什么又搞一个深度学习框架?
18 | - 你们为啥不用 PyTorch / TensorFlow?
19 | - 训推一体是个啥玩意?
20 | - 如何写出一个深度学习框架?(超简化版)
21 | - 动态图训练
22 | - 调用 = 执行
23 | - 依赖关系图 forward & backward
24 | - megdnn kernel
25 | - exec
26 | - Shape Deduce
27 | - 静态图训练 + 推理(粗糙版)
28 | - Tensor
29 | - Graph、SymborVar
30 | - CompNode
31 | - Shape Inference
32 | - Graph Optimization
33 | - 拓扑排序
34 | - 内存优化
35 | - Computing Sequence
36 | - 一个陈年静态图框架是怎么变成动态图框架的?
37 | - Dynamic Region
38 | - Eager Graph
39 | - Eager Runtime + Proxy Graph
40 | - 对未来的展望
41 | - 各种芯片模组的对接,挑战训推一体的理念
42 | - MLIR 等技术的兴起
43 | - 如何做到真 JIT
44 |
--------------------------------------------------------------------------------
/【2022 翻倍争夺战提交示例】mlp-mixer/compare.py:
--------------------------------------------------------------------------------
1 | from model import *
2 |
3 | import json
4 | from glob import glob
5 | import numpy as np
6 | import torch
7 | import torchvision
8 | import timm
9 |
10 |
11 | def read_image(path):
12 | img = torchvision.io.read_image(path)
13 | img = img.float() / 255
14 | img = torchvision.transforms.functional.resize(img, 224)
15 | img = torchvision.transforms.functional.center_crop(img, 224)
16 | img = torchvision.transforms.functional.normalize(img, [0.5, 0.5, 0.5], [0.5, 0.5, 0.5])
17 | return img.numpy()
18 |
19 |
20 | def softmax(logits):
21 | logits = logits - logits.max(-1, keepdims=True)
22 | exp = np.exp(logits)
23 | return exp / exp.sum(-1, keepdims=True)
24 |
25 |
26 | model = mlp_mixer_b16_224(pretrained=True)
27 | torch_model = timm.models.mlp_mixer.mixer_b16_224(pretrained=True)
28 | # model.load_from_torch(torch_model.state_dict())
29 |
30 | data = np.stack([read_image(path) for path in glob('data/*.jpg')])
31 | print(f'input shape {data.shape} max {data.max()} min {data.min()}')
32 | text_labels = json.load(open('imagenet-labels.json'))
33 |
34 | logits = model(mge.tensor(data)).numpy()
35 | torch_model.train(False)
36 | with torch.no_grad():
37 | torch_logits = torch_model(torch.tensor(data)).numpy()
38 |
39 | np.testing.assert_allclose(torch_logits, logits, rtol=1e-3)
40 |
41 | print()
42 | print('torch')
43 | print('megengine')
44 | print()
45 | for p1, p2 in zip(softmax(torch_logits), softmax(logits)):
46 | print(text_labels[p1.argmax()], p1.max())
47 | print(text_labels[p2.argmax()], p2.max())
48 | print()
49 |
--------------------------------------------------------------------------------
/【2022 翻倍争夺战提交示例】mlp-mixer/data/640px-Cat_close-up_2004_b.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/【2022 翻倍争夺战提交示例】mlp-mixer/data/640px-Cat_close-up_2004_b.jpg
--------------------------------------------------------------------------------
/【2022 翻倍争夺战提交示例】mlp-mixer/data/640px-Cat_eyes_2007-1.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/【2022 翻倍争夺战提交示例】mlp-mixer/data/640px-Cat_eyes_2007-1.jpg
--------------------------------------------------------------------------------
/【2022 翻倍争夺战提交示例】mlp-mixer/data/640px-Cat_kitten_side_profile.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/【2022 翻倍争夺战提交示例】mlp-mixer/data/640px-Cat_kitten_side_profile.jpg
--------------------------------------------------------------------------------
/【2022 翻倍争夺战提交示例】mlp-mixer/data/640px-Red_cat_Felis_silvestris_catus.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/【2022 翻倍争夺战提交示例】mlp-mixer/data/640px-Red_cat_Felis_silvestris_catus.jpg
--------------------------------------------------------------------------------
/【2022 翻倍争夺战提交示例】mlp-mixer/data/640px-Siberian_black_tabby_blotched_cat_01.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/【2022 翻倍争夺战提交示例】mlp-mixer/data/640px-Siberian_black_tabby_blotched_cat_01.jpg
--------------------------------------------------------------------------------
/【2022 翻倍争夺战提交示例】mlp-mixer/data/Black_cat_IMG_1618.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/【2022 翻倍争夺战提交示例】mlp-mixer/data/Black_cat_IMG_1618.jpg
--------------------------------------------------------------------------------
/【2022 翻倍争夺战提交示例】mlp-mixer/data/Felis_catus-cat_on_snow.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/【2022 翻倍争夺战提交示例】mlp-mixer/data/Felis_catus-cat_on_snow.jpg
--------------------------------------------------------------------------------
/【2022 翻倍争夺战提交示例】mlp-mixer/data/Tabby_cat_with_blue_eyes-3336579.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/【2022 翻倍争夺战提交示例】mlp-mixer/data/Tabby_cat_with_blue_eyes-3336579.jpg
--------------------------------------------------------------------------------
/【2022 翻倍争夺战提交示例】mlp-mixer/data/Tony_senior_cat_6-6-22.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/【2022 翻倍争夺战提交示例】mlp-mixer/data/Tony_senior_cat_6-6-22.jpg
--------------------------------------------------------------------------------
/【2022 翻倍争夺战提交示例】mlp-mixer/data/lADPDgQ9qoGDocbNAnLNAZ0_413_626.jpg_620x10000q90g.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/【2022 翻倍争夺战提交示例】mlp-mixer/data/lADPDgQ9qoGDocbNAnLNAZ0_413_626.jpg_620x10000q90g.jpg
--------------------------------------------------------------------------------
/【2022 翻倍争夺战提交示例】mlp-mixer/imagenet-labels.json:
--------------------------------------------------------------------------------
1 | ["tench, Tinca tinca", "goldfish, Carassius auratus", "great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias", "tiger shark, Galeocerdo cuvieri", "hammerhead, hammerhead shark", "electric ray, crampfish, numbfish, torpedo", "stingray", "cock", "hen", "ostrich, Struthio camelus", "brambling, Fringilla montifringilla", "goldfinch, Carduelis carduelis", "house finch, linnet, Carpodacus mexicanus", "junco, snowbird", "indigo bunting, indigo finch, indigo bird, Passerina cyanea", "robin, American robin, Turdus migratorius", "bulbul", "jay", "magpie", "chickadee", "water ouzel, dipper", "kite", "bald eagle, American eagle, Haliaeetus leucocephalus", "vulture", "great grey owl, great gray owl, Strix nebulosa", "European fire salamander, Salamandra salamandra", "common newt, Triturus vulgaris", "eft", "spotted salamander, Ambystoma maculatum", "axolotl, mud puppy, Ambystoma mexicanum", "bullfrog, Rana catesbeiana", "tree frog, tree-frog", "tailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui", "loggerhead, loggerhead turtle, Caretta caretta", "leatherback turtle, leatherback, leathery turtle, Dermochelys coriacea", "mud turtle", "terrapin", "box turtle, box tortoise", "banded gecko", "common iguana, iguana, Iguana iguana", "American chameleon, anole, Anolis carolinensis", "whiptail, whiptail lizard", "agama", "frilled lizard, Chlamydosaurus kingi", "alligator lizard", "Gila monster, Heloderma suspectum", "green lizard, Lacerta viridis", "African chameleon, Chamaeleo chamaeleon", "Komodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis", "African crocodile, Nile crocodile, Crocodylus niloticus", "American alligator, Alligator mississipiensis", "triceratops", "thunder snake, worm snake, Carphophis amoenus", "ringneck snake, ring-necked snake, ring snake", "hognose snake, puff adder, sand viper", "green snake, grass snake", "king snake, kingsnake", "garter snake, grass snake", "water snake", "vine snake", "night snake, Hypsiglena torquata", "boa constrictor, Constrictor constrictor", "rock python, rock snake, Python sebae", "Indian cobra, Naja naja", "green mamba", "sea snake", "horned viper, cerastes, sand viper, horned asp, Cerastes cornutus", "diamondback, diamondback rattlesnake, Crotalus adamanteus", "sidewinder, horned rattlesnake, Crotalus cerastes", "trilobite", "harvestman, daddy longlegs, Phalangium opilio", "scorpion", "black and gold garden spider, Argiope aurantia", "barn spider, Araneus cavaticus", "garden spider, Aranea diademata", "black widow, Latrodectus mactans", "tarantula", "wolf spider, hunting spider", "tick", "centipede", "black grouse", "ptarmigan", "ruffed grouse, partridge, Bonasa umbellus", "prairie chicken, prairie grouse, prairie fowl", "peacock", "quail", "partridge", "African grey, African gray, Psittacus erithacus", "macaw", "sulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita", "lorikeet", "coucal", "bee eater", "hornbill", "hummingbird", "jacamar", "toucan", "drake", "red-breasted merganser, Mergus serrator", "goose", "black swan, Cygnus atratus", "tusker", "echidna, spiny anteater, anteater", "platypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus", "wallaby, brush kangaroo", "koala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus", "wombat", "jellyfish", "sea anemone, anemone", "brain coral", "flatworm, platyhelminth", "nematode, nematode worm, roundworm", "conch", "snail", "slug", "sea slug, nudibranch", "chiton, coat-of-mail shell, sea cradle, polyplacophore", "chambered nautilus, pearly nautilus, nautilus", "Dungeness crab, Cancer magister", "rock crab, Cancer irroratus", "fiddler crab", "king crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica", "American lobster, Northern lobster, Maine lobster, Homarus americanus", "spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish", "crayfish, crawfish, crawdad, crawdaddy", "hermit crab", "isopod", "white stork, Ciconia ciconia", "black stork, Ciconia nigra", "spoonbill", "flamingo", "little blue heron, Egretta caerulea", "American egret, great white heron, Egretta albus", "bittern", "crane", "limpkin, Aramus pictus", "European gallinule, Porphyrio porphyrio", "American coot, marsh hen, mud hen, water hen, Fulica americana", "bustard", "ruddy turnstone, Arenaria interpres", "red-backed sandpiper, dunlin, Erolia alpina", "redshank, Tringa totanus", "dowitcher", "oystercatcher, oyster catcher", "pelican", "king penguin, Aptenodytes patagonica", "albatross, mollymawk", "grey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus", "killer whale, killer, orca, grampus, sea wolf, Orcinus orca", "dugong, Dugong dugon", "sea lion", "Chihuahua", "Japanese spaniel", "Maltese dog, Maltese terrier, Maltese", "Pekinese, Pekingese, Peke", "Shih-Tzu", "Blenheim spaniel", "papillon", "toy terrier", "Rhodesian ridgeback", "Afghan hound, Afghan", "basset, basset hound", "beagle", "bloodhound, sleuthhound", "bluetick", "black-and-tan coonhound", "Walker hound, Walker foxhound", "English foxhound", "redbone", "borzoi, Russian wolfhound", "Irish wolfhound", "Italian greyhound", "whippet", "Ibizan hound, Ibizan Podenco", "Norwegian elkhound, elkhound", "otterhound, otter hound", "Saluki, gazelle hound", "Scottish deerhound, deerhound", "Weimaraner", "Staffordshire bullterrier, Staffordshire bull terrier", "American Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier", "Bedlington terrier", "Border terrier", "Kerry blue terrier", "Irish terrier", "Norfolk terrier", "Norwich terrier", "Yorkshire terrier", "wire-haired fox terrier", "Lakeland terrier", "Sealyham terrier, Sealyham", "Airedale, Airedale terrier", "cairn, cairn terrier", "Australian terrier", "Dandie Dinmont, Dandie Dinmont terrier", "Boston bull, Boston terrier", "miniature schnauzer", "giant schnauzer", "standard schnauzer", "Scotch terrier, Scottish terrier, Scottie", "Tibetan terrier, chrysanthemum dog", "silky terrier, Sydney silky", "soft-coated wheaten terrier", "West Highland white terrier", "Lhasa, Lhasa apso", "flat-coated retriever", "curly-coated retriever", "golden retriever", "Labrador retriever", "Chesapeake Bay retriever", "German short-haired pointer", "vizsla, Hungarian pointer", "English setter", "Irish setter, red setter", "Gordon setter", "Brittany spaniel", "clumber, clumber spaniel", "English springer, English springer spaniel", "Welsh springer spaniel", "cocker spaniel, English cocker spaniel, cocker", "Sussex spaniel", "Irish water spaniel", "kuvasz", "schipperke", "groenendael", "malinois", "briard", "kelpie", "komondor", "Old English sheepdog, bobtail", "Shetland sheepdog, Shetland sheep dog, Shetland", "collie", "Border collie", "Bouvier des Flandres, Bouviers des Flandres", "Rottweiler", "German shepherd, German shepherd dog, German police dog, alsatian", "Doberman, Doberman pinscher", "miniature pinscher", "Greater Swiss Mountain dog", "Bernese mountain dog", "Appenzeller", "EntleBucher", "boxer", "bull mastiff", "Tibetan mastiff", "French bulldog", "Great Dane", "Saint Bernard, St Bernard", "Eskimo dog, husky", "malamute, malemute, Alaskan malamute", "Siberian husky", "dalmatian, coach dog, carriage dog", "affenpinscher, monkey pinscher, monkey dog", "basenji", "pug, pug-dog", "Leonberg", "Newfoundland, Newfoundland dog", "Great Pyrenees", "Samoyed, Samoyede", "Pomeranian", "chow, chow chow", "keeshond", "Brabancon griffon", "Pembroke, Pembroke Welsh corgi", "Cardigan, Cardigan Welsh corgi", "toy poodle", "miniature poodle", "standard poodle", "Mexican hairless", "timber wolf, grey wolf, gray wolf, Canis lupus", "white wolf, Arctic wolf, Canis lupus tundrarum", "red wolf, maned wolf, Canis rufus, Canis niger", "coyote, prairie wolf, brush wolf, Canis latrans", "dingo, warrigal, warragal, Canis dingo", "dhole, Cuon alpinus", "African hunting dog, hyena dog, Cape hunting dog, Lycaon pictus", "hyena, hyaena", "red fox, Vulpes vulpes", "kit fox, Vulpes macrotis", "Arctic fox, white fox, Alopex lagopus", "grey fox, gray fox, Urocyon cinereoargenteus", "tabby, tabby cat", "tiger cat", "Persian cat", "Siamese cat, Siamese", "Egyptian cat", "cougar, puma, catamount, mountain lion, painter, panther, Felis concolor", "lynx, catamount", "leopard, Panthera pardus", "snow leopard, ounce, Panthera uncia", "jaguar, panther, Panthera onca, Felis onca", "lion, king of beasts, Panthera leo", "tiger, Panthera tigris", "cheetah, chetah, Acinonyx jubatus", "brown bear, bruin, Ursus arctos", "American black bear, black bear, Ursus americanus, Euarctos americanus", "ice bear, polar bear, Ursus Maritimus, Thalarctos maritimus", "sloth bear, Melursus ursinus, Ursus ursinus", "mongoose", "meerkat, mierkat", "tiger beetle", "ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle", "ground beetle, carabid beetle", "long-horned beetle, longicorn, longicorn beetle", "leaf beetle, chrysomelid", "dung beetle", "rhinoceros beetle", "weevil", "fly", "bee", "ant, emmet, pismire", "grasshopper, hopper", "cricket", "walking stick, walkingstick, stick insect", "cockroach, roach", "mantis, mantid", "cicada, cicala", "leafhopper", "lacewing, lacewing fly", "dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk", "damselfly", "admiral", "ringlet, ringlet butterfly", "monarch, monarch butterfly, milkweed butterfly, Danaus plexippus", "cabbage butterfly", "sulphur butterfly, sulfur butterfly", "lycaenid, lycaenid butterfly", "starfish, sea star", "sea urchin", "sea cucumber, holothurian", "wood rabbit, cottontail, cottontail rabbit", "hare", "Angora, Angora rabbit", "hamster", "porcupine, hedgehog", "fox squirrel, eastern fox squirrel, Sciurus niger", "marmot", "beaver", "guinea pig, Cavia cobaya", "sorrel", "zebra", "hog, pig, grunter, squealer, Sus scrofa", "wild boar, boar, Sus scrofa", "warthog", "hippopotamus, hippo, river horse, Hippopotamus amphibius", "ox", "water buffalo, water ox, Asiatic buffalo, Bubalus bubalis", "bison", "ram, tup", "bighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis", "ibex, Capra ibex", "hartebeest", "impala, Aepyceros melampus", "gazelle", "Arabian camel, dromedary, Camelus dromedarius", "llama", "weasel", "mink", "polecat, fitch, foulmart, foumart, Mustela putorius", "black-footed ferret, ferret, Mustela nigripes", "otter", "skunk, polecat, wood pussy", "badger", "armadillo", "three-toed sloth, ai, Bradypus tridactylus", "orangutan, orang, orangutang, Pongo pygmaeus", "gorilla, Gorilla gorilla", "chimpanzee, chimp, Pan troglodytes", "gibbon, Hylobates lar", "siamang, Hylobates syndactylus, Symphalangus syndactylus", "guenon, guenon monkey", "patas, hussar monkey, Erythrocebus patas", "baboon", "macaque", "langur", "colobus, colobus monkey", "proboscis monkey, Nasalis larvatus", "marmoset", "capuchin, ringtail, Cebus capucinus", "howler monkey, howler", "titi, titi monkey", "spider monkey, Ateles geoffroyi", "squirrel monkey, Saimiri sciureus", "Madagascar cat, ring-tailed lemur, Lemur catta", "indri, indris, Indri indri, Indri brevicaudatus", "Indian elephant, Elephas maximus", "African elephant, Loxodonta africana", "lesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens", "giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca", "barracouta, snoek", "eel", "coho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch", "rock beauty, Holocanthus tricolor", "anemone fish", "sturgeon", "gar, garfish, garpike, billfish, Lepisosteus osseus", "lionfish", "puffer, pufferfish, blowfish, globefish", "abacus", "abaya", "academic gown, academic robe, judge's robe", "accordion, piano accordion, squeeze box", "acoustic guitar", "aircraft carrier, carrier, flattop, attack aircraft carrier", "airliner", "airship, dirigible", "altar", "ambulance", "amphibian, amphibious vehicle", "analog clock", "apiary, bee house", "apron", "ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin", "assault rifle, assault gun", "backpack, back pack, knapsack, packsack, rucksack, haversack", "bakery, bakeshop, bakehouse", "balance beam, beam", "balloon", "ballpoint, ballpoint pen, ballpen, Biro", "Band Aid", "banjo", "bannister, banister, balustrade, balusters, handrail", "barbell", "barber chair", "barbershop", "barn", "barometer", "barrel, cask", "barrow, garden cart, lawn cart, wheelbarrow", "baseball", "basketball", "bassinet", "bassoon", "bathing cap, swimming cap", "bath towel", "bathtub, bathing tub, bath, tub", "beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon", "beacon, lighthouse, beacon light, pharos", "beaker", "bearskin, busby, shako", "beer bottle", "beer glass", "bell cote, bell cot", "bib", "bicycle-built-for-two, tandem bicycle, tandem", "bikini, two-piece", "binder, ring-binder", "binoculars, field glasses, opera glasses", "birdhouse", "boathouse", "bobsled, bobsleigh, bob", "bolo tie, bolo, bola tie, bola", "bonnet, poke bonnet", "bookcase", "bookshop, bookstore, bookstall", "bottlecap", "bow", "bow tie, bow-tie, bowtie", "brass, memorial tablet, plaque", "brassiere, bra, bandeau", "breakwater, groin, groyne, mole, bulwark, seawall, jetty", "breastplate, aegis, egis", "broom", "bucket, pail", "buckle", "bulletproof vest", "bullet train, bullet", "butcher shop, meat market", "cab, hack, taxi, taxicab", "caldron, cauldron", "candle, taper, wax light", "cannon", "canoe", "can opener, tin opener", "cardigan", "car mirror", "carousel, carrousel, merry-go-round, roundabout, whirligig", "carpenter's kit, tool kit", "carton", "car wheel", "cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM", "cassette", "cassette player", "castle", "catamaran", "CD player", "cello, violoncello", "cellular telephone, cellular phone, cellphone, cell, mobile phone", "chain", "chainlink fence", "chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour", "chain saw, chainsaw", "chest", "chiffonier, commode", "chime, bell, gong", "china cabinet, china closet", "Christmas stocking", "church, church building", "cinema, movie theater, movie theatre, movie house, picture palace", "cleaver, meat cleaver, chopper", "cliff dwelling", "cloak", "clog, geta, patten, sabot", "cocktail shaker", "coffee mug", "coffeepot", "coil, spiral, volute, whorl, helix", "combination lock", "computer keyboard, keypad", "confectionery, confectionary, candy store", "container ship, containership, container vessel", "convertible", "corkscrew, bottle screw", "cornet, horn, trumpet, trump", "cowboy boot", "cowboy hat, ten-gallon hat", "cradle", "crane", "crash helmet", "crate", "crib, cot", "Crock Pot", "croquet ball", "crutch", "cuirass", "dam, dike, dyke", "desk", "desktop computer", "dial telephone, dial phone", "diaper, nappy, napkin", "digital clock", "digital watch", "dining table, board", "dishrag, dishcloth", "dishwasher, dish washer, dishwashing machine", "disk brake, disc brake", "dock, dockage, docking facility", "dogsled, dog sled, dog sleigh", "dome", "doormat, welcome mat", "drilling platform, offshore rig", "drum, membranophone, tympan", "drumstick", "dumbbell", "Dutch oven", "electric fan, blower", "electric guitar", "electric locomotive", "entertainment center", "envelope", "espresso maker", "face powder", "feather boa, boa", "file, file cabinet, filing cabinet", "fireboat", "fire engine, fire truck", "fire screen, fireguard", "flagpole, flagstaff", "flute, transverse flute", "folding chair", "football helmet", "forklift", "fountain", "fountain pen", "four-poster", "freight car", "French horn, horn", "frying pan, frypan, skillet", "fur coat", "garbage truck, dustcart", "gasmask, respirator, gas helmet", "gas pump, gasoline pump, petrol pump, island dispenser", "goblet", "go-kart", "golf ball", "golfcart, golf cart", "gondola", "gong, tam-tam", "gown", "grand piano, grand", "greenhouse, nursery, glasshouse", "grille, radiator grille", "grocery store, grocery, food market, market", "guillotine", "hair slide", "hair spray", "half track", "hammer", "hamper", "hand blower, blow dryer, blow drier, hair dryer, hair drier", "hand-held computer, hand-held microcomputer", "handkerchief, hankie, hanky, hankey", "hard disc, hard disk, fixed disk", "harmonica, mouth organ, harp, mouth harp", "harp", "harvester, reaper", "hatchet", "holster", "home theater, home theatre", "honeycomb", "hook, claw", "hoopskirt, crinoline", "horizontal bar, high bar", "horse cart, horse-cart", "hourglass", "iPod", "iron, smoothing iron", "jack-o'-lantern", "jean, blue jean, denim", "jeep, landrover", "jersey, T-shirt, tee shirt", "jigsaw puzzle", "jinrikisha, ricksha, rickshaw", "joystick", "kimono", "knee pad", "knot", "lab coat, laboratory coat", "ladle", "lampshade, lamp shade", "laptop, laptop computer", "lawn mower, mower", "lens cap, lens cover", "letter opener, paper knife, paperknife", "library", "lifeboat", "lighter, light, igniter, ignitor", "limousine, limo", "liner, ocean liner", "lipstick, lip rouge", "Loafer", "lotion", "loudspeaker, speaker, speaker unit, loudspeaker system, speaker system", "loupe, jeweler's loupe", "lumbermill, sawmill", "magnetic compass", "mailbag, postbag", "mailbox, letter box", "maillot", "maillot, tank suit", "manhole cover", "maraca", "marimba, xylophone", "mask", "matchstick", "maypole", "maze, labyrinth", "measuring cup", "medicine chest, medicine cabinet", "megalith, megalithic structure", "microphone, mike", "microwave, microwave oven", "military uniform", "milk can", "minibus", "miniskirt, mini", "minivan", "missile", "mitten", "mixing bowl", "mobile home, manufactured home", "Model T", "modem", "monastery", "monitor", "moped", "mortar", "mortarboard", "mosque", "mosquito net", "motor scooter, scooter", "mountain bike, all-terrain bike, off-roader", "mountain tent", "mouse, computer mouse", "mousetrap", "moving van", "muzzle", "nail", "neck brace", "necklace", "nipple", "notebook, notebook computer", "obelisk", "oboe, hautboy, hautbois", "ocarina, sweet potato", "odometer, hodometer, mileometer, milometer", "oil filter", "organ, pipe organ", "oscilloscope, scope, cathode-ray oscilloscope, CRO", "overskirt", "oxcart", "oxygen mask", "packet", "paddle, boat paddle", "paddlewheel, paddle wheel", "padlock", "paintbrush", "pajama, pyjama, pj's, jammies", "palace", "panpipe, pandean pipe, syrinx", "paper towel", "parachute, chute", "parallel bars, bars", "park bench", "parking meter", "passenger car, coach, carriage", "patio, terrace", "pay-phone, pay-station", "pedestal, plinth, footstall", "pencil box, pencil case", "pencil sharpener", "perfume, essence", "Petri dish", "photocopier", "pick, plectrum, plectron", "pickelhaube", "picket fence, paling", "pickup, pickup truck", "pier", "piggy bank, penny bank", "pill bottle", "pillow", "ping-pong ball", "pinwheel", "pirate, pirate ship", "pitcher, ewer", "plane, carpenter's plane, woodworking plane", "planetarium", "plastic bag", "plate rack", "plow, plough", "plunger, plumber's helper", "Polaroid camera, Polaroid Land camera", "pole", "police van, police wagon, paddy wagon, patrol wagon, wagon, black Maria", "poncho", "pool table, billiard table, snooker table", "pop bottle, soda bottle", "pot, flowerpot", "potter's wheel", "power drill", "prayer rug, prayer mat", "printer", "prison, prison house", "projectile, missile", "projector", "puck, hockey puck", "punching bag, punch bag, punching ball, punchball", "purse", "quill, quill pen", "quilt, comforter, comfort, puff", "racer, race car, racing car", "racket, racquet", "radiator", "radio, wireless", "radio telescope, radio reflector", "rain barrel", "recreational vehicle, RV, R.V.", "reel", "reflex camera", "refrigerator, icebox", "remote control, remote", "restaurant, eating house, eating place, eatery", "revolver, six-gun, six-shooter", "rifle", "rocking chair, rocker", "rotisserie", "rubber eraser, rubber, pencil eraser", "rugby ball", "rule, ruler", "running shoe", "safe", "safety pin", "saltshaker, salt shaker", "sandal", "sarong", "sax, saxophone", "scabbard", "scale, weighing machine", "school bus", "schooner", "scoreboard", "screen, CRT screen", "screw", "screwdriver", "seat belt, seatbelt", "sewing machine", "shield, buckler", "shoe shop, shoe-shop, shoe store", "shoji", "shopping basket", "shopping cart", "shovel", "shower cap", "shower curtain", "ski", "ski mask", "sleeping bag", "slide rule, slipstick", "sliding door", "slot, one-armed bandit", "snorkel", "snowmobile", "snowplow, snowplough", "soap dispenser", "soccer ball", "sock", "solar dish, solar collector, solar furnace", "sombrero", "soup bowl", "space bar", "space heater", "space shuttle", "spatula", "speedboat", "spider web, spider's web", "spindle", "sports car, sport car", "spotlight, spot", "stage", "steam locomotive", "steel arch bridge", "steel drum", "stethoscope", "stole", "stone wall", "stopwatch, stop watch", "stove", "strainer", "streetcar, tram, tramcar, trolley, trolley car", "stretcher", "studio couch, day bed", "stupa, tope", "submarine, pigboat, sub, U-boat", "suit, suit of clothes", "sundial", "sunglass", "sunglasses, dark glasses, shades", "sunscreen, sunblock, sun blocker", "suspension bridge", "swab, swob, mop", "sweatshirt", "swimming trunks, bathing trunks", "swing", "switch, electric switch, electrical switch", "syringe", "table lamp", "tank, army tank, armored combat vehicle, armoured combat vehicle", "tape player", "teapot", "teddy, teddy bear", "television, television system", "tennis ball", "thatch, thatched roof", "theater curtain, theatre curtain", "thimble", "thresher, thrasher, threshing machine", "throne", "tile roof", "toaster", "tobacco shop, tobacconist shop, tobacconist", "toilet seat", "torch", "totem pole", "tow truck, tow car, wrecker", "toyshop", "tractor", "trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi", "tray", "trench coat", "tricycle, trike, velocipede", "trimaran", "tripod", "triumphal arch", "trolleybus, trolley coach, trackless trolley", "trombone", "tub, vat", "turnstile", "typewriter keyboard", "umbrella", "unicycle, monocycle", "upright, upright piano", "vacuum, vacuum cleaner", "vase", "vault", "velvet", "vending machine", "vestment", "viaduct", "violin, fiddle", "volleyball", "waffle iron", "wall clock", "wallet, billfold, notecase, pocketbook", "wardrobe, closet, press", "warplane, military plane", "washbasin, handbasin, washbowl, lavabo, wash-hand basin", "washer, automatic washer, washing machine", "water bottle", "water jug", "water tower", "whiskey jug", "whistle", "wig", "window screen", "window shade", "Windsor tie", "wine bottle", "wing", "wok", "wooden spoon", "wool, woolen, woollen", "worm fence, snake fence, snake-rail fence, Virginia fence", "wreck", "yawl", "yurt", "web site, website, internet site, site", "comic book", "crossword puzzle, crossword", "street sign", "traffic light, traffic signal, stoplight", "book jacket, dust cover, dust jacket, dust wrapper", "menu", "plate", "guacamole", "consomme", "hot pot, hotpot", "trifle", "ice cream, icecream", "ice lolly, lolly, lollipop, popsicle", "French loaf", "bagel, beigel", "pretzel", "cheeseburger", "hotdog, hot dog, red hot", "mashed potato", "head cabbage", "broccoli", "cauliflower", "zucchini, courgette", "spaghetti squash", "acorn squash", "butternut squash", "cucumber, cuke", "artichoke, globe artichoke", "bell pepper", "cardoon", "mushroom", "Granny Smith", "strawberry", "orange", "lemon", "fig", "pineapple, ananas", "banana", "jackfruit, jak, jack", "custard apple", "pomegranate", "hay", "carbonara", "chocolate sauce, chocolate syrup", "dough", "meat loaf, meatloaf", "pizza, pizza pie", "potpie", "burrito", "red wine", "espresso", "cup", "eggnog", "alp", "bubble", "cliff, drop, drop-off", "coral reef", "geyser", "lakeside, lakeshore", "promontory, headland, head, foreland", "sandbar, sand bar", "seashore, coast, seacoast, sea-coast", "valley, vale", "volcano", "ballplayer, baseball player", "groom, bridegroom", "scuba diver", "rapeseed", "daisy", "yellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum", "corn", "acorn", "hip, rose hip, rosehip", "buckeye, horse chestnut, conker", "coral fungus", "agaric", "gyromitra", "stinkhorn, carrion fungus", "earthstar", "hen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa", "bolete", "ear, spike, capitulum", "toilet tissue, toilet paper, bathroom tissue"]
--------------------------------------------------------------------------------
/【2022 翻倍争夺战提交示例】mlp-mixer/model.py:
--------------------------------------------------------------------------------
1 | import megengine as mge
2 | import megengine.functional as F
3 | import megengine.module as M
4 |
5 |
6 | class PatchEmbed(M.Module):
7 | def __init__(self, img_size=(224, 224), patch_size=(16, 16), in_chans=3, embed_dim=768):
8 | super().__init__()
9 | self.embed_dim = embed_dim
10 | self.img_size = img_size
11 | self.grid_size = (img_size[0] // patch_size[0], img_size[1] // patch_size[1])
12 | self.num_patches = self.grid_size[0] * self.grid_size[1]
13 | self.proj = M.Conv2d(in_chans, embed_dim, kernel_size=patch_size, stride=patch_size)
14 |
15 | def forward(self, x):
16 | B, C, H, W = x.shape
17 | assert (H, W) == self.img_size
18 | x = self.proj(x)
19 | x = x.reshape(B, self.embed_dim, self.num_patches).transpose(0, 2, 1) # BCHW -> BNC
20 | return x
21 |
22 |
23 | class MLP(M.Module):
24 | def __init__(self, in_features, hidden_features=None, out_features=None, bias=True):
25 | super().__init__()
26 | out_features = out_features or in_features
27 | hidden_features = hidden_features or in_features
28 |
29 | self.fc1 = M.Linear(in_features, hidden_features, bias=bias)
30 | self.act = M.GELU()
31 | self.fc2 = M.Linear(hidden_features, out_features, bias=bias)
32 |
33 | def forward(self, x):
34 | x = self.fc1(x)
35 | x = self.act(x)
36 | x = self.fc2(x)
37 | return x
38 |
39 |
40 | class MixerBlock(M.Module):
41 | def __init__(self, dim, seq_len, mlp_ratio=(0.5, 4.0)):
42 | super().__init__()
43 | tokens_dim, channels_dim = [int(x * dim) for x in mlp_ratio]
44 | self.norm1 = M.LayerNorm(dim, eps=1e-6)
45 | self.mlp_tokens = MLP(seq_len, tokens_dim)
46 | self.norm2 = M.LayerNorm(dim, eps=1e-6)
47 | self.mlp_channels = MLP(dim, channels_dim)
48 |
49 | def forward(self, x):
50 | x = x + self.mlp_tokens(self.norm1(x).transpose(0, 2, 1)).transpose(0, 2, 1)
51 | x = x + self.mlp_channels(self.norm2(x))
52 | return x
53 |
54 |
55 | class MLPMixer(M.Module):
56 | def __init__(
57 | self,
58 | num_classes=1000,
59 | img_size=(224, 224),
60 | in_chans=3,
61 | patch_size=(16, 16),
62 | num_blocks=8,
63 | embed_dim=512,
64 | mlp_ratio=(0.5, 4.0),
65 | ):
66 | super().__init__()
67 | self.stem = PatchEmbed(img_size=img_size, patch_size=patch_size, in_chans=in_chans, embed_dim=embed_dim)
68 | self.blocks = M.Sequential(*[
69 | MixerBlock(embed_dim, self.stem.num_patches, mlp_ratio)
70 | for _ in range(num_blocks)
71 | ])
72 | self.norm = M.LayerNorm(embed_dim, eps=1e-6)
73 | self.head = M.Linear(embed_dim, num_classes) if num_classes > 0 else M.Identity()
74 |
75 | def forward_features(self, x):
76 | x = self.stem(x)
77 | x = self.blocks(x)
78 | x = self.norm(x)
79 | return x
80 |
81 | def forward(self, x):
82 | x = self.forward_features(x)
83 | x = x.mean(1)
84 | x = self.head(x)
85 | return x
86 |
87 | def load_from_torch(self, torch_state_dict):
88 | sd = {k: v.numpy() for k, v in torch_state_dict.items()}
89 | sd['stem.proj.bias'] = sd['stem.proj.bias'].reshape(1, -1, 1, 1)
90 | self.load_state_dict(sd)
91 |
92 |
93 | # 请将下面的 URL 替换为提交权重文件后得到的 URL
94 | @mge.hub.pretrained('http://localhost:8000/mlp_mixer_b16_224.pkl')
95 | def mlp_mixer_b16_224():
96 | return MLPMixer(num_blocks=12, embed_dim=768)
97 |
--------------------------------------------------------------------------------
/【2022 翻倍争夺战提交示例】mlp-mixer/requirements.txt:
--------------------------------------------------------------------------------
1 | megengine
2 | torchvision
3 | timm
4 |
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/1.旷视天元MegEngine介绍和快速上手.md:
--------------------------------------------------------------------------------
1 | ## 主题
2 | 旷视天元MegEngine介绍和快速上手
3 | ## 讲师
4 | 杨滔,旷视科技AI市场总监
5 | 毕业于清华大学,2020年加入旷视科技有限公司,任AI市场总监,负责MegEngine的生态建设。加入旷视前杨滔在微软工作十四年,负责最新技术在高校中推广和布道,对业界主流技术、云平台以及人工智能技术有丰富和全面的积累。
6 |
7 | ## 概要
8 | * MegEngine背景、特点、环境设置和安装介绍;
9 | * MegStudio在线深度学习开发平台介绍和演示;
10 | * 深度学习和MegEngine编程中的张量、算子、反向传播和自动求导等相关基本概念介绍;
11 | * 如何利用公开项目和社区等资源快速开启深度学习之旅。
12 |
13 | ## PPT地址
14 |
15 | [1.旷视天元MegEngine介绍和快速上手.pptx](./slides/1.旷视天元MegEngine介绍和快速上手.pptx)
16 |
17 | ## 视频地址
18 |
19 | https://www.bilibili.com/video/BV1Zf4y1Q7aZ?p=1
20 |
21 | ## 作业
22 |
23 | 从MegStudio的公开项目中Fork一个项目并运行。
24 |
25 | 1. 关注:https://github.com/MegEngine/MegEngine
26 | 2. 安装MegEngine
27 | 3. 注册和试用MegStudio
28 | 4. 从MegStudio的公开项目中Fork一个项目并运行
29 |
30 | ### 如何提交
31 |
32 | 将 MegStudio 运行成功的截图及个人信息发送到邮箱:mgesupport@megvii.com
33 |
34 | **邮件标题:** 天元入门第一次课程作业
35 |
36 | **邮件内容**
37 |
38 | * 截图
39 | * 姓名:
40 | * 学校(公司):
41 | * 电话:
42 | * 邮寄地址:
43 |
44 |
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/2.模型构建和训练入门.md:
--------------------------------------------------------------------------------
1 | ## 主题
2 | 模型构建和训练入门
3 | ## 讲师
4 | 刘清一
5 | 毕业于北京大学数学科学学院,旷视研究院深度学习框架研究员,负责 MegEngine 分布式训练和 GPU 间通信的开发,加入旷视之前曾经在百度参与大规模分布式计算平台的开发,对分布式系统有一些经验和积累。
6 | ## 概要
7 | * 如何使用 Dataset、Sampler 和 DataLoader 进行数据的加载和预处理。
8 | * 简单介绍神经网络的基本算子,逐层搭建神经网络模型。
9 | * 讲解如何使用梯度下降法优化神经网络的参数。
10 | * 介绍 MegEngine 的数据并行方法,在多个 GPU 上并行训练。
11 | * 如何将训练好的模型的保存为文件,以及如何从文件中加载模型。
12 |
13 | ## PPT地址
14 |
15 | [2.模型构建和训练入门课件.pdf](./slides/2.模型构建和训练入门课件.pdf)
16 | [2.模型构建和训练入门.ipynb](./notebooks/2.模型构建和训练入门.ipynb)
17 |
18 | ## 视频地址
19 |
20 | https://www.bilibili.com/video/BV1Zf4y1Q7aZ?p=2
21 |
22 | ## 作业
23 |
24 | 1. 使用 MegEngine 搭建 LeNet5 网络
25 | 2. 在 MNIST 数据集上进行训练
26 | 3. 保存训练好的模型
27 | 4. 计算在测试集上的准确率
28 |
29 | ### 如何提交
30 |
31 | 将 MegStudio 运行成功的截图及个人信息发送到邮箱:mgesupport@megvii.com
32 |
33 | **邮件标题:** 天元入门第二次课程作业
34 |
35 | **邮件内容**
36 |
37 | * 截图
38 | * 姓名:
39 | * 学校(公司):
40 | * 电话:
41 | * 邮寄地址:
42 |
43 |
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/3.模型构建和训练进阶 I:分类问题.md:
--------------------------------------------------------------------------------
1 | ## 主题
2 | 模型构建和训练进阶 I:分类问题
3 | ## 讲师
4 | 周亦庄
5 | 毕业于清华大学,旷视研究院基础模型算法研究员,负责 MegEngine的模型库构建、NAS算法以及多机多卡训练系统,对模型训练有丰富经验。
6 | ## 概要
7 | * 图片分类任务背景和ImageNet数据集
8 | * 以cifar10为数据进行resnet18网络训练
9 | * 模型测试、保存和读取、以及学习率调整
10 |
11 | ## PPT地址
12 |
13 | [3.模型构建和训练进阶 I:分类问题.pptx](./slides/3.模型构建和训练进阶%20I:分类问题.pptx)
14 | [3.模型构建和训练进阶 I:分类问题1.ipynb](./notebooks/3.模型构建和训练进阶%20I:分类问题1.ipynb)
15 | [3.模型构建和训练进阶 I:分类问题2.ipynb](./notebooks/3.模型构建和训练进阶%20I:分类问题2.ipynb)
16 | [3.模型构建和训练进阶 I:分类问题3.ipynb](./notebooks/3.模型构建和训练进阶%20I:分类问题3.ipynb)
17 | [3.模型构建和训练进阶 I:分类问题4.ipynb](./notebooks/3.模型构建和训练进阶%20I:分类问题4.ipynb)
18 |
19 |
20 | ## 视频地址
21 |
22 | https://www.bilibili.com/video/BV1Zf4y1Q7aZ?p=3
23 |
24 | ## 作业
25 | **使用MegEngine训练一个MNIST分类网络
**
26 |
27 | * MNIST为手写数字数据集,可以在megengine.data中找到
28 | * 分类网络可以使用普通的LeNet,也可以使用卷积的LeNet或其他网络
29 |
30 |
31 | ### 如何提交
32 |
33 | 将运行成功的截图及个人信息发送到邮箱:mgesupport@megvii.com
34 |
35 | **邮件标题:** 天元入门第三次课程作业
36 |
37 | **邮件内容**
38 |
39 | * 截图
40 | * 姓名:
41 | * 学校(公司):
42 | * 电话:
43 | * 邮寄地址:
44 |
45 |
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/4.模型构建和训练进阶 II:物体检测.md:
--------------------------------------------------------------------------------
1 | ## 主题
2 | 模型构建和训练进阶 II:物体检测
3 | ## 讲师
4 | 王枫
5 | 毕业于中科院计算所,旷视科技基础检测组算法研究员,曾获得ICDAR ArT文字检测任务冠军,在MegEngine中负责检测模型的复现和维护和Data部分的设计实现。
6 | ## 概要
7 | 本节课程中,我们将学习通用物体检测的方法和流程,并且我们将以Faster-RCNN为例讲述旷视天元MegEngine是如何实现通用物体检测的Pipeline的。其中包括数据准备,模型搭建,训练测试,还有节省显存的大杀器sublinear memory等内容。
8 |
9 | ## 资料
10 | [4.模型构建和训练进阶 II:物体检测.pdf](./slides/4.模型构建和训练进阶%20II:物体检测.pdf)
11 | [4.模型构建和训练进阶 II:物体检测(anchor).ipynb](./notebooks/4.模型构建和训练进阶%20II:物体检测(anchor).ipynb)
12 | [4.模型构建和训练进阶 II:物体检测(nms).ipynb](./notebooks/4.模型构建和训练进阶%20II:物体检测(nms).ipynb)
13 | [4.模型构建和训练进阶 II:物体检测(transform_pipeline).ipynb](./notebooks/4.模型构建和训练进阶%20II:物体检测(transform_pipeline).ipynb)
14 |
15 | ## 视频地址
16 |
17 | https://www.bilibili.com/video/BV1Zf4y1Q7aZ?p=4
18 | ## 作业
19 |
20 | 了解MegEngine的hub的使用,上传一张图片,并且用hub中的Faster-RCNN 或者 RetinaNet 进行推理.
21 |
22 | ### 如何提交
23 |
24 | 将运行成功的截图及个人信息发送到邮箱:mgesupport@megvii.com
25 |
26 | **邮件标题:** 天元入门第四次课程作业
27 |
28 | **邮件内容**
29 |
30 | * 截图
31 | * 姓名:
32 | * 学校(公司):
33 | * 电话:
34 | * 邮寄地址:
35 |
36 |
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/5. Android 移动端模型推理部署.md:
--------------------------------------------------------------------------------
1 | ## 主题
2 | Android 移动端模型推理部署
3 | ## 讲师
4 | 赵凯
5 | 毕业于四川大学,旷视科技移动业务团队高级工程师,加入旷视前就职于联发科从事手机安卓底层开发。目前主要工作是基于深度学习的视频类效果算法实现、应用及优化。
6 | ## 概要
7 | 以ShuffleNet V2为例, 介绍如何将预训练的模型部署到Android 移动终端,结合Camera完成实时推理实现。
8 | - 预训练模型获取及转换;
9 | - 网络输入和输出讲解;
10 | - MegEngine交叉编译及部署;
11 | - Android Camera预览实时推理。
12 |
13 | ## 资料
14 | 课件:[5. Android 移动端模型推理部署.pdf](./slides/5.%20Android%20移动端模型推理部署.pdf)
15 | code:[5. Android 移动端模型推理部署.zip](./notebooks/5.%20Android%20移动端模型推理部署.zip)
16 |
17 | ## 视频地址
18 | https://www.bilibili.com/video/BV1Zf4y1Q7aZ?p=5
19 | ## 作业
20 |
21 | 使用ResNet预训练模型完成基于Android Camera的实时推理。
22 |
23 | - 1.了解ModelHub 使用
24 | - 2.了解模型预处理方法
25 | - 3.了解Android 移动端快速部署方法
26 |
27 | 要求: 完成课程内容后,需要提供测试效果原图,standalone测试原图的运行结果图和Android Camera实 时预览推理效果图
28 |
29 | ### 如何提交
30 |
31 | 将提供测试效果原图,standalone测试原图的运行结果图和Android Camera实时预览推理效果图及个人信息发送到邮箱:mgesupport@megvii.com
32 |
33 | **邮件标题:** 天元入门第五次课程作业
34 |
35 | **邮件内容**
36 |
37 | * 截图
38 | * 姓名:
39 | * 学校(公司):
40 | * 电话:
41 | * 邮寄地址:
42 |
43 |
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/6.部署进阶:推理端优化.md:
--------------------------------------------------------------------------------
1 | ## 主题
2 | 部署进阶:推理端优化
3 | ## 讲师
4 | 王鹏
5 | 毕业于成都理工大学,旷视科技多摄算法研究员,方向为多视图三维重建与深度学习融合,参与实时预览虚化,光学变焦,超广角畸变,单目深度估计等多个项目的研发与落地。
6 | ## 概要
7 | 了解模型量化基本原理,介绍 MegEngine 框架中模型量化方案,量化相关的模块和使用方法。
8 | 以网络模型ShuffleNet V2为例, 实例讲解使用MegEngine进行模型量化的流程。
9 | 1. 概述
10 | 2. 量化方法介绍
11 | 3. 实例讲解
12 |
13 | ## 资料
14 | 课件:[6.部署进阶:推理端优化.pdf](./slides/6.部署进阶:推理端优化.pdf)
15 | code:[6.部署进阶:推理端优化.zip](./notebooks/6.部署进阶:推理端优化.zip)
16 |
17 | ## 视频地址
18 | https://www.bilibili.com/video/BV1Zf4y1Q7aZ?p=6
19 |
20 | ## 作业
21 |
22 | - 了解MegEngine框架下量化原理
23 | - 利用MegEngine框架修改Shufflenet_v2模型为可量化模型
24 | - 使用train.py、finetune.py 和inference.py 验证量化模型是否修改成功。
25 | - 数据集地址:https://studio.brainpp.com/data-set/1433
26 |
27 | ### 如何提交
28 |
29 | 将修改后的模型py文件, 量化后的模型size及个人信息发送到邮箱:mgesupport@megvii.com
30 |
31 | **邮件标题:** 天元入门第六次课程作业
32 |
33 | **邮件内容**
34 |
35 | * 截图
36 | * 姓名:
37 | * 学校(公司):
38 | * 电话:
39 | * 邮寄地址:
40 |
41 |
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/notebooks/3.模型构建和训练进阶 I:分类问题2.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "import megengine\n",
10 | "import megengine.data as data\n",
11 | "import megengine.data.transform as T\n",
12 | "import megengine.module as M\n",
13 | "import megengine.functional as F\n",
14 | "import megengine.optimizer as optimizer\n",
15 | "import megengine.jit as jit"
16 | ]
17 | },
18 | {
19 | "cell_type": "code",
20 | "execution_count": 2,
21 | "metadata": {},
22 | "outputs": [],
23 | "source": [
24 | "class BasicBlock(M.Module):\n",
25 | " \"\"\"每个ResNet18的Block都包含两层卷积\"\"\"\n",
26 | " def __init__(self, in_channels, out_channels, stride=1):\n",
27 | " super(BasicBlock, self).__init__()\n",
28 | " # 第一层卷积,接 BN 和 ReLU\n",
29 | " self.conv1 = M.ConvBnRelu2d(\n",
30 | " in_channels=in_channels, out_channels=out_channels,\n",
31 | " kernel_size=3, stride=stride, padding=1)\n",
32 | " # 第二层卷积,只接 BN\n",
33 | " self.conv2 = M.ConvBn2d(\n",
34 | " in_channels=out_channels, out_channels=out_channels,\n",
35 | " kernel_size=3, stride=1, padding=1)\n",
36 | " # 残差连接,当输入输出不一致/需要下采样时,用 ConvBn 实现变换\n",
37 | " if in_channels == out_channels and stride == 1:\n",
38 | " self.res_conn = M.Identity()\n",
39 | " else:\n",
40 | " self.res_conn = M.ConvBn2d(\n",
41 | " in_channels=in_channels, out_channels=out_channels,\n",
42 | " kernel_size=1, stride=stride)\n",
43 | " \n",
44 | " def forward(self, x):\n",
45 | " identity = x\n",
46 | " x = self.conv1(x)\n",
47 | " x = self.conv2(x)\n",
48 | " x = x + self.res_conn(identity)\n",
49 | " return F.relu(x)"
50 | ]
51 | },
52 | {
53 | "cell_type": "code",
54 | "execution_count": 3,
55 | "metadata": {},
56 | "outputs": [],
57 | "source": [
58 | "class ResNet18(M.Module):\n",
59 | " def __init__(self):\n",
60 | " self.conv1 = M.ConvBnRelu2d(in_channels=3, out_channels=64,\n",
61 | " kernel_size=3, padding=1)\n",
62 | " # 8 个 BasicBlock,3 次下采样(stride=2),共 8x2=16 层卷积\n",
63 | " self.blocks = M.Sequential(\n",
64 | " BasicBlock(64, 64),\n",
65 | " BasicBlock(64, 64),\n",
66 | " BasicBlock(64, 128, stride=2),\n",
67 | " BasicBlock(128, 128),\n",
68 | " BasicBlock(128, 256, stride=2),\n",
69 | " BasicBlock(256, 256),\n",
70 | " BasicBlock(256, 512, stride=2),\n",
71 | " BasicBlock(512, 512),\n",
72 | " )\n",
73 | " # 全连接分类器,输出维度为 10 类的预测\n",
74 | " self.classifier = M.Sequential(\n",
75 | " M.Dropout(0.2),\n",
76 | " M.Linear(512, 10)\n",
77 | " )\n",
78 | " \n",
79 | " def forward(self, x):\n",
80 | " # 1. 特征提取,输入为 Nx3x32x32 的图片,输出为 Nx512x4x4的张量(Tensor)\n",
81 | " x = self.conv1(x)\n",
82 | " x = self.blocks(x)\n",
83 | " # 2. 4x4平均池化(Average Pooling)\n",
84 | " x = F.avg_pool2d(x, 4)\n",
85 | " x = F.flatten(x, 1)\n",
86 | " # 3. 分类预测\n",
87 | " x = self.classifier(x)\n",
88 | " return x"
89 | ]
90 | },
91 | {
92 | "cell_type": "code",
93 | "execution_count": 8,
94 | "metadata": {},
95 | "outputs": [],
96 | "source": [
97 | "def training():\n",
98 | " # megengine内置CIFAR10的数据集\n",
99 | " dataset = data.dataset.CIFAR10(root=\"/data\", train=True)\n",
100 | " \n",
101 | " # 构造数据生产线\n",
102 | " dataloader = data.DataLoader(\n",
103 | " dataset,\n",
104 | " sampler=data.RandomSampler(dataset, batch_size=64, drop_last=True),\n",
105 | " transform=T.Compose([\n",
106 | " T.RandomHorizontalFlip(),\n",
107 | " T.Normalize(mean=0., std=255.), # f(x) = (x - mean) / std\n",
108 | " T.ToMode(\"CHW\"),\n",
109 | " ])\n",
110 | " )\n",
111 | " \n",
112 | " # 构造网络与输入\n",
113 | " model = ResNet18()\n",
114 | " image = megengine.tensor(dtype=\"float32\")\n",
115 | " label = megengine.tensor(dtype=\"int32\")\n",
116 | " \n",
117 | " # 构造网络优化器\n",
118 | " opt = optimizer.SGD(model.parameters(), lr=0.005, momentum=0.9, weight_decay=1e-4)\n",
119 | " \n",
120 | " # 构造静态的计算图以充分发挥性能\n",
121 | " @jit.trace\n",
122 | " def train_func(image, label):\n",
123 | " # 前传\n",
124 | " loglikelihood = model(image)\n",
125 | " loss = F.cross_entropy_with_softmax(loglikelihood, label)\n",
126 | " accuracy = F.accuracy(loglikelihood, label)\n",
127 | "\n",
128 | " # 反传并更新网络参数\n",
129 | " opt.zero_grad()\n",
130 | " opt.backward(loss)\n",
131 | " opt.step()\n",
132 | " return loss, accuracy\n",
133 | " \n",
134 | " for epoch in range(90):\n",
135 | " # 训练一个epoch == 遍历一次训练数据集\n",
136 | " for param_group in opt.param_groups:\n",
137 | " param_group[\"lr\"] = 0.01 * (1 - epoch / 90)\n",
138 | " for i, batch_data in enumerate(dataloader):\n",
139 | " # 进行一次迭代\n",
140 | " image.set_value(batch_data[0])\n",
141 | " label.set_value(batch_data[1])\n",
142 | " \n",
143 | " loss, acc1 = train_func(image, label)\n",
144 | "\n",
145 | " if i % 50 == 0:\n",
146 | " print(\"epoch\", epoch, \"step\", i, \"loss\", loss, \"acc@1\", acc1)\n",
147 | " \n",
148 | " megengine.save(model.state_dict(), \"checkpoint.pkl\")"
149 | ]
150 | },
151 | {
152 | "cell_type": "code",
153 | "execution_count": 9,
154 | "metadata": {},
155 | "outputs": [
156 | {
157 | "name": "stdout",
158 | "output_type": "stream",
159 | "text": [
160 | "epoch 0 step 0 loss Tensor([2.5812]) acc@1 Tensor([0.0625])\n",
161 | "epoch 0 step 50 loss Tensor([1.9665]) acc@1 Tensor([0.25])\n",
162 | "epoch 0 step 100 loss Tensor([1.9619]) acc@1 Tensor([0.25])\n",
163 | "epoch 0 step 150 loss Tensor([1.7975]) acc@1 Tensor([0.3281])\n",
164 | "epoch 0 step 200 loss Tensor([1.7861]) acc@1 Tensor([0.4375])\n",
165 | "epoch 0 step 250 loss Tensor([1.6038]) acc@1 Tensor([0.4062])\n",
166 | "epoch 0 step 300 loss Tensor([1.8104]) acc@1 Tensor([0.375])\n",
167 | "epoch 0 step 350 loss Tensor([1.8505]) acc@1 Tensor([0.4062])\n",
168 | "epoch 0 step 400 loss Tensor([1.3248]) acc@1 Tensor([0.5469])\n",
169 | "epoch 0 step 450 loss Tensor([1.4438]) acc@1 Tensor([0.5156])\n",
170 | "epoch 0 step 500 loss Tensor([1.2161]) acc@1 Tensor([0.5938])\n",
171 | "epoch 0 step 550 loss Tensor([1.1768]) acc@1 Tensor([0.5781])\n",
172 | "epoch 0 step 600 loss Tensor([1.1995]) acc@1 Tensor([0.5938])\n",
173 | "epoch 0 step 650 loss Tensor([1.08]) acc@1 Tensor([0.6406])\n",
174 | "epoch 0 step 700 loss Tensor([0.9324]) acc@1 Tensor([0.6562])\n",
175 | "epoch 0 step 750 loss Tensor([1.2374]) acc@1 Tensor([0.5781])\n",
176 | "epoch 1 step 0 loss Tensor([1.3166]) acc@1 Tensor([0.5312])\n",
177 | "epoch 1 step 50 loss Tensor([0.8961]) acc@1 Tensor([0.625])\n",
178 | "epoch 1 step 100 loss Tensor([0.998]) acc@1 Tensor([0.625])\n",
179 | "epoch 1 step 150 loss Tensor([1.0145]) acc@1 Tensor([0.625])\n",
180 | "epoch 1 step 200 loss Tensor([0.6282]) acc@1 Tensor([0.7969])\n",
181 | "epoch 1 step 250 loss Tensor([1.0428]) acc@1 Tensor([0.6562])\n",
182 | "epoch 1 step 300 loss Tensor([0.7556]) acc@1 Tensor([0.75])\n",
183 | "epoch 1 step 350 loss Tensor([0.983]) acc@1 Tensor([0.625])\n",
184 | "epoch 1 step 400 loss Tensor([0.8718]) acc@1 Tensor([0.6875])\n",
185 | "epoch 1 step 450 loss Tensor([0.7339]) acc@1 Tensor([0.6719])\n",
186 | "epoch 1 step 500 loss Tensor([0.8939]) acc@1 Tensor([0.7188])\n"
187 | ]
188 | },
189 | {
190 | "name": "stderr",
191 | "output_type": "stream",
192 | "text": [
193 | "\u001b[33m11 20:40:34[mgb] \u001b[0m\u001b[1;4;31mERR caught exception from python callback: python exception:\n",
194 | " Traceback (most recent call last):\n",
195 | " File \"/home/zhouyizhuang/.local/lib/python3.6/site-packages/megengine/_internal/mgb.py\", line 3171, in call\n",
196 | " self._func(value[0])\n",
197 | " File \"/home/zhouyizhuang/.local/lib/python3.6/site-packages/megengine/jit/__init__.py\", line 214, in callback\n",
198 | " dest.set_value(value, share=False)\n",
199 | " File \"/home/zhouyizhuang/.local/lib/python3.6/site-packages/megengine/core/tensor.py\", line 280, in set_value\n",
200 | " self.__val.set_value(value, sync=sync, inplace=inplace, share=share)\n",
201 | " File \"/home/zhouyizhuang/.local/lib/python3.6/site-packages/megengine/_internal/mgb.py\", line 2291, in set_value\n",
202 | " self._copy_from_value_proxy(w)\n",
203 | " File \"/home/zhouyizhuang/.local/lib/python3.6/site-packages/megengine/_internal/mgb.py\", line 2185, in _copy_from_value_proxy\n",
204 | " return _mgb.SharedND__copy_from_value_proxy(self, value)\n",
205 | " KeyboardInterrupt\u001b[0m\n",
206 | "\u001b[33m11 20:40:34[mgb] \u001b[0m\u001b[1;4;31mERR error occurred in computing sequence; synchronizing all comp nodes and releasing vars now ...\u001b[0m\n"
207 | ]
208 | },
209 | {
210 | "ename": "KeyboardInterrupt",
211 | "evalue": "",
212 | "output_type": "error",
213 | "traceback": [
214 | "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
215 | "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)",
216 | "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m\u001b[0m\n\u001b[0;32m----> 1\u001b[0;31m \u001b[0mtraining\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m",
217 | "\u001b[0;32m\u001b[0m in \u001b[0;36mtraining\u001b[0;34m()\u001b[0m\n\u001b[1;32m 43\u001b[0m \u001b[0mlabel\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mset_value\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mbatch_data\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 44\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 45\u001b[0;31m \u001b[0mloss\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0macc1\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtrain_func\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mimage\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mlabel\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 46\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 47\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mi\u001b[0m \u001b[0;34m%\u001b[0m \u001b[0;36m50\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
218 | "\u001b[0;32m~/.local/lib/python3.6/site-packages/megengine/jit/__init__.py\u001b[0m in \u001b[0;36m__call__\u001b[0;34m(self, *args, **kwargs)\u001b[0m\n\u001b[1;32m 422\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_run_wrapped\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 423\u001b[0m \u001b[0;32melif\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_status\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_FINISHED\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 424\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_compiled_func\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 425\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 426\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_status\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_UNSTARTED\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
219 | "\u001b[0;32m~/.local/lib/python3.6/site-packages/megengine/_internal/mgb.py\u001b[0m in \u001b[0;36m__call__\u001b[0;34m(self, *args, **kwargs)\u001b[0m\n\u001b[1;32m 1220\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcallbefore_func\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1221\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_setup_args\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mkwargs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 1222\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_execute\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 1223\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcallback_func\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1224\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mcallable\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcallback_func\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
220 | "\u001b[0;32m~/.local/lib/python3.6/site-packages/megengine/_internal/mgb.py\u001b[0m in \u001b[0;36m_execute\u001b[0;34m(self)\u001b[0m\n\u001b[1;32m 1104\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1105\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m_execute\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m->\u001b[0m \u001b[0;34m\"void\"\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 1106\u001b[0;31m \u001b[0;32mreturn\u001b[0m \u001b[0m_mgb\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mAsyncExec__execute\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 1107\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1108\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m_wait\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m->\u001b[0m \u001b[0;34m\"void\"\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
221 | "\u001b[0;32m~/.local/lib/python3.6/site-packages/megengine/_internal/mgb.py\u001b[0m in \u001b[0;36mcall\u001b[0;34m(self, value)\u001b[0m\n\u001b[1;32m 3169\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mcall\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mvalue\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 3170\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mvalue\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msize\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m==\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 3171\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_func\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvalue\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 3172\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 3173\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_func\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvalue\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
222 | "\u001b[0;32m~/.local/lib/python3.6/site-packages/megengine/jit/__init__.py\u001b[0m in \u001b[0;36mcallback\u001b[0;34m(value, dest)\u001b[0m\n\u001b[1;32m 212\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 213\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mcallback\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvalue\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdest\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0md\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 214\u001b[0;31m \u001b[0mdest\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mset_value\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvalue\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mshare\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mFalse\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 215\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 216\u001b[0m \u001b[0ms\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_insert_barrier\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ms\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_symvar\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
223 | "\u001b[0;32m~/.local/lib/python3.6/site-packages/megengine/core/tensor.py\u001b[0m in \u001b[0;36mset_value\u001b[0;34m(self, value, sync, inplace, share)\u001b[0m\n\u001b[1;32m 278\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0misinstance\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvalue\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mTensor\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 279\u001b[0m \u001b[0mvalue\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mvalue\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__val\u001b[0m \u001b[0;32mor\u001b[0m \u001b[0mvalue\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__sym\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0meager_val\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 280\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__val\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mset_value\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvalue\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msync\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0msync\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0minplace\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0minplace\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mshare\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mshare\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 281\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 282\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mfill\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mvalue\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
224 | "\u001b[0;32m~/.local/lib/python3.6/site-packages/megengine/_internal/mgb.py\u001b[0m in \u001b[0;36mset_value\u001b[0;34m(self, w, sync, inplace, share)\u001b[0m\n\u001b[1;32m 2289\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_set_copy_sync\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msync\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 2290\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0misinstance\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mw\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mCompGraphCallbackValueProxy\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 2291\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_copy_from_value_proxy\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mw\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 2292\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 2293\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n",
225 | "\u001b[0;32m~/.local/lib/python3.6/site-packages/megengine/_internal/mgb.py\u001b[0m in \u001b[0;36m_copy_from_value_proxy\u001b[0;34m(self, value)\u001b[0m\n\u001b[1;32m 2183\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 2184\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m_copy_from_value_proxy\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mvalue\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0;34m'CompGraphCallbackValueProxy'\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m->\u001b[0m \u001b[0;34m\"void\"\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 2185\u001b[0;31m \u001b[0;32mreturn\u001b[0m \u001b[0m_mgb\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mSharedND__copy_from_value_proxy\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mvalue\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 2186\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 2187\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m_share_from_value_proxy\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mvalue\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0;34m'CompGraphCallbackValueProxy'\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m->\u001b[0m \u001b[0;34m\"void\"\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
226 | "\u001b[0;31mKeyboardInterrupt\u001b[0m: "
227 | ]
228 | }
229 | ],
230 | "source": [
231 | "training()"
232 | ]
233 | },
234 | {
235 | "cell_type": "code",
236 | "execution_count": null,
237 | "metadata": {},
238 | "outputs": [],
239 | "source": []
240 | }
241 | ],
242 | "metadata": {
243 | "kernelspec": {
244 | "display_name": "1GPU",
245 | "language": "python",
246 | "name": "1gpu"
247 | },
248 | "language_info": {
249 | "codemirror_mode": {
250 | "name": "ipython",
251 | "version": 3
252 | },
253 | "file_extension": ".py",
254 | "mimetype": "text/x-python",
255 | "name": "python",
256 | "nbconvert_exporter": "python",
257 | "pygments_lexer": "ipython3",
258 | "version": "3.6.9"
259 | }
260 | },
261 | "nbformat": 4,
262 | "nbformat_minor": 4
263 | }
264 |
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/notebooks/3.模型构建和训练进阶 I:分类问题3.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 2,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "import megengine\n",
10 | "import megengine.data as data\n",
11 | "import megengine.data.transform as T\n",
12 | "import megengine.module as M\n",
13 | "import megengine.functional as F\n",
14 | "import megengine.optimizer as optimizer\n",
15 | "import megengine.jit as jit\n",
16 | "import megengine.distributed as dist\n",
17 | "import multiprocessing as mp\n",
18 | "\n",
19 | "F.sync_batch_norm\n",
20 | "M.SyncBatchNorm"
21 | ]
22 | },
23 | {
24 | "cell_type": "code",
25 | "execution_count": 3,
26 | "metadata": {},
27 | "outputs": [],
28 | "source": [
29 | "class BasicBlock(M.Module):\n",
30 | " \"\"\"每个ResNet18的Block都包含两层卷积\"\"\"\n",
31 | " def __init__(self, in_channels, out_channels, stride=1):\n",
32 | " super(BasicBlock, self).__init__()\n",
33 | " # 第一层卷积,接 BN 和 ReLU\n",
34 | " self.conv1 = M.ConvBnRelu2d(\n",
35 | " in_channels=in_channels, out_channels=out_channels,\n",
36 | " kernel_size=3, stride=stride, padding=1)\n",
37 | " # 第二层卷积,只接 BN\n",
38 | " self.conv2 = M.ConvBn2d(\n",
39 | " in_channels=out_channels, out_channels=out_channels,\n",
40 | " kernel_size=3, stride=1, padding=1)\n",
41 | " # 残差连接,当输入输出不一致/需要下采样时,用 ConvBn 实现变换\n",
42 | " if in_channels == out_channels and stride == 1:\n",
43 | " self.res_conn = M.Identity()\n",
44 | " else:\n",
45 | " self.res_conn = M.ConvBn2d(\n",
46 | " in_channels=in_channels, out_channels=out_channels,\n",
47 | " kernel_size=1, stride=stride)\n",
48 | " \n",
49 | " def forward(self, x):\n",
50 | " identity = x\n",
51 | " x = self.conv1(x)\n",
52 | " x = self.conv2(x)\n",
53 | " x = x + self.res_conn(identity)\n",
54 | " return F.relu(x)"
55 | ]
56 | },
57 | {
58 | "cell_type": "code",
59 | "execution_count": 4,
60 | "metadata": {},
61 | "outputs": [],
62 | "source": [
63 | "class ResNet18(M.Module):\n",
64 | " def __init__(self):\n",
65 | " self.conv1 = M.ConvBnRelu2d(in_channels=3, out_channels=64,\n",
66 | " kernel_size=3, padding=1)\n",
67 | " # 8 个 BasicBlock,3 次下采样(stride=2),共 8x2=16 层卷积\n",
68 | " self.blocks = M.Sequential(\n",
69 | " BasicBlock(64, 64),\n",
70 | " BasicBlock(64, 64),\n",
71 | " BasicBlock(64, 128, stride=2),\n",
72 | " BasicBlock(128, 128),\n",
73 | " BasicBlock(128, 256, stride=2),\n",
74 | " BasicBlock(256, 256),\n",
75 | " BasicBlock(256, 512, stride=2),\n",
76 | " BasicBlock(512, 512),\n",
77 | " )\n",
78 | " # 全连接分类器,输出维度为 10 类的预测\n",
79 | " self.classifier = M.Sequential(\n",
80 | " M.Dropout(0.2),\n",
81 | " M.Linear(512, 10)\n",
82 | " )\n",
83 | " \n",
84 | " def forward(self, x):\n",
85 | " # 1. 特征提取,输入为 Nx3x32x32 的图片,输出为 Nx512x4x4的张量(Tensor)\n",
86 | " x = self.conv1(x)\n",
87 | " x = self.blocks(x)\n",
88 | " # 2. 4x4平均池化(Average Pooling)\n",
89 | " x = F.avg_pool2d(x, 4)\n",
90 | " x = F.flatten(x, 1)\n",
91 | " # 3. 分类预测\n",
92 | " x = self.classifier(x)\n",
93 | " return x"
94 | ]
95 | },
96 | {
97 | "cell_type": "code",
98 | "execution_count": 12,
99 | "metadata": {},
100 | "outputs": [],
101 | "source": [
102 | "def training():\n",
103 | " # megengine内置CIFAR10的数据集\n",
104 | " dataset = data.dataset.CIFAR10(root=\"/data\", train=True)\n",
105 | " \n",
106 | " # 构造数据生产线\n",
107 | " dataloader = data.DataLoader(\n",
108 | " dataset,\n",
109 | " sampler=data.RandomSampler(dataset, batch_size=64, drop_last=True),\n",
110 | " transform=T.Compose([\n",
111 | " T.RandomHorizontalFlip(),\n",
112 | " T.Normalize(mean=0., std=255.), # f(x) = (x - mean) / std\n",
113 | " T.ToMode(\"CHW\"),\n",
114 | " ])\n",
115 | " )\n",
116 | " \n",
117 | " # 构造网络与输入\n",
118 | " model = ResNet18()\n",
119 | " image = megengine.tensor(dtype=\"float32\")\n",
120 | " label = megengine.tensor(dtype=\"int32\")\n",
121 | " \n",
122 | " # 构造网络优化器\n",
123 | " opt = optimizer.SGD(model.parameters(), lr=0.005, momentum=0.9, weight_decay=1e-4)\n",
124 | " \n",
125 | " # 构造静态的计算图以充分发挥性能\n",
126 | " @jit.trace\n",
127 | " def train_func(image, label):\n",
128 | " # 前传\n",
129 | " loglikelihood = model(image)\n",
130 | " loss = F.cross_entropy_with_softmax(loglikelihood, label)\n",
131 | " accuracy = F.accuracy(loglikelihood, label)\n",
132 | "\n",
133 | " # 反传并更新网络参数\n",
134 | " opt.zero_grad()\n",
135 | " opt.backward(loss)\n",
136 | " opt.step()\n",
137 | " return loss, accuracy\n",
138 | " \n",
139 | " for epoch in range(90):\n",
140 | " # 训练一个epoch == 遍历一次训练数据集\n",
141 | " for i, batch_data in enumerate(dataloader):\n",
142 | " # 进行一次迭代\n",
143 | " image.set_value(batch_data[0])\n",
144 | " label.set_value(batch_data[1])\n",
145 | " \n",
146 | " loss, acc1 = train_func(image, label)\n",
147 | "\n",
148 | " if i % 50 == 0:\n",
149 | " print(\"epoch\", epoch, \"step\", i, \"loss\", loss, \"acc@1\", acc1)"
150 | ]
151 | },
152 | {
153 | "cell_type": "code",
154 | "execution_count": 15,
155 | "metadata": {},
156 | "outputs": [],
157 | "source": [
158 | "def worker(rank):\n",
159 | " # 每个子进程需要初始化分布式进程组\n",
160 | " dist.init_process_group(\n",
161 | " master_ip=\"localhost\", # 主节点的IP地址。单机多卡的情况下可以简单设为localhost\n",
162 | " master_port=2233, # 进行通信的可用的端口号,0-65535,注意不能被其他进程占用\n",
163 | " world_size=8, # 参与任务的进程总数(即总进程个数,也等于显卡个数)\n",
164 | " rank=rank, # 当前进程的进程号(即第几个进程)\n",
165 | " dev=rank, # 第几个进程用第几块显卡\n",
166 | " )\n",
167 | " print(\"init process\", rank)\n",
168 | " # 开始训练\n",
169 | " training()\n",
170 | "\n",
171 | "def main():\n",
172 | " # 在一台机器上启动 8 个子进程(因为总共有 8 块显卡)\n",
173 | " # 关于 multiprocessing 的使用方法参见python官方文档\n",
174 | " for rank in range(8):\n",
175 | " p = mp.Process(target=worker, args=(rank,))\n",
176 | " p.start()"
177 | ]
178 | },
179 | {
180 | "cell_type": "code",
181 | "execution_count": 16,
182 | "metadata": {},
183 | "outputs": [
184 | {
185 | "name": "stdout",
186 | "output_type": "stream",
187 | "text": [
188 | "init process 0\n",
189 | "init process 1\n",
190 | "init process 2\n",
191 | "init process 3\n",
192 | "init process 4\n",
193 | "init process 5\n",
194 | "init process 6\n",
195 | "init process 7\n",
196 | "epoch 0 step 0 loss Tensor([2.5735], device=gpu6:0) acc@1 Tensor([0.0781], device=gpu6:0)\n",
197 | "epoch 0 step 0 loss Tensor([2.4922], device=gpu1:0) acc@1 Tensor([0.0625], device=gpu1:0)\n",
198 | "epoch 0 step 0 loss Tensor([2.5264], device=gpu3:0) acc@1 Tensor([0.1406], device=gpu3:0)\n",
199 | "epoch 0 step 0 loss Tensor([2.7726], device=gpu5:0) acc@1 Tensor([0.], device=gpu5:0)\n",
200 | "epoch 0 step 0 loss Tensor([2.6011], device=gpu2:0) acc@1 Tensor([0.0938], device=gpu2:0)\n",
201 | "epoch 0 step 0 loss Tensor([2.5178], device=gpu7:0) acc@1 Tensor([0.0312], device=gpu7:0)\n",
202 | "epoch 0 step 0 loss Tensor([2.554], device=gpu4:0) acc@1 Tensor([0.1094], device=gpu4:0)\n",
203 | "epoch 0 step 0 loss Tensor([2.5615], device=gpu0:0) acc@1 Tensor([0.1406], device=gpu0:0)\n",
204 | "epoch 0 step 50 loss Tensor([1.7417], device=gpu6:0) acc@1 Tensor([0.4219], device=gpu6:0)\n",
205 | "epoch 0 step 50 loss Tensor([1.521], device=gpu5:0) acc@1 Tensor([0.5312], device=gpu5:0)\n",
206 | "epoch 0 step 50 loss Tensor([1.6334], device=gpu1:0) acc@1 Tensor([0.3281], device=gpu1:0)\n",
207 | "epoch 0 step 50 loss Tensor([1.6283], device=gpu4:0) acc@1 Tensor([0.375], device=gpu4:0)\n",
208 | "epoch 0 step 50 loss Tensor([1.5762], device=gpu2:0) acc@1 Tensor([0.375], device=gpu2:0)\n",
209 | "epoch 0 step 50 loss Tensor([1.5677], device=gpu0:0) acc@1 Tensor([0.4062], device=gpu0:0)\n",
210 | "epoch 0 step 50 loss Tensor([1.6616], device=gpu3:0) acc@1 Tensor([0.3594], device=gpu3:0)\n",
211 | "epoch 0 step 50 loss Tensor([1.6297], device=gpu7:0) acc@1 Tensor([0.4531], device=gpu7:0)\n",
212 | "epoch 1 step 0 loss Tensor([1.2938], device=gpu7:0) acc@1 Tensor([0.5], device=gpu7:0)\n",
213 | "epoch 1 step 0 loss Tensor([1.5492], device=gpu1:0) acc@1 Tensor([0.4219], device=gpu1:0)\n",
214 | "epoch 1 step 0 loss Tensor([1.3215], device=gpu0:0) acc@1 Tensor([0.4531], device=gpu0:0)\n",
215 | "epoch 1 step 0 loss Tensor([1.5797], device=gpu5:0) acc@1 Tensor([0.5], device=gpu5:0)\n",
216 | "epoch 1 step 0 loss Tensor([1.5253], device=gpu6:0) acc@1 Tensor([0.4219], device=gpu6:0)\n",
217 | "epoch 1 step 0 loss Tensor([1.5287], device=gpu4:0) acc@1 Tensor([0.375], device=gpu4:0)\n",
218 | "epoch 1 step 0 loss Tensor([1.4993], device=gpu3:0) acc@1 Tensor([0.4062], device=gpu3:0)\n",
219 | "epoch 1 step 0 loss Tensor([1.5306], device=gpu2:0) acc@1 Tensor([0.4219], device=gpu2:0)\n",
220 | "epoch 1 step 50 loss Tensor([1.0871], device=gpu6:0) acc@1 Tensor([0.5938], device=gpu6:0)\n",
221 | "epoch 1 step 50 loss Tensor([1.2019], device=gpu5:0) acc@1 Tensor([0.5625], device=gpu5:0)\n",
222 | "epoch 1 step 50 loss Tensor([1.449], device=gpu1:0) acc@1 Tensor([0.5], device=gpu1:0)\n",
223 | "epoch 1 step 50 loss Tensor([1.3241], device=gpu2:0) acc@1 Tensor([0.5625], device=gpu2:0)\n",
224 | "epoch 1 step 50 loss Tensor([1.3581], device=gpu7:0) acc@1 Tensor([0.4531], device=gpu7:0)\n",
225 | "epoch 1 step 50 loss Tensor([1.3815], device=gpu3:0) acc@1 Tensor([0.5], device=gpu3:0)\n",
226 | "epoch 1 step 50 loss Tensor([1.5198], device=gpu4:0) acc@1 Tensor([0.375], device=gpu4:0)\n",
227 | "epoch 1 step 50 loss Tensor([1.2055], device=gpu0:0) acc@1 Tensor([0.5156], device=gpu0:0)\n",
228 | "epoch 2 step 0 loss Tensor([1.313], device=gpu1:0) acc@1 Tensor([0.5156], device=gpu1:0)\n",
229 | "epoch 2 step 0 loss Tensor([1.0568], device=gpu6:0) acc@1 Tensor([0.6094], device=gpu6:0)\n",
230 | "epoch 2 step 0 loss Tensor([1.4929], device=gpu5:0) acc@1 Tensor([0.5625], device=gpu5:0)\n",
231 | "epoch 2 step 0 loss Tensor([1.1126], device=gpu7:0) acc@1 Tensor([0.5938], device=gpu7:0)\n",
232 | "epoch 2 step 0 loss Tensor([1.1693], device=gpu4:0) acc@1 Tensor([0.5781], device=gpu4:0)\n",
233 | "epoch 2 step 0 loss Tensor([1.1453], device=gpu2:0) acc@1 Tensor([0.5312], device=gpu2:0)\n",
234 | "epoch 2 step 0 loss Tensor([1.235], device=gpu0:0) acc@1 Tensor([0.4844], device=gpu0:0)\n",
235 | "epoch 2 step 0 loss Tensor([1.1699], device=gpu3:0) acc@1 Tensor([0.5156], device=gpu3:0)\n",
236 | "epoch 2 step 50 loss Tensor([1.3884], device=gpu6:0) acc@1 Tensor([0.5], device=gpu6:0)\n",
237 | "epoch 2 step 50 loss Tensor([0.9665], device=gpu7:0) acc@1 Tensor([0.5938], device=gpu7:0)\n",
238 | "epoch 2 step 50 loss Tensor([1.0038], device=gpu1:0) acc@1 Tensor([0.5938], device=gpu1:0)\n",
239 | "epoch 2 step 50 loss Tensor([1.1243], device=gpu0:0) acc@1 Tensor([0.5781], device=gpu0:0)\n",
240 | "epoch 2 step 50 loss Tensor([1.014], device=gpu5:0) acc@1 Tensor([0.7188], device=gpu5:0)\n",
241 | "epoch 2 step 50 loss Tensor([1.077], device=gpu4:0) acc@1 Tensor([0.5156], device=gpu4:0)\n",
242 | "epoch 2 step 50 loss Tensor([1.1881], device=gpu2:0) acc@1 Tensor([0.6562], device=gpu2:0)\n",
243 | "epoch 2 step 50 loss Tensor([1.0908], device=gpu3:0) acc@1 Tensor([0.5312], device=gpu3:0)\n",
244 | "epoch 3 step 0 loss Tensor([1.1245], device=gpu1:0) acc@1 Tensor([0.5625], device=gpu1:0)\n",
245 | "epoch 3 step 0 loss Tensor([1.0347], device=gpu5:0) acc@1 Tensor([0.6875], device=gpu5:0)\n",
246 | "epoch 3 step 0 loss Tensor([0.9063], device=gpu7:0) acc@1 Tensor([0.6875], device=gpu7:0)\n",
247 | "epoch 3 step 0 loss Tensor([1.046], device=gpu3:0) acc@1 Tensor([0.6562], device=gpu3:0)\n",
248 | "epoch 3 step 0 loss Tensor([0.961], device=gpu6:0) acc@1 Tensor([0.7031], device=gpu6:0)\n",
249 | "epoch 3 step 0 loss Tensor([0.7993], device=gpu0:0) acc@1 Tensor([0.7188], device=gpu0:0)\n",
250 | "epoch 3 step 0 loss Tensor([0.8532], device=gpu2:0) acc@1 Tensor([0.6406], device=gpu2:0)\n",
251 | "epoch 3 step 0 loss Tensor([1.0973], device=gpu4:0) acc@1 Tensor([0.6094], device=gpu4:0)\n",
252 | "epoch 3 step 50 loss Tensor([0.8748], device=gpu7:0) acc@1 Tensor([0.7344], device=gpu7:0)\n",
253 | "epoch 3 step 50 loss Tensor([0.9524], device=gpu6:0) acc@1 Tensor([0.6875], device=gpu6:0)\n",
254 | "epoch 3 step 50 loss Tensor([1.0182], device=gpu1:0) acc@1 Tensor([0.7031], device=gpu1:0)\n",
255 | "epoch 3 step 50 loss Tensor([0.9418], device=gpu4:0) acc@1 Tensor([0.7188], device=gpu4:0)\n",
256 | "epoch 3 step 50 loss Tensor([0.9911], device=gpu2:0) acc@1 Tensor([0.6719], device=gpu2:0)\n",
257 | "epoch 3 step 50 loss Tensor([1.0507], device=gpu5:0) acc@1 Tensor([0.6562], device=gpu5:0)\n",
258 | "epoch 3 step 50 loss Tensor([0.9237], device=gpu3:0) acc@1 Tensor([0.6562], device=gpu3:0)\n",
259 | "epoch 3 step 50 loss Tensor([0.722], device=gpu0:0) acc@1 Tensor([0.8281], device=gpu0:0)\n",
260 | "epoch 4 step 0 loss Tensor([0.8674], device=gpu7:0) acc@1 Tensor([0.7188], device=gpu7:0)\n",
261 | "epoch 4 step 0 loss Tensor([1.0817], device=gpu5:0) acc@1 Tensor([0.5938], device=gpu5:0)\n",
262 | "epoch 4 step 0 loss Tensor([1.0578], device=gpu1:0) acc@1 Tensor([0.6094], device=gpu1:0)\n",
263 | "epoch 4 step 0 loss Tensor([0.8786], device=gpu6:0) acc@1 Tensor([0.7031], device=gpu6:0)\n",
264 | "epoch 4 step 0 loss Tensor([1.2348], device=gpu2:0) acc@1 Tensor([0.5156], device=gpu2:0)\n",
265 | "epoch 4 step 0 loss Tensor([0.9356], device=gpu4:0) acc@1 Tensor([0.7188], device=gpu4:0)\n",
266 | "epoch 4 step 0 loss Tensor([0.933], device=gpu0:0) acc@1 Tensor([0.6562], device=gpu0:0)\n",
267 | "epoch 4 step 0 loss Tensor([0.7652], device=gpu3:0) acc@1 Tensor([0.7188], device=gpu3:0)\n",
268 | "epoch 4 step 50 loss Tensor([0.7796], device=gpu6:0) acc@1 Tensor([0.7344], device=gpu6:0)\n",
269 | "epoch 4 step 50 loss Tensor([0.8349], device=gpu5:0) acc@1 Tensor([0.7031], device=gpu5:0)\n",
270 | "epoch 4 step 50 loss Tensor([0.9181], device=gpu4:0) acc@1 Tensor([0.7344], device=gpu4:0)\n",
271 | "epoch 4 step 50 loss Tensor([0.9559], device=gpu3:0) acc@1 Tensor([0.5781], device=gpu3:0)\n",
272 | "epoch 4 step 50 loss Tensor([0.5912], device=gpu2:0) acc@1 Tensor([0.8281], device=gpu2:0)\n",
273 | "epoch 4 step 50 loss Tensor([0.9764], device=gpu0:0) acc@1 Tensor([0.6562], device=gpu0:0)\n",
274 | "epoch 4 step 50 loss Tensor([0.9125], device=gpu7:0) acc@1 Tensor([0.6875], device=gpu7:0)\n",
275 | "epoch 4 step 50 loss Tensor([0.7488], device=gpu1:0) acc@1 Tensor([0.7969], device=gpu1:0)\n",
276 | "epoch 5 step 0 loss Tensor([0.5704], device=gpu5:0) acc@1 Tensor([0.7812], device=gpu5:0)\n",
277 | "epoch 5 step 0 loss Tensor([0.9098], device=gpu7:0) acc@1 Tensor([0.6719], device=gpu7:0)\n",
278 | "epoch 5 step 0 loss Tensor([0.9084], device=gpu2:0) acc@1 Tensor([0.7188], device=gpu2:0)\n",
279 | "epoch 5 step 0 loss Tensor([0.8209], device=gpu3:0) acc@1 Tensor([0.7188], device=gpu3:0)\n",
280 | "epoch 5 step 0 loss Tensor([0.8147], device=gpu4:0) acc@1 Tensor([0.6719], device=gpu4:0)\n",
281 | "epoch 5 step 0 loss Tensor([0.8523], device=gpu6:0) acc@1 Tensor([0.7031], device=gpu6:0)\n",
282 | "epoch 5 step 0 loss Tensor([0.9633], device=gpu0:0) acc@1 Tensor([0.7031], device=gpu0:0)\n",
283 | "epoch 5 step 0 loss Tensor([1.0272], device=gpu1:0) acc@1 Tensor([0.6406], device=gpu1:0)\n"
284 | ]
285 | }
286 | ],
287 | "source": [
288 | "main()"
289 | ]
290 | },
291 | {
292 | "cell_type": "code",
293 | "execution_count": null,
294 | "metadata": {},
295 | "outputs": [],
296 | "source": []
297 | }
298 | ],
299 | "metadata": {
300 | "kernelspec": {
301 | "display_name": "8GPU",
302 | "language": "python3",
303 | "name": "8gpu"
304 | },
305 | "language_info": {
306 | "codemirror_mode": {
307 | "name": "ipython",
308 | "version": 3
309 | },
310 | "file_extension": ".py",
311 | "mimetype": "text/x-python",
312 | "name": "python",
313 | "nbconvert_exporter": "python",
314 | "pygments_lexer": "ipython3",
315 | "version": "3.6.9"
316 | }
317 | },
318 | "nbformat": 4,
319 | "nbformat_minor": 4
320 | }
321 |
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/notebooks/3.模型构建和训练进阶 I:分类问题4.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "import megengine\n",
10 | "import megengine.data as data\n",
11 | "import megengine.data.transform as T\n",
12 | "import megengine.module as M\n",
13 | "import megengine.functional as F\n",
14 | "import megengine.optimizer as optimizer\n",
15 | "import megengine.jit as jit"
16 | ]
17 | },
18 | {
19 | "cell_type": "code",
20 | "execution_count": 3,
21 | "metadata": {},
22 | "outputs": [],
23 | "source": [
24 | "class BasicBlock(M.Module):\n",
25 | " \"\"\"每个ResNet18的Block都包含两层卷积\"\"\"\n",
26 | " def __init__(self, in_channels, out_channels, stride=1):\n",
27 | " super(BasicBlock, self).__init__()\n",
28 | " # 第一层卷积,接 BN 和 ReLU\n",
29 | " self.conv1 = M.ConvBnRelu2d(\n",
30 | " in_channels=in_channels, out_channels=out_channels,\n",
31 | " kernel_size=3, stride=stride, padding=1)\n",
32 | " # 第二层卷积,只接 BN\n",
33 | " self.conv2 = M.ConvBn2d(\n",
34 | " in_channels=out_channels, out_channels=out_channels,\n",
35 | " kernel_size=3, stride=1, padding=1)\n",
36 | " # 残差连接,当输入输出不一致/需要下采样时,用 ConvBn 实现变换\n",
37 | " if in_channels == out_channels and stride == 1:\n",
38 | " self.res_conn = M.Identity()\n",
39 | " else:\n",
40 | " self.res_conn = M.ConvBn2d(\n",
41 | " in_channels=in_channels, out_channels=out_channels,\n",
42 | " kernel_size=1, stride=stride)\n",
43 | " \n",
44 | " def forward(self, x):\n",
45 | " identity = x\n",
46 | " x = self.conv1(x)\n",
47 | " x = self.conv2(x)\n",
48 | " x = x + self.res_conn(identity)\n",
49 | " return F.relu(x)"
50 | ]
51 | },
52 | {
53 | "cell_type": "code",
54 | "execution_count": 4,
55 | "metadata": {},
56 | "outputs": [],
57 | "source": [
58 | "class ResNet18(M.Module):\n",
59 | " def __init__(self):\n",
60 | " self.conv1 = M.ConvBnRelu2d(in_channels=3, out_channels=64,\n",
61 | " kernel_size=3, padding=1)\n",
62 | " # 8 个 BasicBlock,3 次下采样(stride=2),共 8x2=16 层卷积\n",
63 | " self.blocks = M.Sequential(\n",
64 | " BasicBlock(64, 64),\n",
65 | " BasicBlock(64, 64),\n",
66 | " BasicBlock(64, 128, stride=2),\n",
67 | " BasicBlock(128, 128),\n",
68 | " BasicBlock(128, 256, stride=2),\n",
69 | " BasicBlock(256, 256),\n",
70 | " BasicBlock(256, 512, stride=2),\n",
71 | " BasicBlock(512, 512),\n",
72 | " )\n",
73 | " # 全连接分类器,输出维度为 10 类的预测\n",
74 | " self.classifier = M.Sequential(\n",
75 | " M.Dropout(0.2),\n",
76 | " M.Linear(512, 10)\n",
77 | " )\n",
78 | " \n",
79 | " def forward(self, x):\n",
80 | " # 1. 特征提取,输入为 Nx3x32x32 的图片,输出为 Nx512x4x4的张量(Tensor)\n",
81 | " x = self.conv1(x)\n",
82 | " x = self.blocks(x)\n",
83 | " # 2. 4x4平均池化(Average Pooling)\n",
84 | " x = F.avg_pool2d(x, 4)\n",
85 | " x = F.flatten(x, 1)\n",
86 | " # 3. 分类预测\n",
87 | " x = self.classifier(x)\n",
88 | " return x"
89 | ]
90 | },
91 | {
92 | "cell_type": "code",
93 | "execution_count": 22,
94 | "metadata": {},
95 | "outputs": [],
96 | "source": [
97 | "def testing():\n",
98 | " # megengine内置CIFAR10的数据集(train=False)\n",
99 | " dataset = data.dataset.CIFAR10(root=\"/data\", train=False)\n",
100 | " \n",
101 | " # 构造数据生产线\n",
102 | " dataloader = data.DataLoader(\n",
103 | " dataset,\n",
104 | " # Random换成Sequential, drop_last=False不漏掉任何一张图片\n",
105 | " sampler=data.SequentialSampler(dataset, batch_size=64, drop_last=False),\n",
106 | " transform=T.Compose([\n",
107 | " # T.RandomHorizontalFlip(), 测试的时候不进行数据增广\n",
108 | " T.Normalize(mean=0., std=255.), # f(x) = (x - mean) / std\n",
109 | " T.ToMode(\"CHW\"),\n",
110 | " ])\n",
111 | " )\n",
112 | " \n",
113 | " # 构造网络与输入\n",
114 | " model = ResNet18()\n",
115 | " image = megengine.tensor(dtype=\"float32\")\n",
116 | " label = megengine.tensor(dtype=\"int32\")\n",
117 | " \n",
118 | " state_dict = megengine.load(\"checkpoint.pkl\")\n",
119 | " model.load_state_dict(state_dict)\n",
120 | " \n",
121 | " # 构造静态的计算图以充分发挥性能\n",
122 | " @jit.trace\n",
123 | " def test_func(image, label):\n",
124 | " # 前传\n",
125 | " loglikelihood = model(image)\n",
126 | " accuracy = F.accuracy(loglikelihood, label)\n",
127 | " return accuracy\n",
128 | " \n",
129 | " # 遍历一次测试数据集\n",
130 | " correct = 0\n",
131 | " for i, batch_data in enumerate(dataloader):\n",
132 | " image.set_value(batch_data[0])\n",
133 | " label.set_value(batch_data[1])\n",
134 | "\n",
135 | " acc1 = test_func(image, label)\n",
136 | " correct += acc1.item() * label.shape[0]\n",
137 | " \n",
138 | " if i % 50 == 0:\n",
139 | " print(\"step\", i, \"acc@1\", acc1)\n",
140 | " print(\"Final accuracy =\", correct / 10000 * 100, \"%\")"
141 | ]
142 | },
143 | {
144 | "cell_type": "code",
145 | "execution_count": 23,
146 | "metadata": {},
147 | "outputs": [
148 | {
149 | "name": "stdout",
150 | "output_type": "stream",
151 | "text": [
152 | "step 0 acc@1 Tensor([0.5938])\n",
153 | "step 50 acc@1 Tensor([0.6719])\n",
154 | "step 100 acc@1 Tensor([0.5938])\n",
155 | "step 150 acc@1 Tensor([0.7188])\n",
156 | "Final accuracy = 63.21 %\n"
157 | ]
158 | }
159 | ],
160 | "source": [
161 | "testing()"
162 | ]
163 | },
164 | {
165 | "cell_type": "code",
166 | "execution_count": null,
167 | "metadata": {},
168 | "outputs": [],
169 | "source": []
170 | }
171 | ],
172 | "metadata": {
173 | "kernelspec": {
174 | "display_name": "1GPU",
175 | "language": "python",
176 | "name": "1gpu"
177 | },
178 | "language_info": {
179 | "codemirror_mode": {
180 | "name": "ipython",
181 | "version": 3
182 | },
183 | "file_extension": ".py",
184 | "mimetype": "text/x-python",
185 | "name": "python",
186 | "nbconvert_exporter": "python",
187 | "pygments_lexer": "ipython3",
188 | "version": "3.6.9"
189 | }
190 | },
191 | "nbformat": 4,
192 | "nbformat_minor": 4
193 | }
194 |
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/notebooks/4.模型构建和训练进阶 II:物体检测(anchor).ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 44,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "import math\n",
10 | "import numpy as np\n",
11 | "\n",
12 | "from megengine import Tensor, tensor\n",
13 | "import megengine.functional as F\n",
14 | "\n",
15 | "\n",
16 | "def meshgrid(x, y):\n",
17 | " assert len(x.shape) == 1\n",
18 | " assert len(y.shape) == 1\n",
19 | " mesh_shape = (y.shape[0], x.shape[0])\n",
20 | " mesh_x = x.broadcast(mesh_shape)\n",
21 | " mesh_y = y.reshape(-1, 1).broadcast(mesh_shape)\n",
22 | " return mesh_x, mesh_y\n",
23 | "\n",
24 | "\n",
25 | "def create_anchor_grid(featmap_size, offsets, stride):\n",
26 | " step_x, step_y = featmap_size\n",
27 | " shift = offsets * stride\n",
28 | "\n",
29 | " grid_x = F.arange(shift, step_x * stride + shift, step=stride)\n",
30 | " grid_y = F.arange(shift, step_y * stride + shift, step=stride)\n",
31 | " grids_x, grids_y = meshgrid(grid_y, grid_x)\n",
32 | " return grids_x.reshape(-1), grids_y.reshape(-1)\n",
33 | "\n",
34 | "\n",
35 | "class AnchorGenerator:\n",
36 | "\n",
37 | " def __init__(\n",
38 | " self,\n",
39 | " anchor_scales: list = [[32], [64], [128], [256], [512]],\n",
40 | " anchor_ratios: list = [[0.5, 1, 2]],\n",
41 | " strides: list = [4, 8, 16, 32, 64],\n",
42 | " offset: float = 0,\n",
43 | " ):\n",
44 | " super().__init__()\n",
45 | " self.anchor_scales = np.array(anchor_scales, dtype=np.float32)\n",
46 | " self.anchor_ratios = np.array(anchor_ratios, dtype=np.float32)\n",
47 | " self.strides = strides\n",
48 | " self.offset = offset\n",
49 | " self.num_features = len(strides)\n",
50 | "\n",
51 | " self.base_anchors = self._different_level_anchors(anchor_scales, anchor_ratios)\n",
52 | "\n",
53 | " def _different_level_anchors(self, scales, ratios):\n",
54 | " if len(scales) == 1:\n",
55 | " scales *= self.num_features\n",
56 | " assert len(scales) == self.num_features\n",
57 | "\n",
58 | " if len(ratios) == 1:\n",
59 | " ratios *= self.num_features\n",
60 | " assert len(ratios) == self.num_features\n",
61 | " return [\n",
62 | " tensor(self.generate_base_anchors(scale, ratio))\n",
63 | " # self.generate_base_anchors(scale, ratio)\n",
64 | " for scale, ratio in zip(scales, ratios)\n",
65 | " ]\n",
66 | "\n",
67 | " def generate_base_anchors(self, scales, ratios):\n",
68 | " base_anchors = []\n",
69 | " areas = [s ** 2.0 for s in scales]\n",
70 | " for area in areas:\n",
71 | " for ratio in ratios:\n",
72 | " w = math.sqrt(area / ratio)\n",
73 | " h = ratio * w\n",
74 | " # center-based anchor\n",
75 | " x0, y0, x1, y1 = -w / 2.0, -h / 2.0, w / 2.0, h / 2.0\n",
76 | " base_anchors.append([x0, y0, x1, y1])\n",
77 | " return base_anchors\n",
78 | "\n",
79 | " def generate_anchors_by_features(self, sizes):\n",
80 | " all_anchors = []\n",
81 | " assert len(sizes) == self.num_features, (\n",
82 | " \"input features expected {}, got {}\".format(self.num_features, len(sizes))\n",
83 | " )\n",
84 | " for size, stride, base_anchor in zip(sizes, self.strides, self.base_anchors):\n",
85 | " grid_x, grid_y = create_anchor_grid(size, self.offset, stride)\n",
86 | " # FIXME: If F.stack works, change to stack\n",
87 | " grid_x, grid_y = grid_x.reshape(-1, 1), grid_y.reshape(-1, 1)\n",
88 | " grids = F.concat([grid_x, grid_y, grid_x, grid_y], axis=1)\n",
89 | " all_anchors.append(\n",
90 | " (grids.reshape(-1, 1, 4) + base_anchor.reshape(1, -1, 4)).reshape(-1, 4)\n",
91 | " )\n",
92 | " return all_anchors\n",
93 | " \n",
94 | " def __call__(self, featmaps):\n",
95 | " feat_sizes = [fmap.shape[-2:] for fmap in featmaps]\n",
96 | " return self.generate_anchors_by_features(feat_sizes)\n",
97 | "\n",
98 | " @property\n",
99 | " def anchor_dim(self):\n",
100 | " return 4"
101 | ]
102 | },
103 | {
104 | "cell_type": "code",
105 | "execution_count": 45,
106 | "metadata": {},
107 | "outputs": [
108 | {
109 | "name": "stdout",
110 | "output_type": "stream",
111 | "text": [
112 | "shape of feature map: (1, 3, 400, 480)\n",
113 | "shape of feature map: (1, 3, 200, 240)\n",
114 | "shape of feature map: (1, 3, 100, 120)\n",
115 | "shape of feature map: (1, 3, 50, 60)\n",
116 | "shape of feature map: (1, 3, 25, 30)\n"
117 | ]
118 | }
119 | ],
120 | "source": [
121 | "from megengine.random import gaussian\n",
122 | "shape = [1, 3, 25, 30]\n",
123 | "shape_list = reversed([shape[:2] + [s * 2**i for s in shape[2:]] for i in range(5)])\n",
124 | "feature_maps = [gaussian(shape) for shape in shape_list]\n",
125 | "for featmap in feature_maps:\n",
126 | " print(\"shape of feature map: {}\".format(featmap.shape))"
127 | ]
128 | },
129 | {
130 | "cell_type": "code",
131 | "execution_count": 46,
132 | "metadata": {},
133 | "outputs": [
134 | {
135 | "name": "stdout",
136 | "output_type": "stream",
137 | "text": [
138 | "anchor shape: (576000, 4)\n",
139 | "576000 = 400 * 480 * 3\n",
140 | "anchor shape: (144000, 4)\n",
141 | "144000 = 200 * 240 * 3\n",
142 | "anchor shape: (36000, 4)\n",
143 | "36000 = 100 * 120 * 3\n",
144 | "anchor shape: (9000, 4)\n",
145 | "9000 = 50 * 60 * 3\n",
146 | "anchor shape: (2250, 4)\n",
147 | "2250 = 25 * 30 * 3\n"
148 | ]
149 | }
150 | ],
151 | "source": [
152 | "anchor_generator = AnchorGenerator()\n",
153 | "anchors_list = anchor_generator(feature_maps)\n",
154 | "for anchors, fmap in zip(anchors_list, feature_maps):\n",
155 | " print(\"anchor shape: {}\".format(anchors.shape))\n",
156 | " print(\"{} = {} * {} * 3\".format(anchors.shape[0], *fmap.shape[2:]))"
157 | ]
158 | },
159 | {
160 | "cell_type": "code",
161 | "execution_count": 48,
162 | "metadata": {},
163 | "outputs": [
164 | {
165 | "name": "stdout",
166 | "output_type": "stream",
167 | "text": [
168 | "Tensor([1264. 1048. 1296. 1080.])\n",
169 | "Tensor([ 608. 1032. 672. 1096.])\n",
170 | "Tensor([1216. 992. 1344. 1120.])\n",
171 | "Tensor([ 512. 928. 768. 1184.])\n",
172 | "Tensor([1024. 768. 1536. 1280.])\n"
173 | ]
174 | }
175 | ],
176 | "source": [
177 | "for anchors in anchors_list:\n",
178 | " print(anchors[anchors.shape[0] * 2 // 3 + 1])"
179 | ]
180 | },
181 | {
182 | "cell_type": "code",
183 | "execution_count": null,
184 | "metadata": {},
185 | "outputs": [],
186 | "source": []
187 | }
188 | ],
189 | "metadata": {
190 | "kernelspec": {
191 | "display_name": "Python 3",
192 | "language": "python",
193 | "name": "python3"
194 | },
195 | "language_info": {
196 | "codemirror_mode": {
197 | "name": "ipython",
198 | "version": 3
199 | },
200 | "file_extension": ".py",
201 | "mimetype": "text/x-python",
202 | "name": "python",
203 | "nbconvert_exporter": "python",
204 | "pygments_lexer": "ipython3",
205 | "version": "3.6.8"
206 | }
207 | },
208 | "nbformat": 4,
209 | "nbformat_minor": 4
210 | }
211 |
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/notebooks/4.模型构建和训练进阶 II:物体检测(nms).ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 2,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "import numpy as np\n",
10 | "\n",
11 | "\n",
12 | "def cpu_nms(dets, thresh):\n",
13 | " x1 = np.ascontiguousarray(dets[:, 0])\n",
14 | " y1 = np.ascontiguousarray(dets[:, 1])\n",
15 | " x2 = np.ascontiguousarray(dets[:, 2])\n",
16 | " y2 = np.ascontiguousarray(dets[:, 3])\n",
17 | "\n",
18 | " areas = (x2 - x1) * (y2 - y1)\n",
19 | " order = dets[:, 4].argsort()[::-1]\n",
20 | " keep = list()\n",
21 | "\n",
22 | " while order.size > 0:\n",
23 | " pick_ind = order[0]\n",
24 | " keep.append(pick_ind)\n",
25 | "\n",
26 | " xx1 = np.maximum(x1[pick_ind], x1[order[1:]])\n",
27 | " yy1 = np.maximum(y1[pick_ind], y1[order[1:]])\n",
28 | " xx2 = np.minimum(x2[pick_ind], x2[order[1:]])\n",
29 | " yy2 = np.minimum(y2[pick_ind], y2[order[1:]])\n",
30 | "\n",
31 | " inter = np.maximum(0.0, xx2 - xx1) * np.maximum(0.0, yy2 - yy1)\n",
32 | " iou = inter / (areas[pick_ind] + areas[order[1:]] - inter)\n",
33 | "\n",
34 | " order = order[np.where(iou <= thresh)[0] + 1]\n",
35 | "\n",
36 | " return keep"
37 | ]
38 | },
39 | {
40 | "cell_type": "code",
41 | "execution_count": 8,
42 | "metadata": {},
43 | "outputs": [
44 | {
45 | "name": "stdout",
46 | "output_type": "stream",
47 | "text": [
48 | "before nms:\n",
49 | " [[ 11.5 12. 311.4 410.6 0.85]\n",
50 | " [ 0.5 1. 300.4 400.5 0.97]\n",
51 | " [ 200.5 300. 700.4 1000.6 0.65]\n",
52 | " [ 250.5 310. 700.4 1000.6 0.72]]\n",
53 | "after nms:\n",
54 | " [[ 0.5 1. 300.4 400.5 0.97]\n",
55 | " [ 250.5 310. 700.4 1000.6 0.72]]\n"
56 | ]
57 | }
58 | ],
59 | "source": [
60 | "dets = np.array([\n",
61 | " [11.5, 12.0, 311.4, 410.6, 0.85],\n",
62 | " [0.5, 1.0, 300.4, 400.5, 0.97],\n",
63 | " [200.5, 300.0, 700.4, 1000.6, 0.65],\n",
64 | " [250.5, 310.0, 700.4, 1000.6, 0.72],\n",
65 | "])\n",
66 | "np.set_printoptions(suppress=True)\n",
67 | "print(\"before nms:\\n\", dets)\n",
68 | "keep = cpu_nms(dets, 0.5)\n",
69 | "print(\"after nms:\\n\", dets[keep])"
70 | ]
71 | },
72 | {
73 | "cell_type": "code",
74 | "execution_count": null,
75 | "metadata": {},
76 | "outputs": [],
77 | "source": []
78 | }
79 | ],
80 | "metadata": {
81 | "kernelspec": {
82 | "display_name": "Python 3",
83 | "language": "python",
84 | "name": "python3"
85 | },
86 | "language_info": {
87 | "codemirror_mode": {
88 | "name": "ipython",
89 | "version": 3
90 | },
91 | "file_extension": ".py",
92 | "mimetype": "text/x-python",
93 | "name": "python",
94 | "nbconvert_exporter": "python",
95 | "pygments_lexer": "ipython3",
96 | "version": "3.6.8"
97 | }
98 | },
99 | "nbformat": 4,
100 | "nbformat_minor": 4
101 | }
102 |
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/notebooks/5. Android 移动端模型推理部署.zip:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/零基础入门旷视天元MegEngine/notebooks/5. Android 移动端模型推理部署.zip
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/notebooks/6.部署进阶:推理端优化.zip:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/零基础入门旷视天元MegEngine/notebooks/6.部署进阶:推理端优化.zip
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/notebooks/README.md:
--------------------------------------------------------------------------------
1 | 包含了零基础入门旷视天元MegEngine课程的notebooks资源
2 |
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/slides/1.旷视天元MegEngine介绍和快速上手.pptx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/零基础入门旷视天元MegEngine/slides/1.旷视天元MegEngine介绍和快速上手.pptx
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/slides/2.模型构建和训练入门课件.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/零基础入门旷视天元MegEngine/slides/2.模型构建和训练入门课件.pdf
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/slides/3.模型构建和训练进阶 I:分类问题.pptx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/零基础入门旷视天元MegEngine/slides/3.模型构建和训练进阶 I:分类问题.pptx
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/slides/4.模型构建和训练进阶 II:物体检测.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/零基础入门旷视天元MegEngine/slides/4.模型构建和训练进阶 II:物体检测.pdf
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/slides/5. Android 移动端模型推理部署.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/零基础入门旷视天元MegEngine/slides/5. Android 移动端模型推理部署.pdf
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/slides/6.部署进阶:推理端优化.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/MegEngine/Resource/f0fa0c5d73811b8c8839e7d4eea53395f2a1bb28/零基础入门旷视天元MegEngine/slides/6.部署进阶:推理端优化.pdf
--------------------------------------------------------------------------------
/零基础入门旷视天元MegEngine/slides/README.md:
--------------------------------------------------------------------------------
1 | 包含零基础入门旷视天元MegEngine课程的PPT资源
2 |
--------------------------------------------------------------------------------