├── README.md ├── ch02_sift.py ├── ch02_sift_match.py ├── ch2_harris_matching.py └── images ├── 1 ├── 01.jpg ├── 02.jpg ├── 05.jpg ├── 08.jpg ├── 09.jpg ├── 10.jpg ├── 11.jpg ├── 13.jpg ├── 16.jpg ├── Harris-02-04.jpg ├── Harris-10-11.jpg ├── Harris-y02-y03.jpg ├── jiaodian.png ├── jiaodian2.png ├── jmu.jpg ├── ju1.jpg ├── ju2.jpg ├── s01.jpg ├── s02.jpg ├── sift-04.jpg ├── siftMatch-02-04-A.jpg ├── siftMatch-02-04-B.jpg ├── siftMatch-10-11-A.jpg ├── siftMatch-10-11-B.jpg ├── siftMatch-y02-y03-A.jpg ├── siftMatch-y02-y03-B.jpg ├── t01.jpg ├── t02.jpg ├── y01.jpg ├── y02.jpg ├── y03.jpg ├── 屏幕快照 2019-03-17 下午6.40.21.png ├── 屏幕快照 2019-03-17 下午6.42.41.png └── 屏幕快照 2019-03-17 下午6.45.42.png /README.md: -------------------------------------------------------------------------------- 1 | # * 图像局部特征描述子 2 | 一幅图像是由固定个数等像素点组成等,每幅图像中总存在着其独特等像素点,我们称这些点为图像点特征点。今天要介绍点计算机视觉领域点图像特征匹配就是以各个图像的特征点为基础而进行的。本文介绍了两种特征点以及特征匹配方法。本文主要讲解局部图像描述子,即介绍用于图像匹配对两种描述子算法,阐述其原理,并且分别使用SIFT以及Harris对两幅图像检测匹配举例、通过SIFT匹配地理标记图像等实际操作。 3 | ## 一.图像匹配 4 | 图像匹配,就是通过找出所给出的每幅图像中点特征点,来与另外一幅图片中的特征点进行一一匹配,从而找到两幅或多幅图片之间的联系。常用的匹配方法有SIFT以及Harris两种,因SIFT(尺度不变特征变换)是过去十年中最成功的图像局部描述算子之一,故下文着重介绍SIFT。 5 | ## 二.Harris角点 6 | Harris算子是一种角点特征,所谓角点,就是局部窗口沿各方向移动,均产生明显变化的点、图像局部曲率突变的点,典型的角点检测算法有: 7 | • Harris角点检测 8 | • CSS角点检测 9 | 下图所示为“角点”: 10 | ![image](https://github.com/Nocami/SIFT/blob/master/images/jiaodian.png) 11 | ### 1.如何检测出Harris角点? 12 | 角点检测最早期的想法就是取某个像素的一个邻域窗口。当这个窗口在像素点各个方向上进行移动时,观察窗口内平均的像素灰度值的变化,若变化巨大,则为角点,若单一方向无变化则为平滑,垂直方向变化大则为边缘。从下图可知,我们可以将一幅图像大致分为三个区域(‘flat’,‘edge’,‘corner’),这三个区域变化是不一样的。 13 | ![image](https://github.com/Nocami/SIFT/blob/master/images/%E5%B1%8F%E5%B9%95%E5%BF%AB%E7%85%A7%202019-03-17%20%E4%B8%8B%E5%8D%886.40.21.png) 14 | 其数学表达式为: 15 | 将图像窗口平移[u,v]产生灰度变化E(u,v) 16 | ![image](https://github.com/Nocami/SIFT/blob/master/images/%E5%B1%8F%E5%B9%95%E5%BF%AB%E7%85%A7%202019-03-17%20%E4%B8%8B%E5%8D%886.42.41.png) 17 | 我们把图像域中点x上的对称半正定矩阵定义为: 18 | ![image](https://github.com/Nocami/SIFT/blob/master/images/%E5%B1%8F%E5%B9%95%E5%BF%AB%E7%85%A7%202019-03-17%20%E4%B8%8B%E5%8D%886.45.42.png) 19 | M为自相关函数E(x,y)的近似Hessian矩阵(M为2*2矩阵)。 20 | 设 λ1、λ2 λ1、λ2为M的特征值,定义角点相应函数R为: 21 | R=λ1λ2−k(λ1+λ2)2,即 22 | R=det(M)−k(tr(M))2 23 | det(M)=λ1*λ2 24 | tr(M)=λ1+λ2 25 | 26 | ### 2.图片匹配实例 27 | Harris.py代码如下: 28 | ```python 29 | #-*- coding: utf-8 -*- 30 | from pylab import * 31 | from PIL import Image 32 | 33 | from PCV.localdescriptors import harris 34 | from PCV.tools.imtools import imresize 35 | 36 | """ 37 | This is the Harris point matching example in Figure 2-2. 38 | """ 39 | 40 | #Figure 2-2上面的图 41 | #im1 = array(Image.open("../data/crans_1_small.jpg").convert("L")) 42 | #im2 = array(Image.open("../data/crans_2_small.jpg").convert("L")) 43 | 44 | #Figure 2-2下面的图 45 | im1 = array(Image.open("../data/sf_view1.jpg").convert("L")) 46 | im2 = array(Image.open("../data/sf_view2.jpg").convert("L")) 47 | 48 | #resize to make matching faster 49 | im1 = imresize(im1, (im1.shape[1]/2, im1.shape[0]/2)) 50 | im2 = imresize(im2, (im2.shape[1]/2, im2.shape[0]/2)) 51 | 52 | wid = 5 53 | harrisim = harris.compute_harris_response(im1, 5) 54 | filtered_coords1 = harris.get_harris_points(harrisim, wid+1) 55 | d1 = harris.get_descriptors(im1, filtered_coords1, wid) 56 | 57 | harrisim = harris.compute_harris_response(im2, 5) 58 | filtered_coords2 = harris.get_harris_points(harrisim, wid+1) 59 | d2 = harris.get_descriptors(im2, filtered_coords2, wid) 60 | 61 | print 'starting matching' 62 | matches = harris.match_twosided(d1, d2) 63 | 64 | figure() 65 | gray() 66 | harris.plot_matches(im1, im2, filtered_coords1, filtered_coords2, matches) 67 | show() 68 | ``` 69 | 实例截图: 70 | ![image](https://github.com/Nocami/SIFT/blob/master/images/Harris-02-04.jpg) 71 | ![image](https://github.com/Nocami/SIFT/blob/master/images/Harris-10-11.jpg) 72 | ![image](https://github.com/Nocami/SIFT/blob/master/images/Harris-y02-y03.jpg) 73 | ## 三.SIFT(尺度不变特征变换) 74 | ### 1.介绍: 75 | David Lowe在文献中提出的SIFT(尺度不变特征变换)是过去十年中最成功的图像局部描述子之一。SIFT经受住了时间的考验。SIFT特征包括兴趣点检测器和描述子,其描述子具有非常强点稳健型,这在很大程度上也是其能够成功和流行点原因。 76 | ![image](http://i0.qhmsg.com/dr/200__/t01fe92d0a98cf7c342.jpg) 77 | 照片:David Lowe 78 | [SIFT解决的问题:](https://www.zybuluo.com/mdeditor?url=https://www.zybuluo.com/static/editor/md-help.markdown#13-待办事宜-todo-列表) 79 | 80 | - [x] 目标的旋转、缩放、平移(rst) 81 | - [x] 图像仿射/投影变换(视点viewpoint) 82 | - [x] 弱光照影响(illumination) 83 | - [x] 部分目标遮挡(occlusion) 84 | - [x] 杂物场景(clutter) 85 | - [x] 噪声 86 | ### 2.SIFT算法的特点: 87 | 1).SIFT特征是图像的局部特征,其对旋转、尺度缩放、亮度变化保持不变性,对视角变化、仿射变换、噪声也保持一定程度的稳定性; 88 | 2). 区分性(Distinctiveness)好,信息量丰富,适用于在海量特征数据库中进行快速、准确的匹配; 89 | 3). 多量性,即使少数的几个物体也可以产生大量的SIFT特征向量; 90 | 4).高速性,经优化的SIFT匹配算法甚至可以达到实时的要求; 91 | 5).可扩展性,可以很方便的与其他形式的特征向量进行联合。 92 | 93 | ### 3.原理简述: 94 | #### 尺度空间: 95 | 在人体的视觉中,无论物体的大小,肉眼可以分辨出其相对大小。但是要让计算机掌握此能力却很困难。在没有标定的场景中,计算机视觉并不能计算出物体的大小,其中的一种方法是把物体不同尺度下的图像都提供给机器,让机器对物体在不同的尺度下有一个统一的认知。在建立统一认知的过程中,要考虑的就是在图像在不同的尺度下都存在的特征点。 96 | 图-多分辨率图像金字塔: 97 | ![image](http://www.opencv.org.cn/opencvdoc/2.3.2/html/_images/Pyramids_Tutorial_Pyramid_Theory.png) 98 | 想象金字塔为一层一层的图像,层级越高,图像越小;尺度越大图像越模糊。 99 | #### DoG极值检测 100 | Difference of Gaussian,为了寻找尺度空间的极值点,每个像素点要和其图像域(同一尺度空间)和尺度域(相邻的尺度空间)的所有相邻点进行比较,当其大于(或者小于)所有相邻点时,改点就是极值点。DoG在计算上只需相邻高斯平滑后图像相减,因此简化了计算! 101 | #### 关键点描述子 102 | 为了实现旋转不变性,基于每个点周围图像梯度的方向和大小,SIFT描述子又引入了参考方向。它使用主方向描述参考方向。主方向使用方向直方图(以大小为权重)来度量。 103 | 104 | ### 4.SIFT算法步骤: 105 | 1).尺度空间极值检测 106 | 107 | 搜索所有尺度上的图像位置。通过高斯微分函数来识别潜在的对于尺度和旋转不变的兴趣点。 108 | 109 | 2). 关键点定位 110 | 111 | 在每个候选的位置上,通过一个拟合精细的模型来确定位置和尺度。关键点的选择依据于它们的稳定程度。 112 | 113 | 3). 方向确定 114 | 115 | 基于图像局部的梯度方向,分配给每个关键点位置一个或多个方向。所有后面的对图像数据的操作都相对于关键点的方向、尺度和位置进行变换,从而提供对于这些变换的不变性。 116 | 117 | 4). 关键点描述 118 | 119 | 在每个关键点周围的邻域内,在选定的尺度上测量图像局部的梯度。这些梯度被变换成一种表示,这种表示允许比较大的局部形状的变形和光照变化。 120 | 121 | ### 5.检测感兴趣点 122 | 为了计算图像的SIFT特征,我们用开源工具包VLFeat。用Python重新实现SIFT特征提取的全过程不会很高效,而且也超出了本书的范围。VLFeat可以在www.vlfeat.org 上下载,它的二进制文件可以用于一些主要的平台。这个库是用C写的,不过我们可以利用它的命令行接口。下面是代码实例: 123 | ```python 124 | # -*- coding: utf-8 -*- 125 | from PIL import Image 126 | from pylab import * 127 | from PCV.localdescriptors import sift 128 | from PCV.localdescriptors import harris 129 | 130 | # 添加中文字体支持 131 | from matplotlib.font_manager import FontProperties 132 | font = FontProperties(fname=r"c:\windows\fonts\SimSun.ttc", size=14) 133 | 134 | imname = '../data/empire.jpg' 135 | im = array(Image.open(imname).convert('L')) 136 | sift.process_image(imname, 'empire.sift') 137 | l1, d1 = sift.read_features_from_file('empire.sift') 138 | 139 | figure() 140 | gray() 141 | subplot(131) 142 | sift.plot_features(im, l1, circle=False) 143 | title(u'SIFT特征',fontproperties=font) 144 | subplot(132) 145 | sift.plot_features(im, l1, circle=True) 146 | title(u'用圆圈表示SIFT特征尺度',fontproperties=font) 147 | 148 | # 检测harris角点 149 | harrisim = harris.compute_harris_response(im) 150 | 151 | subplot(133) 152 | filtered_coords = harris.get_harris_points(harrisim, 6, 0.1) 153 | imshow(im) 154 | plot([p[1] for p in filtered_coords], [p[0] for p in filtered_coords], '*') 155 | axis('off') 156 | title(u'Harris角点',fontproperties=font) 157 | 158 | show() 159 | ``` 160 | 运行结果如下: 161 | ![image](https://github.com/Nocami/SIFT/blob/master/images/sift-04.jpg) 162 | 为了将sift和Harris角点进行比较,将Harris角点检测的显示在了图像的最后侧。正如你所看到的,这两种算法选择了不同的坐标。 163 | ### 6.匹配描述子 164 | #### SIFT 165 | 代码: 166 | ```python 167 | from PIL import Image 168 | from pylab import * 169 | import sys 170 | from PCV.localdescriptors import sift 171 | 172 | 173 | if len(sys.argv) >= 3: 174 | im1f, im2f = sys.argv[1], sys.argv[2] 175 | else: 176 | # im1f = '../data/sf_view1.jpg' 177 | # im2f = '../data/sf_view2.jpg' 178 | im1f = '../data/crans_1_small.jpg' 179 | im2f = '../data/crans_2_small.jpg' 180 | # im1f = '../data/climbing_1_small.jpg' 181 | # im2f = '../data/climbing_2_small.jpg' 182 | im1 = array(Image.open(im1f)) 183 | im2 = array(Image.open(im2f)) 184 | 185 | sift.process_image(im1f, 'out_sift_1.txt') 186 | l1, d1 = sift.read_features_from_file('out_sift_1.txt') 187 | figure() 188 | gray() 189 | subplot(121) 190 | sift.plot_features(im1, l1, circle=False) 191 | 192 | sift.process_image(im2f, 'out_sift_2.txt') 193 | l2, d2 = sift.read_features_from_file('out_sift_2.txt') 194 | subplot(122) 195 | sift.plot_features(im2, l2, circle=False) 196 | 197 | #matches = sift.match(d1, d2) 198 | matches = sift.match_twosided(d1, d2) 199 | print '{} matches'.format(len(matches.nonzero()[0])) 200 | 201 | figure() 202 | gray() 203 | sift.plot_matches(im1, im2, l1, l2, matches, show_below=True) 204 | show() 205 | ``` 206 | 207 | #### Harris 208 | 代码: 209 | ```python 210 | # -*- coding: utf-8 -*- 211 | from pylab import * 212 | from PIL import Image 213 | 214 | from PCV.localdescriptors import harris 215 | from PCV.tools.imtools import imresize 216 | 217 | """ 218 | This is the Harris point matching example in Figure 2-2. 219 | """ 220 | 221 | # Figure 2-2上面的图 222 | #im1 = array(Image.open("../data/crans_1_small.jpg").convert("L")) 223 | #im2= array(Image.open("../data/crans_2_small.jpg").convert("L")) 224 | 225 | # Figure 2-2下面的图 226 | im1 = array(Image.open("../data/sf_view1.jpg").convert("L")) 227 | im2 = array(Image.open("../data/sf_view2.jpg").convert("L")) 228 | 229 | # resize加快匹配速度 230 | im1 = imresize(im1, (im1.shape[1]/2, im1.shape[0]/2)) 231 | im2 = imresize(im2, (im2.shape[1]/2, im2.shape[0]/2)) 232 | 233 | wid = 5 234 | harrisim = harris.compute_harris_response(im1, 5) 235 | filtered_coords1 = harris.get_harris_points(harrisim, wid+1) 236 | d1 = harris.get_descriptors(im1, filtered_coords1, wid) 237 | 238 | harrisim = harris.compute_harris_response(im2, 5) 239 | filtered_coords2 = harris.get_harris_points(harrisim, wid+1) 240 | d2 = harris.get_descriptors(im2, filtered_coords2, wid) 241 | 242 | print 'starting matching' 243 | matches = harris.match_twosided(d1, d2) 244 | 245 | figure() 246 | gray() 247 | harris.plot_matches(im1, im2, filtered_coords1, filtered_coords2, matches) 248 | show() 249 | ``` 250 | 对比实例: 251 | SIFT: 252 | ![image](https://github.com/Nocami/SIFT/blob/master/images/siftMatch-10-11-A.jpg) 253 | ![image](https://github.com/Nocami/SIFT/blob/master/images/siftMatch-10-11-B.jpg) 254 | Harris: 255 | ![image](https://github.com/Nocami/SIFT/blob/master/images/Harris-10-11.jpg) 256 | SIFT: 257 | ![image](https://github.com/Nocami/SIFT/blob/master/images/siftMatch-y02-y03-A.jpg) 258 | ![image](https://github.com/Nocami/SIFT/blob/master/images/siftMatch-y02-y03-B.jpg) 259 | Harris: 260 | ![image](https://github.com/Nocami/SIFT/blob/master/images/Harris-y02-y03.jpg) 261 | 为了将sift和Harris角点进行比较,将Harris角点检测的显示在了图像的最后侧。正如你所看到的,这两种算法选择了不同的坐标。 262 | 263 | 对比可以看出,Harris算法的结果存在一些不正确匹配,这是因为,与SIFT相比,图像像素块的互相关矩阵具有较弱的描述性。且Harris只能先将图像转化为灰度图再进行特征匹配,SIFT明显要好的多。 264 | ### 7.匹配地理标记图像 265 | 最后,使用局部描述子来匹配带有地标的图像。因为Panoramio已经停止服务了,所以我们需要自己准备一些图片。这里使用了我本科所在的高校(集美大学)的一些照片。通过肉眼观察,我们可以很明显的区分出15张图片中所包含的四个不同的景象。 266 | 中山纪念堂: 267 | 268 | ![image](https://github.com/Nocami/SIFT/blob/master/images/02.jpg) 269 | 嘉庚图书馆: 270 | 271 | ![image](https://github.com/Nocami/SIFT/blob/master/images/y02.jpg) 272 | 尚大楼: 273 | 274 | ![image](https://github.com/Nocami/SIFT/blob/master/images/s02.jpg) 275 | 延奎图书馆: 276 | 277 | ![image](https://github.com/Nocami/SIFT/blob/master/images/t01.jpg) 278 | 279 | #### 下面我们通过计算机来实现: 280 | 281 | 我们首先通过是否具有匹配的局部描述子来定义图像间的连接,然后可视化这些连接情况。为了完成可视化,我们可以在图中显示这些图像,图的边代表连接。我们采需要采用pydot工具包,它提供了GraphViz graphing库的Python接口。安装方式如此: 282 | 1.安装graphviz-2.28.0.msi 283 | 下载地址:http://download.csdn.NET/detail/shouwangzhelv/9492517 284 | 配置系统环境变量:C:\Program Files (x86)\Graphviz 2.28\bin添加到path中 285 | 2.安装pyparsing-1.5.7.win32-py2.7.exe; 286 | 287 | 安装pyparsing-1.5.7.win32-py2.7.exe 288 | https://pypi.python.org/pypi/pyparsing/1.5.7 289 | 290 | 3.安装pydot 291 | 292 | 下载地址:https://pypi.python.org/pypi/pydot2/1.0.33 293 | 294 | 1)解压 295 | 296 | 2)cmd 到包所在位置,注意必须是 setup.py所在路径 297 | 298 | 3) 执行python setup.py install 299 | 300 | cmd 到包所在位置,注意必须是 setup.py所在路径 301 | 302 | 执行python setup.py install 303 | 304 | 代码如下(路径改为自己的图片所在位置,命名方式不限): 305 | ```python 306 | # -*- coding: utf-8 -*- 307 | from pylab import * 308 | from PIL import Image 309 | from PCV.localdescriptors import sift 310 | from PCV.tools import imtools 311 | import pydot 312 | 313 | """ This is the example graph illustration of matching images from Figure 2-10. 314 | To download the images, see ch2_download_panoramio.py.""" 315 | 316 | #download_path = "panoimages" # set this to the path where you downloaded the panoramio images 317 | #path = "/FULLPATH/panoimages/" # path to save thumbnails (pydot needs the full system path) 318 | 319 | download_path = "F:/dropbox/Dropbox/translation/pcv-notebook/data/panoimages" # set this to the path where you downloaded the panoramio images 320 | path = "F:/dropbox/Dropbox/translation/pcv-notebook/data/panoimages/" # path to save thumbnails (pydot needs the full system path) 321 | 322 | # list of downloaded filenames 323 | imlist = imtools.get_imlist(download_path) 324 | nbr_images = len(imlist) 325 | 326 | # extract features 327 | featlist = [imname[:-3] + 'sift' for imname in imlist] 328 | for i, imname in enumerate(imlist): 329 | sift.process_image(imname, featlist[i]) 330 | 331 | matchscores = zeros((nbr_images, nbr_images)) 332 | 333 | for i in range(nbr_images): 334 | for j in range(i, nbr_images): # only compute upper triangle 335 | print 'comparing ', imlist[i], imlist[j] 336 | l1, d1 = sift.read_features_from_file(featlist[i]) 337 | l2, d2 = sift.read_features_from_file(featlist[j]) 338 | matches = sift.match_twosided(d1, d2) 339 | nbr_matches = sum(matches > 0) 340 | print 'number of matches = ', nbr_matches 341 | matchscores[i, j] = nbr_matches 342 | print "The match scores is: \n", matchscores 343 | 344 | # copy values 345 | for i in range(nbr_images): 346 | for j in range(i + 1, nbr_images): # no need to copy diagonal 347 | matchscores[j, i] = matchscores[i, j] 348 | 349 | #可视化 350 | 351 | threshold = 2 # min number of matches needed to create link 352 | 353 | g = pydot.Dot(graph_type='graph') # don't want the default directed graph 354 | 355 | for i in range(nbr_images): 356 | for j in range(i + 1, nbr_images): 357 | if matchscores[i, j] > threshold: 358 | # first image in pair 359 | im = Image.open(imlist[i]) 360 | im.thumbnail((100, 100)) 361 | filename = path + str(i) + '.png' 362 | im.save(filename) # need temporary files of the right size 363 | g.add_node(pydot.Node(str(i), fontcolor='transparent', shape='rectangle', image=filename)) 364 | 365 | # second image in pair 366 | im = Image.open(imlist[j]) 367 | im.thumbnail((100, 100)) 368 | filename = path + str(j) + '.png' 369 | im.save(filename) # need temporary files of the right size 370 | g.add_node(pydot.Node(str(j), fontcolor='transparent', shape='rectangle', image=filename)) 371 | 372 | g.add_edge(pydot.Edge(str(i), str(j))) 373 | g.write_png('jmu.png') 374 | ``` 375 | 我们将每对图像之间的匹配特征数保存在数组中: 376 | ![image](https://github.com/Nocami/SIFT/blob/master/images/ju1.jpg) 377 | ![image](https://github.com/Nocami/SIFT/blob/master/images/ju2.jpg) 378 | ![image](https://github.com/Nocami/SIFT/blob/master/images/jmu.jpg) 379 | 可见,计算机已经将图片标记划分完成。 380 | 381 | -------------------------------------------------------------------------------- /ch02_sift.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | from PIL import Image 3 | from pylab import * 4 | from PCV.localdescriptors import sift 5 | from PCV.localdescriptors import harris 6 | 7 | # 添加中文字体支持 8 | from matplotlib.font_manager import FontProperties 9 | font = FontProperties(fname=r"c:\windows\fonts\SimSun.ttc", size=14) 10 | 11 | imname = r'e:\Study\pythonxyProject\sift\images\s01.jpg' 12 | im = array(Image.open(imname).convert('L')) 13 | sift.process_image(imname, 'empire.sift') 14 | l1, d1 = sift.read_features_from_file('empire.sift') 15 | 16 | figure() 17 | gray() 18 | subplot(131) 19 | sift.plot_features(im, l1, circle=False) 20 | title(u'SIFT特征',fontproperties=font) 21 | subplot(132) 22 | sift.plot_features(im, l1, circle=True) 23 | title(u'用圆圈表示SIFT特征尺度',fontproperties=font) 24 | 25 | # 检测harris角点 26 | harrisim = harris.compute_harris_response(im) 27 | 28 | subplot(133) 29 | filtered_coords = harris.get_harris_points(harrisim, 6, 0.1) 30 | imshow(im) 31 | plot([p[1] for p in filtered_coords], [p[0] for p in filtered_coords], '*') 32 | axis('off') 33 | title(u'Harris角点',fontproperties=font) 34 | 35 | show() 36 | -------------------------------------------------------------------------------- /ch02_sift_match.py: -------------------------------------------------------------------------------- 1 | from PIL import Image 2 | from pylab import * 3 | import sys 4 | from PCV.localdescriptors import sift 5 | 6 | 7 | if len(sys.argv) >= 3: 8 | im1f, im2f = sys.argv[1], sys.argv[2] 9 | else: 10 | im1f = r'e:\Study\pythonxyProject\sift\images/02.jpg' 11 | im2f = r'e:\Study\pythonxyProject\sift\images/04.jpg' 12 | # im1f = '../data/crans_1_small.jpg' 13 | # im2f = '../data/crans_2_small.jpg' 14 | # im1f = '../data/climbing_1_small.jpg' 15 | # im2f = '../data/climbing_2_small.jpg' 16 | im1 = array(Image.open(im1f)) 17 | im2 = array(Image.open(im2f)) 18 | 19 | sift.process_image(im1f, 'out_sift_1.txt') 20 | l1, d1 = sift.read_features_from_file('out_sift_1.txt') 21 | figure() 22 | gray() 23 | subplot(121) 24 | sift.plot_features(im1, l1, circle=False) 25 | 26 | sift.process_image(im2f, 'out_sift_2.txt') 27 | l2, d2 = sift.read_features_from_file('out_sift_2.txt') 28 | subplot(122) 29 | sift.plot_features(im2, l2, circle=False) 30 | 31 | matches = sift.match(d1, d2) 32 | matches = sift.match_twosided(d1, d2) 33 | print '{} matches'.format(len(matches.nonzero()[0])) 34 | 35 | figure() 36 | gray() 37 | sift.plot_matches(im1, im2, l1, l2, matches, show_below=True) 38 | show() -------------------------------------------------------------------------------- /ch2_harris_matching.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | from pylab import * 3 | from PIL import Image 4 | 5 | from PCV.localdescriptors import harris 6 | from PCV.tools.imtools import imresize 7 | 8 | """ 9 | This is the Harris point matching example in Figure 2-2. 10 | """ 11 | 12 | # Figure 2-2上面的图 13 | #im1 = array(Image.open("../data/crans_1_small.jpg").convert("L")) 14 | #im2 = array(Image.open("../data/crans_2_small.jpg").convert("L")) 15 | 16 | # Figure 2-2下面的图 17 | im1 = array(Image.open("../data/sf_view1.jpg").convert("L")) 18 | im2 = array(Image.open("../data/sf_view2.jpg").convert("L")) 19 | 20 | # resize to make matching faster 21 | im1 = imresize(im1, (im1.shape[1]/2, im1.shape[0]/2)) 22 | im2 = imresize(im2, (im2.shape[1]/2, im2.shape[0]/2)) 23 | 24 | wid = 5 25 | harrisim = harris.compute_harris_response(im1, 5) 26 | filtered_coords1 = harris.get_harris_points(harrisim, wid+1) 27 | d1 = harris.get_descriptors(im1, filtered_coords1, wid) 28 | 29 | harrisim = harris.compute_harris_response(im2, 5) 30 | filtered_coords2 = harris.get_harris_points(harrisim, wid+1) 31 | d2 = harris.get_descriptors(im2, filtered_coords2, wid) 32 | 33 | print 'starting matching' 34 | matches = harris.match_twosided(d1, d2) 35 | 36 | figure() 37 | gray() 38 | harris.plot_matches(im1, im2, filtered_coords1, filtered_coords2, matches) 39 | show() -------------------------------------------------------------------------------- /images/01.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/01.jpg -------------------------------------------------------------------------------- /images/02.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/02.jpg -------------------------------------------------------------------------------- /images/05.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/05.jpg -------------------------------------------------------------------------------- /images/08.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/08.jpg -------------------------------------------------------------------------------- /images/09.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/09.jpg -------------------------------------------------------------------------------- /images/1: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /images/10.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/10.jpg -------------------------------------------------------------------------------- /images/11.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/11.jpg -------------------------------------------------------------------------------- /images/13.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/13.jpg -------------------------------------------------------------------------------- /images/16.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/16.jpg -------------------------------------------------------------------------------- /images/Harris-02-04.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/Harris-02-04.jpg -------------------------------------------------------------------------------- /images/Harris-10-11.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/Harris-10-11.jpg -------------------------------------------------------------------------------- /images/Harris-y02-y03.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/Harris-y02-y03.jpg -------------------------------------------------------------------------------- /images/jiaodian.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/jiaodian.png -------------------------------------------------------------------------------- /images/jiaodian2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/jiaodian2.png -------------------------------------------------------------------------------- /images/jmu.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/jmu.jpg -------------------------------------------------------------------------------- /images/ju1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/ju1.jpg -------------------------------------------------------------------------------- /images/ju2.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/ju2.jpg -------------------------------------------------------------------------------- /images/s01.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/s01.jpg -------------------------------------------------------------------------------- /images/s02.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/s02.jpg -------------------------------------------------------------------------------- /images/sift-04.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/sift-04.jpg -------------------------------------------------------------------------------- /images/siftMatch-02-04-A.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/siftMatch-02-04-A.jpg -------------------------------------------------------------------------------- /images/siftMatch-02-04-B.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/siftMatch-02-04-B.jpg -------------------------------------------------------------------------------- /images/siftMatch-10-11-A.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/siftMatch-10-11-A.jpg -------------------------------------------------------------------------------- /images/siftMatch-10-11-B.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/siftMatch-10-11-B.jpg -------------------------------------------------------------------------------- /images/siftMatch-y02-y03-A.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/siftMatch-y02-y03-A.jpg -------------------------------------------------------------------------------- /images/siftMatch-y02-y03-B.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/siftMatch-y02-y03-B.jpg -------------------------------------------------------------------------------- /images/t01.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/t01.jpg -------------------------------------------------------------------------------- /images/t02.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/t02.jpg -------------------------------------------------------------------------------- /images/y01.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/y01.jpg -------------------------------------------------------------------------------- /images/y02.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/y02.jpg -------------------------------------------------------------------------------- /images/y03.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/y03.jpg -------------------------------------------------------------------------------- /images/屏幕快照 2019-03-17 下午6.40.21.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/屏幕快照 2019-03-17 下午6.40.21.png -------------------------------------------------------------------------------- /images/屏幕快照 2019-03-17 下午6.42.41.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/屏幕快照 2019-03-17 下午6.42.41.png -------------------------------------------------------------------------------- /images/屏幕快照 2019-03-17 下午6.45.42.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Nocami/PythonComputerVision-2-SIFT/06c87d9bea8b8bf610f2bd7a4dd1a1285f3a1403/images/屏幕快照 2019-03-17 下午6.45.42.png --------------------------------------------------------------------------------