我正在尝试更新我的代码以使用 cv2.SURF()
而不是 cv2.FeatureDetector_create("SURF")
和 cv2.DescriptorExtractor_create (“冲浪”)
。但是,在检测到关键点后,我无法获取描述符。调用 SURF.detect
的正确方法是什么?
我尝试按照 OpenCV 文档进行操作,但我有点困惑。这就是它在文档中所说的。
Python: cv2.SURF.detect(img, mask) → keypoints¶
Python: cv2.SURF.detect(img, mask[, descriptors[, useProvidedKeypoints]]) → keypoints, descriptors
第二次调用 SURF.detect
时如何传递关键点?
最佳答案
我不确定我是否正确理解了您的问题。但是如果你正在寻找一个匹配 SURF 关键点的样本,下面是一个非常简单和基本的样本,它类似于模板匹配:
import cv2
import numpy as np
# Load the images
img =cv2.imread('messi4.jpg')
# Convert them to grayscale
imgg =cv2.cvtColor(img,cv2.COLOR_BGR2GRAY)
# SURF extraction
surf = cv2.SURF()
kp, descritors = surf.detect(imgg,None,useProvidedKeypoints = False)
# Setting up samples and responses for kNN
samples = np.array(descritors)
responses = np.arange(len(kp),dtype = np.float32)
# kNN training
knn = cv2.KNearest()
knn.train(samples,responses)
# Now loading a template image and searching for similar keypoints
template = cv2.imread('template.jpg')
templateg= cv2.cvtColor(template,cv2.COLOR_BGR2GRAY)
keys,desc = surf.detect(templateg,None,useProvidedKeypoints = False)
for h,des in enumerate(desc):
des = np.array(des,np.float32).reshape((1,128))
retval, results, neigh_resp, dists = knn.find_nearest(des,1)
res,dist = int(results[0][0]),dists[0][0]
if dist<0.1: # draw matched keypoints in red color
color = (0,0,255)
else: # draw unmatched in blue color
print dist
color = (255,0,0)
#Draw matched key points on original image
x,y = kp[res].pt
center = (int(x),int(y))
cv2.circle(img,center,2,color,-1)
#Draw matched key points on template image
x,y = keys[h].pt
center = (int(x),int(y))
cv2.circle(template,center,2,color,-1)
cv2.imshow('img',img)
cv2.imshow('tm',template)
cv2.waitKey(0)
cv2.destroyAllWindows()
以下是我得到的结果(使用绘画将粘贴的模板图像复制到原始图像上):
如您所见,存在一些小错误。但对于初创公司来说,希望它是好的。
关于python - OpenCV 2.4.1 - 在 Python 中计算 SURF 描述符,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/10984313/