使用固定网络摄像头,我正在拍摄每帧中移动的物体的连续照片。我试图估计每帧的子像素偏移,但估计值相差甚远。由于我的第一次试验中的估计是错误的,因此我为该对象创建了一个蒙版,并尝试在去除背景的同时仅使用移动对象来估计偏移。但像素偏移的估计是相同的。
function delta_est = estimate_shift(s,n)
h = waitbar(0, 'Shift Estimation');
set(h, 'Name', 'Please wait...');
nr = length(s);
delta_est=zeros(nr,2);
p = [n n] % only the central (aliasing-free) part of NxN pixels is used for shift estimation
p(1)
sz = size(s{1});
S1 = fftshift(fft2(s{1})); % Fourier transform of the reference image
for i=2:nr
waitbar(i/nr, h, 'Shift Estimation');
S2 = fftshift(fft2(s{i})); % Fourier transform of the image to be registered
S2(S2==0)=1e-10;
Q = S1./S2;
A = angle(Q); % phase difference between the two images
% determine the central part of the frequency spectrum to be used
beginy = floor(sz(1)/2)-p(1)+1;
endy = floor(sz(1)/2)+p(1)+1;
beginx = floor(sz(2)/2)-p(2)+1;
endx = floor(sz(2)/2)+p(2)+1;
% compute x and y coordinates of the pixels
x = ones(endy-beginy+1,1)*[beginx:endx];
x = x(:);
y = [beginy:endy]'*ones(1,endx-beginx+1);
y = y(:);
v = A(beginy:endy,beginx:endx);
v = v(:);
% compute the least squares solution for the slopes of the phase difference plane
M_A = [x y ones(length(x),1)];
r = M_A\v;
delta_est(i,:) = -[r(2) r(1)].*sz/2/pi;
end
close(h);
我需要进行哪些更改才能转向正确的子像素估计?
最佳答案
尝试使用计算机视觉系统工具箱中的vision.PointTracker
。它可以以亚像素精度跨帧跟踪点。
关于matlab - Matlab中运动物体的亚像素位移估计,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41432400/