2014-08-31 27 views
1

角度 http://www.mathworks.com/help/vision/examples/find-image-rotation-and-scale-using-automated-feature-matching.html讓我用visionrecovertform 的代碼鏈接旋轉的MATLAB

我怎麼能找到的角度,第二圖像移動???

function [ output_args ] = sid(a,b) 
global numFrames 
i=1; 
original = rgb2gray(imread(a)); 
%imshow(original); 
%text(size(original,2),size(original,1)+15, ... 
% 'Image courtesy of Massachusetts Institute of Technology', ... 
% 'FontSize',7,'HorizontalAlignment','right'); 

%% Step 2: Resize and Rotate the Image 

distorted = rgb2gray(imread(b)); % Try varying the angle, theta. 
%figure, imshow(distorted) 

%% 
% You can experiment by varying the scale and rotation of the input image. 
% However, note that there is a limit to the amount you can vary the scale 
% before the feature detector fails to find enough features. 

%% Step 3: Find Matching Features Between Images 
% Detect features in both images. 
ptsOriginal = detectSURFFeatures(original); 
ptsDistorted = detectSURFFeatures(distorted); 

%% 
% Extract feature descriptors. 
[featuresOriginal, validPtsOriginal] = extractFeatures(original, ptsOriginal); 
[featuresDistorted, validPtsDistorted] = extractFeatures(distorted, ptsDistorted); 

%% 
% Match features by using their descriptors. 
index_pairs = matchFeatures(featuresOriginal, featuresDistorted); 

%% 
% Retrieve locations of corresponding points for each image. 
matchedOriginal = validPtsOriginal(index_pairs(:,1)); 
matchedDistorted = validPtsDistorted(index_pairs(:,2)); 

%% 
% Show putative point matches. 
%figure; 
%showMatchedFeatures(original,distorted,matchedOriginal,matchedDistorted); 
%title('Putatively matched points (including outliers)'); 

%% Step 4: Estimate Transformation 
% Find a transformation corresponding to the matching point pairs using the 
% statistically robust M-estimator SAmple Consensus (MSAC) algorithm, which 
% is a variant of the RANSAC algorithm. It removes outliers while computing 
% the transformation matrix. You may see varying results of the 
% transformation computation because of the random sampling employed by the 
% MSAC algorithm. 
[tform, inlierDistorted, inlierOriginal] = estimateGeometricTransform(... 
    matchedDistorted, matchedOriginal, 'similarity'); 

%% 
% Display matching point pairs used in the computation of the 
% transformation. 
%figure; 
%showMatchedFeatures(original,distorted,inlierDistorted, inlierOriginal); 
%title('Matching points (inliers only)'); 
%legend('ptsOriginal','ptsDistorted'); 


% Compute the transformation matrix for the invert transform. 
Tinv = tform.invert.T; 

ss = Tinv(2,1); 
sc = Tinv(1,1); 
scale_recovered = sqrt(ss*ss + sc*sc) 
theta_recovered = atan2(ss,sc)*180/pi 

%% 
% The recovered values should match your scale and angle values selected in 
% *Step 2: Resize and Rotate the Image*. 

%% Step 6: Recover the Original Image 
% Recover the original image by transforming the distorted image. 
outputView = imref2d(size(original)); 
recovered = imwarp(distorted,tform,'OutputView',outputView); 
imwrite(recovered,b); 


%figure, imshowpair(original,recovered,'montage') 

%% 
% The |recovered| (right) image quality does not match the |original| 
% (left) image because of the distortion and recovery process. In 
% particular, the image shrinking causes loss of information. The artifacts 
% around the edges are due to the limited accuracy of the transformation. 
% If you were to detect more points in *Step 4: Find Matching Features 
% Between Images*, the transformation would be more accurate. For example, 
% we could have used a corner detector, detectFASTFeatures, to complement 
% the SURF feature detector which finds blobs. Image content and image size 
% also impact the number of detected features. 

% displayEndOfDemoMessage(mfilename) 


end 
+1

閱讀您鏈接頁面中的第5步(至少,我認爲這就是您要求的)。 – nkjt 2014-09-01 10:23:11

回答

0

theta_recovered是以度的角度。