2011-07-01 108 views
0

我有一臺服務器在Linux中使用Berkeley_sockets,我創建一個TCP連接與iPod客戶端。我有一個IplImage* img;從服務器發送到iPod。我使用write(socket,/*DATA*/,43200);命令,我試圖發送的數據是:reinterpret_cast<char*>(img),imgimg->imageData。所有這些選擇實際上都會發送任何類型的數據。如何發送IplImage從服務器到iPod客戶端通過TCP的UIImage

在iPod的一邊,我收到的數據這樣(我在這裏看到,所以不要介意複雜的東西,它只是從一個單一的圖像接收的所有數據。):

bytesRead = [iStream read: (char*)[buffer mutableBytes] + totalBytesRead maxLength: 43200 - totalBytesRead]; 

接收整個圖像後,我有這個:

[buffer setLength: 43200]; 
NSData *imagem = [NSData dataWithBytes:buffer length:43200]; 
UIImage *final= [self UIImageFromIplImage:imagem]; 

現在..我知道我可以有OpenCV的iPod上的工作,但我不能找到如何得到它的工作一個簡單的解釋,所以我使用the second code from this webpage並對其進行了調整,因爲我知道我的圖像的所有規格(例如我設置了所有從CGImageCreate()函數的變量):

- (UIImage *)UIImageFromIplImage:(NSData *)image { 

CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray(); 

// Allocating the buffer for CGImage 
NSData *data = [NSData dataWithBytes:image length:43200]; 

CGDataProviderRef provider = CGDataProviderCreateWithCFData((CFDataRef)data); 

// Creating CGImage from chunk of IplImage  
size_t width = 240; 
size_t height = 180; 
size_t depth = 8;    //bitsPerComponent 
size_t depthXnChannels = 8; //bitsPerPixel 
size_t widthStep = 240;  //bytesPerRow 

CGImageRef imageRef = CGImageCreate(width, height, depth, depthXnChannels, widthStep, colorSpace, kCGImageAlphaNone|kCGBitmapByteOrderDefault,provider, NULL, false, kCGRenderingIntentDefault); 

// Getting UIImage from CGImage 
UIImage *ret = [UIImage imageWithCGImage:imageRef]; 
lolView.image = ret; 
CGImageRelease(imageRef); 
CGDataProviderRelease(provider); 
CGColorSpaceRelease(colorSpace); 
return ret; 

}

問題:當我展示形象,我得到它完全怪異和'random',即使發送的圖像總是一樣。我真的不知道有什麼問題..

PS:TCP連接與其他數據(如數字或單詞)正常工作。圖像是灰度。

感謝您的幫助。

回答

1

我知道它是這樣工作的。 在服務器端(代碼::塊在linux用了openFrameworks(& ofxOpenCv)):

img.allocate(240, 180, OF_IMAGE_COLOR);     //ofImage 
img2.allocate(240, 180);         //ofxCvColorImage 
frame = cvCreateImage(cvSize(240,180), IPL_DEPTH_8U, 3); //IplImage 
bw = cvCreateImage(cvSize(240,180), IPL_DEPTH_8U, 1);  //IplImage 
gray.allocate(240, 180);         //ofxCvGrayscaleImage 


///ofImage 
img.loadImage("lol.jpg"); 

///ofImage -> ofxCvColor 
img2.setFromPixels(img.getPixels(), 240, 180); 

///ofxCvColor -> IplImage 
frame = img2.getCvImage(); 

///IplImage in GRAY 
cvCvtColor(frame,bw,CV_RGB2GRAY); 
cvThreshold(bw,bw,200,255,CV_THRESH_BINARY); //It is actually a binary image 
gray = bw; 
pix = gray.getPixels(); 

n=write(newsockfd,pix,43200); 

在客戶端(iPod的4.3):

-(UIImage *) dataFromIplImageToUIImage:(unsigned char *) rawData; 
{ 
size_t width = 240; 
size_t height = 180; 
size_t depth = 8;     //bitsPerComponent 
size_t depthXnChannels = 8;   //bitsPerPixel 
size_t widthStep = 240;    //bytesPerRow 

CGContextRef ctx = CGBitmapContextCreate(rawData, width, height, depth, widthStep, CGColorSpaceCreateDeviceGray(), kCGImageAlphaNone); 

CGImageRef imageRef = CGBitmapContextCreateImage (ctx); 
UIImage* rawImage = [UIImage imageWithCGImage:imageRef]; 

CGContextRelease(ctx); 

myImageView.image = rawImage; 
return rawImage; 

free(rawData); 
} 

也許有做一個簡單的方法這,但是,嘿,完成了工作。希望這有助於任何人。

相關問題