2017-08-28 108 views
1

目標是通過網絡將視頻從Raspberry Pi(Raspivid/H.264)流式傳輸到運行在筆記本電腦上的OpenCV應用程序中。將Raspivid的Netcat H.264視頻轉換爲OpenCV

開放CV捕獲如下(C++):

cv::VideoCapture cap; 
cap.open("cam_1"); // cam_1 is a FIFO 

cv::Mat frame; 

while(1){ 
    cap >> frame; 
    cv::imshow("", frame); 
    cv::waitKey(10); 
} 

的FIFO流是如下創建:

mkfifo cam_1 

一旦OpenCV的程序運行時,netcat的監聽器啓動:

ncat --recv-only --keep-open --verbose --listen 5001 > cam_1 

一旦netcat監聽器在筆記本電腦上運行,流將從第È樹莓裨

raspivid --verbose --nopreview -b 2000000 --timeout 0 -o - | ncat 192.168.LAPTOP.IP 5001 

,或者出於調試目的,在筆記本電腦上的本地文件可以被流式傳輸到netcat的:

cat video.h264 | nc 192.168.LAPTOP.IP 5001 

二者均得到以下錯誤:

Unable to stop the stream: Inappropriate ioctl for device (ERROR)icvOpenAVI_XINE(): Unable to initialize video driver.

有趣的是,如果我啓動筆記本電腦上的Netcat偵聽器,然後用CTRL + C殺死它,然後在啓動視頻流之前再次啓動它,使用任一方法... 然後視頻p正確放置

我不明白爲什麼啓動netcat偵聽器,然後殺死它,然後再次啓動有一個影響或影響是什麼。我認爲可能需要在視頻之前將EOF或BOF回顯到FIFO中,我不確定該語法是什麼。

我嘗試了所有Netcat的味道。

+0

https://stackoverflow.com/a/44972255/2836621 –

回答

1

如果您在OpenCV嘗試讀取它之後但在開始流式傳輸之前觸摸FIFO,則它將起作用。

1

我只是解決了這個使用以下https://stackoverflow.com/a/48675107/2355051

我結束了適應這種picamera python recipe

在樹莓派:(createStream.py)

import io 
import socket 
import struct 
import time 
import picamera 

# Connect a client socket to my_server:8000 (change my_server to the 
# hostname of your server) 
client_socket = socket.socket() 
client_socket.connect(('10.0.0.3', 777)) 

# Make a file-like object out of the connection 
connection = client_socket.makefile('wb') 
try: 
    with picamera.PiCamera() as camera: 
     camera.resolution = (1024, 768) 
     # Start a preview and let the camera warm up for 2 seconds 
     camera.start_preview() 
     time.sleep(2) 

     # Note the start time and construct a stream to hold image data 
     # temporarily (we could write it directly to connection but in this 
     # case we want to find out the size of each capture first to keep 
     # our protocol simple) 
     start = time.time() 
     stream = io.BytesIO() 
     for foo in camera.capture_continuous(stream, 'jpeg', use_video_port=True): 
      # Write the length of the capture to the stream and flush to 
      # ensure it actually gets sent 
      connection.write(struct.pack('<L', stream.tell())) 
      connection.flush() 

      # Rewind the stream and send the image data over the wire 
      stream.seek(0) 
      connection.write(stream.read()) 

      # Reset the stream for the next capture 
      stream.seek(0) 
      stream.truncate() 
    # Write a length of zero to the stream to signal we're done 
    connection.write(struct.pack('<L', 0)) 
finally: 
    connection.close() 
    client_socket.close() 

在機器正在處理流:(processStream.py)

import io 
import socket 
import struct 
import cv2 
import numpy as np 

# Start a socket listening for connections on 0.0.0.0:8000 (0.0.0.0 means 
# all interfaces) 
server_socket = socket.socket() 
server_socket.bind(('0.0.0.0', 777)) 
server_socket.listen(0) 

# Accept a single connection and make a file-like object out of it 
connection = server_socket.accept()[0].makefile('rb') 
try: 
    while True: 
     # Read the length of the image as a 32-bit unsigned int. If the 
     # length is zero, quit the loop 
     image_len = struct.unpack('<L', connection.read(struct.calcsize('<L')))[0] 
     if not image_len: 
      break 
     # Construct a stream to hold the image data and read the image 
     # data from the connection 
     image_stream = io.BytesIO() 
     image_stream.write(connection.read(image_len)) 
     # Rewind the stream, open it as an image with opencv and do some 
     # processing on it 
     image_stream.seek(0) 
     image = Image.open(image_stream) 

     data = np.fromstring(image_stream.getvalue(), dtype=np.uint8) 
     imagedisp = cv2.imdecode(data, 1) 

     cv2.imshow("Frame",imagedisp) 
     cv2.waitKey(1) #imshow will not output an image if you do not use waitKey 
     cv2.destroyAllWindows() #cleanup windows 
finally: 
    connection.close() 
    server_socket.close() 

該解決方案與我在原始問題中引用的視頻具有相似的結果。較大的分辨率幀會增加Feed的延遲,但這對我的應用程序來說是可以忍受的。

首先你需要運行processStream.py,然後在Raspberry Pi上執行createStream.py。如果這不起作用,請執行以下腳本:sudo