2013-03-12 23 views
1

我正在努力開發和應用程序保存視頻和此視頻可以在PC中查看。我使用android手機和Java的電腦。電腦內容服務器端和傳輸是由插座。 我的問題似乎是,我可以錄製視頻,但在PC端應用程序不能再現視頻sent.` 我告訴你這是我的設置MediaRecorder代碼:使用類似webcam的android移動設備

public void prepareVideoRecorder(Camera mCamera, ParcelFileDescriptor pfd, 
     SurfaceHolder mHolder) { 
    if (mCamera == null) { 
     mCamera = safeCameraOpen(mCamera); 
    } 
    if (mMediaRecorder == null) { 
     mMediaRecorder = new MediaRecorder(); 

     mCamera.stopPreview(); 
     // Step 1: unlock and set camera to MediaRecorder; 
     mCamera.unlock(); 
     mMediaRecorder.setCamera(mCamera); 
    } 

    // Step 2: Set sources: 
    mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER); 
    mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA); 
    //mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4); 


    // Step 3:Set a CamcorderProfile (APi level 8 or higher) 
    mMediaRecorder.setProfile(CamcorderProfile 
      .get(CamcorderProfile.QUALITY_HIGH)); 

    // Step 4: Set output file 
    mMediaRecorder.setOutputFile(pfd.getFileDescriptor()); 
    // Step 5: Set the preview output 
    mMediaRecorder.setPreviewDisplay(mHolder.getSurface()); 
    try { 
     mMediaRecorder.prepare(); 
    } catch (IllegalStateException e) { 
     // TODO Auto-generated catch block 
     e.printStackTrace(); 
    } catch (IOException e) { 
     // TODO Auto-generated catch block 
     e.printStackTrace(); 
    } 

} 

這似乎是正確的。然後在PC端應該播放視頻的開發與xuggler,並在應用程序停止:

if (container.open(inputstream, null) < 0) { 
      throw new IllegalArgumentException("could not open inpustream"); 
     } 

這是下一個Java類的部分:

public class imagePnl extends JPanel { 

URL medialocator = null; 
BufferedImage image; 
private Player player; 
private DataSource ds = null; 
private String mobileLocation = "socket://localhost:1234"; 
// private ByteArrayDataSource byteDs = null; 
private InputStream inputStream = null; 
IContainerFormat format; 

public imagePnl() { 
} 

public void setVideo(InputStream inputstream) { 
    // Let's make sure that we can actually convert video pixel formats. 
    if (!IVideoResampler 
      .isSupported(IVideoResampler.Feature.FEATURE_COLORSPACECONVERSION)) { 
     throw new RuntimeException("you must install the GPL version" 
       + " of Xuggler (with IVideoResampler support) for " 
       + "this demo to work"); 
    } 

    IContainer container = IContainer.make(); 

    if (container.open(inputstream, null) < 0) { 
     throw new IllegalArgumentException("could not open inpustream"); 
    } 
    // query how many streams the call to open found 
    int numStreams = container.getNumStreams(); 
    // and iterate through the streams to find the first video stream 
    int videoStreamId = -1; 
    IStreamCoder videoCoder = null; 
    for (int i = 0; i < numStreams; i++) { 
     // Find the stream object 
     IStream stream = container.getStream(i); 
     // Get the pre-configured decoder that can decode this stream; 
     IStreamCoder coder = stream.getStreamCoder(); 

     if (coder.getCodecType() == ICodec.Type.CODEC_TYPE_VIDEO) { 
      videoStreamId = i; 
      videoCoder = coder; 
      break; 
     } 
    } 
    if (videoStreamId == -1) { 
     throw new RuntimeException("could not find video stream"); 
    } 
    /* 
    * Now we have found the video stream in this file. Let's open up our 
    * decoder so it can do work. 
    */ 
    if (videoCoder.open() < 0) { 
     throw new RuntimeException(
       "could not open video decoder for container"); 
    } 
    IVideoResampler resampler = null; 
    if (videoCoder.getPixelType() != IPixelFormat.Type.BGR24) { 
     // if this stream is not in BGR24, we're going to need to 
     // convert it. The VideoResampler does that for us. 
     resampler = IVideoResampler.make(videoCoder.getWidth(), 
       videoCoder.getHeight(), IPixelFormat.Type.BGR24, 
       videoCoder.getWidth(), videoCoder.getHeight(), 
       videoCoder.getPixelType()); 
     if (resampler == null) { 
      throw new RuntimeException(
        "could not create color space resampler."); 
     } 
    } 
    /* 
    * Now, we start walking through the container looking at each packet. 
    */ 
    IPacket packet = IPacket.make(); 
    long firstTimestampInStream = Global.NO_PTS; 
    long systemClockStartTime = 0; 
    while (container.readNextPacket(packet) >= 0) { 
     /* 
     * Now we have a packet, let's see if it belongs to our video stream 
     */ 
     if (packet.getStreamIndex() == videoStreamId) { 
      /* 
      * We allocate a new picture to get the data out of Xuggler 
      */ 
      IVideoPicture picture = IVideoPicture.make(
        videoCoder.getPixelType(), videoCoder.getWidth(), 
        videoCoder.getHeight()); 

      try { 
       int offset = 0; 
       while (offset < packet.getSize()) { 
        System.out 
          .println("VideoManager.decode(): decode one image"); 
        /* 
        * Now, we decode the video, checking for any errors. 
        */ 
        int bytesDecoded = videoCoder.decodeVideo(picture, 
          packet, offset); 
        if (bytesDecoded < 0) { 
         throw new RuntimeException(
           "got error decoding video"); 
        } 
        offset += bytesDecoded; 

        /* 
        * Some decoders will consume data in a packet, but will 
        * not be able to construct a full video picture yet. 
        * Therefore you should always check if you got a 
        * complete picture from the decoder 
        */ 
        if (picture.isComplete()) { 
         System.out 
           .println("VideoManager.decode(): image complete"); 
         IVideoPicture newPic = picture; 
         /* 
         * If the resampler is not null, that means we 
         * didn't get the video in BGR24 format and need to 
         * convert it into BGR24 format. 
         */ 
         if (resampler != null) { 
          // we must resample 
          newPic = IVideoPicture 
            .make(resampler.getOutputPixelFormat(), 
              picture.getWidth(), 
              picture.getHeight()); 
          if (resampler.resample(newPic, picture) < 0) { 
           throw new RuntimeException(
             "could not resample video"); 
          } 
         } 
         if (newPic.getPixelType() != IPixelFormat.Type.BGR24) { 
          throw new RuntimeException(
            "could not decode video as BGR 24 bit data"); 
         } 

         /** 
         * We could just display the images as quickly as we 
         * decode them, but it turns out we can decode a lot 
         * faster than you think. 
         * 
         * So instead, the following code does a poor-man's 
         * version of trying to match up the frame-rate 
         * requested for each IVideoPicture with the system 
         * clock time on your computer. 
         * 
         * Remember that all Xuggler IAudioSamples and 
         * IVideoPicture objects always give timestamps in 
         * Microseconds, relative to the first decoded item. 
         * If instead you used the packet timestamps, they 
         * can be in different units depending on your 
         * IContainer, and IStream and things can get hairy 
         * quickly. 
         */ 
         if (firstTimestampInStream == Global.NO_PTS) { 
          // This is our first time through 
          firstTimestampInStream = picture.getTimeStamp(); 
          // get the starting clock time so we can hold up 
          // frames until the right time. 
          systemClockStartTime = System 
            .currentTimeMillis(); 
         } else { 
          long systemClockCurrentTime = System 
            .currentTimeMillis(); 
          long millisecondsClockTimeSinceStartofVideo = systemClockCurrentTime 
            - systemClockStartTime; 

          // compute how long for this frame since the 
          // first frame in the stream. 
          // remember that IVideoPicture and IAudioSamples 
          // timestamps are always in MICROSECONDS, 
          // so we divide by 1000 to get milliseconds. 
          long millisecondsStreamTimeSinceStartOfVideo = (picture 
            .getTimeStamp() - firstTimestampInStream)/1000; 
          final long millisecondsTolerance = 50; // and we 
                    // give 
                    // ourselfs 
                    // 50 ms 
                    // of 
                    // tolerance 
          final long millisecondsToSleep = (millisecondsStreamTimeSinceStartOfVideo - (millisecondsClockTimeSinceStartofVideo + millisecondsTolerance)); 
          if (millisecondsToSleep > 0) { 
           try { 
            Thread.sleep(millisecondsToSleep); 
           } catch (InterruptedException e) { 
            // we might get this when the user 
            // closes the dialog box, so just return 
            // from the method. 
            return; 
           } 
          } 
         } 

         // And finally, convert the BGR24 to an Java 
         // buffered image 
         BufferedImage javaImage = Utils 
           .videoPictureToImage(newPic); 

         // and display it on the Java Swing window 
         setImage(javaImage); 
         // if (listener != null) { 
         // listener.imageUpdated(javaImage); 
         // } 
        } 
       } // end of while 
      } catch (Exception exc) { 
       exc.printStackTrace(); 
      } 
     } else { 
      /* 
      * This packet isn't part of our video stream, so we just 
      * silently drop it. 
      */ 
      do { 
      } while (false); 
     } 

    } 
    /* 
    * Technically since we're exiting anyway, these will be cleaned up by 
    * the garbage collector... but because we're nice people and want to be 
    * invited places for Christmas, we're going to show how to clean up. 
    */ 
    if (videoCoder != null) { 
     videoCoder.close(); 
     videoCoder = null; 
    } 
    if (container != null) { 
     container.close(); 
     container = null; 
    } 

    // byteDs = new ByteArrayDataSource(bytes, "video/3gp"); 
    // ToolFactory.makere byteDs 
    // .getOutputStream(); 
    // Manager.createPlayer(byteD); 
    // Player mediaPlayer = Manager.createRealizedPlayer(new 
    // MediaLocator(mobileLocation)); 
    // Component video = mediaPlayer.getVisualComponent(); 
    // Component control = mediaPlayer.getControlPanelComponent(); 
    // if (video != null) { 
    // add(video, BorderLayout.CENTER); 
    // } 
    // add(control, BorderLayout.SOUTH); 
    // mediaPlayer.start(); 
    // } catch (IOException | NoPlayerException | CannotRealizeException ex) 
    // { 
    // Logger.getLogger(imagePnl.class.getName()).log(Level.SEVERE, null, 
    // ex); 
    // } 
    paint(getGraphics()); 
} 

public void setImage(BufferedImage image) { 
    this.image = (BufferedImage) image; 

    paint(getGraphics()); 
} 

@Override 
public void paintComponent(Graphics g) { 
    // super.paintComponent(g); 
    // Graphics2D g2d = (Graphics2D) g; 
    // 
    // g2d.drawImage(image, 0, 0, null); 
    // explicitly specify width (w) and height (h) 
    g.drawImage(image, 10, 10, this.getWidth(), this.getHeight(), this); 

} 

} 當應用程序在stoppled這條線沒有顯示錯誤,但是應用程序也沒有在PC端顯示視頻。

我希望你能幫助我。這個專業是爲了學習的目的。 由於提前, 弗蘭

回答

2

如果你想從流Android的視頻,你應該使用流媒體協議,如RTSPRTP。 使用TCP套接字將不起作用,因爲標頭信息在通過套接字接收的所有數據包中不可用。 請看看Spydroid

+0

Helloskanda。感謝您的回答。那麼,我應該更換我的套接字以獲得RTST或RTP協議嗎?我會看看Spydroid,然後告訴你。 – user2160960 2013-03-12 15:49:18

+0

@ user2160960是的,可以使用spydroid的源代碼。如果我記得不錯,它使用RTSP。 – Skanda 2013-03-12 15:52:07