2013-12-13 40 views
3

我試圖使用MediaCodec和MediaMux,並遇到一些麻煩。android:SurfaceTexure,相機幀等待超時

下面是從logcat的錯誤:

12-13 11:59:58.238: E/AndroidRuntime(23218): FATAL EXCEPTION: main 
12-13 11:59:58.238: E/AndroidRuntime(23218): java.lang.RuntimeException: Unable to resume activity {com.brendon.cameratompeg/com.brendon.cameratompeg.CameraToMpeg}: java.lang.IllegalStateException: Can't stop due to wrong state. 
12-13 11:59:58.238: E/AndroidRuntime(23218): at android.app.ActivityThread.performResumeActivity(ActivityThread.java:2918) 

代碼得到錯誤的 「mStManager.awaitNewImage();」,這是在的onResume功能。 logcat說「相機幀等待超時」。
mStManager是類SurfaceTextureManager的一個實例。 awaitNewImage()函數來自「相機幀等待超時」。我已將該課程添加到我的帖子中。我的代碼

部分是這樣(的的onCreate功能的onResume功能):

@Override 
    protected void onCreate(Bundle savedInstanceState) { 
     // arbitrary but popular values 
     int encWidth = 640; 
     int encHeight = 480; 
     int encBitRate = 6000000;  // Mbps 
     Log.d(TAG, MIME_TYPE + " output " + encWidth + "x" + encHeight + " @" + encBitRate); 

     super.onCreate(savedInstanceState); 
     setContentView(R.layout.activity_camera_to_mpeg); 

      prepareCamera(encWidth, encHeight); 
      prepareEncoder(encWidth, encHeight, encBitRate); 
      mInputSurface.makeCurrent(); 
      prepareSurfaceTexture(); 

      mCamera.startPreview();   
} 


@Override 
public void onResume(){ 

    try { 

     long startWhen = System.nanoTime(); 
      long desiredEnd = startWhen + DURATION_SEC * 1000000000L; 
      SurfaceTexture st = mStManager.getSurfaceTexture(); 
      int frameCount = 0; 

     while (System.nanoTime() < desiredEnd) { 
      // Feed any pending encoder output into the muxer. 
      drainEncoder(false); 

      // Switch up the colors every 15 frames. Besides demonstrating the use of 
      // fragment shaders for video editing, this provides a visual indication of 
      // the frame rate: if the camera is capturing at 15fps, the colors will change 
      // once per second. 
      if ((frameCount % 15) == 0) { 
       String fragmentShader = null; 
       if ((frameCount & 0x01) != 0) { 
        fragmentShader = SWAPPED_FRAGMENT_SHADER; 
       } 
       mStManager.changeFragmentShader(fragmentShader); 
      } 
      frameCount++; 

      // Acquire a new frame of input, and render it to the Surface. If we had a 
      // GLSurfaceView we could switch EGL contexts and call drawImage() a second 
      // time to render it on screen. The texture can be shared between contexts by 
      // passing the GLSurfaceView's EGLContext as eglCreateContext()'s share_context 
      // argument. 
      mStManager.awaitNewImage(); 
      mStManager.drawImage(); 

      // Set the presentation time stamp from the SurfaceTexture's time stamp. This 
      // will be used by MediaMuxer to set the PTS in the video. 
      if (VERBOSE) { 
       Log.d(TAG, "present: " + 
         ((st.getTimestamp() - startWhen)/1000000.0) + "ms"); 
      } 
      mInputSurface.setPresentationTime(st.getTimestamp()); 

      // Submit it to the encoder. The eglSwapBuffers call will block if the input 
      // is full, which would be bad if it stayed full until we dequeued an output 
      // buffer (which we can't do, since we're stuck here). So long as we fully drain 
      // the encoder before supplying additional input, the system guarantees that we 
      // can supply another frame without blocking. 
      if (VERBOSE) Log.d(TAG, "sending frame to encoder"); 
      mInputSurface.swapBuffers(); 
     } 

     // send end-of-stream to encoder, and drain remaining output 
     drainEncoder(true); 
    } catch(Exception e) { 
     Log.d(TAG, e.getMessage()); 
     // release everything we grabbed 
     releaseCamera(); 
     releaseEncoder(); 
     releaseSurfaceTexture(); 
    } 
} 

在代碼中的一類,是有關錯誤

private static class SurfaceTextureManager 
      implements SurfaceTexture.OnFrameAvailableListener { 
     private SurfaceTexture mSurfaceTexture; 
     private CameraToMpeg.STextureRender mTextureRender; 

     private Object mFrameSyncObject = new Object();  // guards mFrameAvailable 
     private boolean mFrameAvailable; 

     /** 
     * Creates instances of TextureRender and SurfaceTexture. 
     */ 
     public SurfaceTextureManager() { 
      mTextureRender = new CameraToMpeg.STextureRender(); 
      mTextureRender.surfaceCreated(); 

      if (VERBOSE) Log.d(TAG, "textureID=" + mTextureRender.getTextureId()); 
      mSurfaceTexture = new SurfaceTexture(mTextureRender.getTextureId()); 

      // This doesn't work if this object is created on the thread that CTS started for 
      // these test cases. 
      // 
      // The CTS-created thread has a Looper, and the SurfaceTexture constructor will 
      // create a Handler that uses it. The "frame available" message is delivered 
      // there, but since we're not a Looper-based thread we'll never see it. For 
      // this to do anything useful, OutputSurface must be created on a thread without 
      // a Looper, so that SurfaceTexture uses the main application Looper instead. 
      // 
      // Java language note: passing "this" out of a constructor is generally unwise, 
      // but we should be able to get away with it here. 
      mSurfaceTexture.setOnFrameAvailableListener(this); 
     } 

     public void release() { 
      // this causes a bunch of warnings that appear harmless but might confuse someone: 
      // W BufferQueue: [unnamed-3997-2] cancelBuffer: BufferQueue has been abandoned! 
      //mSurfaceTexture.release(); 

      mTextureRender = null; 
      mSurfaceTexture = null; 
     } 

     /** 
     * Returns the SurfaceTexture. 
     */ 
     public SurfaceTexture getSurfaceTexture() { 
      return mSurfaceTexture; 
     } 

     /** 
     * Replaces the fragment shader. 
     */ 
     public void changeFragmentShader(String fragmentShader) { 
      mTextureRender.changeFragmentShader(fragmentShader); 
     } 

     /** 
     * Latches the next buffer into the texture. Must be called from the thread that created 
     * the OutputSurface object. 
     */ 
     public void awaitNewImage() { 
      final int TIMEOUT_MS = 2500; 

      synchronized (mFrameSyncObject) { 
       while (!mFrameAvailable) { 
        try { 
         // Wait for onFrameAvailable() to signal us. Use a timeout to avoid 
         // stalling the test if it doesn't arrive. 
         mFrameSyncObject.wait(TIMEOUT_MS); 
         if (!mFrameAvailable) { 
          // TODO: if "spurious wakeup", continue while loop 
          throw new RuntimeException("Camera frame wait timed out"); 
         } 
        } catch (InterruptedException ie) { 
         // shouldn't happen 
         throw new RuntimeException(ie); 
        } 
       } 
       mFrameAvailable = false; 
      } 

      // Latch the data. 
      mTextureRender.checkGlError("before updateTexImage"); 
      mSurfaceTexture.updateTexImage(); 
     } 

     /** 
     * Draws the data from SurfaceTexture onto the current EGL surface. 
     */ 
     public void drawImage() { 
      mTextureRender.drawFrame(mSurfaceTexture); 
     } 

     @Override 
     public void onFrameAvailable(SurfaceTexture st) { 
      if (VERBOSE) Log.d(TAG, "new frame available"); 
      synchronized (mFrameSyncObject) { 
       if (mFrameAvailable) { 
        throw new RuntimeException("mFrameAvailable already set, frame could be dropped"); 
       } 
       mFrameAvailable = true; 
       mFrameSyncObject.notifyAll(); 
      } 
     } 
    } 

有沒有人有任何想法?謝謝!

+0

在哪一行,你收到此錯誤....這個錯誤是從相機狀態..使用try ..catch得到它WHT是錯誤的....讓我知道.. –

+0

謝謝!我用try..catch和我修改了我的帖子。 「mStManager.awaitNewImage();」代碼出錯。 logcat說「相機幀等待超時」。 –

+0

mStManager是類SurfaceTextureManager的一個實例。 awaitNewImage()函數來自「相機幀等待超時」。我已將該課程添加到我的帖子中。 –

回答

5

我也遇到過這個問題。因此,原因是您的代碼在具有活套的線程上運行。你必須確保代碼運行在沒有活套的線程上。如果是這樣,SurfaceTexture.OnFrameAvailableListener將向等待的線程發送「幀可用」消息,而不是將消息發送給主線程上的處理程序,並且會卡住。

Bigflake的例子爲您提供一個詳細的描述:

/** 
* Wraps testEditVideo, running it in a new thread. Required because of the way 
* SurfaceTexture.OnFrameAvailableListener works when the current thread has a Looper 
* configured. 
*/ 
private static class VideoEditWrapper implements Runnable { 
    private Throwable mThrowable; 
    private DecodeEditEncodeTest mTest; 
    private VideoEditWrapper(DecodeEditEncodeTest test) { 
     mTest = test; 
    } 
    @Override 
    public void run() { 
     try { 
      mTest.videoEditTest(); 
     } catch (Throwable th) { 
      mThrowable = th; 
     } 
    } 
    /** Entry point. */ 
    public static void runTest(DecodeEditEncodeTest obj) throws Throwable { 
     VideoEditWrapper wrapper = new VideoEditWrapper(obj); 
     Thread th = new Thread(wrapper, "codec test"); 
     th.start(); 
     th.join(); 
     if (wrapper.mThrowable != null) { 
      throw wrapper.mThrowable; 
     } 
    } 
}