2017-08-10 57 views
0

我必須爲識別解決方案提供YUV(NV21)字節數組,並且希望縮短處理時間以縮小預覽幀。NV21上的縮小比例問題 - > ARGB - > NV21轉換

從這裏和那裏收集的解決方案,我設法以1:1的比例進行轉換,並獲得識別點擊。但如果我想縮小中間位圖,我沒有結果。即使我把它縮小到只有95%。

任何幫助,將不勝感激。

因此,每隔400毫秒我就會預覽幀以異步轉換它。我使用RenderScript將其轉換爲ARGB,將其縮小並將其轉換回來。

// Camera callback 
@Override 
public void onPreviewFrame(byte[] frame, Camera camera) { 
    if (camera != null) { 
     // Debounce 
     if ((System.currentTimeMillis() - mStart) > 400) { 
      mStart = System.currentTimeMillis(); 

      Camera.Size size = camera.getParameters().getPreviewSize(); 
      new FrameScaleAsyncTask(frame, size.width, size.height).execute(); 
     } 
    } 

    if (mCamera != null) { 
     mCamera.addCallbackBuffer(mBuffer); 
    } 
} 

// In FrameScaleAsyncTask 
@Override 
protected Void doInBackground(Void... params) { 
    // Create YUV type for in-allocation 
    Type yuvType = new Type.Builder(mRenderScript, Element.U8(mRenderScript)) 
      .setX(mFrame.length) 
      .create(); 
    mAllocationIn = Allocation.createTyped(mRenderScript, yuvType, Allocation.USAGE_SCRIPT); 

    // Create ARGB-8888 type for out-allocation 
    Type rgbType = new Type.Builder(mRenderScript, Element.RGBA_8888(mRenderScript)) 
      .setX(mWidth) 
      .setY(mHeight) 
      .create(); 
    mAllocationOut = Allocation.createTyped(mRenderScript, rgbType, Allocation.USAGE_SCRIPT); 

    // Copy frame data into in-allocation 
    mAllocationIn.copyFrom(mFrame); 

    // Set script input and fire ! 
    mScript.setInput(mAllocationIn); 
    mScript.forEach(mAllocationOut); 

    // Create a bitmap of camera preview size (see camera setup) and copy out-allocation to it 
    Bitmap bitmap = Bitmap.createBitmap(mWidth, mHeight, Bitmap.Config.ARGB_8888); 
    mAllocationOut.copyTo(bitmap); 

    // Scale bitmap down 
    double scaleRatio = 1; 
    Bitmap scaledBitmap = Bitmap.createScaledBitmap(
      bitmap, 
      (int) (bitmap.getWidth() * scaleRatio), 
      (int) (bitmap.getHeight() * scaleRatio), 
      false 
    ); 
    bitmap.recycle(); 

    int size = scaledBitmap.getRowBytes() * scaledBitmap.getHeight(); 
    int scaledWidth = scaledBitmap.getWidth(); 
    int scaledHeight = scaledBitmap.getHeight(); 
    int[] pixels = new int[scaledWidth * scaledHeight]; 
    // Put bitmap pixels into an int array 
    scaledBitmap.getPixels(pixels, 0, scaledWidth, 0, 0, scaledWidth, scaledHeight); 

    mFrame = new byte[pixels.length * 3/2]; 
    ImageHelper.encodeYUV420SPAlt(mFrame, pixels, scaledWidth, scaledHeight); 

    return null; 
} 

的RGB到YUV算法(見:this answer):

public static void encodeYUV420SPAlt(byte[] yuv420sp, int[] argb, int width, int height) { 
    final int frameSize = width * height; 

    int yIndex = 0; 
    int uvIndex = frameSize; 

    int a, R, G, B, Y, U, V; 
    int index = 0; 
    for (int j = 0; j < height; j++) { 
     for (int i = 0; i < width; i++) { 

      a = (argb[index] & 0xff000000) >> 24; // a is not used obviously 
      R = (argb[index] & 0xff0000) >> 16; 
      G = (argb[index] & 0xff00) >> 8; 
      B = (argb[index] & 0xff) >> 0; 

      // well known RGB to YUV algorithm 
      Y = ((66 * R + 129 * G + 25 * B + 128) >> 8) + 16; 
      U = ((-38 * R - 74 * G + 112 * B + 128) >> 8) + 128; 
      V = ((112 * R - 94 * G - 18 * B + 128) >> 8) + 128; 

      // NV21 has a plane of Y and interleaved planes of VU each sampled by a factor of 2 
      // meaning for every 4 Y pixels there are 1 V and 1 U. Note the sampling is every other 
      // pixel AND every other scanline. 
      yuv420sp[yIndex++] = (byte) ((Y < 0) ? 0 : ((Y > 255) ? 255 : Y)); 
      if (j % 2 == 0 && index % 2 == 0) { 
       yuv420sp[uvIndex++] = (byte) ((V < 0) ? 0 : ((V > 255) ? 255 : V)); 
       yuv420sp[uvIndex++] = (byte) ((U < 0) ? 0 : ((U > 255) ? 255 : U)); 
      } 

      index++; 
     } 
    } 
} 

回答

0

我最後用C結束了調整我的圖像(作爲OpenCV.Mat)直接++。這樣更方便,更快捷。

Size size(correctedWidth, correctedHeight); 
Mat dst; 
resize(image, dst, size);