2017-05-24 36 views
0

我正在構建在線Web應用程序,它在畫布上呈現視頻,然後使用canvas.captureStream()和mediaRecorder記錄畫布。問題是,當用戶切換選項卡或最小化窗口時,畫布會凍結。我使用webWorkerSetInterval(Hacktimer.js)繼續運行我的動畫。根據鉻,他們還沒有提供解決方案https://bugs.chromium.org/p/chromium/issues/detail?id=639105當選項卡處於非活動狀態時,畫布不會重新繪製(WebGL)

任何人都可以提出解決辦法嗎?我嘗試了新的窗口,不允許最小化但不成功。記錄不切換窗口時停止

+0

的可能的複製[CanvasCaptureMediaStream/MediaRecorder幀同步](https://stackoverflow.com/questions/40687010/canvascapturemediastream-mediarecorder-frame-synchronization) – Kaiido

+0

ps to close-vote:我知道這些問題看起來不同,但核心問題幾乎相同(rAF限制在那裏),並且解決方案可能會這樣做 你也是。 – Kaiido

+0

通過使用WebWorker,rAF限制已被消除。但我正在嘗試您的解決方案現在不要關閉它。此外,這不涉及隱藏的畫布,但不活動的選項卡。 –

回答

2

NB(當標籤被切換或最小化僅停止):
現在,這個問題已被具體地編輯,以治療webgl的上下文中,它可能不是完全是這個previous answer的副本,它確實不適用於webgl上下文;但由於鉻錯誤...

所以這個答案將告訴你如何解決這個錯誤,同時等待從鉻修復。


鏈接的回答讓使用WebAudio API的計時方法來創建一個定時循環的,不依賴於屏幕刷新率也不窗口/標籤的可見性。

但正如在標題中所說,這目前不適用於chrome上的webgl上下文。

簡單的解決方法,是使用一個屏幕外2D上下文作爲流源,和引起我們webgl的畫布到該2D上下文:

function startRecording(webgl_renderer, render_func) { 
 
    // create a clone of the webgl canvas 
 
    var canvas = webgl_renderer.domElement.cloneNode(); 
 
    // init an 2D context 
 
    var ctx = canvas.getContext('2d'); 
 
    function anim(){ 
 
    // render the webgl Animation 
 
    render_func(); 
 
    // draw the wegbl canvas on our 2D one 
 
    ctx.clearRect(0,0,canvas.width, canvas.height); 
 
    \t ctx.drawImage(webgl_renderer.domElement, 0,0); 
 
    } 
 
\t var fps = 60; 
 
    // start our loop @60fps 
 
    var stopAnim = audioTimerLoop(anim, 1000/fps); 
 
    // maximum stream rate set as 60 fps 
 
    var cStream = canvas.captureStream(fps); 
 

 
    let chunks = []; 
 
    var recorder = new MediaRecorder(cStream); 
 
    recorder.ondataavailable = e => chunks.push(e.data); 
 
    recorder.onstop = e => { 
 
    // we can stop our loop 
 
    stopAnim(); 
 
    var url = URL.createObjectURL(new Blob(chunks)); 
 
    var v = document.createElement('video'); 
 
    v.src = url; 
 
    v.controls = true; 
 
    document.body.appendChild(v); 
 
    } 
 
    recorder.start(); 
 
    // stops the recorder in 20s, try to change tab during this time 
 
    setTimeout(function() { 
 
    recorder.stop(); 
 
    }, 20000); 
 
    btn.parentNode.removeChild(btn); 
 
} 
 

 

 
/* 
 
    An alternative timing loop, based on AudioContext's clock 
 

 
    @arg callback : a callback function 
 
     with the audioContext's currentTime passed as unique argument 
 
    @arg frequency : float in ms; 
 
    @returns : a stop function 
 

 
*/ 
 
function audioTimerLoop(callback, frequency) { 
 

 
    var freq = frequency/1000;  // AudioContext time parameters are in seconds 
 
    var aCtx = new AudioContext(); 
 
    // Chrome needs our oscillator node to be attached to the destination 
 
    // So we create a silent Gain Node 
 
    var silence = aCtx.createGain(); 
 
    silence.gain.value = 0; 
 
    silence.connect(aCtx.destination); 
 

 
    onOSCend(); 
 

 
    var stopped = false;  // A flag to know when we'll stop the loop 
 
    function onOSCend() { 
 
    var osc = aCtx.createOscillator(); 
 
    osc.onended = onOSCend; // so we can loop 
 
    osc.connect(silence); 
 
    osc.start(0); // start it now 
 
    osc.stop(aCtx.currentTime + freq); // stop it next frame 
 
    callback(aCtx.currentTime); // one frame is done 
 
    if (stopped) { // user broke the loop 
 
     osc.onended = function() { 
 
     aCtx.close(); // clear the audioContext 
 
     return; 
 
     }; 
 
    } 
 
    }; 
 
    // return a function to stop our loop 
 
    return function() { 
 
    stopped = true; 
 
    }; 
 
} 
 

 
/* global THREE */ 
 
/* Note that all rAF loop have been removed 
 
    since they're now handled by our 'audioTimerLoop' */ 
 

 

 
(function() { 
 

 
    'use strict'; 
 
    var WIDTH = 500, HEIGHT = 500; 
 
    var scene = new THREE.Scene(); 
 
    var camera = new THREE.PerspectiveCamera(75, WIDTH/HEIGHT, 0.1, 1000); 
 

 
    var renderer = new THREE.WebGLRenderer(); 
 
    renderer.setSize(WIDTH , HEIGHT); 
 
    document.body.appendChild(renderer.domElement); 
 

 
    var geometry = new THREE.CubeGeometry(5, 5, 5); 
 
    var material = new THREE.MeshLambertMaterial({ 
 
     color: 0x00fff0 
 
    }); 
 
    var cube = new THREE.Mesh(geometry, material); 
 
    scene.add(cube); 
 

 
    camera.position.z = 12; 
 
    
 
    var pointLight = new THREE.PointLight(0xFFFFFF); 
 

 
    pointLight.position.x = 10; 
 
    pointLight.position.y = 50; 
 
    pointLight.position.z = 130; 
 

 
    scene.add(pointLight); 
 

 
    var render = function() {   
 
     var delta = Math.random() * (0.06 - 0.02) + 0.02; 
 

 
     cube.rotation.x += delta; 
 
     cube.rotation.y += delta; 
 
     cube.rotation.z -= delta; 
 

 
     renderer.render(scene, camera); 
 
    }; 
 
    render(); 
 
    console.clear(); 
 
    
 
    btn.onclick = function(){startRecording(renderer, render);}; 
 

 
}());
body { 
 
    margin: 0; 
 
    background: #000; 
 
} 
 
button{ 
 
    position: absolute; 
 
    top: 0; 
 
    }
<script src="https://cdnjs.cloudflare.com/ajax/libs/three.js/85/three.min.js"></script> 
 
<!-- Mobile devices need an user interaction to start the WebAudio API --> 
 
<button id="btn">Start</button>

+0

Aweome破解。但drawImage對於我來說是一個瓶頸,因爲它需要大約80-90ms。我想達到至少8 fps的錄音。但是用這種方法,即使幀速非常慢,也會跳過很多幀。你能建議優化嗎?我們可以綁定2D canvas和GL canvas,還是使用context.readPixels加速過程? –

+0

@AmriteshAnand,80ms oO!?你的畫布的大小是多少?使用與片段中相同的動畫,但使用2500 * 2500的畫布,我獲得'clearRect + drawImage'的時間最長爲10ms,平均爲1.22ms。我也沒有60 fps,但罪魁禍首是在webgl部分,而不是在drawImage。而AFAIK,drawImage是將webgl畫布繪製到2D的最快方式。 – Kaiido

+0

非常感謝,我在Mac上試過,它給了我很好的表現。 –

相關問題