2013-03-28 64 views
1

我正在客觀C中錄製聲音文件(wav格式)。我想通過Objective C stringByEvaluatingJavaScriptFromString將其傳回給Javascript。我在想,我將不得不將wav文件轉換爲base64字符串以將其傳遞給此函數。然後,我將不得不在JavaScript中將base64字符串轉換回(wav/blob)格式,以將其傳遞給音頻標籤以播放它。我不知道我該怎麼做?也不知道如果這是最好的方式將波形文件傳回給JavaScript?任何想法將不勝感激。將聲音(wav)文件從目標c傳遞給javascript

回答

2

好吧,這並不像我預期的那樣簡單。所以這裏是我如何實現這一目標。

步驟1:我使用AudioRecorder以caf格式錄製音頻。

NSArray *dirPaths; 
NSString *docsDir; 

dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); 

docsDir = [dirPaths objectAtIndex:0]; 

soundFilePath = [docsDir stringByAppendingPathComponent:@"sound.caf"]; 

NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath]; 

NSDictionary *recordSettings = [NSDictionary dictionaryWithObjectsAndKeys: 
    [NSNumber numberWithInt:AVAudioQualityMin], 
    AVEncoderAudioQualityKey, 
    [NSNumber numberWithInt:16], 
    AVEncoderBitRateKey, 
    [NSNumber numberWithInt:2], 
    AVNumberOfChannelsKey, 
    [NSNumber numberWithFloat:44100], 
           AVSampleRateKey, 
    nil]; 

NSError *error = nil; 

audioRecorder = [[AVAudioRecorder alloc] 
       initWithURL:soundFileURL 
       settings:recordSettings error:&error]; 

if(error) 
{ 
    NSLog(@"error: %@", [error localizedDescription]); 
} else { 
    [audioRecorder prepareToRecord]; 
} 

之後,你只需要調用audioRecorder.record來錄製音頻。它將以caf格式記錄爲 。如果你想看到我的recordAudio功能,那就來了。

(void) recordAudio 
    { 
    if(!audioRecorder.recording) 
    { 
     _playButton.enabled = NO; 
     _recordButton.title = @"Stop"; 
     [audioRecorder record]; 
     [self animate1:nil finished:nil context:nil]; 

    } 
    else 
    { 
     [_recordingImage stopAnimating]; 
     [audioRecorder stop]; 
     _playButton.enabled = YES; 
     _recordButton.title = @"Record"; 
    } 
    } 

步驟2:將caf格式轉換爲wav格式。這是我能夠執行使用以下功能。

-(BOOL)exportAssetAsWaveFormat:(NSString*)filePath 
{ 
    NSError *error = nil ; 

NSDictionary *audioSetting = [NSDictionary dictionaryWithObjectsAndKeys: 
           [ NSNumber numberWithFloat:44100.0], AVSampleRateKey, 
           [ NSNumber numberWithInt:2], AVNumberOfChannelsKey, 
           [ NSNumber numberWithInt:16], AVLinearPCMBitDepthKey, 
           [ NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey, 
           [ NSNumber numberWithBool:NO], AVLinearPCMIsFloatKey, 
           [ NSNumber numberWithBool:0], AVLinearPCMIsBigEndianKey, 
           [ NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved, 
           [ NSData data], AVChannelLayoutKey, nil ]; 

NSString *audioFilePath = filePath; 
AVURLAsset * URLAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:audioFilePath] options:nil]; 

if (!URLAsset) return NO ; 

AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:URLAsset error:&error]; 
if (error) return NO; 

NSArray *tracks = [URLAsset tracksWithMediaType:AVMediaTypeAudio]; 
if (![tracks count]) return NO; 

AVAssetReaderAudioMixOutput *audioMixOutput = [AVAssetReaderAudioMixOutput 
               assetReaderAudioMixOutputWithAudioTracks:tracks 
               audioSettings :audioSetting]; 

if (![assetReader canAddOutput:audioMixOutput]) return NO ; 

[assetReader addOutput :audioMixOutput]; 

if (![assetReader startReading]) return NO; 



NSString *title = @"WavConverted"; 
NSArray *docDirs = NSSearchPathForDirectoriesInDomains (NSDocumentDirectory, NSUserDomainMask, YES); 
NSString *docDir = [docDirs objectAtIndex: 0]; 
NSString *outPath = [[docDir stringByAppendingPathComponent :title] 
        stringByAppendingPathExtension:@"wav" ]; 

if(![[NSFileManager defaultManager] removeItemAtPath:outPath error:NULL]) 
{ 
    return NO; 
} 

soundFilePath = outPath; 

NSURL *outURL = [NSURL fileURLWithPath:outPath]; 
AVAssetWriter *assetWriter = [AVAssetWriter assetWriterWithURL:outURL 
                 fileType:AVFileTypeWAVE 
                 error:&error]; 
if (error) return NO; 

AVAssetWriterInput *assetWriterInput = [ AVAssetWriterInput assetWriterInputWithMediaType :AVMediaTypeAudio 
                      outputSettings:audioSetting]; 
assetWriterInput. expectsMediaDataInRealTime = NO; 

if (![assetWriter canAddInput:assetWriterInput]) return NO ; 

[assetWriter addInput :assetWriterInput]; 

if (![assetWriter startWriting]) return NO; 


//[assetReader retain]; 
//[assetWriter retain]; 

[assetWriter startSessionAtSourceTime:kCMTimeZero ]; 

dispatch_queue_t queue = dispatch_queue_create("assetWriterQueue", NULL); 

[assetWriterInput requestMediaDataWhenReadyOnQueue:queue usingBlock:^{ 

    NSLog(@"start"); 

    while (1) 
    { 
     if ([assetWriterInput isReadyForMoreMediaData] && (assetReader.status == AVAssetReaderStatusReading)) { 

      CMSampleBufferRef sampleBuffer = [audioMixOutput copyNextSampleBuffer]; 

      if (sampleBuffer) { 
       [assetWriterInput appendSampleBuffer :sampleBuffer]; 
       CFRelease(sampleBuffer); 
      } else { 
       [assetWriterInput markAsFinished]; 
       break; 
      } 
     } 
    } 

    [assetWriter finishWriting]; 

    //[self playWavFile]; 
    NSError *err; 
    NSData *audioData = [NSData dataWithContentsOfFile:soundFilePath options: 0 error:&err]; 
    [self.audioDelegate doneRecording:audioData]; 
    //[assetReader release ]; 
    //[assetWriter release ]; 
    NSLog(@"soundFilePath=%@",soundFilePath); 
    NSDictionary *dict = [[NSFileManager defaultManager] attributesOfItemAtPath:soundFilePath error:&err]; 
    NSLog(@"size of wav file = %@",[dict objectForKey:NSFileSize]); 
    //NSLog(@"finish"); 
}]; 

以及在此功能,我打電話與audioData audioDelegate功能doneRecording這是wav格式 。這裏是doneRecording的代碼。

-(void) doneRecording:(NSData *)contents 
{ 
myContents = [[NSData dataWithData:contents] retain]; 
[self returnResult:alertCallbackId args:@"Recording Done.",nil]; 
} 

// Call this function when you have results to send back to javascript callbacks 
// callbackId : int comes from handleCall function 

// args: list of objects to send to the javascript callback 
- (void)returnResult:(int)callbackId args:(id)arg, ...; 
{ 
    if (callbackId==0) return; 

    va_list argsList; 
    NSMutableArray *resultArray = [[NSMutableArray alloc] init]; 

    if(arg != nil){ 
    [resultArray addObject:arg]; 
    va_start(argsList, arg); 
    while((arg = va_arg(argsList, id)) != nil) 
     [resultArray addObject:arg]; 
    va_end(argsList); 
    } 

    NSString *resultArrayString = [json stringWithObject:resultArray allowScalar:YES error:nil]; 
    [self performSelectorOnMainThread:@selector(stringByEvaluatingJavaScriptFromString:) withObject:[NSString stringWithFormat:@"NativeBridge.resultForCallback(%d,%@);",callbackId,resultArrayString] waitUntilDone:NO]; 
    [resultArray release];  
} 

第3步:現在是時候回去傳達給裏面的UIWebView JavaScript的,我們正在做記錄 音頻這樣你就可以開始從美國接受的塊數據。我正在使用websockets將 的數據傳回到javascript。數據將以塊 傳輸,因爲我使用的服務器(https://github.com/benlodotcom/BLWebSocketsServer)是使用 libwebsockets(http://git.warmcat.com/cgi-bin/cgit/libwebsockets/)構建的。

這是您如何在委託類中啓動服務器。上的javascript側

- (id)initWithFrame:(CGRect)frame 
{ 
    if (self = [super initWithFrame:frame]) { 

     [self _createServer]; 
     [self.server start]; 
     myContents = [NSData data]; 

    // Set delegate in order to "shouldStartLoadWithRequest" to be called 
    self.delegate = self; 

    // Set non-opaque in order to make "body{background-color:transparent}" working! 
    self.opaque = NO; 

    // Instanciate JSON parser library 
    json = [ SBJSON new ]; 

    // load our html file 
    NSString *path = [[NSBundle mainBundle] pathForResource:@"webview-document" ofType:@"html"]; 
    [self loadRequest:[NSURLRequest requestWithURL:[NSURL fileURLWithPath:path]]]; 



    } 
    return self; 
} 
-(void) _createServer 
{ 
    /*Create a simple echo server*/ 
    self.server = [[BLWebSocketsServer alloc] initWithPort:9000 andProtocolName:echoProtocol]; 
    [self.server setHandleRequestBlock:^NSData *(NSData *data) { 

     NSString *convertedString = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding]; 
     NSLog(@"Received Request...%@",convertedString); 

     if([convertedString isEqualToString:@"start"]) 
     { 
      NSLog(@"myContents size: %d",[myContents length]); 

      int contentSize = [myContents length]; 
      int chunkSize = 64*1023; 
      chunksCount = ([myContents length]/(64*1023))+1; 

      NSLog(@"ChunkSize=%d",chunkSize); 
      NSLog(@"chunksCount=%d",chunksCount); 

      chunksArray = [[NSMutableArray array] retain]; 

      int index = 0; 
      //NSRange chunkRange; 

      for(int i=1;i<=chunksCount;i++) 
      { 

       if(i==chunksCount) 
       { 
        NSRange chunkRange = {index,contentSize-index}; 
        NSLog(@"chunk# = %d, chunkRange=(%d,%d)",i,index,contentSize-index); 
        NSData *dataChunk = [myContents subdataWithRange:chunkRange]; 
        [chunksArray addObject:dataChunk]; 
        break; 
       } 
       else 
       { 
        NSRange chunkRange = {index, chunkSize}; 
        NSLog(@"chunk# = %d, chunkRange=(%d,%d)",i,index,chunkSize); 
        NSData *dataChunk = [myContents subdataWithRange:chunkRange]; 
        index += chunkSize; 
        [chunksArray addObject:dataChunk]; 
       } 
      } 

      return [chunksArray objectAtIndex:0]; 

     } 
     else 
     { 
      int chunkNumber = [convertedString intValue]; 

      if(chunkNumber>0 && (chunkNumber+1)<=chunksCount) 
      { 
       return [chunksArray objectAtIndex:(chunkNumber)]; 
      } 


     } 

     NSLog(@"Releasing Array"); 
     [chunksArray release]; 
     chunksCount = 0; 
     return [NSData dataWithBase64EncodedString:@"Stop"]; 
    }]; 
} 

代碼是

var socket; 
var chunkCount = 0; 
var soundBlob, soundUrl; 
var smallBlobs = new Array(); 

function captureMovieCallback(response) 
{ 
    if(socket) 
    { 
     try{ 
      socket.send('start'); 
     } 
     catch(e) 
     { 
      log('Socket is not valid object'); 
     } 

    } 
    else 
    { 
     log('socket is null'); 
    } 
} 

function closeSocket(response) 
{ 
    socket.close(); 
} 


function connect(){ 
    try{ 
     window.WebSocket = window.WebSocket || window.MozWebSocket; 

     socket = new WebSocket('ws://127.0.0.1:9000', 
             'echo-protocol'); 

     socket.onopen = function(){ 
     } 

     socket.onmessage = function(e){ 
      var data = e.data; 
      if(e.data instanceof ArrayBuffer) 
      { 
       log('its arrayBuffer'); 
      } 
      else if(e.data instanceof Blob) 
      { 
       if(soundBlob) 
        log('its Blob of size = '+ e.data.size + ' final blob size:'+ soundBlob.size); 

       if(e.data.size != 3) 
       { 
        //log('its Blob of size = '+ e.data.size); 
        smallBlobs[chunkCount]= e.data; 
        chunkCount = chunkCount +1; 
        socket.send(''+chunkCount); 
       } 
       else 
       { 
        //alert('End Received'); 
        try{ 
        soundBlob = new Blob(smallBlobs,{ "type" : "audio/wav" }); 
        var myURL = window.URL || window.webkitURL; 
        soundUrl = myURL.createObjectURL(soundBlob); 
        log('soundURL='+soundUrl); 
        } 
        catch(e) 
        { 
         log('Problem creating blob and url.'); 
        } 

        try{ 
         var serverUrl = 'http://10.44.45.74:8080/MyTestProject/WebRecording?record'; 
         var xhr = new XMLHttpRequest(); 
         xhr.open('POST',serverUrl,true); 
         xhr.setRequestHeader("content-type","multipart/form-data"); 
         xhr.send(soundBlob); 
        } 
        catch(e) 
        { 
         log('error uploading blob file'); 
        } 

        socket.close(); 
       } 

       //alert(JSON.stringify(msg, null, 4)); 
      } 
      else 
      { 
       log('dont know'); 
      } 
     } 

     socket.onclose = function(){ 
      //message('<p class="event">Socket Status: '+socket.readyState+' (Closed)'); 
      log('final blob size:'+soundBlob.size); 
     } 

    } catch(exception){ 
     log('<p>Error: '+exception); 
    } 
} 

function log(msg) { 
    NativeBridge.log(msg); 
} 
function stopCapture() { 
    NativeBridge.call("stopMovie", null,null); 
} 

function startCapture() { 
    NativeBridge.call("captureMovie",null,captureMovieCallback); 
} 

NativeBridge.js

var NativeBridge = { 
    callbacksCount : 1, 
    callbacks : {}, 

    // Automatically called by native layer when a result is available 
    resultForCallback : function resultForCallback(callbackId, resultArray) { 
    try { 


    var callback = NativeBridge.callbacks[callbackId]; 
    if (!callback) return; 
    console.log("calling callback for "+callbackId); 
    callback.apply(null,resultArray); 
    } catch(e) {alert(e)} 
    }, 

    // Use this in javascript to request native objective-c code 
    // functionName : string (I think the name is explicit :p) 
    // args : array of arguments 
    // callback : function with n-arguments that is going to be called when the native code returned 
    call : function call(functionName, args, callback) { 

    //alert("call"); 
    //alert('callback='+callback); 
    var hasCallback = callback && typeof callback == "function"; 
    var callbackId = hasCallback ? NativeBridge.callbacksCount++ : 0; 

    if (hasCallback) 
     NativeBridge.callbacks[callbackId] = callback; 

    var iframe = document.createElement("IFRAME"); 
    iframe.setAttribute("src", "js-frame:" + functionName + ":" + callbackId+ ":" + encodeURIComponent(JSON.stringify(args))); 
    document.documentElement.appendChild(iframe); 
    iframe.parentNode.removeChild(iframe); 
    iframe = null; 

    }, 

    log : function log(message) { 

     var iframe = document.createElement("IFRAME"); 
     iframe.setAttribute("src", "ios-log:"+encodeURIComponent(JSON.stringify("#iOS#" + message))); 
     document.documentElement.appendChild(iframe); 
     iframe.parentNode.removeChild(iframe); 
     iframe = null; 

    } 

}; 
  1. 我們要求的javascript側,以html側連接()上體負荷

  2. 一旦我們從startCapture函數收到回調(captureMovieCallback),我們會發送 開始消息,指示我們已準備好接受數據。

  3. 目標c端的服務器將wav音頻數據拆分爲chunksize = 60 * 1023, 的小塊並存儲在數組中。

  4. 發送第一個塊回到JavaScript端。

  5. javascript接受此塊並從服務器發送它需要的下一個塊的數量。

  6. 服務器發送此號碼指示的塊。這個過程重複,直到我們 發送最後一個塊爲JavaScript。

  7. 在最後我們發送停止消息回到JavaScript方面,表明我們完成了。它 顯然是3個字節的大小(這是作爲標準來打破這個循環。)

  8. 每個塊存儲爲一個小的數組blob。現在我們創建一個更大的斑點從這些 小斑點使用以下行

    soundBlob = new Blob(smallBlobs,{「type」:「audio/wav」});

    此blob上傳到服務器,它將此blob寫入wav文件。 我們可以將url傳遞給此wav文件作爲音頻標籤的src以在JavaScript端重播它。

  9. 我們在發送blob到服務器後關閉websocket連接。

    希望這是明確的足以理解。

+0

你能幫我解決這個問題嗎? http://stackoverflow.com/q/43867801/3378413 –

0

如果您只想播放聲音,那麼使用iOS中的一種本地音頻播放系統而不是HTML音頻標籤會更好。

+0

我是開發項目的一部分,我們正在構建一個系統,該系統將用於Andriod,PC,MAC和iOS等不同平臺。我想從目標c(在iPad上)將音頻('caf')文件返回給javascript,以便將音頻上傳到服務器的代碼可以在不同的平臺上共享。我想盡量減少本地平臺的參與。我知道像apache cardova這樣的平臺做同樣的事情,但不知道他們是如何做到的。 –

+0

Cordova不會按照您要求的方式在JavaScript和Native之間傳遞音頻*數據*,它只是使用本地平臺音頻播放庫以JavaScript請求的文件路徑播放音頻文件。 –

+0

Ben,我上面已經發布瞭解決方案。這個解決方案的性能並不差。 –