2013-10-05 74 views
1

我試圖端口下面的從Java到JavaScript的:的JavaScript和Java - 字符串編碼

String key1 = "whatever"; 
String otherKey = "blah"; 
String key2;  
byte keyBytes[] = key1.getBytes(); 
for (int i = 0; i < keyBytes.length; i++) { 
    keyBytes[i] ^= otherKey.charAt(i % otherKey.length()); 
} 
key2 = new String(keyBytes); 

下面是我寫的東西:

var key1 = "whatever"; 
var other_key = "blah"; 
var key2 = ""; 
for (var i = 0; i < key1.length; ++i) 
{ 
    var ch = key1.charCodeAt(i); 
    ch ^= other_key.charAt(i % other_key.length); 
    key2 += String.fromCharCode(ch); 
} 

然而,他們給出不同的答案。 ...

什麼是catch,JavaScript字符串編碼不同,我該如何糾正它們?

+3

Java字節是8位; javascript char是unicode 16位。 – fred02138

+2

代碼的第一部分以多種方式崩潰,不應該在Java中使用,更不要說移植到Javascript。 –

+2

@JonSkeet請問你能解釋一下爲什麼它會被破壞? :) –

回答

0

你在你的代碼忘了一個charCodeAt(),如下:

var key1 = "whatever"; 
var other_key = "blah"; 
var key2 = ""; 
for (var i = 0; i < key1.length; ++i) 
{ 
    var ch = key1.charCodeAt(i); 
    ch ^= other_key.charAt(i % other_key.length).charCodeAt(0); 
    key2 += String.fromCharCode(ch); 
} 

,在Java中的隱式轉換(字符到字節)操作,在此之前^=

我改變在java和javascript中查看該字節數組的代碼。運行後的結果是一樣的:

的Javascript:

function convert(){ 
    var key1 = "whatever"; 
    var other_key = "blah"; 

    var key2 = ""; 
    var byteArray = new Array(); 

    for (var i = 0; i < key1.length; ++i){ 
     var ch = key1.charCodeAt(i); 
     ch ^= other_key.charAt(i % other_key.length).charCodeAt(0); 

     byteArray.push(ch); 
     key2 += String.fromCharCode(ch); 
    } 

    alert(byteArray); 
} 

結果:21,4,0​​,28,7,26,4,26


Java:

static void convert() { 
    String key1 = "whatever"; 
    String otherKey = "blah"; 
    String key2; 
    byte keyBytes[] = key1.getBytes(); 
    for (int i = 0; i < keyBytes.length; i++) { 
     keyBytes[i] ^= otherKey.charAt(i % otherKey.length()); 
    } 
    System.out.println(Arrays.toString(keyBytes)); 
    key2 = new String(keyBytes); 
} 

結果:[21,4,0,28,7,26,4,26]