c# - Noisy audio clip after decoding from base64 -


i encoded wav file in base64 (audioclipname.txt in resources/sounds).

here source wave file

then tried decode it, make audioclip , play this:

public static void createaudioclip() {     string s = resources.load<textasset> ("sounds/audioclipname").text;      byte[] bytes = system.convert.frombase64string (s);     float[] f = convertbytetofloat(bytes);      audioclip audioclip = audioclip.create("testsound", f.length, 2, 44100, false, false);     audioclip.setdata(f, 0);      audiosource = gameobject.findobjectoftype<audiosource> ();     as.playoneshot (audioclip); }  private static float[] convertbytetofloat(byte[] array)  {     float[] floatarr = new float[array.length / 4];      (int = 0; < floatarr.length; i++)      {         if (bitconverter.islittleendian)              array.reverse(array, * 4, 4);          floatarr[i] = bitconverter.tosingle(array, * 4);     }      return floatarr; } 

every thing works fine, except sound 1 noise.

i found this here on stack overflow, answer dosnt solve problem.

here details wav file unity3d:

enter image description here

does know problem here?

edit

i wrote down binary files, 1 after decoding base64, second after final converting, , compared original binary wav file:

enter image description here

as can see, file encoded correctly cause decoding , writing file down this:

string scat = resources.load<textasset> ("sounds/test").text;  byte[] bcat = system.convert.frombase64string (scat); system.io.file.writeallbytes ("assets/just_decoded.wav", bcat); 

gave same files. files have length.

but final 1 wrong, problem somewhere in converting float array. dont understand wrong.

edit:

here code writing down final.wav:

string scat = resources.load<textasset> ("sounds/test").text;  byte[] bcat = system.convert.frombase64string (scat); float[] f = convertbytetofloat(bcat);  byte[] bytearray = new byte[f.length * 4]; buffer.blockcopy(f, 0, bytearray, 0, bytearray.length);  system.io.file.writeallbytes ("assets/final.wav", bytearray); 

the wave file try play (meow.wav) has following properties:

  • pcm
  • 2 channels
  • 44100 hz
  • signed 16-bit little-endian

your main mistake is, interpreting binary data as if representing float. is, bitconverter.tosingle() does.

but need is, create signed 16-bit little-endian value (as specified in wavefile-header) each 2 bytes, cast float , normalize it. , each two bytes make sample in case of file (16-bit!), not four bytes. data is little endian (s16le), have swap if host machine wasn't.

this corrected conversion function:

private static float[] convertbytetofloat(byte[] array) {     float[] floatarr = new float[array.length / 2];      (int = 0; < floatarr.length; i++) {         floatarr[i] = ((float) bitconverter.toint16(array, * 2))/32768.0;     }      return floatarr; } 

and should skip on header of wave-file (the real audio data starts @ offset 44).

for clean solution, have interpret wave-header correctly , adapt operations according specified there (or bail out if contains unsupported parameters). example sample format (bits per sample , endianess), sample rate , number of channels must taken care of.


Comments

Popular posts from this blog

sublimetext3 - what keyboard shortcut is to comment/uncomment for this script tag in sublime -

java - No use of nillable="0" in SOAP Webservice -

ubuntu - Laravel 5.2 quickstart guide gives Not Found Error -