c# - Converting UInt64 to a binary array -
i having problem method wrote convert uint64 binary array. numbers getting incorrect binary representation.
results
correct 999 = 1111100111
correct 18446744073709551615 = 1111111111111111111111111111111111111111111111111111111111111111
incorrect? 18446744073709551614 = 0111111111111111111111111111111111111111111111111111111111111110
according online converter binary value of 18446744073709551614 should 1111111111111111111111111111111111111111111111111111111111111110
public static int[] getbinaryarray(uint64 n) { if (n == 0) { return new int[2] { 0, 0 }; } var val = (int)(math.log(n) / math.log(2)); if (val == 0) val++; var arr = new int[val + 1]; (int = val, j = 0; >= 0 && j <= val; i--, j++) { if ((n & ((uint64)1 << i)) != 0) arr[j] = 1; else arr[j] = 0; } return arr; }
fyi: not homework assignment, require convert integer binary array encryption purposes, hence need array of bits. many solutions have found on site convert integer string representation of binary number useless came mashup of various other methods.
an explanation why method works numbers , not others helpful. yes used math.log , slow, performance can fixed later.
edit: , yes need line use math.log because array not 64 bits long, example if number 4 in binary 100 array length 3. requirement of application way.
it's not returned array input uint64.maxvalue - 1
wrong, seems uint64.maxvalue
wrong.
the array 65 elements long. intuitively wrong because uint64.maxvalue
must fit in 64 bits.
firstly, instead of doing natural log , dividing log base 2, can log base 2.
secondly, need math.ceiling
on returned value because need value fit inside number of bits. discarding remainder cast int
means need arbitrarily val + 1
when declaring result array. correct scenarios - 1 of not correct is... uint64.maxvalue
. adding 1 number of bits necessary gives 65-element array.
thirdly, , finally, cannot left-shift 64 bits, hence i = val - 1
in loop initialization.
haven't tested exhaustively...
public static int[] getbinaryarray(uint64 n) { if (n == 0) { return new int[2] { 0, 0 }; } var val = (int)math.ceiling(math.log(n,2)); if (val == 0) val++; var arr = new int[val]; (int = val-1, j = 0; >= 0 && j <= val; i--, j++) { if ((n & ((uint64)1 << i)) != 0) arr[j] = 1; else arr[j] = 0; } return arr; }
Comments
Post a Comment