arrays - c# Buffer explanation -
this might beginer's question i've been reading , i'm finding hard understand.
this sample msdn page subject (just little smaller).
using system; class setbytedemo { // display array contents in hexadecimal. public static void displayarray(array arr, string name) { // array element width; format formatting string. int elemwidth = buffer.bytelength(arr) / arr.length; string format = string.format(" {{0:x{0}}}", 2 * elemwidth); // display array elements right left. console.write("{0,7}:", name); (int loopx = arr.length - 1; loopx >= 0; loopx--) console.write(format, arr.getvalue(loopx)); console.writeline(); } public static void main() { // these arrays modified setbyte. short[] shorts = new short[2]; console.writeline("initial values of arrays:\n"); // display initial values of arrays. displayarray(shorts, "shorts"); // copy 2 regions of source array destination array, // , 2 overlapped copies source source. console.writeline("\n" + " array values after setting byte 1 = 1 , byte 3 = 200\n"); buffer.setbyte(shorts, 1, 1); buffer.setbyte(shorts, 3, 10); // display arrays again. displayarray(shorts, "shorts"); console.readkey(); } }
setbyte
should easy understand, if print shorts array before doing setbyte
operation array looks this
{short[2]} [0]: 0 [1]: 0
after doing first buffer.setbyte(shorts, 1, 1);
array becomes
{short[2]} [0]: 256 [1]: 0
and after setting buffer.setbyte(shorts, 3, 10);
array becomes
{short[2]} [0]: 256 [1]: 2560
at end, in example print array right left:
0a00 0100
i don't understand how works, can give me information this?
the .net types use little endianness. means first byte (0th, actually) of short
, int
, etc. contains least significant bits.
after setting array seems byte[]
:
0, 1, 0, 10
as short[]
interpreted this:
0 + 1*256 = 256, 0 + 10*256 = 2560
Comments
Post a Comment