- #1
- 63
- 0
Homework Statement
this is in my function
GET_UINT32( X, input, 0 );
this is a macro
#define GET_UINT32(n,b,i) \
{ \
(n) = ( (uint32) (b)[(i) ] << 24 ) \
| ( (uint32) (b)[(i) + 1] << 16 ) \
| ( (uint32) (b)[(i) + 2] << 8 ) \
| ( (uint32) (b)[(i) + 3] ); \
}
3435973836=0xcccccccc
when inputted into the macro, it out puts
0x4E6F7726 =1315927840.
Oxcc, Oxcc, Oxcc, Oxcc, Oxcc, Oxcc, Oxcc, Oxcc
into
0x4E, 0x6F, 0x77, 0x20, 0x69, 0x73, 0x20, 0x74
The Attempt at a Solution
I am not sure how to begin. I am taking a cryptography class [/B]for fun as a statistics major. I tried understanding it but it just confused me. I want to see how (uint32) (b)[(i) + 1] << 16 ) changes the input
Last edited: