[Courses] [C] Beginner's Lesson 4B: Arrays, Qualifiers, and Reading Numbers
Lorne Gutz
lgutz at vistar.ca
Tue Oct 15 08:09:54 EST 2002
On Friday 11 October 2002 13:10, KWMelvin wrote:
Be careful here....its always the simple things that get you.
First the size of 'int' is compiler dependent. Which usually
means it is 16 or 32 bits long. It is usually the size of a word
on the CPU that you are compiling the code for. GNU C will
generate 16 bit 'ints' if you compile code for a small AVR
chip, but 32 bit 'ints' if compiling code for a Pentium chip in
most PCs.
Second, the signed-type-specifier never existed until the advent
of ANSI C. So all the old traditional C compilers will default to
unsigned!!! So B carefull.
I use the following type defines to solve this nasty little problem.
They are not my invention, they are actually in one of the GNU
C header files, not sure which one right off the top.
typedef unsigned char uint8_t;
typedef unsigned short uint16_t;
typedef unsigned long uint32_t;
typedef signed char int8_t;
typedef signed short int16_t;
typedef signed long int32_t;
Note: a short is always 16 bits and a long is 32 bits but
an int can be either 16 or 32.......maybe even 64 on
some of the larger computers.
cheers
Lorne
PS: Just trying to keep you on your toes. :)
> The range of numbers that can be used with these depends on whether
> the integer is signed, or unsigned. For example, on some machines,
> a `signed int' has the range -32768 (2^15) to 32767 (2^15 -1). On
> the same machine, an `unsigned int' has the range of 0 to 65535 (2^16).
>
> All `int' declarations default to 'signed', so the declaration:
>
> signed long int var_name;
>
> is the same as:
>
> long int var_name;
So this is only true about 99% of the time
*********************************
More information about the Courses
mailing list