Short integer
From Wikipedia, the free encyclopedia
In computer programming, a short integer is a variable that can hold a positive or negative whole number whose range is less than or equal to that of a standard integer on the same machine.
Although there is no global standard, it is common for a short integer to either be exactly half the size, or the same size as a standard integer (in the same context). In this latter case, use of the word 'short' is technically redundant, but may be used to indicate that it is not a long integer.
A variable defined as a short integer in one programming language may be different in size to a similarly defined variable in another. In some languages this size is fixed across platforms, while in others it is machine dependent. In some languages this datatype does not exist at all.
[edit] Signedness
In some programming languages, a short integer can be declared signed or unsigned. In signed shorts, one bit is used as a sign bit that indicates the sign of the number, dividing the number of positive numbers that the type can represent by two. There are several representations of negative numbers, the most common in modern computing being two's complement.
[edit] Common sizes
Programming language | Platforms | Data type name | Signedness | Storage in bytes | Minimum value | Maximum value |
---|---|---|---|---|---|---|
C and C++ | all platforms | short |
signed | 2 | -32,768 or -215 |
32,767 or 215-1 |
unsigned short |
unsigned | 2 | 0 | 65,535 or 216-1 |
||
C# | all platforms | short |
signed | 2 | -32,768 or -215 |
32,767 or 215-1 |
ushort | unsigned | 2 | 0 | 65,535 or 216-1 |
||
Java | Java platform | short |
signed | 2 | -32,768 or -215 |
32,767 or 215-1 |
In the Windows API, the datatype SHORT
is defined as a 16-bit signed integer on all machines.