The most important use for a byte is holding a character code. The bits in a byte are numbered from bit zero (b0) through seven (b7) as given follows:
Bit 0 (b0) is the low order bit or least significant bit and bit 7 (b7) is the high order bit or most significant bit of the byte.
As here we see that a byte contains exactly two nibbles where Bits b0 to b3 comprise the low order nibble and bits b4 through b7 form the high order nibble.
Since a byte contains exactly two nibbles, byte values require two hexadecimal digits.
As the traditional modern computer is a byte addressable machine, it turns out to be more efficient to manipulate a whole byte than an individual bit or nibble.
This is the reason that most programmers use a whole byte to represent data types that require no more than 256 items
Since a Byte contains eight bits, it can represent 28 or 256 different values because the maximum 8-bit binary number may by 1111 1111 that is equivalent to 256(Decimal) therefore generally a byte is used to represent the following:
- unsigned numeric values in the range 0 to 255
- signed numbers in the range -128 to +127
- ASCII character codes
- And other special data types requiring no more than 256 different values as many data types have fewer than 256 items so eight bits is usually sufficient.
|
|