The Foundation of Computing
At the most fundamental level, computers speak in binary—ones and zeros. Understanding how these translate into the hexadecimal system used in programming is essential for any developer.
Binary (Base 2)
Each binary digit (bit) represents a power of 2:
| Bit Position | 7 | 6 | 5 | 4 | 3 | 2 | 1 | 0 |
|---|---|---|---|---|---|---|---|---|
| Value | 128 | 64 | 32 | 16 | 8 | 4 | 2 | 1 |
The number 11010110 in binary equals 214 in decimal.
Hexadecimal (Base 16)
Hex uses 0-9 and A-F to represent values. Each hex digit maps to exactly 4 bits:
0x00= 00xFF= 2550x1E40AF= the blue in our brand palette
Common Data Units
| Unit | Bits | Bytes |
|---|---|---|
| 1 Byte | 8 | 1 |
| 1 Kilobyte | 8,192 | 1,024 |
| 1 Megabyte | 8,388,608 | 1,048,576 |
| 1 Gigabyte | 8,589,934,592 | 1,073,741,824 |
Why It Matters
Understanding these conversions helps you:
- Estimate memory usage in your applications
- Debug color values in CSS
- Optimize network payloads
- Work with low-level protocols and APIs