MB to GB Conversion — 1000 or 1024?

It's one of the most frustrating aspects of digital technology: how many Megabytes are exactly in a Gigabyte? The answer changes depending on who you ask. To automatically calculate both outputs, jump to our main converter page. Otherwise, read the full history below.

Base-10 (Decimal Standard)

1 GB = 1000 MB

Follows the International System of Units (SI). Used by storage manufacturers (Hard drive and SSD packaging) and Apple's macOS.

Base-2 (Binary Standard)

1 GiB = 1024 MB

Follows the International Electrotechnical Commission (IEC). Used by Windows OS, RAM modules, and developers.

Why the inconsistency?

In the early days of computing, engineers noticed that `2^10` equals `1024`, which was very close to `1000`. Since the scientific community already used standard "Kilo" prefixes for multiples of 1000, engineers lazily adopted "Kilobyte" to mean 1024 bytes.

This worked fine when discussing small scale Kilobytes. However, as data scales to Gigabytes and Terabytes, the gap between the base-10 and base-2 math widens drastically. This causes the famous "missing storage space" problem.

The "Missing Storage" Phenomenon

If you buy a 1 Terabyte (TB) hard drive, it's marketed with 1,000,000,000,000 bytes (Base-10). But when you plug it into a Windows machine, Windows calculates using Base-2. It divides those bytes by 1024 over and over again, displaying your brand-new drive as having only 931 GB.

The IEC Solution

In 1998, the IEC created new prefixes strictly for Base-2 calculations, placing an 'i' in the name. So, exactly 1024 Megabytes became 1 Gibibyte (GiB). While this technically solves the confusion, the new prefixes haven't become universally adopted in consumer language, which means `GB` is still often used incorrectly to refer to `GiB` standards.

Keep Reading