That limitation was the infamous 640K barrier on PCs.

What was up with that?

Why 640K?

The word Windows 11 with a map pin on the right and the default windows 11 wallpaper as background.

That’s a paltry sum by today’s standards, where our phones flauntseveral gigabyteswithout a second thought.

But travel back to the early 1980s, and you’ll be in a different technological landscape altogether.

But why the odd 640K limit for system RAM?

Old computer of the eighties in a history museum.

FERNANDO MACIAS ROMO / Shutterstock.com

IBM reserved the remaining memory address space for other uses like ROM (Read-Only Memory) and hardware peripherals.

This was actually a reasonable division, considering the limited applications and hardware capabilities at the time.

Bill Gates has been famously misquoted as saying, “640K ought to be enough for anybody.”

The Wolfenstein 3D hardware test screen showing available EMS and XMS memory

id Software

The Twist in the Plot: It Wasn’t Enough!

As you may not be shocked to hear, software began to grow in complexity and sophistication.

This allowed programs to dynamically access different pages of expanded memory as needed.

Doom Cover from Nintendo Store

id Software

XMS was also developed by Lotus, Intel, and Microsoft.

These chips allowed access to vast amounts of memory that would have been unthinkable just years before.

The 80386 could address up to 4GB of RAM in both modes using a technique called paging.

However, accessing extended memory in Real Mode requires special software such as DOS extenders or memory managers.

This evolution demanded more memory.

The gaming industry became a catalyst for technological progress.

Game developers worked closely with hardware manufacturers, driving innovation and pushing the boundaries of personal computing.

With the 80386, multitasking became a thing, making full use of available memory.

Perhaps most importantly, don’t forget that technology can still surprise us.