A Beginner’s Guide to Understanding How PCs Operate.

Photo by Dan Cristian Pădureț

As a self-proclaimed nerd, I was quietly ashamed yesterday when a senior engineer at my job argued that too few engineers today truly understand how computers work. Of course, my initial reaction was to jump to the defensive: “I was taught that at Uni, it’s the basics!” I had a quiet chuckle to myself shortly afterwards. Since when had the education system ever truly prepared me for the industry?

Sure, I knew bits and pieces of how a computer worked from my stint as a software developer, plus a broad (if basic) view from my time at Uni. But ultimately, I realized I couldn’t have a meaningful discussion about the very tool I used day in and day out: my PC. How could that be allowed? How could I consider myself a cyber professional without knowing that much?

This article aims to remedy that in an easy-to-consume format. I am not complacent in my lack of knowledge, and if you’ve read this far, I suspect you aren’t either.

What is a Computer?

Computers were invented because humans are inefficient. Lazy, too. It’s much better to design a wonderful, magical machine that does our thinking for us than be forced to think ourselves. Your PC is a computer; but so is the Antikythera mechanism, built c.100 BC to navigate the stars. Computers are not a recent phenomenon, but this article focuses on modern, digital computers.

Computers take input data, do cool things with it, and then spit out some output data. Alan Turning’s theoretical model (the Turing Machine) describes a computer as a machine capable of performing any computation that could be described as a set of logical rules. To translate that into Engish: any task that you can theoretically do with a pen and paper (and enough time), a ‘computer’ ought to be able to do too. It can probably just do it more quickly than you can.

Over Simplified Representation of How a Computer Works. Image by the author.

What is a Computer Made Of?

Due to a whole lot of physics-related stuff, a computer can only comprehend two…