Finish your degree!

02 Jul 2025

[ general  memories  ]

Everybody knows that Bill Gates dropped out of Harvard to found Microsoft. Mark Zuckerberg dropped out of Harvard too. Both of them changed the world — and both went on to become billionaires, all without a university education. And now we know that AI is going to automate most professions, especially programming. Why hang around and get a degree when there is so much money to be made right now?

Some history: the emergence of computers

But first, let’s recall how we got here in the first place. There are many candidates for the first computer. Here are a few of them:

As we see, they all emerged in a university context. But what about Charles Babbage’s Analytical Engine, the Bletchley Park Colossus or Konrad Zuse’s Z3? All had the support of national governments. Computers originated in the public sector, as did countless further innovations in all aspects of computer technology and applications. A computer science student at a good research university is immersed in this dynamic environment. With a few notable exceptions, this cannot be said of the private sector, which quite reasonably is fixated on the short term bottom line.

The emergence of computer science degrees

Like many aspiring computer scientists of my generation, I took a degree in mathematics. I went to Caltech for a variety of reasons, but their strength in computer science wasn’t one of them. Computer science degrees were slow to take off, perhaps because they seemed unnecessary: in the 1970s, no leading computer scientist had one. Most had studied mathematics, physics, or in the case of Tony Hoare, classics and philosophy (that is, “Greats” at Oxford). However, Edsger Dijkstra did take a programming course at Cambridge — in 1951! Trained in theoretical physics, the experience prompted him to devote his career to the new technology.

Cambridge was a pioneer in computer science teaching:

People from overseas may need reminding that a degree course in England typically runs for three years only, focused on a single subject. It does not include any of the literary or cultural canon that forms the basis of a “liberal education”, as in the USA. A degree course in History, English or Philosophy is equally specialised and focused. At Cambridge, some students can encounter multiple subjects by switching from one Tripos to another at the end of an academic year.

People said, back then, that there was no need to study computer science. You should get a degree in a real subject, like physics, and “easily pick up programming in a couple of weeks”. Physics is great if you want to learn about electromagnetism, thermodynamics, quantum mechanics, relativity and so on, but it will be no more useful to your computing career than studying Spinoza and Schopenhauer. Most of what is taught in an undergraduate computer science course today was already known four decades ago: computer architecture, operating systems, compiler design, user interface design, the key data structures and algorithms; also theory including operational and denotational semantics, automata theory, formal languages, computation theory, concurrency theory; also programming paradigms including OOP, functional programming, logic programming, process management and synchronisation. Plus my favourite obscure topic: queueing theory. We at Cambridge were already teaching much of this, and it’s essential knowledge for any computing professional.

By the way, the course Gates dropped out of was Mathematics. Let’s be honest: Mathematics and Physics are both tougher than Computer Science. Gates wasn’t sure he was good enough, and what are Lie Algebras good for anyway?

Do you want to change the world?

It’s clear that Bill Gates took the right decision. He caught the moment when the PC revolution was taking off, powered by the VLSI revolution and Moore’s Law. Acquiring QDOS and developing key office software like MS Word and Excel, he quickly made Microsoft the dominant force in the computer industry. Mark Zuckerberg and a number of other tech billionaires followed similar trajectories. Whether the world changed for the better is not always clear: Facebook, especially, has proved to be a malign force, while Microsoft set the lowest possible bar for software quality and went out of its way to weaken computing standards, succeeding in the case of email.

At least Gates and Zuckerberg had a good grasp of the technology they would need when they dropped out of their degree courses. Elizabeth Holmes had nothing but a teenager’s dream. She wanted to change the world by revolutionising blood testing. Dropping out of Stanford at 19, she came up with a brand – Theranos – and a Star Wars themed slogan. But she didn’t have a new technology for blood testing, or any scientific basis for such a technology; she did not even know what the scientific challenges were. Remarkably, her fervour and commitment were enough to persuade a number of scientifically illiterate investors, including George Shultz, Henry Kissinger, Betsy DeVos and Rupert Murdoch, to part with large sums. They lost their money and Holmes went to jail for fraud.

In such a case, where you have a dream that no one knows how to realise, you have to change the world the slow way. That means finishing your degree, continuing to a PhD, all the while researching the chemistry of blood testing or whatever your dream is about. Eventually you may achieve a breakthrough that will help to achieve your dream, if only somewhat, over decades, and without making you a billionaire. (Kind of like my case.) Or launch your startup in your 30s, equipped with knowledge, experience and the emotional maturity you did not have at 19.

Anyway: as a result of the EDSAC, four people at Cambridge got Nobel Prizes. They all were users of the EDSAC, not its developers. (Project lead Maurice Wilkes did receive quite a few other honours.) None of the original team got rich. I never heard any of them complain about that.

Anyway – why do people desire great wealth? I can’t see the attraction of needing a bodyguard 24/7. The wealthy crave bigger and bigger thrills just to stave off boredom: they get blasted into space or taken up Mount Everest or down to visit the Titanic. If they survive such dangerous adventures, they might yet die in mysterious circumstances in prison or in their grandiose palace or their yacht. No, thank you.

AI is taking your job

A lot of people are afraid that AI will take their jobs, and a lot of CEOs are hoping that AI will eliminate a lot of jobs. That leads immediately to a joke (sorry):

Alice: They are going to replace our CEO by AI! Soon, a soulless entity will be making heartless decisions!

Bob: But how is that different from now?

AI is capable of astonishing feats, but they look like the sort of feats that might let a professional work faster and better without replacing their job altogether. That is to say, AI resembles previous technologies such as word processors and spreadsheets, which somehow did not eliminate the jobs of secretaries and accountants. Regarding what AI is capable of right now, do a Google search for AI ruined Duolingo. Practice dialogues for learning a foreign language were never thrilling, and are surely the perfect task to automate using AI: you can control the subject matter and vocabulary through your prompt. Duolingo customers seem to think otherwise.

Another obvious application for AI is the law. Lengthy and detailed affidavits, legal arguments buttressed by references to case law, are tedious and time-consuming to write. The temptation to generate one instantly through an LLM has proved to be irresistible, but hallucinations make it essential to scrutinise every word and every reference. The question is whether AI is still saving time once you count the time spent verifying its output. Certainly a lot of lawyers are not bothering to do that. Perhaps AI could be used to search databases of real cases for genuinely relevant legal arguments. This would require creating specialist tools, as opposed to expecting an LLM to solve the problem out of the box.

I have lived long enough to know that you never say never about new technology. Nevertheless, if AI is unsuitable not just for legal affidavits but even for “Hi Natasha, have you seen Professor Pavlov?” language drills, we don’t need to worry about them replacing professionals who can use their knowledge and skills to find real, practical solutions to problems.

Getting paid to drop out

However, billionaire Peter Thiel seems to have decided that avoiding university is such a good thing, he offers selected candidates $200,000 each to drop out. When he and his colleagues founded PayPal in the late 1990s, they got for free the fruits of research from across the academic world. As we saw, the computer was created through by universities and public funding, as was the Internet. The World Wide Web concept itself came from CERN, a publicly funded research institution. Tim Berners-Lee gave us the Web for nothing. Thiel turned all that freely given research into billions of dollars and unquantifiable political influence, but turned his back on the source of his wealth and power. We are now witnessing the defunding of American science and I would not be surprised to learn that Thiel had a hand in that. Bill Gates learned a different lesson from his life trajectory, and has given lavish gifts to computer science departments in the USA and also here in Cambridge, and I write these very words in a William Gates Building.

Lesson

My message to young people is simple: learn when you have the opportunity. It is still possible to learn when you get older, but you seldom get the time. For most of my career, I’ve relied on material I learned when I was under 30. Choose wisely: trends come and go quickly, but foundational material stays valid and pertinent forever. Finish that PhD, and if your music career ever falters, you can always fall back on your thesis on Overconvergent Siegel Modular Symbols.

The alternative is to become the sort of ignorant fool who misuses Excel because he has never heard of databases. You’re welcome.