Writer's desk

Thoughts and stationery

Why Learning as an Adult Is Easier Than You Think (And How to Do) — June 4, 2025

Why Learning as an Adult Is Easier Than You Think (And How to Do)

The joy of mastery: Why adults actually have the edge


Evolution is a slow teacher. Humans have been able to hasten the stolid build-up of genetic memory through teaching  and culture. But we still come armed with a mental model of the world when we first emerge into it, given to us through countless generations of natural selection. Scientists have been able to suss out that babies have an in-built sense of object permanence, number and probability. They watched newborns as a screen obscured a red ball teetering on the edge of a table and noted the surprise on their faces when a green ball dropped, a look that was absent when it was the expected red ball that appeared.

This surprise caused the babies to pay closer attention and potentially update their mental model. It is surprise, and the delight we take in it, that sparks our desire to find out more about the world. It is at the heart of learning.

And learning can be learnt. Babies can seem like learning machines – they soak up their mother tongue without thought and develop the skills in their first few years that will help them survive and thrive.

But just because we have left the hothouse learning environment of our early years doesn’t mean we are cut off from learning. We can master skills at quickly as an adolescent – if given the same time, space and resources.

It starts like this:

There is this imagined me. He can do all these things I can’t. He’s good at chess, knows about photography, can read Latin and knows more about our world. I know how I can become him. I can learn all these skills. But learning, especially as an adult self-learner, is difficult. So, before I could work towards my goals, I had to learn more about how we learn.

Learners are motivated by five things which make learning rewarding:

  • Survival
  • Material advancement
  • Self-fulfilment
  • Social position
  • Mental sharpness

First, daily survival. You need the skills of cooking, cleaning and more to at least come across as a semi-competent person.

Gaining new skills helps with career progression. Knowing Excel can make work easier, more productive and boost progress towards a promotion.

Beyond material wealth, persistent learning can lead to a more fulfilling life. We use the word hobby as a stand in for what is self-directed learning. Whether it is photography, baking, sailing or any of the countless other skills people enjoy, all are different arenas where we are continuous students.

We can learn to impress. Being able to identify a Cézanne from a Seurat gives someone cultural heft. It’s impressive if your conversation partner can outline clearly and entertainingly how the printing press revolutionised the medieval world and what parallels that has with the internet.

Photo by Marco Djallo on Unsplash

Finally, learning keeps minds sharp. At the beginning of the 21st century, life expectancy climbed to the highest it’s ever been in human history. As we exercise to halt our bodies from falling into frailty, so we stretch our minds to safeguard them from slumping into senility.

Learning becomes more difficult as we get older. Though it is not because our minds decline. There are no teachers or lecturers to guide us. We have great demands on our time. We have to work, socialise, take care of ourselves and those around us. We may lack the incentive to allot some of that limited free time and money to the effort needed.

Growing up, tests and exams gave us extrinsic motivations for studying and learning new material. There is no one now, though, to punish you if you put off till tomorrow reading up on a potential new hobby that you know will make your life more fun today. You don’t get that promotion only because you finished an online course on business communication.

Picking up new skills as an adult must be its own reward. These intrinsic prizes are more compelling that extrinsic ones anyway. The sense of satisfaction we gain from finally completing our first knitted sweater or winning that first club game of chess can’t be beaten.

We lead many lives. They can be professional, spiritual, social, philosophical, physical and more. Only when we embrace and try to better each of these separated strands that braid and become our holistic existence, can we say we are living to the fullest. We must pay attention to our mental lives. And committing ourselves to becoming dedicated students, of whatever field that sparks interest, is one of the best ways of leading a more rewarding mental life.

Becoming such a student isn’t hard or mysterious. We only need a few gentle nudges and suggestions before we can flourish. There are approaches and models we can all easily use that make learning easier and more fulfilling. Anyone can do it. Learning can be learned.

1. Metacognition: Know how your brain works and learns

The nematode worm is one of the world’s more simple creatures. It has a three-day life cycle and only 959 cells. Yet, it can learn. It learns via habituation and association. Though habituation, it learns to respond and adapt to the repeated presence of a stimulus. Association lets it remember aspects of its environment that help it predict other things.

Photo by Milad Fakurian on Unsplash

Learning is within the reach of everyone. But there is one tool that leaves the ability to learn more easily within grasp. The British Educational Endowment Foundation identified metacognition, knowing about the powers, limitations and workings of the brain or how we learn, as one of the most effective ways of boosting how well students learn.

Stanislas Dehaene, a neuroscientist, identified four metacognitive pillars that help the brain absorb knowledge and information to create learning.

Four pillars of learning

Focused attention

We can become saturated with information. Attention is a way of filtering out what is unimportant and attending to what is. It tells us when to pay attention, what to pay attention to and how to pay attention. Paying conscious, focused attention to something changes the brain. Neurons fire and continue firing far longer than they would otherwise. This strengthens synapses and makes the information easier to recall.

Meditating, made accessible through apps such as Balance or Headspace, is one way to build attention.

By focusing intently on study material, free from distractions, it becomes much easier to store and retrieve information, making learning much more effective.

Active engagement

Merely reading, watching or listening to lessons cannot lead to a deep understanding of a topic though ,  no matter how much focus you give them. Passivity is the enemy of learning. Curiosity and surprise sit at the centre of active engagement. When learning, we come up with ideas about how the world works , such as  how to conjugate a verb in a foreign language so that you can get your meaning across ,  and then test those ideas out.

Rephrasing learned concepts in our own words is one of the easiest ways to engage actively while learning. This helps immerse the brain in the concepts themselves and embeds them more securely in the mind.

Error feedback

But we have to test those hypotheses against the real world. If our understanding is wrong, we have to correct it. When we get something wrong, when our predictions about the world fail, we get surprised. As with newborns, surprise is one of the key reactions we need to learn. It is not enough to have only a theoretical knowledge of a subject. We have to put that knowledge to the test and see if it works in the manner we expect. As a self-learner though, it can be hard to get quality error feedback. There is no teacher there to help guide us.

Photo by Brett Jordan on Unsplash

Determining what type of feedback we need is crucial to success. We don’t need formal exams or an instructor to watch over us. Put knowledge to practical use and see if the outcome meets what is expected. Chat to someone in a target foreign language, play a game of chess and talk about it with the opponent afterwards, take a photograph in challenging conditions and see how it comes out. All these are great ways to get feedback and help adjust our mental models ,  leading to better learning.

Consolidation

Even if we deploy the three earlier pillars, it won’t be enough to gain real expertise in a subject. It takes time and effort over the long term to go from struggling to mastery. Neuroplasticity is the thing which allows us to keep learning throughout our lives. The brain can change. It can strengthen certain areas and move things from its conscious, effortful realm to the automatic.

Think of learning to read. At first, all the letters had to be sounded out and then put together. After years of consolidating how letters merged to become words, it is hardly any effort at all. We don’t even read the whole word. We see its shape and suddenly (almost instantly), it springs forward as if spoken in a clear voice.

It is the same with other things we learn. We have to build and strengthen connections in our brains. Revisiting previous lessons helps build those connections. Remembering lessons without reference to notes, talking to people what was learned both help consolidate memories. Continually build and strengthen connections and things become more automatic.

Sleep also plays an important role. While we sleep, our brains are running through what happened that day. They are incorporating things we did, saw and learned into our mental models of the world. We forget things as time goes on. But with sleep we can hold on to things for longer.


2. Have a strategy

Being aware of the four pillars is a good start; the next step is to produce a plan and strategy.

First, decide what to focus on. Having a goal is important. Passion for a subject helps motivation and discipline. Anyone reading this probably has something in mind already. Start with that.

Identify existing skills and knowledge. This helps settle on what aspects to improve. As I did, creating an imagined future self with new skills and abilities is a useful tool. What is needed to make that a reality? That will help identify what to focus on.

Photo by Marissa Grootes on Unsplash

Then see what resources are out there to help foster those skills. Luckily, we live at a time when learning material is abundant and easy to get. There are online courses, books, video series, audio books, lecture recordings and more that cover any imaginable subject. Spend time researching what will boost progress towards the spelled out goals. Don’t go overboard. A beginner only needs one or two texts. These might include a course from Udemy (I recently bought one on photography) paired with a book. Librarians can help a lot and what they provide is often free. Learning doesn’t have to be expensive.

Create a timeline of when to hit specific milestones. These should be measurable as well as achievable. The desire to become a chess master isn’t as helpful wanting to hit a 1400 ELO rating on chess.com or learn several openings for black. Use deadlines to create a sense of urgency but be generous. Learning should be a delight, not a burden.

Finally, set aside time to reflect on progress. Were the goals too hard or too easy? Were the resources used good enough? Are goals being met? Answering these types of questions can help see where to change and adapt the plan, leading to better outcomes.

3. Tips for studying

The first two pillars of learning are focused attention and active engagement. There are approaches to studying that help fire these and make learning easier. Both the mental and physical environments play big roles in making learning easier.

Focused attention may seem in short supply. Bombarded by notifications, distracted by short, viral videos alongside the frequent dopamine hits of an always online culture, our attention spans have shrivelled.

It’s easy to build them back up though. I’ve found that you can’t beat the pomodoro technique. This splits sessions into 25-minute periods of work, followed by a five minute rest. After four of these half hour cycles there is a longer break. Once I became comfortable with that, I doubled the length of each cycle to an hour, with 50 minute study sessions followed by ten minute break. It has helped in other areas of my life too. There are plenty of apps that help track this.

Creating the right physical environment is important. I study at my desk with headphones on, usually playing some background, ambient music that doesn’t intrude too much on my mind. I put my phone on do not disturb and leave it on the other side of the room (wireless headphones are useful here) or have it propped up showing how far I am through my pomodoro session. I might have a cup of tea and a few biscuits at my side so I’m not tempted to give into hunger or thirst and cut short my session.

It helps to create a habit. I try to study at the same time every day, usually putting aside an hour each evening to do so. Time may be in short supply, but consistency and frequency are better than duration. If it’s only 15 minutes you can carve out, then carve them out.


After environment comes method. Passivity does not help learning. We must become actively engaged with our material. Peter Hollins, who has published several books on self-learning, has outlined some techniques that I have found beneficial.

The first is Cornell note taking, which I have slightly modified for myself. Here you split the page up into two unequal columns and a footer. The bigger column should take up two-thirds of the page and this is where initial notes are recorded. These notes are reduced in the second column, called cues, that focus on key issues. Finally, the footer is used to summarise the whole page in a few sentences.

From Writer’s Bloc Blog

My method involves taking the initial notes on day one, reducing them to the cues at the beginning of the following session and then writing the summary on day three. All these records, cues and summaries should be in your own words, engaging with the material rather than taking passive, verbatim notes. Illustrations and diagrams can also be used. This way, I come back to material often and early in the process, reengage with it and see if my further work has highlighted any confusion.

When taking notes, ask questions of the text that seek to clarify information, challenge assumptions, probe evidence, examine viewpoints, seek implications as well as understand why something is important.

Flashcards are useful for revisiting material. But they shouldn’t be relegated to a way to mere recite facts or definitions. When I was studying for my degree, I used flashcards (on the fall of the Roman Empire in this case). The front of the flashcard had the title (the Emperor Diocletian for instance) as well as tasks such as ‘analogues’, ‘context’ or ‘evaluate’. These asked me to actively engage with the topic on the flashcard, combine it with other information and think about it more deeply. On the back, bare facts can still appear alongside page references to where other aspects of the title are discussed.

4. Overcoming the obstacles

Procrastination is a massive obstacle, at least for me. Starting is the hardest part. Once I am focused and invested, I am very happy studying. But moving myself from in front of the television or away from a videogame is hard. That’s why making study a habit is important. It’s like exercise. Doing it once or twice won’t help. Consistency is vital.

One thing that helped me get over my inclination to dwell on mindless fun was understanding the procrastination cycle.

The cycle starts with unhelpful assumptions, such as ‘everything has to be perfect to begin’. That may include only wanting to start studying on the hour mark. If it hits three minutes past, then there is nothing else but to wait another 57 minutes.

Photo by Tungsten Rising on Unsplash

Once we make these unhelpful assumptions, we move to the next step: increasing discomfort. We know what we should be doing, but now feel bad or anxious, which gets in the way of starting. We then arrive at the third step, looking for things to decrease discomfort. We all know that this would actually be doing the thing we said we’d do. Instead, we might clean the dishes, organise the desk, reply to messages or anything else that gives us the illusion we are making progress without actually doing so.

This finally leads to the consequences of our actions. We may have a beautifully tidy room with all our emails read, but we haven’t learned anything new or made progress towards our goals. This can lead back to unhelpful assumptions , that it must always be this way before we can study ,  or even more discomfort, which we look to reduce with even more avoidance activities. Understanding this loop can help us break out of it.

Lifelong learning

Learning is fun and anyone can do it. I wanted to learn a whole host of skills but before I did, I wanted to make sure I was doing it the right way. I want to improve my Latin, take better photographs, learn how to play chess well alongside understanding different aspects of the world to a much greater degree. I set out to learn about learning. And this is part of it. It’s a recall exercise that hopefully has some use for you.

The most empowering thing was metacognition. By the end of my exploration, I had discovered how malleable the brain is, how it is set up and wants to learn. Through the methods I found, putting together a plan and having a structured approach was easy, making me more willing and enthusiastic to just sit down and learn. Below, I’ve included links to some of the sources I used in this, which go much deeper on certain aspects covered here, as well as lot that weren’t.

Learning is for life. It is never too late to start. It is fun, empowering, enlightening, entertaining and fulfilling. As Aeschylus says in Agamemnon, ‘It is always in season for old men to learn.’

Resources and reading list

The Science of Self-Learning, Peter Hollins

How We Learn, Stanislaw Dehaene

Don’t Go Back to School, Kio Stark

A Mind for Numbers, Barbara Oakley

The Learning Strategies Center

The Learning Center: Studying 101

Cornell notes

This is a slightly revised version of an article I published first on Medium.

How a Planet That Didn’t Exist Proved Newton Wrong and Einstein Right — May 28, 2025

How a Planet That Didn’t Exist Proved Newton Wrong and Einstein Right

How the mystery of Mercury and Vulcan laid the path for a scientific revolution

On May 8th, 1845, Mercury was sixteen seconds late. Like its divine namesake, the solar system’s innermost planet was a problem child. Ever since Isaac Newton had formulated his laws of universal gravitation and motion, astronomers had been building tables of the exact orbits of all the planets. Using ever more subtle and nuanced understandings of the laws, they had succeeded — for the most part. They still couldn’t pin down Mercury, despite all the other planets having become subject to human understanding. Not that there hadn’t been challenges before. French astronomer and mathematician Pierre-Simon Laplace had explained the observed changing orbits of Jupiter and Saturn, out of line with Newtonian expectations, by more exact calculation.

But Mercury was proving a more difficult problem. It was particularly vexing for Urbain Jean-Joseph Le Vellier, Laplace’s successor as France’s preeminent astronomer-mathematician. His tables, predicting a transit, that had foretold Mercury would arrive before it did. This was a direct challenge to Newton. His laws were supposed to be universal — they were the same everywhere on Earth and throughout the heavens. To say otherwise was blasphemy.

Le Vellier, a proud Newtonian, would continue returning to the problem of Mercury for the next decade-and-a-half. He couldn’t devote his whole time to the planet though. Another problem was plaguing astronomers. On the outer edges of the known solar system, Uranus, the only planet up to then discovered since the ancients, was also misbehaving. Le Vallier’s solution would lead him to suspect a similar solution to the Mercury problem. By calculation, pen and paper, he was about to discover Neptune.

If there was an unseen planet outside the orbit of Uranus, maybe there was one inside that of Mercury’s. Le Vallier’s solution was Vulcan — a boiling planet, first in line to be hit by the Sun’s immense heat and power. There was only one problem. Vulcan didn’t exist. And yet, Mercury was late.


Before 1781, William Herschel would have thought he’d be remembered for his music, if anything. But he would go down as the man who discovered the first planet that the ancient Babylonians, Egyptians, Greeks and Romans had missed. On March 13th, 1781, he rushed away from his job as director of the Bath Orchestra to his great passion, astronomy. He was on the hunt for true binaries, stars that orbited one another rather than appeared to be paired by being on the same sightline from Earth. He pointed his telescope at one such potential binary, between Tauros and Gemini, and saw something unexpected.

Image by 0fjd125gk87 from Pixabay

At first, Herschel thought it was a comet. Many had been spotted before. He watched it over the course of the next month, but it did not behave like a comet. There was no tail. It didn’t alter in size. It appeared as a fuzzy disc and he noted its gentle arc across the night sky. From his observations, others worked out a rough path. It orbited the Sun. It was a planet. It was Uranus.

Here was a test of Newton. Would his laws be able to precisely describe the behaviour of such a novelty? To get a better fix on Uranus, astronomers explored their archives. They discovered misidentified sightings of it before, dating back to 1690. With a rough plot and observations from both before and after Herschel’s discovery, by 1821 they were able to describe its track around the sun. But what they had come up with didn’t line up with Newtonian predictions. Something was wrong.


By 1845, the problem of Uranus was an embarrassment. This was the task given to Le Verrier. He began by recalculating the supposed orbit, refining it to a point where he has sure something else was to blame. Could an asteroid hit have whacked Uranus out of Newtonian perfection? Did Uranus have a moon pulling it out of its predicted orbit?

Le Verrier rejected these possibilities. Something else was out there. Eliminating some variables and making some assumptions, he was able to reckon where a possible culprit lurked. It was 1846, a mere year after being set this mathematical mystery. By August 31st, he was able to report where in the sky someone should look to spot it. No one did. His colleagues in Paris were uninterested. They didn’t have the right equipment or up to date star charts. They wouldn’t know what they were seeing even if they had trained their telescopes to the point east of Capricorn where Le Verrier thought the planet was hiding.

Image by MiraCosic from Pixabay

Over two weeks later, Le Verrier’s frustration got the better of him. If Paris wouldn’t look for his planet, some other observatory would. On September 18th, he wrote to Johann Gottfried Galle at the Berlin Observatory. Five days letter, Galle received the letter. That night he pointed his telescope at the small square of sky. An assistant, Heinrich Ludwig d’Arrest, ticked off known objects on a star map. Just after midnight, Galle called out one speck of light. It was not on the map. Neptune had been found, in the word’s of the head of the Paris Observatory, ‘on the tip of [Le Verrier’s] pen.’

Le Verrier became famous. All Europe celebrated his achivement. He toured the continent, taking in awards and honours. That victory lap, a power struggle at the Paris Observatory and the 1848 revolutions kept him from returning to the problem of Mercury.

Neptune had been found, in the word’s of the head of the Paris Observatory, ‘on the tip of [Le Verrier’s] pen.’

In 1852, Le Verrier came back to his larger work, determining the precise orbits of the inner planets and showing that they were stable. This produced results. He was able to find that the previous estimates of the Earth’s distance from the Sun were wrong. His corrected result was impressively close to modern measurements. He re-reckoned the mass of the Earth and Mars. With these, he was able to fix the orbits of Venus, Earth and Mars. All were in line with Newton’s laws. Mercury, however, was still misbehaving.


The problem was small. Mercury’s perihelion, the point at which it is closest to the Sun, was inching forward. The planets out to Jupiter accounted for a good part of that creep. But there was a tiny sliver left — an accounting error that shouldn’t exist. It was miniscule. Over three million years, Mercury would make one more orbit than anticipated; over a year, that was merely .38 arcseconds of difference or around 1/10,00th of a degree.

Urbain Le Verrier | Public domain

With his rigourous calculations in hand, Le Verrier knew that either Newton’s laws were wrong or there was something else propelling Mercury onwards. And Newton wasn’t wrong. On September 12th, 1859, he published his results. He initially thought that it was an intra-Mercurial asteroid belt, believing a planet with enough heft able to create such an acceleration would have been spotted before. On December 22nd that year, a letter was sent that would change his mind.


Edmond Modeste Lescarbault was a French country doctor but, like Herschel, his passion was astronomy. Unlike Herschel though, he was on the search for asteroids. He had watching in 1845 when Mercury was late. One of the best ways to spot asteroids is when it makes a transit, or crosses the Sun’s face when viewed from Earth. Le Verrier’s calculations had yielded the best yet calculations for when Mercury would transit the Sun — an appointment it was 16 seconds late for.

Lescarbault continued watching for transits. On March 26th, 1859, he finally spotted one. Between seeing patients, he would take snatches of time at his telescope. That day, he saw something roughly a quarter the size of Mercury cross the Sun. He was not present for the whole transit but worked out it lasted one hour and seventeen minutes. He didn’t do anything with this observation though until he saw a report of Le Vellier’s work later that year. Finally, on December 22nd, he wrote to Europe’s most famous astronomer, convinced that what he saw was a planet.

Le Verrier raced down to Lescarbault’s little village outside Paris. He was convinced after questioning the diligent, if amateur, Sun watcher. In January 1860, with Lescarbault’s observation backing up his calculations, he announced that he had found Vulcan.

The transit of Mercury on May 9, 2016. Mercury is visible to the lower left of center. A sun spot is visible above center. | Elijah Mathews | CC BY-SA 4.0

This announcement was treated with the same fanfare as his discovery of Neptune. He was feted across the world. He was elevated to the status of a scientific god, an unrivalled genius. His name is little known today, despite his mathematical heroics to uncover the shadowy Neptune.

Vulcan didn’t exist. Attempts to back up Lescarbault’s observation and Le Verrier’s calculations did not succeed. As with Uranus, astronomers once more opened up the archives looking for hints at previous, unrecognised sightings of Vulcan. These may have been identified as sunspots rather than as a new planet. With several candidates identified alongside Lescarbault’s observation, work was done to predict when the shy planet would transit the Sun and could be seen again. With such few data points to work with, this was imprecise. But despite diligent work, with continuous stakeouts looking at the Sun over the course of days and weeks, Vulcan did not make an appearance.


The best chance to spy it would be during a solar eclipse. One would come in 1869 in America. Armed with the latest technology, photography, many US observers were ready to lay claim to ironclad proof that Vulcan existed. One early pioneer of astrophotography, Benjamin Gould, would take his own photos as well as comb through around 400 others. None showed Vulcan. But it was not yet dead. If Newton was right, as he had been up to then, than Vulcan, or something like it, should exist. The math said so. In 1878 there would be another eclipse and another chance to see Vulcan.

Even with better and better telescopes, cameras and methods, it did not show itself. Less than two decades earlier, Vulcan was placed among the planets in textbooks. Now, Mercury’s unexplained creep and Vulcan’s inability to materialise were becoming embarrassments.

So, it was ignored. Le Verrier died in 1877, still believing he discovered two planets. The eclipse of 1878 put that assumption under extreme strain. By the turn of the century, people believed in Newton’s laws but not in the planet that they said should exist. It became an oddity, an aberration, something better left unspoken.

Mercury’s unexplained creep and Vulcan’s inability to materialise were becoming embarrassments.

It would take a mind equal to Newton’s to work it out and revolutionise physics. Albert Einstein would do so.


For Newton, gravity just was. He didn’t provide any hint why it should be so. He measured it, distilled how it worked and produced his laws. Einstein saw that time and space are linked. Mass warps it. Where it is concentrated, near the Sun for instance, it creates deep gravity wells. Newton’s laws, outside of such wells, describes how gravity works nearly perfectly. In them, his laws buckle and break.

For Einstein to prove his new conception of gravity was correct he would need to prove Newton wrong. There were a few false starts. In 1914, he announced a preliminary account of his theory, but that still couldn’t account for the excessive wandering of Mercury. The first world war did not interrupt Einstein’s work but allowed him to refine his thoughts. It was hard work, the most intense of his career as he would later recall. But he got there. He showed that his theory would replicate Newton’s laws, for the most part. It was near the centre of the solar system that things broke down. He would soon fix it.

Photo by Maks Key on Unsplash

On November 18th, 1915, near the end of a series of lectures he was giving to the Prussian Academy on the progress of his work, Einstein outlined how his new ideas about gravity could explain why Mercury’s perihelion was inching forward. Vulcan was not needed. Only a completely new understanding of how space, time, mass and energy interact was. Newton, it turned out, was wrong.


Science is supposed to progress through observation, hypothesis, prediction, theory and experimentation. If theory disagrees with observation, than the theory is wrong. For half a century, physicists ignored this when it came to Mercury. Newton was always right, they believed.

Thing are never as perfect as they are described. The problem of Mercury should have alerted scientists to issues with Newton’s laws — at least in extreme cases. But to do so, they would have to throw out their whole conception of how the universe worked. It was a problem not many could face and fewer overcome.

Thomas Kuhn, an American philosopher, distinguishes between normal science and a paradigm shift. The shift from Newtonian gravity to Einstein’s general relativity was one such shift. Normal scientific work can show when an established framework is wrong, but it takes something else to create a new paradigm. For half a century, Vulcan was ignored. Now it is mostly forgotten. But it highlights how far from certain all our scientific knowledge about the world is. It is not as solid as it is made out to be. It moves and shifts; we must be ready and able to shift with it.


Read more

The Planet That Wasn’t, Isaac Asimov

In Search of Vulcan, Robert Fontenrose

The Hunt for Vulcan: How Albert Einstein Destroyed a Planet and Deciphered the Universe, Thomas Levenson

The Hunt for Vulcan, the Planet That Wasn’t There, Simon Worrall

Why Everyone Went on a Wild Goose Chase Looking for the Planet Vulcan, Kat Eschner

Vulcan | The Planet That Didn’t Exist, Zepherus

The Future History of Meat — December 2, 2020

The Future History of Meat


How our attitude to meat will change in the 21st century

Knowing how much Americans love kings and the French, Herbert Hoover looked to Henri IV for inspiration in his 1928 campaign to become president. A chicken for every pot was his promise. And though the electorate would soon revile him as one of the worst presidents in American history, he did go on to be president.

When Henri wanted to signal a change from the religious conflicts that had plagued his predecessors, he promised that every peasant would have meat for their Sunday dinner. His relatively peaceful two-decade reign and policies that benefited the everyday person made him one of France’s most loved kings. Bridging the 16th and 17th centuries, Henri bribed nobles instead of battling them, bringing peace to his war-wracked kingdom. The early revolutionaries of 1789 saw Henri as a model king for their desired constitutional monarchy. At least before it got to the head-chopping off phase.

Si Dieu me prête vie, je ferai qu’il n’y aura point de laboureur en mon royaume qui n’ait les moyens d’avoir le dimanche une poule dans son pot! — Henri IV

Despite a gap of centuries, and contrasting popular support, both Henri and Hoover recognised something that united common people across their ages. Meat meant wealth, prosperity and satisfaction.

Economics boasts very few universal predictions. But one fact held fast for centuries. As people got richer, they ate more meat. Hunter-gatherers valued the delicious, calorie-dense meat of the game they killed. There are relatively few such societies left. One is the Bushmen of the Kalahari. Here, hunters bringing back a freshly killed antelope or other prey are insulted and mocked. This is not because they are fiercely vegetarian. Rather, they recognised that the ability to provide meat could raise a person to power and undermine the egalitarianism they prized. Meat is status.

The Roast Beef of Old England

In the stratified world of Britain during the height of its empire, the meat on a family’s table showed what class there were in. Venison, available only to the landed nobility, sat at the top of this hierarchy. Next came beef, belonging to the middling class. Poultry, lamb and pork followed. As expansive deer parks gave way to pastureland, beef became more of a staple. Rearing cattle was an important way to make money. Beef, especially roast beef, quickly became a national symbol of Britain and one of its few culinary treasures.

Photo by Zoltan Tasi on Unsplash

In the 18th century, the agricultural revolution boosted yields of livestock and cereals. Through selective breeding animals became larger and fatter. New farming techniques grew enough extra feed to keep whole herds alive during the lean months of winter. In earlier times, pigs and fowl would have been slaughtered on St. Martin’s Day, 11 November, so as not to compete with humans for the winter’s stores.

The agricultural revolution changed all that. It was just in time too, as workers began to leave the land and head to the great industrialising cities of northern England. Not that their diet was great. Meat was still scarce. But it wasn’t as rare as it had been. When there was a joint of beef on the table it was the man of the house who had most of it.


This association between meat and masculinity still exists. The barbecue is often the domain of a man, while quotidian meals are cooked in the female kitchen. Society is moving beyond those traditional roles though. This attitude linking meat to masculinity is being eroded. Just like the one between meat and status.

Beating the Meat

Industrial-scale agriculture began to take hold near the beginning of the 20th century. The mechanisation of tools, the arrival of synthetic fertilisers and the discovery of antibiotics meant that food and animals could be raised in numbers that would have been impossible for earlier farmers. Battery farms caged chickens in the millions. Cattle herds did not have to graze and instead had all fodder delivered to them. This created economies of scale that brought cheap, processed meat to billions. It sparked the emergence of new industries, like fast food.

But the widespread availability of cheap beef, pork and chicken eroded its status. Healthy lifestyles, open to only those with the time and money to afford it, replaced a topside of beef on the dinner table as the marker of the middle class. Education levels are a consistent way to measure class. People who left school before 17 eat 25% more meat than those with university degrees. Obesity, once a mark of wealth and respect, is now an epidemic which is disproportionately harming working-class and poorer people. Driven by the stress of poverty and lacking in time, fast food and ready-made meals are an attractive option for many poorer households. Meat has other issues to contend with too.

Photo by Erik Mclean on Unsplash

Factory farms, with their massive cattle herds, produce a large part of the greenhouse gasses heating up the world. The conditions these creatures are in provokes much-deserved hostility from animal rights campaigners. Chickens in battery farms can live their entire lives in cages smaller than an A4 piece of paper. These squalid and cramped circumstances are a breeding ground for disease. The industrial use of antibiotics is reducing their efficacy, leading to a potential health crisis. Runoff from farms, including manure, pesticides and fertilisers, pollutes many vital water resources.


Meat eaters still defend their position. Humanity was propelled down its evolutionary path by being able to catch, cook and eat meat which freed calories to be used for thinking and socialising. And roast beef, with thick, rich gravy perched on a Yorkshire pudding accompanied by roast potatoes, is delicious.

That roast dinner remains a key part of British culture. One I can wholly buy into. Especially during winter and the second lockdown, I found myself looking forward to a high-welfare piece of beef on a Sunday. It becomes a ritual, an event. A focal point for days that can get too flabby without anything to do outside the kitchen or dining room. It was these roasts, with the full trimmings of course, which got me thinking about how we think and feel about meat. It is a stubborn tradition in a changing world. Plus, a good excuse to drink wine. It is already open for the gravy anyway. The cook’s reward for slaving over the oven.

Roast beef, medium, is not only a food. It is a philosophy. — Edna Ferber

The rise of vegetarianism and veganism is a good thing. We cannot afford a planet that feeds its over seven billion people on piles of beef, pork and poultry. Our bodies can’t either.

There is hope for meat-eaters. Lab-grown meat is an innovation nearly ready for the supermarket. Startups are racing to get their cultured meat on people’s tables and in their stomachs. Whoever wins will make a fortune and save the suffering of both animals and the planet. Companies like Mosa Meat, who have already debuted their cultured hamburger, or Israel’s SuperMeat, developing lab-grown chicken meat, might be the future. Today’s celebrated firms, like Impossible Food or Beyond Meat, might be supplanted very soon, their faux meat replaced by actual, though lab-grown, meat.

Meat’s back on the menu

These changes will shift our attitudes toward meat. Heritage and high welfare livestock breeding will return. Land will become cheaper as the commercial herds of cattle disappear, replaced by labs in big cities. Some of it will be used for rewilding efforts, attempts to bring back the biodiversity destroyed by the expansion of agriculture. But there will be space for a slower, older kind of farming. Wagyu and Kobe beef enjoys a distinguished reputation today. Cattle are raised humanely and with high welfare standards. The meat they produce is celebrated. Its luxury status is confirmed by its price. That might be the case for most traditional farms of the future.

Raising old-fashioned breeds of chicken, not designed for the battery farms that supply the pots or KFCs, is a popular pastime for some of Silicon Valley’s winners. Intensive farming may give way to more sustainable methods, increasing the price of meat. It may regain its place as a luxury, a status symbol, once again.

Photo by Tom Robinson on Unsplash

The feasts of history, resplendent with suckling pigs gagged with an apple or the little whole sparrows of Rome, may reappear. Trotters and other currently discarded parts of a butchered creature, evidence of a real, living animal, could become the height of luxury.

As how we grow our food changes, our attitude toward it does too. Lobsters, found in abundance, were once the food of the lowest of the low. Now it is a delicacy, prized for its rarity and price, if not its taste. Turkeys didn’t always rule the roost at Christmas. Goose and beef used to be the central, vital part of any Christmas feast.

Food and meat are the roots of culture. Our attitude toward it may seem permanent, fixed by venerated elders in some distant past. But it is not. As meat moves from the field to the pharma lab, how we feel about what is on our forks and what is says about us will too.

Dangerous metaphors — April 13, 2020

Dangerous metaphors

Why talking about the war on coronavirus is causing harm

Metaphor is not merely a tool reserved for poets. It soaks our language from our everyday conversation to the most high-flying rhetoric. We understand the world through metaphors. It helps lend physicality and understanding to abstract concepts. George Lakoff, in Metaphors We Live By, even says that our entire conceptual model of reality is metaphorical. Metaphors are powerful. We need to use them to come to a better understanding of our world.

The coronavirus pandemic has prompted many commentators, journalists and politicians to liken it to a war. It is a battle against an invisible enemy. It requires sacrifice and bravery to overcome. It is an easy and, at first sight, apt metaphor to employ. The response to coronavirus requires a collective effort and will, a coming together of the whole community in a single-minded focus, and a suspension of normality that does resemble nations during wartime. It creates heroes and famous battles that inspire stiff upper lips and helps us shoulder the burden of onerous measures. But the sheer quantity of such analogies does create harm.


Metaphors are powerful because they transform the general into the specific. Big ideas are shrunk down to graspable things. Metaphors allow us to budget time and shine a light on an idea. Situations can be looking up or going downhill.

Arguments become battles which allows you to assault an opponent’s logical position. Professional outrage merchants on YouTube post videos where they ‘destroy’ someone and their thinking through argument. These verbal battles between opponents leave one side victorious and the other defeated. War is a powerful metaphor in this context because arguments do mimic the contours of armed conflict. It places two sides against one another.


With coronavirus that is not the case. There are no good guys or bad guys. There is humanity and then there is this little packet of proteins and genetic code. Coronavirus does not spy or strategise, it does not invade or invent terrible new weapons. There are no allies or axis, no central powers and no entente. But constant talk of war can leave us creating an enemy where there is none. It leads to President Trump tweeting and ranting about the ‘China virus’. It allows us to forget the very real human suffering and hardship faced by the citizens of Wuhan. It is there on Twitter when people, escaping small and squalid flats, walk in a park and get called traitors or collaborators conspiring with a foreign enemy. Collective action on a global scale is needed to overcome coronavirus. But by overusing the war metaphor it splits people and nations apart.

Photo by Stijn Swinnen on Unsplash

It goes beyond that too. The war metaphor makes it easy to ignore the death and destitution the virus causes. When battles are won with courage and bravery those who succumb to the virus must have lacked fight. People do not die because they are without a certain quality. They die because a virus attacks their lungs and makes it impossible to breathe. Delivery workers, shop staff and healthcare professionals are called heroes. This verbal admiration, echoed praise, is not backed up by material action though. Those stocking shelves and delivering food packages are not given hazard pay or sick leave. Those working in hospitals and care homes are not given the protection they need. But by recognising them as heroes we can be blinded to what they actually require. We can pat ourselves on the back after a weekly round of applause for the health service or talk about minting medals. This does not help and is only possible because we are employing the language of war when we talk about this healthcare crisis.


Weneed metaphors to help us make sense of the world. In First You Write a Sentence, Joe Moran says that ‘metaphor is how we nail the jelly of reality to the wall.’ War is not a reality for many of us in the west. We know it through movies and videogames. That creates a false impression of what must be done. It creates sides and divides people. It underplays the seriousness of things. The coronavirus pandemic is not a war but a healthcare crisis. It is an unneeded metaphor. Plague is as much a rider of the apocalypse as war is. When a metaphor gets repeated too often, when it becomes a runaway metaphor, it risks obfuscating the real nature of things more than it clarifies them. We have reached that point by now. We must see this pandemic as it really is and face it together with the understanding that gives us.

Writing in pencil —

Writing in pencil

The pencil can be mightier than the pen

Writing tools exist in a hierarchy of prestige. Their place in executive offices, at high-profile public signings and as gifts that mark the stages of life make the fountain pen king. Below it come ballpoint and rollerballs, gel pens and biros. The pencil, humble and basic, looks down only on crayons – if those can even be called writing tools. The pencil is the first writing tool we use and the first abandoned. We graduate from tracing specimen letters and rubbing out our first wobbly attempts in trails of graphite early. Our scrawls, early efforts at cursive, are replaced by more masterful strokes drawn in ink. We go from the grey of our first pencils to the blue ink of the schoolroom before we are urged to match the seriousness of the adult world in sombre black. This progression has us leave behind an evocative and tactile tool. As I took up pencils again, as I started to write for pleasure, I have fallen in love.


My pencil case and desk drawers are full of three pencils. Yes, there are the strays, those branded pencils you forget where you picked them up, but I write with ones I have sought out and bought. The Ticonderoga is my workman. Its name recalls the woods of a newly settled America, a wild Thoreau-like landscape. It is familiar. Its yellow and green livery is more remembered from school off the television than any real-life experience though. It is for teasing out an idea, a thought, capturing it quickly before it disappears. It is a composer of first drafts. Robust and quotidian, it leaves behind a line more grey than black. The pressure needed to make your mark is not inconsequential; it feels like there is some work going into your writing. Essays and jottings are constructed rather than flow which matches my experience of writing out an idea more fully. It admits mistakes. Not full rewrites of final edits, but its crowning eraser, the only lasting innovation in the four century history of the modern pencil, does let you catch those better word or better phrases that pop into your head the instant you see an inadequate cousin on the page. The Ticonderoga implies that things are not yet done.

Things progress. Drafts are rewritten. Your text begins to come together. To help marshal ideas and expression I turn to my Tombow Mono 100s. It is a smoother writer than the Ticonderoga. It keeps its point longer and writes in a more authoritative black. Its purpose is picked out in gold on black. It is for ‘hi-precision DRAFTING’. It is serious. It tidies and straightens, evens out and tunes up.

Humans create personalities. Our tools and appliances each have their own quirks. We imbue them with character. How we found them, where we found them, their branding, context and our own experience inform this character. There is some shamanesque magic in tools. Maybe it is how it focuses the mind on different things, maybe it is just an overactive imagination, but I can’t help but feel it. It is why I enjoy using pencils, using different ones.

The final one I keep close at hand is the world’s most famous pencil. For people who see a pencil or pen just as what they are, who have the good sense not to create personalities or mythologies for what is merely graphite encased in wood, that may sound strange. But the Blackwing 602 is celebrated around the world. Authors and writers praise it. For good reason too. The 602 sings across the page. The gruff grit and grumble a pencil makes as you drag it along the page is tuned and smoothed by the 602. When you’re ready, after the thinking and rethinking, after the edits and drafts, the 602 is there. It flows. It adds flourish. It is eccentric. Its cuboid eraser is weird. Its boasting of ‘twice the speed, half the pressure’ is arrogant. Yet, with its wax and graphite mix, it can carry it off.


Pencils are humble but powerful. It is your path into the whole written world. Writing gives our thoughts an existence outside of ourselves. The connection between pencil and paper is closer than that with pens. It has a friction, a noise, a rough and bumpy physicality that is not matched with silky ink. Pencils mark our progress. They get dull; our minds need sharpening. Roald Dahl used to start each writing day with six freshly sharpened Ticonderogas. When they were worn and their points were dull and thick, the work must have been done. I still like to write with pens. But there is a permanence to ink that I am not ready for when I am journaling or setting out on a piece. A pencil, its smells and sound, its reflection of the work put in, its allowance for mistakes, is the perfect tool. A pencil gives its life for your writing. It gives it life.

Why we write — April 8, 2020

Why we write

And why it should be slow

Photo by Mike Tinnion on Unsplash


Simple questions sometimes have simple answers. But it is rare. A child’s questioning can reveal complex gears at work beneath a seemingly simple surface. Simple questions often have many answers too. Answers vary not just based on who you ask, but when you ask someone and where.

The earliest writers lived in Sumer. They spoke a long-dead language. We can still hear their voices though. What they were saying, at first, was not particularly interesting. If asked why they wrote they would probably have said that writing is permanent. Across gaps of space and time, hundreds of miles, thousands of years, we can still read their land deeds and inventories. There’s are stories built of lists and legal contracts. They wrote to remember and be remembered.

Stories have always been with part of us. We can’t help telling them. We can’t help listening to them. They filled the air around the dusk lit campfires of our hunter-gatherer ancestors and they are with us still in the Netflix bathed sofas of 21st-century apartment blocks. We search for and fill our lives with stories.

Photo by Kevin Erdvig on Unsplash

It took a long time for storytellers to use the linguistic innovation of writing; to fasten breath in clay and ink. The oldest stories we can still read have all the hallmarks of a previous oral tradition. But we can only hear them now because of writing.

The answer the Sumerians might have given, the reason we can still hear the story of Gilgamesh, is the reason we write. We write because it is crystallised conversation, a lasting thought. But writing has also given us a mastery over our words and sentences. The rote phrases of Homer gave blind rhapsodes time to fill poems with their imagination, fitting their flourishes into metre and scheme. Writing allows us to edit. We can educate and train our sentences before we send them off into the world.


I write because it is easier to express myself in writing. The ‘likes’ and ‘umms’ of conversation are erased. There is no l’esprit d’escalier. I can snatch a fleeting thought and fix it still. I can poke and prod and mould it into a better version of itself. I take my time with my thoughts. I improve them. I turn them over, examine them, polish them and know my mind a little more fully.

Writing gives me time with my thoughts. I used to think this slow way of writing, with not a lot of published work, was a failure. I bought books, read articles, and watched tutorials that promised a better way. A faster way. But I could never write as quickly as they urged me to.


I have been able to build my life around words and sentences. I wanted to write more quickly because it would allow me to do more of what I loved. I have worked as an editor improving other people’s sentences. I have freelanced, writing copy and articles for blogs and businesses. I have written mass emails and marketing copy. In my spare time, I keep a journal, I have written private poetry and published my own articles on my blog and Medium. In all of these forms, I have rewritten, edited, polished and improved my initial thoughts. They have come out the better for it.

Photo by Steve Johnson on Unsplash

Online writing advice tells you to write every day, to publish every day. Build an audience and an income. I write every day. That is good advice. But I publish infrequently. Writing slowly allows you to know your thoughts better. It improves your thinking and your writing. Writing out my first few drafts in longhand with a pencil gives my sentences physicality. I can work them with my hands, slow and methodical, like a craftsman, an artisan, savouring the satisfaction of a well-made artefact. Writing becomes an act of defiance, against the world’s demands of always more, always faster and against my own stupidity.


There are many reasons to write. To inform, to educate, to entertain, to convince, to seduce. And we are all writers. Offices, universities, studies produce millions of sentences every day. Each email that pings into your inbox, each word on packaging and every advert tempting you to take your money out of your wallet has been placed there by a human hand. But we are not machines for pumping out word after dull word. There is a craft to writing that can’t be reproduced in the suffocating air of a sweatshop. Writing should revel in the telling, not just in what is told. It lets us build more beautiful sentences.

There are many reasons to write. Every person will have a different one. I write because it allows me to sound smarter than I am. It allows me to explore the lines of thought that crackle across the synapses of my brain, that fizz for an instant before they are gone. I can only do that by slowing down. I won’t ever be prolific. I won’t ever amass thousands of readers hanging on my every word. But I will develop my craft. That is something, however small, to prize.

The battle to deliver Britain’s food — January 14, 2020

The battle to deliver Britain’s food

Food delivery services are everywhere across the UK – but can one dominate?

The list of food wars in the UK is long and slightly mind boggling. You had the Ice Cream Wars, where Glaswegian drug-slinging ice cream van drivers fought a turf war across the Scottish city. There were the Cod Wars fought between Icelandic and UK fishermen. One person was killed, another injured and countless fish lost their lives in this conflict. There was the Bean War, when a shop ended up selling a tin of beans for minus 2 pence in a race to cut prices and attract shoppers.

 Another food war is heating up in the United Kingdom. This time it isn’t centred on one type or meal. Instead, it is fought over who gets to deliver Britons their food. It is a three-way fight, with a native British start-up, an old dotcom style business, and a massive foreign invader. The stakes? Cornering a market that is already worth over £10bn already.

The British takeaway and food delivery market has a long history. It probably goes farther back that the traditional fish and chips, but that dish really kicked off the whole thing. From punters lining up, ordering their food and bringing it home with them, it evolved into the branded and fully integrated delivery system most obvious with institutions, however culinarily respectable, of Dominoes and Pizza Hut.

Up until only really a decade ago this was the way things were done. A customer would get a menu through their letterbox. They’d ring up the restaurant and order. The food would be delivered by someone employed by the restaurant. Straight forward and simple. This system kept people fed with countless curries, pizzas and chow meins.

The internet changed that.

Just Eat emerged in the 2001. It allowed small, independent restaurants to be found online. Just Eat wouldn’t deliver the food themselves but would provide the platform for online ordering. Though it came after the dotcom bubble, its ambitions were limited and never attracted the astronomical valuations that have characterised the current tech boom.

Next on the scene were the delivery firms. They would provide the same platform as a Just Eat or Hungry House but would also process and deliver orders. With a fleet of underemployed cyclists or scooter drivers, their logos can be seen all over London. Deliveroo was the brainchild of an American transplant in England. It was created in 2013 and is one of the few European unicorns that can rival its US counterparts.

Those US counterparts eventually arrived in the UK too. Uber Eats is the third of our cast to launch. In 2016, the ride-hailing app launched its related food delivery service. These competitors have cornered the market in Britain. They have expanded without hurting each other too much. Now, though, the scene is set for a showdown between them all.

Who will win?

The tactics are the same as other business wars that have been fought out in Silicon Valley. Out raise your opponents, grow rapidly while burning through cash and hope you’re the last one standing. Uber Eats has experience in this type of knock-down drag-out fight. Its parent company is skilled at losing tons of money in the hopes of future dominance. Uber’s core business is yet to turn a profit yet it enjoys a market capitalisation of $45bn.

It is skilled at raising money. Before it went public in early 2019, it had wrung out nearly $25bn in private funding. Those deep pockets will need to be employed against Deliveroo, which has also raised barely believable amounts of money. After rebuffing a takeover attempt by none other than Uber, Deliveroo managed to raise $575m from Amazon.

It needs it too. In 2018 Deliveroo lost £232m. Uber Eat’s own losses aren’t separated out from its parent company. It is safe to assume that it isn’t profitable though, even if it makes more of a return than Uber’s ride hailing service.

The outlier here is Just Eat. Already an old horse, at least when considering the small timescales of tech companies (it is three years older than Facebook), it was born in a time when investors were actually looking for a business to make money. Rather than just burn.

Just Eat had a net income of £101.7m in 2018. But that is down from 2016’s £115bn. That squeeze on profits is also being felt by its European competitor takeaway.com. Those two are in talks to merge, creating a company that can go toe-to-toe with Uber Eats and Deliveroo.

That points to Just Eat as being the eventual winner. It already is a winner. It makes money. Even after it moved closer to its competitors, by offering food delivery fulfilment in 2017. That’s a notoriously hard place to make money in, as attested to by Deliveroo’s results as well as by Grubhub, a US service similar to Just Eat:

[We] don’t believe now, that a company can generate significant profits on just the logistics component of the business.

Things are, alas, more complicated than that.

Uber Eats and Deliveroo can outspend and undercut Just Eat. They can strike exclusive deals with major brands keeping diners locked into one platform for, say, McDonald’s. Uber already fulfils 10% of UK McDonald’s orders.

Just Eat has enjoyed a long relationship with small, independent restaurants. That could be its competitive edge, but Deliveroo is also crowding into the market. All these firms are becoming increasingly similar to one another. Deliveroo is acting as a platform, similar to Just Eat’s early days. Pick-up options, where diners go and collect their own food, is offered on all services now. That means the platforms still make commission without having to pay those pesky delivery people.

What’s next?

Virtual brands and dark kitchens are the current trends taking over the industry. Virtual brands are ones which live entirely online. They don’t have physical locations. They can be set up easily, cheaply and quickly in markets that might be missing a certain cuisine.

The real power of data is shown here. If there is a location with a lot of people searching for sushi dishes but there isn’t a sushi restaurant able to serve them, then a virtual brand could be created.

They would be placed in dark kitchens. These are kitchens that cater solely for the delivery market. Different styles of food and brands would be located in a place not available to the general public. They save on rents, by not needing to be located in high footfall areas and make it more efficient for couriers to collect and deliver food. Travis Kalanick set up a business providing dark kitchens when he was booted as Uber’s CEO.

Dark kitchens and virtual brands are just two efficiencies these firms can make. They have a stable of couriers working for them, but their busy periods are all clustered around the same time. Dinner is the most active time, as you can imagine, with some spike of orders around lunch. Apart from that they aren’t earning money for either themselves or for the company.

So why not deliver other things? Uber Eats and Deliveroo are expanding into grocery deliveries. Maybe they could become the new milkmen as well. Or deliver purchases from shops to customers within a short period of time.

That’s because, at the heart of it, Uber Eats, Deliveroo and Just Eat aren’t food delivery companies. They are marketplaces with point-to-point logistics divisions. The winner will be the one who can expand quickest, get the most customers and harvest their data most effectively. That data is the key.

It can be used to provide a new virtual brand to an area where it will be successful. It can be used to predict flurries of activity or forecast future demand. Sell that data to the restaurants they work with and they can build up relationships which act as moats to their competitors.

Right now, Deliveroo is in last place. It is a UK-based company, making it harder for them to reach the willing rich investors of US VCs. It is operating in the same way to Uber Eats without the support its competitor enjoys. If Amazon continues to support it, though, and build out its food delivery service into a broader delivery business, it may just come out on top. But then it won’t be Deliveroo, it would be Amazon Delivery.

Just Eat is the most interesting of the three. An older company, already profitable, but willing to try new things, it may just be my favourite to come out on top. The merger with takeaway.com is a smart move. They can share expertise, explore new markets and raise money better as a larger company.

Just Eat also has a much better reputation than either Deliveroo or Uber Eats. They aren’t in the press often. They don’t have news stories about underpaid workers or data privacy concerns. They don’t have to run adverts about how safe their service is (as Uber have to do).

So, who will win the food delivery war? My bet is on Just Eat.

What happens now? —

What happens now?

What’s in store for British politics following the election result

Originally written: 19/12/19

After a five-week campaign with politics and politicians taking up every nook and cranny they can squeeze themselves into on your television, radio or smartphone, polling day gave us all a respite. Broadcasters in the UK are barred from reporting on campaigning details during polling day. From just after midnight until the polls closed at 10 PM, we had a welcome break from politics.

But that brief reprieve was only very brief.

The first real indication of what result the election may have brought was the exit poll. On the strike of 10 PM, as polling stations closed around the country, the exit poll revealed a surprising Tory majority.

It wasn’t close either.

Predictions have the Tories winning around 360 seats. Labour slumped to below 200.

Boris Johnson has been returned to 10 Downing Street with a strong majority

So, what happens now?

Brexit

The issue which dominated and defined this elections was Brexit. The Tories won with their simple message of ‘Get Brexit done’. Labour floundered with their renegotiate, referendum, remain position, while voters saw the Liberal Democrat’s position of revoke as undemocratic.

Mr Johnson now has the majority he needs to deliver his version of Brexit. This has already passed the last Parliament, but the withdrawal agreement, similar to Theresa May’s apart from the fact that there will be a customs border between Great Britain and Northern Ireland, will be fast-tracked through the House of Commons.

Legislation that will take the UK out of the EU will be submitted to the House this week for its second reading. This process will be a lot smoother than the previous editions. First, John Bercow, willing to let backbenchers and the opposition control the business of the House, is no longer Speaker and even if he was, the government has enough MPs to control the House’s timetable.

Mr Johnson will want to deliver on his promise of Brexit by January 31. There will then be a year’s worth of negotiating during the transition period, before the UK formally leaves the trading bloc and Brexit is done.

Scotland

As the UK comes out of one union, another could be under threat. The Act of Union was signed in 1707, uniting the kingdoms of England and Scotland. After over three centuries it could be close to its end.

When Hadrian set the northern limits of the Roman Empire, he chose a boundary close near to what would become the border between Scotland and England. England was then very much part of a wider Europe; Scotland was locked out. Into the Middle Ages, though, Scotland would appeal to the continent, the Auld alliance with France, to secure its independence from its more powerful and prosperous southern neighbour.

Scotland being locked out of the EU was one of the major reasons to preserve the union during the 2015 independence referendum. But now with the UK set to finally leave the EU, Scottish nationalists are once again using the ability to keep close ties with Europe as a major argument for independence.

The Scottish National Party won 48 of 59 seats in Scotland. With the strong support for the European Union in the northern kingdom (62% supported remaining in the EU at the 2016 referendum) there will be another push for an independence referendum. Nicola Sturgeon, the SNP’s leader, has also been focusing on the Conservative’s policies and the damage they could do to Scotland.

Another referendum has been rejected by Mr Johnson, but it is a battle that will continue and intensify over the coming weeks and months. How the government deals with the push for another vote will be a major theme of this parliament. One thing is certain though; Mr Johnson does not want the union to dissolve under his leadership. With opinion polls in Scotland balanced on a knife edge between independence and keeping the union, however, it is something that may be beyond his control.

Cabinet reshuffle

A new majority will allow Mr Johnson to shape his cabinet in his image more so than he was capable of in the last parliament. There are already a few positions he has to fill. Nicky Morgan stepped down as an MP, vacating her role as culture secretary. Alun Cairns will also have to be replaced as Welsh secretary and Zac Goldsmith, who lost his seat in Richmond Park, has left the role of environment secretary empty. These positions will be filled in the coming days before a larger reshuffle takes place in February.

So far, only Sajid Javid, the Chancellor, has been assured of his position. Five members of the cabinet have already been identified as under risk of losing their positions. These include Thérèse Coffey, the work and pensions secretary, Andrea Leadsom, the business secretary, Liz Truss, who runs the international trade department, Jacob Rees-Mogg, the leader of the House of Commons, and Julian Smith, Northern Ireland secretary.

Several departments could also be folded into other ones. The Department for Exiting the EU could become part of the Department for Trade while the Department for International Development could be subsumed by the Foreign and Commonwealth Office.

One person being touted for a return to the Cabinet is Penny Mordaunt. She was fired from her position as defence secretary when Boris Johnson became prime minister. But now her loyalty from the backbenches could be rewarded with a return to the big table. Michael Gove is also set to get a bigger role.

Budget

Finally, a budget is expected in February to help deliver on the non-Brexit related campaign promises of the Conservatives. The NHS is set to receive increased funding, to the tune of £34bn, while cuts in the number of police officers are set to be reversed. Austerity meant an axing of 20,000 officers but the new budget may start a recruitment drive to bring those 20,000 back. A pledge to retain and recruit 50,000 nurses will also have to be funded.

A tax cut, by raising the threshold at which people start paying National Insurance, was also a key pledge of Mr Johnson on the campaign trial. The February budget is also predicted to include that. Social care will be funded by an additional £1bn and education investment is set to increase with Mr Johnson having promised to increase the minimum funding per pupil.

MPs return to Westminster on Monday. They will enter a very different parliament than they left. A Conservative majority, the largest since the 80s, will mean that Boris Johnson has a lot more room to manoeuvre. Now, it is up to him to deliver on his election promises.

Where are all the European unicorns? — September 1, 2019

Where are all the European unicorns?

And does it matter there aren’t that many?

“The problem with the French is that they don’t have a word for entrepreneur.” — President George W. Bush

Smartphones make our lives easier. The whole world is shown to us and mediate through the shiny screens that nearly all of us carry around. Ordering a taxi, discovering new music, staying in touch with friends or colleagues, and booking your next holiday is done quickly with a few taps. And a lot of money has been made making life easier.

The companies behind these technologies have become massive. They have grown from small start-up teams with only a good idea to private firms worth over $1bn. These are the unicorns. Some have become public and are now worth tens, if not hundreds, of billions of dollars.

Uber, Facebook, and Amazon all emerged in the age of the internet and are now worth over $1.5trn together. The smallest of that trio, Uber, alone is worth $55bn.

One thing, though, stands out. In the unicorn stable, only a few were bred in Europe.


Companies founded in the US since 2000 are now worth $1.37trn; in China, they’re worth $675bn. In Europe, that figure is only $240bn.

Europe is as large a market as both the US and China. There are more people in Europe than in the US and nearly as much money. It is not as simple as size. The true causes are historic and cultural as well as prosaic and regulatory.

One theory for the lack of European unicorns is that Europeans are just not as entrepreneurial as Americans or Chinese. They don’t have the ambition to create massive companies. That viewpoint is summed up by that famous (though probably fake) quote of President Bush. And it is a viewpoint that is wrong.


There are plenty of ambitious and talented people in Europe. It has traditionally been the birthplace of many big companies. On the Fortune 500, the list of the largest companies in the world, 160 were founded in Europe. Only 132 were American born. Traditionally, then, Europe hasn’t had a problem with producing ambitious individuals.

University campuses across the continent are full of students dreaming of founding great companies, shaping the world, making a difference. From Israel to Ireland there are entrepreneurs-in-waiting, fantasising of making it big, becoming the next Bill Gates or Elon Musk.

Musk himself isn’t even American. Many unicorn founders aren’t. Adam Neumann, the founder of WeWork, was born in Tel Aviv, Israel. Patrick and Joe Collison of Stripe came from Tipperary in Ireland. That’s only three. There is Russian Sergey Brin of Google, Ukrainian Jan Koum who set up WhatsApp, France’s Renaud Laplanche founded Lending Club, Mikkel Svane moved Zendesk from Denmark to the US after early funding gave it the money to do so. Zendesk is now an American unicorn, despite its start in Europe.


Silicon Valley attracts startups, whether they’re European or American. It has the conditions that founders are looking for. These founders don’t leave their countries to make it big somewhere else just because. They are coming to Silicon Valley for something.

Part of that is that many startups have found success there. It is where people with the skills that founders need gravitate to. It is a black hole, sucking in all those of building a unicorn. The talent pool is massive and it makes sense to start there. Success breeds success.

There are other reasons Silicon Valley provides a good environment for start-ups beyond the plentiful talent. There is a support structure for when things get tough, with lots of mentorship and advice from those that have gone through it before. That type of knowledge, concentrated in one area, is powerful.


London is the most successful startup hub in Europe. Of the roughly 70 European unicorns, 17 can be found in London, more than in any other one place. Many of the big ones, such as Monzo, Revolut, and TrasferWise, specialise in financial technology.

They are disrupting how money is stored, used, and moved around the globe. The talent they need to do this could only be found in London, one of the biggest financial centres in the world. Talent attracts talent.

Talent also attracts money. An investor looking at two different companies, everything else being equal, would lean to the one better placed to use its money. Unicorns chase growth, they burn money to grab a large part of the market and depend on scale for profitability. With the talent pool already in place in Silicon Valley, then they are better placed to burn that cash to reach a profit-making scale in Silicon Valley.

Investors want to be close to their investments. It makes more sense to base yourself in Silicon Valley or America than in Europe. The constant gossip of the Valley helps them find new startups to back. Flying to Europe all the time to hunt for new opportunities, to keep an eye or offer advice to a startup isn’t feasible. The money stays in Silicon Valley and is worked hard.

There is more money looking to invest in American startups as well. The success of previous unicorns has made some investors bolder, wanting to get on the next big thing before it takes off. Compared to a more conservative investing market in Europe, the US has around 14x the capital looking for a tech startup to invest in, according to Siraj Khaliq of Atomico, a European venture capital fund set up by Niklas Zennstrom, one of the co-founders of Skype.


With a much tighter financing market, European companies must focus on earning revenue quicker than their American counterparts. There isn’t as much money going around, and they have to make the most of what they have. Growth and potential are behind much of the valuation of US companies, but it can’t be such a focus for European ones.

This allows US competitors to snap up European companies before they can make it to the truly big time. Shazam was acquired by Apple in 2017 for $400m and has incorporated it as a key part of Apple Music.

Alphabet has gobbled up Belarussian AIMatter, which built a product that allowed users to transform images and videos in real-time. Booking.com, a Dutch startup, might have been worth around $50bn today, but it was acquired by Priceline, another American company, for $113m before it had the chance.

Supercell, the Finnish games company behind Clash of Clans, is owned by Chinese giant Tencent. Supercell was first acquired by SoftBank, the Japanese venture fund, for $1.5bn. After only three years, in 2016, it was sold to Tencent for $10.2bn.


European startups can make it big. Just it isn’t often throughout Europe. While the European Union has done a lot to bring the continent together, such as harmonise banking regulations that have made it easier for fintech firms like Revolut to access a large market, there are still national divides.

Zalando is a German online shopping platform worth more than $14bn. It is mainly active in the German-speaking part of Europe, Germany (of course), Switzerland and Austria. It has hardly made a mark outside its home territory.

A US company is just American, a Chinese firm is only Chinese, but a European one can be British, French, German or dozens of other nationalities. All countries that have historically not got along. National pride still acts as a barrier after regulations have been torn down.


There is one more factor to consider. What if American unicorns are being valued far higher than they should? That would mean that the sober European investors are better at pricing a company and not getting overhyped about a good idea that might not pan out.

There is some evidence for this. The recent IPOs of Uber and Lyft show that some valuations are overly optimistic. Uber went public in May with a share price of $45. It has slid to $32. It is a similar story with Lyft. It debuted at $72 and has declined to $50. The public markets are valuing these companies below what private investors were.

Even Slack, which had a well-received IPO, has taken a hit. Shares are trading at around $30, instead of the $38.50 they started at. Investors want to back the next Facebook or Google. They can fool themselves into thinking that the hyped private tech firm could be it. Each of these unicorns is unique — they are often the only big company with a similar idea (except for Lyft and Uber). If an investor thinks it is a good one, can work at scale and earn massive amounts of money, they are more likely to get involved. A look at the numbers behind a brand may dim that initial enthusiasm though. That’s could be what’s happening in a more risk-averse European market.


Europe has plenty of problems, though building unicorns isn’t one of them. There are plenty of exciting tech companies in the Old World, but the unbridled enthusiasm of the States is not present. There may be less eye-catching headlines, but it makes for a more stable market. There are as many European unicorns as there should be.

Should Amazon spin-off AWS? — August 26, 2019

Should Amazon spin-off AWS?

A breakdown of why some people think Amazon should spin-off AWS — and why some say it shouldn’t.

Amazon is two companies. The first is the one we all know about. It is the marketplace, the place to go to buy and sell on the internet. Buying a Prime membership gives you access to a lot; free shipping, exclusive savings, deals, and streaming video and music services.

The last of these, the streaming services, gives us a glimpse at the other business that makes up Amazon. It takes advantage of Amazon’s logistics expertise. This is one of Amazon’s main competitive advantages, as noted even by its leadership.

Amazon uses a lot of bandwidth. But it doesn’t use it all the time. It was better and cheaper for Amazon to build data centres. So, what could it do with that infrastructure when it was not being used?

The answer was simple; make money by renting it out. This was the beginning of Amazon Web Services (AWS). It began in 2006, and since then it has exploded. It is now the largest cloud computing provider in the world.

AWS provides the backbone for some of the biggest online firms in the world. It lets you binge-watch terrible movies and engrossing series on Netflix. It enables you to watch as streamers play games on Twitch. AWS helps you book your next holiday on Airbnb. That focus on selling to enterprises, rather than consumers, is one of the significant differences between AWS and Amazon.

Another came in 2015. It began reporting its financial results separately from the rest of Amazon. That gave us a glimpse at how much Amazon relies on AWS.

During the second quarter of 2019, AWS took in $8.38bn. That only represents an eighth of Amazon’s revenue of $62.4bn during the same period. But its operating income of $2.1bn was over two-thirds of Amazon’s $3.1bn.

Amazon without AWS wouldn’t be as big or valuable as it is today. That has led many to ask: shouldn’t AWS be its own company?

Some influential people in the industry think that Amazon should spin off AWS into its own firm. Number one among them is Scott Galloway, a professor at NYU and a tech industry analyst. Another is Mark May at Citi Research.

There are plenty of good reasons why it is a good idea.


Regulators are looking at Amazon. Right now, there are three significant investigations in the US looking at the tech industry giants. One was launched by the Federal Trade Commission, another by the Department of Justice and a final one by attorneys-general of around twenty US states.

Spinning off AWS might take the heat off Amazon. It wouldn’t have as much power in the market. It wouldn’t be the dominant player in two massive sectors, e-commerce and cloud computing. It could save Amazon billions of dollars in fines or restructuring costs. It might be forced to do it anyway, so why not do it now?

AWS dominates cloud computing. The other two big players in the industry are Microsoft’s Azure platform and Google Cloud. None are separate companies. Investors have nowhere to put their money if they want to bet on only cloud computing.

If you don’t think digital advertising, e-commerce or operating systems will grow as much as cloud computing, tough luck; you’re saddled to an investment you only half believe it.

If AWS goes its own way, it will hoover up all those investors who believe in cloud computing above everything else. This would boost the value of AWS beyond the hit Amazon would take by losing it. An AWS unshackled from Amazon could focus on cloud computing and dominate the market to an even greater extent.

AWS, right now, subsidises the rest of Amazon. It could use that money instead to invest in itself or pay dividends to shareholders. AWS’s operating margins, how much money is leftover from a sale after the costs of goods sold and operating expenses, since 2013 is 23%. For the rest of Amazon it’s 1.5%.

Amazon uses that reliable money sitting in the bank to fund research. Between 2001 and 2009, covering a period when AWS wasn’t even a thing, Amazon’s research budget grew by around 19% each year. From 2010 to 2018, that figure was 42%.

New products, some of them failures, are continually emerging. The Kindle, Fire phone, Alexa and others are, in effect, paid for by AWS. If they fail, it is no big deal. Unlock that money and AWS would be one of the ten biggest companies in the world.

That’s what Scott Galloway believes. He has form when it comes to predicting what Amazon will do. He called them buying Whole Foods, and he predicted where their new headquarters would be. Scott knows Amazon.

But others aren’t as convinced.


All those subsidises leaving Amazon would hit it hard. Those small margins from its retail business would stop Amazon doing what it wants to do: investing cash flow into growth rather than giving back money to investors as dividends.

A recession will hit Amazon, as a consumer-facing business, hard. If people have less money, then they have less money to buy on non-essentials which they shop on Amazon. Enterprise customers, which rely on AWS as the backbone of its product, can’t stop buying from AWS. Their costs are for essentials. Keeping AWS will limit Amazon’s exposure to any volatility in the market.

AWS also enjoys the cheap money Amazon can generate. Setting up a modern server farm is expensive. In the second quarter of 2019, AWS had a capital expenditure of $3.31bn. Amazon issued a 3-year bond in 2017, which yields 2.53%. China’s 3-year government bond has a yield of 2.8%.

When you can borrow money cheaper than China, you’re a business in a good place. AWS wouldn’t be able to raise capital at such a cheap rate. That would make its large capital expenditures a lot more expensive. This handbrake on growth would limit its chances at dominating the sector.


There are good arguments on both sides. But what will happen? What is the hot take?

The answer lies in Jeff Bezos’s philosophy.

He has set up Amazon to chase long-term growth rather than short term profitability. If he wanted a quick buck, then AWS might well be spun off. But that isn’t the case. For the foreseeable future, AWS will remain part of Amazon.

When Bezos wants to cash out and rake in the piles of dollars that are waiting for him, AWS will become its own firm. It won’t happen before that. When it does happen, buying things on Amazon, then, will become a lot more expensive.