Goodbye, World.

On the necessity of death, from the eyes of a software maker

One of the first computer programs a fledgling software maker will write is a “Hello, World!” program. It’s a simple piece of code that tells the computer to say the words “Hello, World!” But the creation of this program (and others like it) is less about the program itself and more about the “Hello, World!” moment that a new software maker experiences when they write their first computer program. The software maker has just announced their arrival to the world, and in doing so has peeked behind the curtain to understand the magic of our marvelous modern machines: software.

But whereas learning how a parlor trick works often results in literal disillusionment, learning how computers work can feel like learning real magic. As journalist Clive Thompson describes it, “coding confers an astonishingly powerful sense of control and mastery. The machine does precisely and obediently what you tell it to."1 And it is the “Hello, World!” moment that imbues into a software maker this sense of unbridled potential and power. It’s an experience, Thompson muses, that “taps directly into the thrills of programming, which are deeply Promethean.”

Software is undoubtedly powerful.2 And it makes software makers feel powerful as well. After all, we’re gods of the little worlds we create. Want to create Adam? useradd -m adam. Tired of his kind and need to start over? rm -rf /home. Ready to plunge the world into darkness? shutdown -h now. (Or to be really sure: unplug the power.)

It would be presumptuous to claim that software is the only field that inspires such hubris among its practitioners. Every novelist, painter, and songwriter will have experienced his or her own “Hello, World!” moment. They will have had that realization that one is limited only by their imagination. That feeling that they can create worlds. Every Minecraft builder who has spent countless hours erecting palaces and cathedrals is clearly no stranger to such a feeling.3

Most creative works, however, aren’t good. A parent may put their child’s drawing up on the fridge, but good luck convincing them to do the same for another child’s doodle. Similarly, a “Hello, World!” program will likely only be run a handful of times by the software maker herself before being forgotten or deleted. Of course, there’s nothing inherently wrong with bad creations, especially if they’re used as pedagogical tools. But even among professionals, there are works which the creators will wish to quickly forget.4

Over time, there will be an accumulation of both works that are forgotten and works that persist in the public’s mind. The works that persist will accumulate into a body of work that reflects certain trends in the tastes and values of society. What we choose to forget and what we choose to remember define the zeitgeist of the period.

With the rise of the Internet in the mid-1990s, for example, new programming languages like JavaScript and PHP were born. Languages like C and Lisp, despite being much more established, weren’t designed for the style of rapid web development needed at the time. Similarly, one of the most popular programming languages today is Swift, which was only introduced in 2014 by Apple. Clearly, there are many software makers who want to take advantage of the massive popularity of Apple’s ecosystem. Time will tell how long a language like Swift will be around.5 The same applies to which software development frameworks are in fashion (agile vs kanban vs waterfall), which operating systems are in vogue (Android vs iOS), and which text editors are all the rage.6

Much of the time, these decisions aren’t terribly consequential.7 But every now and then, an objectively bad idea enters the public consciousness. And as James Clear puts it, “The best thing that can happen to a bad idea is that it is forgotten."8 Unfortunately, people can be frustratingly stubborn in refusing to abandon their beliefs. At these times, the only recourse for putting a bad idea to rest may be to wait for enough people who hold that idea to die out. Planck’s principle captures this succinctly for the system of scientific knowledge: “Science progresses one funeral at a time”.9

Software makers have certainly indulged in their fair share of mistakes and bad ideas: online advertising10, the current state of content moderation11, and mandatory password changes12 to name just a few. Given how much technology permeates modern life, these bad ideas have drastic consequences. If we have the courage, we will confront these bad ideas head on, replace them with better systems, and allow the bad ideas to die. If we fail to actively rid ourselves of these mistakes, our only recourse will be to await the death of older generations. More will suffer while we wait.

Humans prefer beginnings over endings. Beginnings arouse hope and optimism, while endings remind us of sorrow and regret. After someone passes, we may continue to celebrate their birthdays while conveniently forgetting about their deathdays.13 But we cannot forget that death is crucial to the evolution of a system. It is a necessary aspect of every field that utilizes human creativity and ingenuity. Hellos may be exciting, but we shouldn’t forget to also say goodbye.14

Thanks to Ken Chew, Jian Liu, and William Zhu for their thoughts and comments on this post.


  1. https://slate.com/technology/2019/10/hello-world-history-programming.html ↩︎

  2. https://a16z.com/2011/08/20/why-software-is-eating-the-world/ ↩︎

  3. https://www.wired.com/story/best-minecraft-builds/ ↩︎

  4. Everyone involved in the 2019 film adaptation of Cats, for example. ↩︎

  5. With any luck, PHP is now middle-aged and will die in a few decades along with the remaining PHP programmers. ↩︎

  6. https://xkcd.com/1823/ ↩︎

  7. There may be a correlation between how inconsequential a decision is and how likely it is to spawn a holy war. Disagreements on camelCase vs PascalCase vs snake_case, spaces vs tabs, and indentation style, for example, can lead to some rather testy arguments. ↩︎

  8. https://jamesclear.com/3-2-1/october-28-2021 ↩︎

  9. https://en.wikipedia.org/wiki/Planck%27s_principle ↩︎

  10. https://www.theatlantic.com/technology/archive/2014/08/advertising-is-the-internets-original-sin/376041/ ↩︎

  11. https://www.eff.org/deeplinks/2019/04/content-moderation-broken-let-us-count-ways ↩︎

  12. https://www.ftc.gov/news-events/blogs/techftc/2016/03/time-rethink-mandatory-password-changes ↩︎

  13. There certainly are celebrations of death, such as Día de los Muertos (Day of the Dead) in Mexico or 清明节 (Qingming Festival) in China, but these are the exceptions rather than the rule. Furthermore, celebrations of birth tend to be specific to an individual, while celebrations of death tend to be more generalized. The U.S. has three federal holidays that celebrate the births of Martin Luther King, Jr., George Washington, and Jesus of Nazareth (Christmas). Only one of the federal holidays is an observance for the dead: Memorial’s Day. ↩︎

  14. https://www.youtube.com/watch?v=rblYSKz_VnI ↩︎