AN OP-ED FROM THE FUTURE

How Can You Appreciate 23rd-Century English? Look Back 200 Years

by Gretchen McCulloch

Editors’ note: This is part of the Op-Eds From the Future series, in which science fiction authors, futurists, philosophers and scientists write Op-Eds that they imagine we might read 10, 50 or even 200 years from now. The challenges they predict are imaginary — for now — but their arguments illuminate the urgent questions of today and prepare us for tomorrow. The opinion piece below is a work of fiction.

When you think of communication back in the early 21st century, you probably think of it as the beginnings of the modern phone. But you may not realize that it’s also the origin point for many words and linguistic constructions that we’re still using now, 200 years later. I’ve been using the records at the Internet Archive to research the English of this fascinating historical era, and my research has led me to believe that we should take a more relaxed and curious attitude toward our own language changes in the 23rd century.

For example, did you know that there was a period between the 17th and

Chic-fil-a  
 

the 20th centuries when English didn’t make a distinction between formal and informal ways of addressing someone? Shakespeare distinguished between formal “you” and informal “thou,” but our presentday distinction between formal “you” and informal “u” dates back only to the beginning of the internet age. How could people of this unfortunate era have had a true understanding of the Bard when they had no way to fully grasp the intimacy of the sonnets (“shall i compare u to a summer’s day / u are more lovely and more temperate”)?

I was also surprised to learn in my archival research that “lol” was once an acronym! Today, we’re all used to it being a simple indicator of double meaning, but it actually once stood for the phrase “laughing out loud.” I’ve even seen historic phrases like “it’s lol funny,” which only make sense if we interpret them as “it’s laugh-out-loud funny.” The shift from genuine to aspirational laughter seems to have been well underway by 2001, and the further shift to irony that we’re all familiar with today happened more gradually in the decades after. Strange enough, there are even some records of people spelling it “LOL” or pronouncing it “ell oh ell” before it settled on the obvious “loll” pronunciation.

Several of our common words were once acronyms from this era, such as “omg” (“oh my god”) and “wtf” (“what the” followed by an expletive then as now unprintable in these pages — some things have not changed). (Another word that started as an acronym was “ok,” but it’s even older, from the 19th century.) The acronymic history of these words has been obscured by the fact that they quickly shifted into being written in alllowercase so as to avoid confusion with allcaps for shouting, but there were once hundreds of words in this category, including the long-forgotten “rotflol” (“rolling on the floor laughing out loud”) and “hhoj” (“haha only joking"), as well as the deliciously archaic “afk” (“away from keyboard”).

The early 21st century was also a golden era for linguistic innovation related to using indirect constructed dialogue to convey actions and mental states. In speech, this era saw the rise of “be like” and in writing, the “me:” and *does something* conventions. (And I’m like, how did people even communicate their internal monologues without these?? also me: *shakes head* yeah I have no idea.) We now take these linguistic resources for granted, but at the time they represented a significant advancement in modeling complex emotions and other internal conditions on behalf of oneself and other people. Imagine being limited to the previous generation of dialogue tags, which attempted to slice everything into sharp distinctions between “said,” “felt” and “thought.”

Most intriguingly of all, I unearthed a psycholinguistic study from 2004 that examined how people conveyed sarcasm in writing when discussing fashion fails, but concluded that while participants did attempt to communicate written irony, the primary typographic resource at their disposal was the simple — and then still-ambiguous — dot dot dot. (“Oh wow, that dress is … ravishing.”) Similarly, a 2013 book by Keith Houston surveyed five centuries of philosophical proposals for indicating written irony, but in the end reported, “the irony mark (and, for that matter, the sarcasm mark) remains an elusive beast.”

Little did they know that barely a few years later, writers who had grown up with a rich inventory of typographic signals of importance or enthusiasm would develop the maturity to repurpose them into the detailed inventory of ironic double meaning that the world had been craving since a 1575 printer first proposed a mirrored question mark (؟) to distinguish rhetorical questions. A book from 2019 details the beginnings of such a list, including “scare quotes,” Pseudo-Important Caps, the ~ironic ~tilde, ✨faux-enthusiastic sparkles✨, s p a c e s t r e t c h e d d e a d p a n, and the. passive. aggressive. period.

So you’d imagine that early-21st-century people would have been really excited about this fascinating era that they were living in, right?

In my research, I came across so many doommongering quotes about how texting was ruining the English language, when we obviously now know it as a cultural renaissance in writing that ushered in the new genre of the textolary novel and other kinds of microfiction, not to mention creating now-classic nonfiction formats like the thread. (I drafted this op-ed as a thread myself, as any sensible writer would do, because how else would I stresstest each of my sentences to make sure they were all pithy and vital?) Not to mention, of course, that the internet as a near-free means of distribution opened up writing to a wider range of writers who no longer needed a publisher or distributor to share their writing with the public.

As ridiculous as the fears of the past seem, when I read them, I found myself seeing with new light the fears of the present. We’ve all heard the complaints about how the youths are communicating these days — many of us even have complained about it ourselves. But what will the people of the 25th century think, looking back at our 23rd-century rants about kids refusing to say “no worries” in response to “thank you?” Won’t they be totally accustomed to hearing “it’s nothing” or its even more reviled short form “snothin” by then? Perhaps these phrases will even seem oldfashioned, the way “no problem” seems stuffy and ironic to us, although people in the 21st century were still sometimes complaining about its novelty.

Or take emoshes, those little generated drawings that adorn our messages and are reviled by technological fearmongers the world over. Won’t our objections to them seem quaint eventually, as quaint as the archival diatribes I read about emojis? I’m sure you’ve seen complaints that kids these days will never develop artistic ability because the AI just completes their drawings for them! Or that custom-generated emoshes in response to your sketches makes them too unsubtle and obvious compared to when you actually had to go find the correct emoji yourself! Maybe the kids will stop using words altogether because sending pictures has gotten so easy! It’s hard to believe, but in their day, 21st-century emojis were just as controversial, and the same criticisms reemerged in successive generations when emojis became animated and 3D.

It now seems obvious that emojis didn’t lead to any sort of decline in writing or language, even though people worried about this at the time, and so too will practicing our informal drawing more often lead to more kids developing their artistic abilities. After all, the invention of the camera didn’t kill other kinds of visual art, and A.I.-augmented drawing is just another tool at our disposal. It may even lead to a great expansion in drawing ability, as sketching becomes something we do every day as a conversation, rather than the exclusive domain of the artist, just like writing did in the early 21st century.

How arrogant of us to think that, amid all of the possible eras of the English language, it somehow peaked exactly one generation ago, in the 22nd century. How foolish the critics of those bygone years look in their disdain for their own century and reverence for the 20th or the 21st. How clear it is, from the perspective of history, that when we mythologize the English of a previous age, all we’re doing is creating a moving target that we can never quite hit.

We can break this cycle. We don’t have to wait until the 23rd century passes into history before we start appreciating its linguistic innovations. We don’t have to use language as a tool for demonstrating intellectual superiority when we could be using it as a way of connecting with each other.

 
 

Students, teachers and reformers are pushing back against the failures of mainstream higher education.

A small band of students will travel to Sitka, Alaska, this month to help reinvent higher education. They won’t be taking online courses, or abandoning the humanities in favor of classes in business or STEM, or paying high tuition to fund the salaries of more Assistant Vice Provosts for Student Life. They represent a growing movement of students, teachers and reformers who are trying to compensate for mainstream higher education’s failure to help young people find a calling: to figure out what life is really for.

 
 

This Is Your Life on Climate Change

The 2010s were the hottest decade ever measured on Earth, and 2019 was the second-hottest year ever measured, scientists at NASA and the National Oceanic and Atmospheric Administration announced today.

After a year of flash droughtsrampant wildfires, and searing heat waves that set all-time records across Europe and turned parts of Greenland’s ice sheet into slush, the finding was not a surprise to researchers, or likely anyone else. But it capped an anxious decade that saw human-caused climate change transform from a far-off threat into an everyday fact of life.

 
 

College-Educated Voters Are Ruining American Politics

Many college-educated people think they are deeply engaged in politics. They follow the news—reading articles like this one—and debate the latest developments on social media. They might sign an online petition or throw a $5 online donation at a presidential candidate. Mostly, they consume political information as a way of satisfying their own emotional and intellectual needs. These people are political hobbyists. What they are doing is no closer to engaging in politics than watching SportsCenter is to playing football.