Why are Humans Obsessed with Predicting the Future?
How did we get here? And what does this mean for, well, our future?
Human beings are clearly different from other creatures on this planet. But why, exactly? And how did we get this way?
In the latest episode of This Anthro Life, host Adam Gamwell explores these questions with serial entrepreneur, futurist, and author Byron Reese as they discuss his latest book “Stories, Dice, and Rocks That Think: How Humans Learned to See the Future--and Shape It.”
Check out the episode here: How Humans Learned to See the Future - and Shape it with Byron Reese.
I’ll admit it. I’m pretty obsessed with the future.
I can’t help but wonder what’s coming down the road, whether it’s planning for the upcoming week or what social media controversy will erupt during the next midterm election cycle. I mean, collectively we’re all so obsessed with the future there’s been an explosion of mindfulness apps, tools and wellness discounts from insurers to help the populace deal with some of the anxiety that comes from constantly living in the future (or past for that matter). Businesses are becoming obsessed with foresights - how to predict what consumer behavior or cultural trends will take off and shape the future of a category. To some traditional research methods like surveys and interviews can feel too slow in the race to get more predictive data faster than the competition. In short, as a cultural species, we can’t seem to wait to get to the future - and know how we’re gonna get there.
Talking with Byron Reese helped me newly consider how one of the functions of storytelling is a kind of dress rehearsal for what might be in a way I hadn’t before. Stories take us out of the present so easily. Even the simple and well-known opening lines “once upon a time,” or my favorite, “a long time ago, in a galaxy far, far away…” are designed to transport our minds to another time and place. Have you noticed that stories are almost always told in the past tense? As if the future has already happened.
What’s really interesting, as Reese illuminates, is the way storytelling operates as a cultural tool for predicting, seeing and shaping the future. It becomes the bedrock through which we learn new languages for calculating and extending our ability to see and predict the future, namely the language of mathematics and probability theory and then the technologies we’re grappling with today - Artificial Intelligence, Natural Language Processing, Big Data and the like. These technologies not only predict the future faster than any one human ever could, but are doing so in languages we don’t entirely understand (even though we created them).
How do we grapple with a hyper-calculator that tells us the future but operates in a language we can’t understand (at least not yet)?
This question is why I believe the second part of the book on the rise of probability theory is so crucial. Along with our love of technology, we tend to focus on the latest and shiniest objects in AI and Big Data. But this is because we’re used to, and take for granted, our ability to calculate probabilities. Math was never my strong suit in school (the curse of many social scientists?), so one of my blindspots in humanity’s cultural history is the role mathematics has played. (or maybe I just needed a better Algebra curriculum in middle school, who knows?).
Tracing this history opened my eyes to the fact that humanity grappled with the problem of prediction for centuries, and then once unlocked, used this framework to shape society, for better and for worse, around everything from government annuities, mortality calculations, insurance, and (for worse) eugenics.
This cautionary tale brings us back to the question of emerging technologies. Storytelling gave rise to what Reese calls Agora, the metaphorical (or literal depending on how you feel about it), societal superorganism made up of our shared knowledge that enabled us to mentally time travel into the past and future. (Note: Agora is the Greek word for marketplace).
Might we think about culture as a superorganism?
Today technological advancements have enabled Agora to overcome the limits of its (our?) brain. The big challenge of AI today is what makes good data? What are we teaching AI to learn from? Or, what stories is Agora feeding AI and what will the implications of these be? As Byron puts it, “eternal vigilance is the price of current and future technological advancements.”
It’s not that things will go off the rails as so many of our late 20th century and early 21st century Sci-Fi stories like to depict. For one, I highly recommend Becky Chambers hopepunk genre, depictions of Sci-Fi futures where robots simply walked away from factory life and went to study nature. I think in our rush to know the future - and to beat the competition there - we forget our capacity to tell optimistic stories. An anxiety about getting the future right often overshadows imagining what we consider to be the right future or the future we want.
In Reese’s rendering language and storytelling broke humans out of the eternal present, opened up new languages (mathematics) for predicting the future, and unlocked advancements in how we capture storytelling material (data) and how we use it to predict. And I think it matters that as he weaves this story Reese shares he is a humanist and optimist. One thing I’m taking away from this conversation and the book is that storytelling is as much about our need and ability to imagine the future (or past) as it is to predict it. And in this in-between space, there is room to inquire about and shape the kinds of stories we want to tell.
And if you haven’t check out the episode! How Humans Learned to See the Future - and Shape it with Byron Reese.
What are your thoughts on our ability, obsession, and desires for seeing and predicting the future? What are your takeaways from this post or the episode? Let’s get in conversation below.