Now Here Is a Curious Thing
My book group is deep into a series on new tech — smart machines, AI (artificial intelligence), big data — and the implications for our lives and society.
Our most recent reading has been The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future, by Kevin Kelly. Kelly was the founding editor of Wired magazine.
It is easy to think of the new tech as something “out there,” but of course it isn’t. It is part of all our lives in a host of ways, something that Kelly does a good job of showing.
Take this blog, for instance. One of Kelly’s 12 forces is “Flowing.” In it he talks about the way books (apart from revised editions) are a done deal. My books are, once published, finished. The canon — so to speak — is closed. This blog and each post can, however, be updated, revised, altered at any time. Much more fluid.
That said, a fair amount of the literature on the new tech has a kind of breathless, isn’t-this-amazing, nothing-will-ever-be-the-same-ever-again quality to it. It is a bit, often more than a bit, triumphalist.
As an example of this breathless quality take this paragraph from Kelly’s chapter on “Filtering.”
“The brilliance behind Google, Facebook, and other internet platforms’ immense prosperity is a massive infrastructure that filters this commodity attention.” Getting people’s attention, that is, is what internet platform filtering is good at. Kelly continues:
“Platforms use serious computational power to match the expanding universe of advertisers to the expanding universe of consumers. Their AIs seek the optimal ad at the optimal time in the optimal place and the optimal frequency with the optimal way to respond.” So advertising on the internet is, well, pretty optimal.
About the time I was reading this Robert Mueller announced his indictments of a gaggle of Russians for trying to influence our 2016 election. Their project included getting 126 million viewers diving into fake news stories on Facebook.
Curiously, Facebook executives immediately maintained that these ads and stories had nothing to do with the outcome of the election. Mark Zuckerberg said, “Voters make decisions based on lived experience” (not ads on FB). Facebook’s chief counsel, Rob Goldman, said of the ads, “Swaying the election was not their main goal.” In other words, don’t blame us.
So which is it? Internet platforms “use serious computational power to match the expanding universe of advertisers to the expanding universe of consumers” or “Voters make decisions on lived experience,” and the fake news we so optimally filtered really didn’t make any difference? Probably can’t have it both ways.
My own take-away is that amidst all the very real technological change and innovation, which has both upsides and downsides to it, human nature remains much the same — that is brilliant and flawed, capable of great nobility and highly capable of self-deception. When push came to shove FB execs pulled an Adam, “The woman, she gave me the fruit of the tree.” Gee whiz, don’t blame us and our optimal filters.
The new technology changes a great deal, but not human nature. We still have to answer the most basic questions — who we humans are, what we are for, and how are we to live and live well. Which is a way of saying that the new technology for all its triumphalist claims does not override the importance of our best religious and moral traditions, healthy human cultures and families, and the rituals and relationships that give life meaning and purpose. It is these elements that are really at risk in our society today. For more on that follow that link to my previous post and the Andrew Sullivan article cited there.