If you live in modern times you've probably experienced Algorithm Anxiety—the feeling that your hard work to tailor a personalization engine will be easily undone. Engines that govern content recommendations are tailored by our interactions, even micro-interactions, whose signals are so plenteous they cannot be numbered.
I faced this feeling recently. I was accidentally logged into Spotify on the family iPad, and my beautiful nine-year-old kiddo decided to marathon CG5, BTS, and a bevy of Minecraft and Among Us-themed music.
It took me four years to perfect my Spotify.
And it took my daughter a single afternoon to ruin it. My recommendations are now irreparably damaged.
Algorithm Anxiety often leads us to depend more on the algorithm than on human interactions, which leads to isolation. We've become conscious of the ever-omniscient algo, driven by our dependency on the recommendation engine, leading us to a latent, low-humming, unceasing anxiety.
The result? We're less experimental, reticent to take suggestions, and inevitably become trapped in a genre prison.
Seeding and Serendipity
The average Spotify subscriber is acutely aware that their past listening habits shape future recommendations, but they may not realize the depth of the data that Spotify has on them. Spotify doesn't just track what you "heart", or like, but also how much of a song you listen to before skipping, as well as genre affinity.
While Spotify is merely good at it, Instagram has perfected the art of behavioral analysis in its recommendation engine.
Your micro-interactions give off signals to Instagram’s algo about your nuanced tastes. Then the algo takes that data and begins to feed you suggestions based on other people's behavior—often people that have similar tastes to your own. If the algorithm knows about your social graph (e.g. you logged in with Insta or FB), it may take the behavior of your closest contacts into account.
This process, called seeding, is tested against the control, which is your normal, everyday behavior. Based on your interaction, new information can be gained. Not just about your tastes and interests, but also your ability for those tastes and interests to be manipulated.
In Episode 105 of Future Commerce, we touched on how Instagram’s recommendations shape taste over time:
Phillip: [00:09:07] If that's something that caught your eye [Instagram] knows contextually [that you scrolled back to it, and paused...] so they can feed more information to you and they can reinforce that over and over and over.
It's the reason why I believe I am into sneakers. It's not because... I've loved sneakers my whole life. That happened when I was 35. It's because Instagram kept shoving it down my throat.
Maybe I'm happy being a sneakerhead. Maybe. Or maybe it's the reason that I buy anything nowadays.
This awareness that the algorithm is "watching" leads to a category of cognitive biases called the Hawthorne effect. The Hawthorne effect occurs when people behave differently because they know they are being watched.
The Hawthorne Effect: a type of reactivity in which individuals modify an aspect of their behavior in response to their awareness of being observed
I often think about a tweet from 2018, about a Lyft driver giving a form of praise to an unknown force, the algorithm, for its beneficence:
What we gain in experience, we trade-off in a background static of worry and paranoia about ruining some perceived gain or progress. A squandered time investment made in "perfecting" the engine. When the engine works in our favor, it feels like a divine blessing. When it works against us, it feels like punishment.
It seems to me then, that Algorithm Anxiety has similar symptoms and effects as psycho-spiritual stressors. We've created new invisible gods whose blessings are fleeting, and whose curses are everlasting. Instead of life eternal, we discover more bands like Anderson.Paak.
Creating new gods to please. How very human of us.
This week I realized the power of my chronophobia about not just my Spotify algorithm, but of all the algorithms that power my experiences. This includes, but is not limited to:
- My TikTok FYP
- My ad recommendations on various websites
- My Google News push notifications
- My Instagram Explore page
- My Poshmark style
The more "artificially intelligent" the world becomes, the more cognitive weight it places on the consumer to continuously maintain a garden that a single pest could destroy.
Greek poet Ovid wrote of Pygmalion, a sculptor, who became so enamored with his own creation he begged the gods to bring her to life, and they do so.
Have you ever found it curious that science fiction writers of the early 1900s were eerily accurate in their depiction of the future? It's no coincidence. The Pygmalion Effect would state that we built that future on purpose based on our foreknowledge; making science fiction into science fact. The technologies we have today are the fulfillment of the work of creative imagination that took place in prior decades.
High-speed rail, global telecommunications, and pocket-sized, touchscreen devices—these are all examples of The Pygmalion Effect.
The Pygmalion Effect is an example of an other-imposed, self-fulfilling prophecy: the way you treat someone has a direct impact on how that person acts. If another person thinks something will happen, they may consciously or unconsciously make it happen through their actions or inaction.
We're living out the Pygmalion Effect today with the current talk of the metaverse. While Instagram has the power to shape consumer desire, its chief executive officer, Mark Zuckerberg, has the power to shape where its competitors make investments. Meta, the new parent company of Facebook, is making investments in VR and ancillary technologies, virtually willing the metaverse into existence.
When thinking about these past ten months of 2021—the collective excitement around crypto, the bull market in NFTs, the hordes of people using services like Discord to organize, and the rise of Decentralized Autonomous Organizations and the new breeds of corporations they enable—it’s astonishing the pace of adoption and disruption that developers and community members are welcoming. Their efforts have pushed Zuckerberg into a full-on change of narrative and he’s betting the future of his company on it.
The metaverse becoming self-fulfilling, based on external stimulus, is analogous to a 1968 study performed by psychologists Rosenthal and Jacobsen. In it, they conducted an experiment to see whether student achievement could be self-fulfilling, based on the expectations of their teachers. From a Simply Psychology article on Self Fulfilling Prophecy:
Rosenthal and Jacobsen gave elementary school children an IQ test and then informed their teachers which children were going to be average and which children were going to be "Bloomers," the twenty percent of students who showed “unusual potential for intellectual growth”
However, unknown to the teachers, these students were selected randomly and may or may not have fulfilled those criteria. After eight months, they came back and retested the children's intelligence.
The results showed that Bloomers' IQ scores had risen (experimental group) significantly higher than the average students (control group), even though these academic Bloomers were chosen at random. The Bloomers gained an average of two IQ points in verbal ability, seven points in reasoning, and four points in overall IQ.
The Twitter algorithm may be to blame for Zuck's Meta-pivot.
For years, the Twitter algo has been uniquely tuned to provide contextual suggestions for tweets from people outside of a user's immediate network. But during Covid, lockdowns and stimulus contributed to an unusual acceleration in interest in speculative assets like crypto and aggregation of subsidiary communities. Crypto suggestions began to fill timelines and by the summer of 2021, DTC Twitter became NFT bull market Twitter. eCommerce Twitter avatars changed to PFP (profile pic) images of cats and monkeys. VC Twitter turned into pumping various PFP projects. The general tulip mania of "internet money" achieved escape velocity — at least, until the crypto crash.
But back in the days of peak crypto, I witnessed a general anxiety about being left behind as a technological revolution takes hold. If you spent any time on Twitter, you've no doubt witnessed the NFT bull market in real-time; and you may have begun some research on Google, joined a Discord, or participated in a Twitter Spaces discussion to learn more about web3 and NFTs. If so, you helped train a host of algorithms to proliferate a sense of rapid adoption, the need to become educated, the FOMO, and eventually, "aping in."
The metaverse as a concept is so broad that it's effectively all-consuming; as it can mean community, economy, currency, pseudo-anonymity, corporate and governance structures, and even play-to-earn video games that double as a form of income and work.
In our version of this reality, Meta (the company) will bring the metaverse (the thing) to consumers whether they’re ready for it or not, based on the external stimuli of an incredibly passionate and vocal community of crypto and defi enthusiasts.
Why? All because of algorithmic predestination.
The Metaverse and Algorithm Anxiety
To be fair, the bet seems like a sound strategy. In the shift to mobile devices, Facebook ceded its fate to carrier networks and device manufacturers; namely, Apple, Google, and Samsung. Now it is manifesting the possibility to own the hardware gateway to the metaverse, with its ownership of Oculus, positioning AR and VR as a meta-layer on top of the real world.
Meta’s attempt to own the devices and the platform runs counter to the crypto community’s passion for anonymity. VC firms are investing in pseudonymous founders of crypto projects. This is fundamentally at odds with Facebook’s view of the metaverse: real identities, and real people, in a virtual world.
Facebook has survived at least three fundamental shifts in technology in just twenty years: it won social media, it survived the smartphone wave, and it changed culture forever (for better or worse). Now, for it to survive the next cultural shift, it must fulfill our predetermined future.
This poses a problem for people suffering from Algorithm Anxiety.
The awareness of being watched, and our actions tattling on us, is shaping how we spend money, who we interact with online, and eroding our tolerance for risk. It's changing how we interact socially. Without controls in place to allow us to have behavior "modes"—or the ability to take on pseudonymous personalities while in the metaverse, we run the risk of having our virtual world experiences shape our real-world behavior. Algorithm Anxiety could intensify and spill over IRL.
Awareness that these apps are passively altering our behavior gives brand, eCommerce, and business leaders an opportunity to create more equitable experiences for customers.
Platforms like Spotify lack simple user controls to prevent Algorithm Anxiety—for instance, the ability to clear browsing data from a particular window of time. Heck, let's put Incognito Mode into everything. Isn't that what it's for? So that we're not tracked when we're behaving outside of the norm?
Sure, the stakes are low when our children ruin our musical suggestions on Spotify, a service that we elect to use of our own free will.
What happens when the impacts have higher stakes? We can’t realistically opt out of algorithms that power the world around us. Yet. We envision a future where algorithms are open, where corporations have transparency about the parameters and inputs that affect an algorithm.
Maybe one day algorithms will become portable, and consumers will bring their own profiles and preferences from digital experience to digital experience. The best way to avoid algorithmic anxiety is to remove the anxiety altogether, by bringing transparency and portability to online platforms.