Is it possible our power to empathize with each other is weakening because the digital world is becoming more central to our lives and we increasingly have less of a shared experience in that sphere?
One person who speaks on this topic is Jaron Lanier, who I first discovered watching The Social Dilemma. I took some notes on his points:
…think about Wikipedia. When you go to a page, you’re seeing the same thing as other people. So it’s one of the few things online that we at least hold in common.
Now just imagine for a second that Wikipedia said, “We’re gonna give each person a different customized definition, and we’re gonna be paid by people for that.” So, Wikipedia would be spying on you. Wikipedia would calculate, “What’s the thing I can do to get this person to change a little bit on behalf of some commercial interest?” Right? And then it would change the entry.
Can you imagine that? Well, you should be able to, because that’s exactly what’s happening on Facebook. It’s exactly what’s happening in your YouTube feed.
Later, Jaron says:
And then you look over at the other side [of an argument], and you start to think, “How can those people be so stupid? Look at all of this information that I’m constantly seeing. How are they not seeing that same information?” And the answer is, “They’re not seeing that same information.”
His point, echoed by others, is a good one: we have no way of seeing each other’s social feeds — or any tailored online experience for that matter — so we have lessened powers to empathize with what other people may be thinking or feeling.
After listening to Jaron in the movie, I looked him up and found he has a book titled Ten Arguments for Deleting Your Social Media Accounts Right now with a chapter titled “Social Media is Destroying Your Capacity for Empathy” where he further emphasizes this point:
When we're all seeing different, private worlds, then our cues to one another become meaningless. Our perception of actual reality [outside of the digital] suffers. (55)
You know the admonition: “Don’t judge someone until you walk a mile in their shoes”? The equivalent for our online age is: “Don’t judge someone until you spend a day in their feeds”.
But it’s not exclusively social media which presents customized experiences and content to its users. It’s damn near a best practice to do this in any domain of online experiences: social, financial, e-commerce, journalism, you name it.
For example, I was listening to a talk from 2013 by Jason Cohen titled “Why Data, Statistics and Numbers Can Make You do the Wrong Thing”.
He talks about a practice that has become standard for digital product design: formulate a theory for influencing individual user behavior, design accordingly, and then test and measure if it works, i.e. “how can I take cues from what I know about each user to customize their experience and move them along to accomplishing a task?” The task might be what they, the user, want. But it might also be what we, the site makers, want.
when the customer gets to the pricing page, [maybe I think] they are ready or a hard sale [so] I should change the headline, for example, to ‘Buy Now’…Or [maybe] I want to funnel them in to do a certain action or a certain button, so I’d remove a certain side bar because I want to focus them…Maybe I think people who come from this traffic source may be interested in seeing ‘x’ next. People who have just searched for security might want to see a landing page that has a testimonial from a customer about security or a big list of stuff that we do about security. Or [maybe] I think that people from the UK would like a landing page where we use too many vowels in the word color…[maybe] I think that at this point the person would like to chat with a human, and again that’s a tricky one right? On a homepage is a chatbox that pops right up…is that good or bad? It might piss people off – but it might engage more people.
As you can see, it’s not exclusively social media feeds which are tailored to individuals. Increasingly businesses of all kinds are using technology to customize and cater to every facet of an individual’s circumstances.
- This person came from source ___? [algorithm runs] They're likely interested in ___, show them that.
- This person searched for ___? [algorithm runs] They’re probably interested in ___, show them that.
- Show this person ___ and show that person ___ and let’s look at the data and see how people react to those different experiences.
Add up how all these data points shape what ultimately gets delivered digitally to someone and you can see how we’re all living in worlds that look different from each other.
From social media posts to search engine results to media recommendations to online marketplaces. Even in “boring” industries like insurance or health I’m sure there are lots of algorithms or A/B tests running to show different things to different people.
I know many modern applications are so customized for end users, that there’s no way anybody can jump into production and get a representative view of what any one person will see at any moment in time. What you see is subject to a million different factors, data points, and feature flags. There is no one Facebook, there are a billion Facebooks.
I’m not advocating that any kind of customization is a bad thing. I’m merely noting there are side effects to practices we often think to be solely beneficial — “this could only ever be good for everyone” is a common refrain in tech.
As our world moves more towards virtual reality and the metaverse, it’s possible we will end up standing in the same virtual place together and see two very different realities.
How hard will it be to empathize with each other then?