Then came 2017. The concerns over social-media-born misinformation and propaganda during last year’s presidential race were one flavor of this worry. Another is what Facebook might be doing to our psychology and social relationships — whether it has addicted us to “short-term, dopamine-driven feedback loops” that “are destroying how society works,” to quote Chamath Palihapitiya, one of several former Facebook executives who have expressed some version of this concern over the last few months.
Mr. Palihapitiya, who is now a venture capitalist, made those comments during a talk at Stanford University last month; after the comments were widely reported this week, he walked them back. But his fears have been echoed across Silicon Valley and lately have become something like a meme: What if Facebook is rotting our brains?
This gets to why an otherwise in-the-weeds blog post from Facebook’s research team is so interesting. Though it is quite abstruse, the post, by David Ginsberg and Moira Burke, two company researchers, takes readers through a tour of the nuances on whether Facebook can be bad for you.
It’s possible to read the post either cynically or optimistically. The cynical take is that Facebook is conceding the most obvious downsides of its product in order to convince us it really does care.
Yes, the company noted, people who spend a lot of time “passively consuming” social feeds do tend to feel worse. What’s passive consumption? That’s when you just scroll, click on lots of links and “likes,” and post your own updates without really interacting with others in a deep way. The company pointed to a study published this year in the American Journal of Epidemiology — by researchers who weren’t affiliated with Facebook — that showed that people who clicked on more “likes” and links than the typical Facebook user reported worse physical and mental health.
But hold on, said Facebook. Another study — this one conducted in partnership with Facebook by Robert E. Kraut, a professor at Carnegie Mellon University who has long studied how computers affect users’ psychology — had a more upbeat finding. It showed that using Facebook more deeply and meaningfully, for instance by posting comments and engaging in back-and-forth chats on the service, improved people’s scores on well-being.
“Simply broadcasting status updates wasn’t enough; people had to interact one-on-one with others in their network” to gain great personal benefits from the service, the post stated.
You can see the issue here: Facebook is saying that if you feel bad about Facebook, it’s because you’re holding it wrong, to quote Steve Jobs. And the cure for your malaise may be to just use Facebook more.
The post pointed out several recent and coming changes to Facebook that the company said encouraged active interactions on the service. That’s the real message: Once you discover how much more you can get out of Facebook with this new stuff, you’ll feel super.
O.K., sure, the post can be read this way. But I’m more optimistic about it, because it’s in line with an evolving corporate posture from the company.
After initially dismissing Facebook’s role in the 2016 election, Mark Zuckerberg, Facebook’s co-founder and chief executive, has spent much of the last year publicly grappling with Facebook’s role in the world. He published a lengthy letter to Facebook’s community attempting to establish new social goals for the company. He apologized for glibly dismissing the idea that Facebook could have altered the outcome of the election. And in the company’s last earnings report to investors, he said he was willing to risk the company’s profitability to improve its community.
To be sure, Facebook is putting its own favorable spin on these studies. Yet its willingness to shine a light on critical research, and its pledge to take the findings into account when it designs its products, has to be welcomed as something new.
If you think Facebook is ruining the world, you should be a little glad that even Facebook agrees that we need a better Facebook — and that it is pledging to build one.