Two items on the metastasizing, Borg-like entity known as Facebook recently caught my eye.
Facebook just announced sweeping changes to fix significant problems with its newsfeed, the main conduit for news and information for over 2 billion people. However, the problems with Facebook’s newsfeed won’t be fixed with these tweaks. In fact, they are likely to get much worse as Facebook attempts to fix them. […]
To see why failure was (and will continue to be) inevitable, let me recast the situation:
- Facebook is actively micromanaging the information flow and social interactions of over 2 billion people, and insanely complex and highly uncertain task.
- Facebook is making the sweeping decisions on how to micromanage the newsfeed centrally (with a small team of young executives empowered to relentless tweak the system by the dictatorial fiat of the company’s CEO).
- Facebook’s goals are a selfish utopianism (in its version utopia, the world revolves around Facebook).
The Current Year is very weird, when you think about it. The idea of a “small team of engineers in Menlo Park,” led by this guy –
The right thing for Facebook to do here would be to drop all the micromanagement and simply let each user control his/her own News Feed experience by default, with a full set of tools and filters. No shady algorithm controlling what you see. No censorship except of spam and illegal content.
This would probably require some adjustments to Facebook’s business model, as the News Feed accounts for 85% of the company’s revenue. I suspect, though, that the core reason Facebook insists on controlling that spigot has nothing to do with money.
In everyday life, we tend to have different sides of ourselves that come out in different contexts. For example, the way you are at work is probably different from the way you might be at a bar or at a church or temple. […] But on Facebook, all these stages or contexts were mashed together. The result was what internet researchers called context collapse. […]
In 2008, I found myself speaking with the big boss himself, Facebook CEO Mark Zuckerberg. I was in the second year of my PhD research on Facebook at Curtin University. And I had questions.
Why did Facebook make everyone be the same for all of their contacts? Was Facebook going to add features that would make managing this easier?
To my surprise, Zuckerberg told me that he had designed the site to be that way on purpose. And, he added, it was “lying” to behave differently in different social situations.
Up until this point, I had assumed Facebook’s socially awkward design was unintentional. It was simply the result of computer nerds designing for the rest of humanity, without realising it was not how people actually want to interact.
The realisation that Facebook’s context collapse was intentional not only changed the whole direction of my research but provides the key to understanding why Facebook may not be so great for your mental health.
To me, the experience of using Facebook is akin to being in a room filled with everyone I know, yammering away at high volume. It’s unpleasant, and I avoid it as much as possible.
I remember when Zuckerberg infamously said that “Having two identities for yourself is an example of a lack of integrity.” I recall being very creeped out by that sentiment. It’s deeply totalitarian, similar to the argument that “If you’ve got nothing to hide, you’ve got nothing to fear”; i.e. that only criminals or bad people desire privacy. It also flies in the face of some basic observations about human behavior.
The question is, will users put up with forced “context collapse” and micromanagement of the News Feed over the long run, or will they revolt against this form of paternalistic social engineering? I’m betting on the latter.