I used to criticize Facebook for creating an echo chamber. My argument was – as almost every one else – that Facebook should show users what is right, not what they agree with.
I overestimated the level of transparency the internet brings. When the political events were happening in Egypt and the local media was hiding the truth, I thought as soon as every one joins Facebook/twitter the truth will be revealed. I was wrong. It happened to some extent, but the echo chambers were much stronger.
One of the main drivers of hostility on the internet is people seeing their core beliefs being attacked. Regardless of the side, seeing something we disagree with triggers our survival response and hence we become hostile to the adversary.
I am recently giving this a lot of thought. I started to think Facebook shouldn’t try to avoid echo chambers, it should strive to create the perfect one.
If you have the perfect echo chamber and only see the things you agree with, you won’t feel the internet is unsafe as it is now. It sounds counterintuitive, I know, but this behavior is why people are moving more towards private conversations.
That being said, it makes me question why we don’t have this until now, this is what I could think of
- Nobody thought of it. I highly doubt.
- It is not technically feasible. I also doubt to some extent.
- It drives engagement down. If we only see things we agree with, we are less likely to engage with the content. Less engagement means less time spent on site, less ads to be served, and less money to be made.
That’s my theory.