July 31, 2023
I’ve been a fan and following the author of this book, Max Fisher, for many years. I first came across his work with Vox’s explanation of the Syrian conflict. Fisher is an international reporter who currently works at The New York Times. He’s also worked at The Atlantic and The Washington Post. I’ve been following him ever since 2017 to help me translate and understand current world affairs and societal issues. So I was excited to read his book The Chaos Machine.
The Chaos Machine talks about what has been going on with these social media companies and understanding the negative impact that social media and its algorithm is having on society. It pulled the veil behind and revealed a lot more about what’s been going on behind the scenes. They are feigning ignorance behind closed doors but are very much aware of what is going on and continue to maintain and keep things going in the name of profits to shareholders.
I learned a lot of incredible things that I’m surprise I had never heard about. The more significant thing was Facebook‘s role in a genocide that occurred by Myanmar military against the Rohingya people in 2017. If you’re not aware, Myanmar is a place or country that had very little Internet access and very little mobile phone penetration. In 2011, the country started to liberalize and allowed two telecommunication providers in. Facebook was running into the issue where they have reached the maximum number of users in the Western world and more or less decided to explore other parts of the world that had yet to be on Facebook. They worked it out with local telecommunications providers to facilitate the expansion of providing Internet. They provided Facebook already installed on mobile phones in order to expedite the adoption of Facebook users. Facebook initially allowed its app to be used without incurring data charges so it became very popular.
There are normally community moderators that help to monitor hateful speech and content on Facebook for a particular country. At this time, there were only a handful moderators for the country. Most Facebook employees did not speak the language. In a short time, misinformation spread on the platform which lead to murderers and genocide of the minority Rohingya Muslims. However, Facebook is refusing to accept responsibility and accountability for its role in this genocide.
With this book I was surprised to learn how disturbing some of the things that social media has directly caused in society. We were able to see that the full effect of it in the United States with our most recent election, and saw that the transition of power, a sign of a normal healthy democracy, was threatened during January 6. And again, social media platforms are not taking accountability for the role that the algorithms are playing on real people. The leaders are choosing to say that they do not understand the algorithm, our citizens and its consumers.
So you may be asking yourself what the book is recommending as a solution and I believe the author does a good job of ensuring that the intent of the book is not to solve this problem. I don’t believe it’s his responsibility or problem to solve. One thing that he suggests at the very end is that if there is not a good understanding of how the algorithm works, and there’s a complete refusal of social media companies and tech companies to accept accountability for the effects of the algorithm, then it is best to simply turn it off.
I agree with this sentiment. There is a selfish part of me as an entrepreneur that understands or knows that those who are able to use the algorithm to their benefit can make quite a bit of money. With financial gains comes serious societal drawbacks. Teen depression rates have risen. People are lonelier than they ever have been. I think that we have to decide if the good outweighs the bad. More importantly, I believe the tech companies do have to accept some level of accountability in order to ensure the well-being of our citizens and its consumers.