A follow-up to “Sugar-Coating Facebook”

In the wake of my post about Facebook a little over a month ago, I received a ton of feedback, both positive and negative, about the contents of the piece. To those who responded: thank you. It means so much that you are interested enough not only to read what I wrote, but to be critical of it as well.
First and foremost, a friend pointed out that the problem with Facebook is less of a “Facebook” problem, and more of a “people” problem. I agree, to a point, and only meant my piece to point out that Facebook’s model is currently more likely to be used to create echo chambers than to promote honest and open communication between people with opposing views. This is a people problem — people are choosing not to expose themselves to opposing views — but it is only a people problem because Facebook’s business model is making it one. Given this, I altered the second half of the piece to make it more hopeful — to make it clear that I believe there to be a solution to this grand, capitalistic issue that I outlined with no tactic for combating it. My dad’s critique was the same. He was not convinced by my argument, and pointed out again that Facebook was simply acting as a medium for engagement with different ideas for the people on it. He pointed out that if we want to create a world in which people remove themselves from echo chambers, we needn’t shut down Facebook, or even stop using it entirely — rather, we simply need to change the way that people use Facebook (simple, right?). This was an elaboration on the first argument, though in essence, they were the same.
“[I]f we see the best possible use case of Facebook as the ‘solution’, we are ignoring the reality that people tend to use tools in the way they are designed — hence the famous quote: ‘If you are a hammer, everything looks like a nail.’”
The critique that most puzzled me, however, came after I had edited and republished the piece, from one of my closest friends, and someone who is an ardent supporter of technology and its power as a force for good. It surprised me, then, when he told me that he disagreed that Facebook was a people problem, and believed that as a tool and resource used by vast numbers of people to ingest information about the world, it is Facebook’s responsibility — and not only its responsibility, but its duty — to design itself so that it offers users not only that which they want to see, but also that which might challenge them, or provide a different perspective on a belief they hold tightly.
He argued that by seeing the best possible use case of Facebook as the “solution”, we are ignoring the reality that people tend to use tools in the way they are designed — hence the famous quote: “If you are a hammer, everything looks like a nail.” It is unrealistic, he claimed, to think that just because Facebook can be used as a force for good, and just because people can seek out contrarian opinions and create online realities for themselves that are arguably more diverse than those which they are surrounded by in the real world, that they actually will. Thus, it is Facebook’s responsibility — or the responsibility of another company — one that is younger, nimbler and more adaptable — to find an entirely new way of distributing media that does not bow to the same constraints as Facebook currently does. It could be, too, that Facebook would offer a “contrarian” section to their news feed, where accounts and publications that Facebook’s algorithm determined as contrary to the user’s own would be available for perusal. In a more extreme case, these articles would be placed in the user’s news feed in proportion to the amount of content that each user consumed from the publications, accounts, and people they follow — content from their echo chamber, so to speak.
And yet, in many ways, finding a solution to the problem using the service that created the problem would appear to be counterproductive. If a solution in the general sense is the step that follows the identification of a problem, then solutions by their nature are imperfect, because finding one typically depends on reframing the use case of the subject that created the problem (in this case, Facebook). In essence, solutions as we know them are nothing more than transient fixes to structural problems, whereas a true solution is something that eliminates the institutional framework that gave way to the original issue.
Overconsumption (in the broadest possible sense) offers an interesting metaphor. For decades, the issue of overconsumption has been combatted by giving people better and more efficient things to consume. The iPhone, for instance, eliminated the need to carry around an MP3 player and a phone (and about a million other things) at once. Similarly, car companies, aware of the environmental degradation for which their cars were responsible, began making them more fuel-efficient. The iPhone did not stop us from listening to music, or communicating with one another via mobile devices, rather it gave us a fun and more efficient way to do things that we already did on a daily basis. Similarly, electric cars did not stop us from driving, rather they absolved us of our guilt by catering to the collective societal understanding that every problem can and should be solved through the purchase and subsequent use of something new. These solutions to overconsumption, then, though they have caught on, and have undoubtedly made the world better, are still rooted in the problem itself. It may appear that our behavior with regard to consumption has changed, but it hasn’t, and unless our society collectively realizes just how close we appear to be to the tipping point with regard to environment, it won’t. Thus, consumption-based solutions have been and will continue to be the method by which we combat overconsumption, and more importantly, misconsumption — the idea that what we are consuming is not what we should be. This encapsulates many of the problems with Facebook — and markets — in one sentence.
Finding a true solution to any problem requires rendering irrelevant the basic structural institutions that first gave way to that problem. Because of this, true solutions are extraordinarily rare. This does not mean that we should not pursue them, rather it means that the issues presented by Facebook will be almost impossible for Facebook itself to solve. With Facebook, it is lazy — condescending, even — to consider the problems it presents solved just because there would exist, somewhere within Facebook’s algorithm, a way for us to tweak the typical Facebook use case and get out of the echo chambers in which we often find ourselves stuck. If a million people on Facebook do this, that still leaves 99.99% of Facebook’s users out of the loop. It is akin to publishing data that blatantly explains why sugar will ruin your body and mind and considering that a solution to obesity. Of those in the world that consume sugar — which is everyone — a small minority may read the study, and of those that do, an even smaller minority might even change their behavior. Most, however, will not. Is this natural selection? Perhaps. But our society has moved past the point where it is acceptable to simply let those who do not have the time nor the resources to understand what is best for them languish.
And yet, a world in which Facebook controls media distribution is almost more worrisome than a world in which they act as a medium to show us what we want to see. It is akin to taking power from the people and placing it in the hands of a dictator, and benevolent or not, this would be a direct threat to the (admittedly imperfect) institution of democracy. Democracy and markets, given that they are both rooted in the desires of the majority, can, from time to time, produce objectively undesirable, or unhealthy outcomes for some living under them (think Trump — though he didn’t technically win the majority, fat-free food saddled with added sugar, echo chambers, etc.). By their nature, however, democratic and market-based societies tend to self-correct, even though in the short run, some may suffer. Not only do they self-correct, but they do so in a faster, more seamless, and less violent manner than do societies living under systems of socialism or communism, or societies reliant on a dictatorship (think U.S.S.R., North Korea, Venezuela, etc.). Democracy and markets are far from perfect systems, but in the long run, the outcomes they produce are fraught with less violence and more prosperity than outcomes produced through other political and economic mechanisms.
History tells us that no economy has ever successfully incorporated market socialism — and that is what giving Facebook power over media curation/distribution would be: the centralization of the means of media curation and distribution. If we give Facebook the power to choose what to show us, algorithmically or otherwise, instead of allowing users to choose for themselves, we are taking one problem and replacing it with another, arguably larger one. By giving one organization the power to influence public opinion by picking and choosing what is worth reading, what is considered contrarian, and what is considered mainstream, we are trading the tyranny of the majority for a sort of media-based oligarchy. And of course, these contrarian “publications” would look different for every person on Facebook. In a sense, this would lead us deeper into our own echo chambers, and farther away from a “solution” to this issue.
Between these two extremes — between a world in which Facebook does nothing, and a world in which it takes over the entire process of media distribution — there exists a medium. This medium is difficult to describe, because it does not yet exist, but it can, and it must, given the increasingly influential role that Facebook is playing in everything from general elections to ideological conflicts. And while the idea of media centralization is worrisome (ex. North Korea), it is nowhere near as worrying as market socialism on the whole (ex. U.S.S.R), or a true autocracy (ex. North Korea, again). 62% of Americans count Facebook as one of the mediums they use to find and digest news. In an autocracy like North Korea, 100% of the population is forced to count the government as the only medium they can rely on for news. My worries about Facebook stem partially from paying attention to countries like North Korea, and noticing just now problematic things have become there, but it is elitist and wrong to claim that the two are even remotely similar.
Facebook should not come to dominate the media distribution process in its entirety, but given how many people use it, and its influence on the 2016 presidential election, it must assume some semblance of responsibility both for curating the content that it manages, and distributing it in such a way that users do not lose themselves completely. If it assumes this responsibility — and it should — Facebook will be forced to ask questions of itself like no company before it. Given its power and ubiquity, we can only hope that the answers it arrives at are the right ones.
Like this? Click here to subscribe to my newsletter.