December 22, 2021

Mysteries of Facebook

It is a rare day when I don’t spend time on Facebook.  I use it to connect with friends and relatives and as a source of news. (My news also comes from NPR, ABC, MSNBC, The New York Times, The Washington Post, Time, and other sources. I certainly don’t rely on Facebook!) The majority of my Facebook friends are connected to The Episcopal Church.

Most of my interactions on Facebook proceed smoothly and predictably. The Meta Mavens who run Facebook always seem to be tweaking how the site operates, however. This is always disconcerting and seldom seems to result in an improved interface for users. No one really understands the Facebook algorithms used to populate one’s news feed, and other features are inscrutable, odd, or irritating.

I don’t want to undertake a thorough critique of Facebook here, but I do want to take note of mysteries and peculiarities of the software used by billions of people.

The News Feed. Why do I see what I see in my news feed? Why don’t I see what I don’t see in my news feed? Posts from some of my Facebook friends never show up there or even in my notifications? It would be useful to understand why I see what I see and to have straightforward means to adjust what is displayed.

Notifications. I check out all the posts and comments flagged in notifications. Like the news feed, however, I don’t understand how the list is populated. To me, the most mysterious feature of my notifications is that some notifications name a Facebook user (e.g., “So-and-so posted …”) whereas others indicate that a Facebook friend has done something (e.g., “You friend So-and-so …”). Why are the actions of different friends announced differently?

Posting. My posts often link to other sites on the Web, perhaps to my own blog, to a media site, etc. I don’t understand why this process sometimes works differently on my own page from how it works on a group page. For example, if I post on my home page and on a group page the exact same text containing a link to one of my blog posts, Facebook may display a graphic from the linked-to page in one instance but not in the other. Why does this happen? Is there a workaround?

Facebook used to give the poster some—though not much—control over which graphic is displayed from a linked-to page. Having full power to select a graphic here would be helpful. For example, when I link to a post on my blog that contains no internal graphics, Facebook sometimes displays a graphic from the blog’s sidebar. Sometimes, it displays a tiny printer icon at the bottom of every post, which it then blows up and is unrecognizable (as well as irrelevant). Sometimes, it displays no graphic at all, whether or not the post contains one. When Facebook does display a relevant image, it must often reduce it to fit the allotted space. At times, it both reduces the graphic and truncates it. The latter action sometimes can cut off text or, as in my last post, even decapitate a person in a photo. Why doesn’t Facebook give users more control over graphics? Likewise, I would like more freedom to position my cover photo.

Censorship. I am sympathetic to the position of Facebook with respect to the policing of posts. There are calls for Facebook to be more “responsible” and to remove posts that are false or somehow dangerous. Although, as a private company, Facebook is not bound by the free-speech guarantees of the First Amendment, it leans toward the very American value of allowing free speech. Do we really want Facebook to be the final arbiter of what is true or what is dangerous? Arguably, posts proposing quack cures for COVID-19 are both false and dangerous, but is their removal the best way to deal with them?

Facebook seems to attack the low-hanging fruit of questionable posts. A friend of mine, for example, is always finding herself in Facebook jail, though I cannot understand why. She is neither a fabricator nor a revolutionary. Moreover, whatever algorithms are looking for questionable posts or comments seem pretty dumb. I once had a comment removed as being “hate speech.” In response to a post about some stupid behavior—I don’t remember the post at all—I made the comment, “Americans are idiots.” No one reading this would have taken this as hate speech aimed at all Americans (a class to which even I belong). Facebook’s algorithm did, however, and my protest of the comment’s removal failed to have it reinstated. In general, Facebook’s algorithms seem not to appreciate irony.

It is surely the case that Facebook posts have been damaging to the Republic. Posts about the 2020 election’s having been stolen, for example, must take at least some credit for what happened on January 6, 2021. I honestly do not know how to prevent such damaging speech. I think that part of the problem is that people on Facebook, like people in the country at large, have tended to segregate themselves by political views. Facebook actually encourages this. Perhaps it should force us to deal more frequently with people whose worldview is wildly different from our own. We might not like that, but it might be good for us and for our country.

No comments:

Post a Comment

Anonymous comments are not allowed. All comments are moderated by the author. Gratuitous profanity, libelous statements, and commercial messages will be not be posted.