Great article. The author concludes with the following points on what is happening to information as all information is increasingly "flattened" to fit the new, single content stream of the social media feeds on our smartphones:
1. Everything is trivialized: major public policy changes arrive in the same package and placed next to trivial pop culture.
2. We respond to information using the same low-bandwidth tools (like, heart, retweet, etc.) that limit the expressiveness of response.
3. All information is in direct competition, playing the algorithms to gain attention.
4. Power over information is consolidated to a very small number of gatekeepers, and mainly Facebook.
This is an interesting way to look at things, and it resonates with me. I think the author described the problems with what I think of as the "hypermedia" era very well. But the author doesn't offer suggestions for what, if anything we should do about it.
As for myself, a couple years ago I read Cal Newport's Deep Work and then Digital Minimalism, and found their ideas compelling enough that I deleted Facebook and disabled web browsing and email on my phone (although I later put web and email back).
However, #deletefacebook seems to have about a snowballs chance of happening at large.
What countering forces could or should break the deathgrip that Facebook and Google in particular hold over communication and information publication today?
He missed an important piece - it all functions because the Reward circuitry in peoples brains are feeding off of Likes/Clicks/View/Upvote counts etc etc.
Take the dumb counts away. Hide them. Delay them etc etc and all of a sudden pavlovs dog or skinners rat starts behaving differently.
People act as if this dumb half baked reward system solely existing to increase "engagement" underlying everything is now unchangeable and baked into the fabric of everything for ever and always.
There are quite possibly better ways to run reactions than the way current social media does it.
However, I think it's easy to miss the important role/use that these devices play. The article "the silence is deafening" just the other day noted how non-verbal cues in conversation and I'd add that social media reactions and similar devices allows people to be "loosely engaged" with each other in the way that who attend the same event without exchanging words are loosely connected.
And these kinds of perfunctory attachments are important (as COVID also shows). These kinds of attachment were also manipulated even before the Internet, with store clerks instructed to say "have a nice day" and such even in the 1970s.
Which is to say, likes, memes and emoticons may seem kinda silly and annoying from one perspective but they're a way to emulate "the stuff of life" and we need to think about how to do that better.
There are market forces behind the engagement cult, so - unfortunately - they're probably not wrong while those forces continue to operate.
The root of that problem is the use of the web as a device for capturing time and attention and selling that attention to third parties. As long as that continues to be the prime directive, the race to the bottom will continue.
I derive some hope from the fact that there seems to be an accelerating trend of people doing their online socializing in places other than Facebook, even if they still technically have a FB account. I don't have stats to back this up (FB itself has an obvious motive to obscure the fact that this is happening, so I don't trust their usage statistics), but I'm meeting more and more people who state that they only visit FB rarely, and express a preference for WhatsApp, Discord, or some other service for actually getting into contact.
A related anecdote: I finally took the step of deactivating my FB account some months ago, and when I told people about this I mostly heard sympathy rather than disappointment or disbelief.
For #3 at least, an environment that separated within-topic from between-topic competition for attention might be a major improvement. A large search engine cooperating with a government might be able to pull this off by making content also findable on any major all-vs.-all feed ineligible for display: a preselected topic is embedded in each search, so search results can in principle be isolated from between-topic attention competition. This would almost have to start outside the US, but if it actually works it may then see wider adoption.
(I'd expect some radicalization/"evaporative cooling" memetic hazard to remain, but it should at least decrease in intensity, since one driver of increased self-radicalization is low viability of moderate, non-attention-grabbing content relative to overstated clickbait.)
Power over information is consolidated to a very small number of gatekeepers, and mainly Facebook.
I run a modestly large Facebook group. I haven't noticed any way that Facebook prevent me from whatever sort of information I want to be. So I can't see Facebook as an absolute filter.
But certainly, at scale, Facebook does wind-up being a filter but through the default, predictable laziness of the average person.
Remember, Hypothetically, if you had a network of thousands of moderators and millions of people in civil, well moderated groups, you could have a debate in almost social media forum you chose; FB, Reddit, some BBS, email, whatever. The exception would Twitter or any media that doesn't allow moderation. But at a point, if we had actual community, it could trump the media it was on.
This is not to say that this is at all likely. Rather, the point when you have passive mass of seething people, yes, sure who is turning the spigots on the hot-button issue content is going to be an "influencer" but this status is by no means cure-able by the elimination of a given media outlet (well, some portion of "#deletefacebook" seem to be weird introverts with a sort of daft "the problem is people who don't know each talking at all" approach and who hope that eliminating platforms can stop this).
What countering forces could or should break the deathgrip that Facebook and Google in particular hold over communication and information publication today?
I think the hostility these entities face today expresses a situation of them not having a tighter grip (at least a much tighter grip) on the public imagination than The New York Times, CNN, right-wing radio, corporate public relations etc. There are many forces struggle here. In many ways I shudder to think what a Facebook under the control of whatever corporate consensus might appear (and I'm further to the left than the right).
> you could have a debate in almost social media forum you chose ... if we had actual community, it could trump the media it was on.
Perhaps, but not easily. The medium is still the message, and, to pick on Facebook, its design militates against true discourse. Engaging in a debate is an exercise in fighting the tool. For example, FB is intent on making you accidentally miss part of what your opponent has said. Everything from hiding comments, to forcing you to click "read more" over and over again is designed to facilitate misunderstanding and frustration. If an actual debate in good faith is the goal, it is hard to overcome design problems of this sort.
Regarding your rejection of FB as an absolute filter, it remains true that FB decides which posts in your FB group it will show to people, and which ones it won't.
The medium is still the message, and, to pick on Facebook, its design militates against true discourse. Engaging in a debate is an exercise in fighting the tool. For example, FB is intent on making you accidentally miss part of what your opponent has said.
Describing this Facebook's intent requires references. Facebook's algorithm aims for engagement and certain TL;DR; quality. But there are no tools that don't require a person to fight medium. No medium is perfect for discussion and any discourse requires an awareness of medium where it is taking place.
Regarding your rejection of FB as an absolute filter, it remains true that FB decides which posts in your FB group it will show to people, and which ones it won't.
Only if members stay on their regular Facebook feed. If you have a tight group, people go directly to the group page.
If they go directly to the page, FB will still apply its filter to the posts in that group.
> there are no tools that don't require a person to fight medium.
Perhaps, but there are tools that don’t make it practically impossible to find out what your opponent actually tried to say to you. I have been in the situation multiple times of doing everything I can on FB to make sure I read everything that was written in a thread prior to responding but still failed. This is what I mean when I say FB is intent on it. The design makes it impossible, whether that was the aim or not.
Edited to add: It’s no surprise that this leads to greater “engagement,” but it is obviously counterproductive to actual debate.
If they go directly to the page, FB will still apply its filter to the posts in that group.
FB doesn't filter the posts, shows them all - I'd be surprised if you could show otherwise.
FB doesn't do it's comment-filtering but that can be gotten around by clicking show more.
I've never had the experience of FB hiding comments in a fashion I fundamentally couldn't find them (and I do a lot of discussion on FB). It can be hard to see everything.
Perhaps if you scroll far enough it will show you every post in a group. Most people don’t. FB chooses what to show them from the group. That’s the filter.
As for not showing every comment, I am positive it has happened to me a few times that an important comment just doesn’t show up in the thread until after several others have been left. I have assumed it is because they are using a database that doesn’t guarantee a SELECT gives every record, but who knows.
As far as I can tell, FB group view, what you see when go "to" the group page, shows posts in the order of who last commented on the post. It's a much more reliable view than the feed, which indeed can show posts from anywhere in any order.
I think you're right that Facebook sometimes does just fail to show some comments at some points. The thing here is that any world wide distributed system on the scale of Facebook will sometimes do that. I don't believe there's any algorithmic manipulative intention involved here. (as opposed to the feed, which indeed is something like Facebook's spin on things).
And with Facebook's feed, even. Yes, it's filter of your friends' comments and links and a filter with a spin on it. But what is the contents of the average minor metropolitan newspaper? 90% of it is a filter of the wire service feed with a particular spin to it.
> what you see when go "to" the group page, shows posts in the order of who last commented on the post.
But that is itself a decision to elevate conflict and hide things that don’t generate conflict, unless there are an extraordinarily small number of posts in the group or it is just a small size. Because every time you visit the page you start at the top. And you get sucked into the “high-engagement” content before you get to it.
> I think the hostility these entities face today expresses a situation of them not having a tighter grip (at least a much tighter grip) on the public imagination...
The issue isn't a grip on our imaginations, it's the grip on our data (including off-site traffic thanks to ubiquitous trackers) that seems objectionable. And when it's the only way to talk to your grandma, for example, it's maddeningly difficult for a lot of people to quit
You have 3-4 different groups talking about "what the problem is" with FB and Google. Some talk about the kind of conversations they facilitate, some talk about how they undermine traditional experts, some talk about how they take data.
And for some the answer is monk-like isolation and simplicity, some the answer is imposing state control over these entities, for some the answer is limiting data entities can keep.
The thing that's frustrating is the only effect that has legs is the worst one - states directly regulating social media and the views there-in. It's unfortunate others with other complaints attach themselves to a process heading in this direction.
1. Everything is trivialized: major public policy changes arrive in the same package and placed next to trivial pop culture.
2. We respond to information using the same low-bandwidth tools (like, heart, retweet, etc.) that limit the expressiveness of response.
3. All information is in direct competition, playing the algorithms to gain attention.
4. Power over information is consolidated to a very small number of gatekeepers, and mainly Facebook.
This is an interesting way to look at things, and it resonates with me. I think the author described the problems with what I think of as the "hypermedia" era very well. But the author doesn't offer suggestions for what, if anything we should do about it.
As for myself, a couple years ago I read Cal Newport's Deep Work and then Digital Minimalism, and found their ideas compelling enough that I deleted Facebook and disabled web browsing and email on my phone (although I later put web and email back).
However, #deletefacebook seems to have about a snowballs chance of happening at large.
What countering forces could or should break the deathgrip that Facebook and Google in particular hold over communication and information publication today?