As of yesterday, the algorithm has prompted me to read a somewhat controversial story by Isaiah McCall.

By now, I've seen it all over the place. The algorithm dropped it into my email; it's been sitting on the "For you" feed and currently ranks #1 on the "trending" section under its main tag page.

Isaiah is a well-established author, and I would love to read more of his work. As I skimmed through his blog, I found plenty of stories I'd have enjoyed reading, but the algorithm thought it was time to introduce me to his work in a rather self-derogatory manner in what concerns the hosting platform.

Why is the AI now blatantly promoting stories about potential "policy infringement" versus "censorship" issues?

Curiously, the last installment of "The Edition" (Friday, November 4) also addresses the policy subject and praises the Trust and Safety team and its rules.

Should we be scared? Is the AI out to get us? Is this sudden censorship call to attention in the best interests of the platform and its community? Is there a hidden agenda?

How do you feel about all of this?

None

The ouroboros effect

Don't get me wrong: I've chosen Isaiah's story from among many others to illustrate something I've been noticing for a long time and dubbed "the ouroboros effect."

When you visit the topic page for each specific tag, you probably noticed already that you are always shown the trending section (not the "latest," not the "best,"… the "trending").

"Trending" can be a conflicting word in platforms with human and AI-based distribution.

We can only guess what that means. Maybe it's a factor of several engagement metrics. Some suggest it consists of a formula factoring in reading time, claps, responses, etc.

Sorry to break it to you; we are all going on a goose chase. Nobody knows. I can only guess that all our theories are wrong.

There's more heuristic to it and other variables than we can fathom. We make the algorithm in our own image.

How can a trend be supported by the same algorithm that promoted it in the first place?

Social media trends supported by AI always remind me of an ouroboros, a serpent eating its tail.

None

The endless cycle of negativity in Social Media

If you're still reading this, you probably are an avid reader; thus, you're no stranger to the "ouroboros effect."

And if you do, I'm guessing it's because you want to connect with friends, favorite authors, and other people who share similar interests.

But unfortunately, the algorithms behind these platforms have developed into something more sinister than a helpful tool for staying connected with friends and loved ones: they're creating a toxic environment where hate speech and misinformation thrive.

To make it even worse, I'm noticing that sometimes the algorithm even seems to work against the platform itself by heavily promoting stories that will negatively impact its own scaling process, thus hampering growth and causing everyone's progress to be hindered in the long run.

To escape the echo chamber, I suggest you take a breather and read this story by Carolyn Hastings, regardless of any algorithm bias.

None

Trending negativity echo chambers

You might be thinking, "Come on! Are you going to tell me that we're training the algorithm to fail?"

But I am telling you exactly that.

The algorithm is just a computer program executing on a set of rules that we've given it, and those rules are designed to serve us content we like — and they don't care whether or not the results offend us or make us uncomfortable.

The algorithms have no empathy for their users; they don't realize that some people may find specific stories offensive or upsetting.

They're just following the instructions provided by human developers, who sometimes may have made mistakes in judgment when deciding what goes into an article's metadata and bylines.

In short: You might think your article about global warming isn't controversial because it's scientifically accurate, but if someone else thinks otherwise, then who's right? Who knows? Maybe neither of you knows anything about science, or perhaps both do! Maybe everyone does.

None

Is the fault in the algorithm or our stars?

This is a fascinating question. I don't think this is a conspiracy theory, but it is worth discussing whether our actions might cause the algorithm to create more negative content in the future.

The algorithm is built by engineers and data scientists who want to maximize engagement, not necessarily serve the public interest.

The company can make money only if users are on the platform, so they have every incentive to keep us engaged and addicted as much as possible.

However, I wonder if the "ghost in the machine" is not short-sighted, as it seems to miss the bigger picture, and that probably explains why publishing platforms are exhibiting a downward spiral in the number of views.

We make the algorithm in our own image!

So if we assume developers are trying to create "intelligent algorithms," it seems reasonable to consider that they would seek to promote a positive experience as much as we do.

Algorithms aren't burdened by the same physical limitations and biological needs that come with being human.

Still, in theory, an "intelligent algorithm" is not unlike us — it just wants what we want.

It's essential to understand this because this is where things start to get tricky.

If you think about how people interact with social media networks like Facebook or Twitter, there's often a lot of negativity involved, especially when talking about politics.

None

The 'ghost in the machine' heuristic conundrum

The very definition of an algorithm should allow for an adjustment in its behavior based on real-life interactions.

It's like how you don't learn to drive by reading a book or watching someone else play video games on YouTube. You must get behind the wheel and take your car for a spin.

The same should be true of our algorithms, but it's not right now. Why?

Because we're still too fixated on building systems optimized for momentum and performance alone — not their ability to learn from their mistakes and adjust their behavior accordingly.

The "intelligent algorithm" is and will be a mirage if we keep it trapped in the echo chamber he creates. The AI will only bring its demise and our own.

None

How to escape the algorithm's echo chamber

So what does that mean for how we use social media? Are we heading to an even more polarized framework?

It means that if you're scrolling through your feed or reading about politics online to get information about candidates or policy positions or other things of consequence, chances are pretty good that you're going to get information from people who share similar viewpoints with you.

And since most of us only follow people who agree with them politically (as opposed to those who disagree), there's no reason why this trend would change anytime soon — especially as social networks become increasingly polarized.

And so, the question becomes: How can we change social media for the better?

We need to improve social media by changing how we interact with it.

  • As a society, we need to be more responsible on social media.
  • We must become more aware of the impact of our actions on others.
  • We need to be more empathetic towards those who might get hurt by them. We must change how we interact with it.

We can do this by taking a step back from our devices and rethinking how we use Social Media.

In a free society, we can't make social networks less available, so users are forced to go out into the real world instead of remaining glued to their devices 24/7.

We can take matters into our own hands and escape the echo chamber by stepping out of our AI-based feeds and searching for content ourselves within communities of like-minded people.

Another solution could involve changing some basic features within social media services themselves; perhaps making it harder for people to post hate speech or inciting violence would reduce abuse online.

It's hard enough to keep up with current events as they happen, let alone trying not only to understand why but also how much digital technology has changed over time (think about what happened during WW2 compared with what is happening today).

None

Concluding remarks

It may seem like we're doomed to an endless cycle of social media negativity, but we can do plenty of things right now to change that.

For one thing, we could stop sharing so much bad news and harmful content on social media — and instead engage with and share more positive stories and uplifting content from our friends.

Every platform has positivity clusters like Engage, my newest publication.

Other articles from this series:

⭐️ Sign up through this link. Support your favorite platform and its talented authors. You'll boost our community's success and support my work with a small commission, all while gaining exclusive perks and benefits as a member.

None
Don't click it unless you mean it!
None

Thanks for taking the time to read this story on Engage. Let me start by inviting you to follow the publication and consider becoming a writer; please take a minute to read our mission statement and subscribe to our newsletter, so you never miss a thing.