Social media manipulation

A big part of the problem, as i understand it, is the way that Facebook and Twitter set up their feeds to stoke controversy and polarization. If it stayed as a controlled boil they would prefer that since they can sell more ads that way. But it does boil over and then they think they have to start censoring some of the bile they in fact encourage as a business model.

I want user control. I do not want them acting as censors when it suits them and then in other cases claiming it isn’t their responsibility. I personally abhor hate speech, but I don’t see how it can be fairly defined or enforced. In porn, there are “community standards” that put a lid on it some places, but we’re talking about planetary services. And maybe I want to see the unvarnished full view of what people are saying. I am an adult, why would I not be allowed to see it? or maybe I don’t want to see it. Then let me decide. Not Facebook or Twitter. They don’t exactly have trustable ethics for such powers.

1 Like

you are pointing in a very very valid direction where large part of the problem lies.

Just listened to this podcast about the Carlos Maza saga, how Steve Crowder violated Youtube’s hate speech rules (but still got away with it because his following is too big basically). But main take away is second part of the podcast in which they explain how it is possible that fringe voices became so big, and how they’re only now realizing how they’ve helped radicalize people.

(and this article fromThe Verge tells how they’re now going to even get rid of algorithms with educational playlists, “So you won’t fall asleep during chemistry lessons and wake up to conspiracy theories.”)

That article about YouTube playing nice for kids tends to make me more cynical about them rather than less. We’ll do the right thing for your kids but we’ll keep on manipulating you. Still, it shows that they respond to some pressure. Which has been quite heavy lately given how shameless they have been in exploiting kids.

I need to find the reference but I read recently that controversy and polarization raise ad revenue more than the happy kitty photos and the cuddly babies. Since they are publicly traded they can’t let their stock price go down as if they were part of some socially responsible portfolio. Can’t have that. Sometimes I think of it metaphorically not so much as a road but rather as a bobsled or luge run where once you get going there is no getting off.

1 Like

This article in Wired lays it out pretty well.

" Earlier this year, researchers at Google’s Deep Mind examined the impact of recommender systems, such as those used by YouTube and other platforms. They concluded that “feedback loops in recommendation systems can give rise to ‘echo chambers’ and ‘filter bubbles,’ which can narrow a user’s content exposure and ultimately shift their worldview.”

The model didn’t take into account how the recommendation system influences the kind of content that’s created. In the real world, AI, content creators, and users heavily influence one another. Because AI aims to maximize engagement, hyper-engaged users are seen as “models to be reproduced.” AI algorithms will then favor the content of such users.

The feedback loop works like this: (1) People who spend more time on the platforms have a greater impact on recommendation systems. (2) The content they engage with will get more views/likes. (3) Content creators will notice and create more of it. (4) People will spend even more time on that content. That’s why it’s important to know who a platform’s hyper-engaged users are: They’re the ones we can examine in order to predict which direction the AI is tilting the world."

So in other words, what we see is largely a product from people with nothing better to do with their time…

That’s exactly what the podcast touched upon as well: YouTube’s drive to get to 1billion views and having every decision - from design to algorithms – designed to get people watch more and more. They apparently joked about the previous algorithm being called the Gangam Style algorithm: every video would eventually lead to Gangam Style, the most popular video at that time. They started realizing that just showing the same content but more popular did lead to more views per video, but not more minutes watched (if you get feed exactly the same video but watched by more people, you may like it, but you won’t continue watching it).
So, the culprit is that YouTube’s aim to make YouTube profittable (which it wasn’t when Google bought it), it’s need to have people watch more and more, led them to create something allowing fringe voices to be in your feed when you’re just looking for a video on how to your lawn, and now you’re a flat earther.

Right. They always need you to think you’re getting it all for free. And of course you are monetarily. But they need to be pretty non transparent about it because none of them make that much unless they follow you around and compile where you go and what you do.

User control is not just a policy issue. And not just a privacy or bubble issue. Or a UI issue. It is all of them. Facebook would say, “but you do have user control. You control who your friends are, many privacy settings, what groups you want to join and so much else.”

But what about following me around? What if I control that for myself? What if I say don’t follow me here (have private browsing but Matt Coleman security expert extraordinaire says they aren’t all that private) but you can follow me to these sites while I shop for something, and when I find it you leave me alone again, as one example.

Anyway I don’t see how at this point the giant companies can be controlled other than to enforce ways that allow authentic competition. Fining them does nothing as far as I can tell. Right now they just crush or buy what looks interesting or makes them more profitable or dominant. And again this is the bobsled course. Go fast and can’t get off.

1 Like

I was watching tv just now and an ad came on for Hewlett-Packard laptops. The theme read, “Be you. Nobody’s watching. Now with webcam kill switch.”

Oh wow big feature. Now you don’t need tape. It does reflect though a change in the blanket acceptance of blessed tech.

1 Like

In Edgeryders, that would be me! :smiley: Do we even have recommendation algos?

Oh boom, this just showed up in my feed

Also interesting to EarthOS crew and @ilaria

1 Like

Thanks @inge for the ping.
WTF is the first thing that comes to my mind. Then, well, there’s a lot to say about that. I take some time to think about it and I’ll come back to you.

1 Like

From the end users point of view, especially with a platform like YouTube, where you essentially do the digital equivalent of channel zapping, a recommendation algorithm is key.

The problem is that for YouTube, this algorithm is ultimately controlled by the wrong metric (is this “engagement”, that drives advertising consumption and hence revenue?).

The solution though isn’t simply to abolish the algorithm - we still want a system that shows us new things. And these new things should ideally blip us out of our bubble, but in a way that doesn’t make us uncomfortable or, worse, bored.

We’re toying with the idea of making the algorithm transparent and user-controlled, but it’s hard to do right. The fetched content should be relevant, new, interesting, and still in tune with the users preferences somehow. And it needs to be clear why each and every bit of new content is presented to the user, so they understand what is going on.

Getting an algorithmic feed needs to be a pull operation, not a push one. That means that the user needs to actively ask for the algorithmic feed. Each and every time. The default needs to be a sorted timeline view. And if that gets too full, the user should be asked to curate which content is shown there, and which sources get pushed to the backburner.