Are Social Media Companies like Facebook and Twitter Platforms or Publishers?

Or can you be one or the other depending on whether it is to your advantage, which is what Facebook especially wants to do.

Newcomers - please sign up and join the conversation!

1 Like

A platform is content neutral, like the phone company, and does not interfere in any way with the content of what gets said on their networks (wiretapping in criminal cases is different). A publisher makes editorial choices and takes responsibility for the consequences of that content. It used to be a pretty clear distinction until online conversation networks came along.

In the early days, those of us who managed those online conversation networks, including big companies like America Online, argued that we shouldn’t be held liable for everything that the users say to each other, because for one thing, we would have to pre-approve everything that got posted, which most users would find intolerable. Or we would have to actively censor whatever we found objectionable. So for the most part people said whatever they wanted and dealt with the liability or libel issues themselves. In 1996 in the USA, Congress passed the Communications Decency Act that did a number of things, mainly related to obscenity and minors, but included section 230 which make online services immune from liability for the content posted on those companies’ sites. In 1997 the Supreme Court struck down most of the CDA, but kept Section 230. This was all before social media as we know it was even dreamed up.

So into this legal framework came social media operations that gained audience far beyond anything any other online service had ever seen or even likely dreamed of. And with the advent of the Facebook ‘news feed’ began the process we find ourselves in today where the companies actively manipulate what we see on our screens to maximize our desire to stay on the site, or return to it often, and to feed us a lot of what we want to see and hear so we might buy the things the advertisers show to us. Whether they intended it or not, this actively contributes to a more polarized society according to numerous studies and essays.

And now, Facebook has banned extremist individuals outright.

So are they a platform or a publisher? They say they are a platform when they want that shielding from the law and they admit to being a publisher when they want to get rid of what they consider to be hate speech or people who espouse violence.

But when the New Zealand mass shooter spread that video all around Facebook, Facebook used the shield of a platform to avoid liability for spreading it, but then they wanted to be a publisher when they banned it. And let us remember that while that video was viewed on their platform millions of times, they made money off the ads that ran along with it.

1 Like

This article on Vox does a good job of explaining the situation…

Meanwhile Facebook is tussling with the EU on a number of regulatory fronts, mainly to keep the EU from regulating them at all, although they seem to be willing to accept a little of it.

For now FB can play the neutral platform card. According to the above Politico article, “The EU law governing responsibility for content on social media platforms is the 2000 e-commerce directive, which does not hold companies like Google and Facebook liable for illegal content posted by their users. Companies must take down illegal content once it has been flagged as such, but they are not required to actively prevent it from being uploaded.”

But this is not at all settled. What do you think should happen?

1 Like

The debate on Internet regulation teems with these questions. Is Uber a tech company or a taxi company? Is Airbnb a platform or a hotel business? I am fresh from a long conversation with Amelia Andersdotter (@teirdes). She explained that the legal consequences of sticking a label on things are very far-reaching. If I understand her reasoning, she advocates:

  1. Taking a long, hard look at the emergent consequences of the way these companies do business before taking action.
  2. Mapping these consequences against the values that we would like society to uphold. This can be messy. In the case of Uber, some people dislike the relatively low compensation that many drivers appear to get for their time. But some others point to the fact that Uber has made taxi travel safe for Indian women. Women do not like to take taxis in India, because drivers have been known to demand more money than agreed once the car is in motion and the client has left the safety of her house. There have been cases of harassment, too. Uber does not have these problems, because (1) the price or a ride is established and (2) the identity of the driver is known, and the company is credible when it threatens consequences against misbehaving drivers.
  3. Only after this move in to regulate.

I am probably representing this quite clumsily, Amelia herself could do it much better.

Often regulation is one or more steps behind the innovative entity that acts in ways the call out for some regulation. In Uber’s case, it does seem like the taxi cab industry, at least in the US, was so complacent that it was ripe for serious disruption. With AirBnB it’s more complicated because an unintended consequence is how much it has changed the rental picture for local residents, which has large social impact. In Facebook’s case, they seem to just want to run out ahead of any regulation, because, I think, they want to constantly adjust their models to maximize their ROI from all the data gathering.

I can sympathize with FB’s view in a sense. Back when I started and managed, it was owned by a company that had thousands of employees, both union and non-union. The unionized workers were represented by no less than 14 separate unions. Naturally they wanted to represent my employees too. I, as a lifelong union sympathizer and former union member myself (Machinists International), thought that someday it would be appropriate for them to represent the workers on my crew. But not at first because they would enforce rules written long ago for a different industry with clear divisions of labor and that would stifle the quick movements we required to be innovative. And in that sense, unions function kind of like regulators. But then, we were essentially giving news away to the public with a small number of ads thrown in and we were not doing any deep data mining on our users…

Author and columnist Cory Doctorow just published this column in Locus discussing how anti-trust laws have been so weakened that they no longer get applied to breaking up really big monopolistic companies when the real problem is the sheer size and hegemony of those companies.

" Concentration leads to all kinds of bad things. Take Facebook, a company whose day is long past. People fucking hate Facebook, which is why 15,000,000 people left the service in 2018, primarily 15-to-34 year-olds, way up from 2016. And where did most of those people go? Instagram, acquired by Facebook in 2012.

Facebook has a privacy problem and we should regulate to fix it, but the reality is that for as long as people who care about Facebook’s privacy dumpster fire end up on another Facebook-owned site, Facebook will slowwalk any kind of privacy protections."

He then goes on a long excursion about Article 13 and copyright, where he makes the point that forcing these companies to monitor all their comments and function as a censor will be so expensive that it will mean breaking them up - the real need - won’t happen because smaller entities won’t be able to deal with that kind of workload. “If we appoint tech giants with the unimaginably expensive civic duty of policing all online speech for copyright infringement (or “extremism” or what have you), that will make it impossible to unbiggen Big Tech: we won’t be able to shrink them into pieces small enough to manage, because those pieces won’t be able to manage their public duties.”

1 Like

This New York Times editorial by Facebook co-founder Chris Hughes is well worth your time.

Relevant to this topic he says, " Mark used to insist that Facebook was just a “social utility,” a neutral platform for people to communicate what they wished. Now he recognizes that Facebook is both a platform and a publisher and that it is inevitably making decisions about values. The company’s own lawyers have argued in court that Facebook is a publisher and thus entitled to First Amendment protection."

1 Like

Yes I read it, makes sense to me. Will be interesting to see which politicians/ regulators pursue this line…FB has means to blackmail/disinform alot of people into falling in line with FBs agenda.

Hughes recommends that the government set standards for what can and can’t be said. I think that barks up the wrong tree. A lot of FBs problems could be solved by giving everyone much more control over their experience, for instance let people do the filtering they want. But that runs counter to their business model of creating ever more pinpoint profiles against which they sell ads at ever better prices.

1 Like