Good point. Journalists often look for ‘blood in the water’ and if they can’t find much, then maybe ratchet things up a bit to increase the controversy.
And FB has to operate at a scale beyond what many of us can probably even imagine. So naturally whatever they do, good or bad, is going to look big to anyone who looks. And Amazon does the same thing with Alexa.
But there is a bit of a story here though for the reason you point out: they don’t disclose.
The “break things” ethos unfortunately allows a lot of room to justify unethical behavior. And this is exactly why they get so much scrutiny. The company leadership misrepresents what they do, even, in the case of Zuckerberg testifying to Congress, while under oath. Those sharks will never leave his tail after that.
1 Like
Facebook got fined 5 billion dollars by the US Govt last month. The EU is about to announce its findings on FB’s alleged violations of the GDPR regarding sharing WhatsApp data with FB without user consent…
These are huge fines, but they seem to not make a real difference. Europe needs a bigger strategy than leveling big fines. It seems like only a migration of people somewhere else will cause them to change. i imagine that most FB users are not that concerned about it actually.
I dream of antitrust policy based on access to market: “if you don’t break your company up, we will no longer allow you to operate on our 500 million rich people market”. The EU is almost there, with pretty strong decisions imposed on non-European companies by commissioners like Monti and Vestager.
1 Like
I do think denying access to market is the only way to enforce monopolistic behavior. But does breaking up big tech companies solve the problem? So for example you would break Instagram and WhatsApp away from Facebook?
1 Like
Hi it’s been several months I didn’t come here. I’m really enthusiast about your project, great efforts, however as it was probably mentioned before, why not join an existing solution like Fediverse/Mastodon, because when you reinvent the roll, you do it on your own, and you add to the hundreds and thousands of little projects that exists and that lack of users.
Olivier
We actually plan to build on ActivityPub, to join the Fediverse. And Darcy will be open source, to give back to everyone else too.
We will however expand things a bit, to include better moderation interfaces, combining the moderation of several instances, sharing of blocklists between instances, and so on.
I don’t think so. The problem isn’t that this is all owned by Facebook these days, but that the underlying business model is “grow big, get sold, earn money through ads”.
Issue is that not only Facebook, Instagram, Whatsapp et al are responsible for violating consent, issue is often the ad-tech ecosystem which is compromised from companies that most people never heard from such as Appnexus, IndexExchange and others that have loads of data obtained from users and its almost impossible to get to them.
In the United States monopoly has for the past many decades been defined as being a monopolistic practice only if it raises prices on consumers. That is not the problem that we’re looking at here and at least in this country there is no policy at this point for dealing with the data issue.
The public itself, and thus the legislators they elect, so far do not view a person’s personal data as being a commodity that has assignable value. So it is all regarded as a free service. After seeing what a lousy job the senators did in questioning Zuckerberg at his hearings not long ago I’m not real confident that some of those old guys are going to understand it well enough to write workable law.Add to that truly massive lobbying campaigns by big tech and I really don’t expect any movement on this side of the Atlantic.
Yeah, I was just connecting the issue of data handling from Facebook and big tech with companies that store that data (and most of people never heard of them, but they have loads of data on people).
For the zuck hearing, yeah, if they will create a law based on their questions, it will be quite bad. For lobbying, well, in EU Parliament that was also (and still is) a big issue with GDPR which was watered down by influences from big tech (and if you ask me, big tech wins big with GDPR in place). I’m not sure how can we make an advertising ecosystem that gives people what they want, but treats data with confidentiality and use it only in purposes that the consumer who gave them that data allowed them to use it.
I re-discovered one great technologist (listened to her long time ago, forgot, randomly discovered on youtube again), Aleks Krotoski, that has a podcast show on the BBC on these topics and has a interesting video on web privacy. I’ll link it here: https://www.youtube.com/watch?v=KGX-c5BJNFk&t=1362s
Right now I’m reading “Chaos Monkeys” by Antonio Martinez. It’s a pretty fascinating Silicon Valley tale with a lot of focus on his time at Facebook a few years ago. He points out that FB didn’t start raking in the truly huge money until they started following people around the web to figure out their real interests. Just confining it to likes friends groups and all the “inside Facebook” activity didn’t produce the kind of targeting that brought the higher ad rates.
This came about around 2011-12, a few years after Google blazed the surveillance trail that is now the norm.
2 Likes
At one time it was somewhat of a silly complaint about advertising on the web - since the ad-supported model was what the market (people) desired. However, as with most things, this went way too far…some “news” pages now actually have 100’s of ads on them (CNN, for example). Hundreds! The targeting worked better when it was done in a subtle or general fashion. Now, it’s so blatantly obvious and hides the good stuff…that it would seem to lose some of the effect (perhaps I am looking a bit in to the future here).
I can almost start to say that the web was more useful in 1998 than it is today…when google results are now page after page of “content farms” or “SEO” stuff designed to have the sizzle (results that look good in a list) but no steak once you get to the site.
One would think a smart company like Google would see this - and perhaps they do, but they have pressure to keep the bucks flowing.
I should mention that I attended in-person google seminars when I was a publisher and they showed us “heat maps” of pages, which is where we should put ads. But, even then, I think google TOS allowed no more than two or three ads per page.
Just thinking out loud here - but billboards are allowed on highways…however, if advertisers put one every 50 meters and they were all moving and colorful to grab your attention, laws and regulation would be passed or the industry asked to regulate itself so that the purpose of the highway remained sound. This may be part of the solution in some places - make the internet “the highway” and tell publishers over a certain size that they have to choose…X amounts of ads per page or % of page with ads, etc.
But ads are not the only problem - it’s the fight for eyeballs which has resulted in so many entire web sites being created for nothing except drawing people in. That part is up to the search engines. It may just happen that duck duck or another SE gets there first - and there is nothing to say that people will not gravitate to the ad-free or limited ad (or limited “content farm” results) search engine.
Anyway, that part is not exactly Social Media related but it is thoughts on the fact that the Internet IS broken and fixes are needed.
It would be truly amazing to see how google reacts if, for example, 10% of search queries started going to duck duck or a similar service.
1 Like
To me it is pretty sad that the Internet is so massively consumed by the dictates of market place and markets, but I suppose it was inevitable. It’s a real trade off with the Internet of old and the Internet of today. Back in those days I hungered for the better graphics and audio and throughput that I knew was coming. But back then there was a kind of maybe tacit understanding or agreement that the commercialization of everything should not be such a runaway train. Maybe that was just left over from the days not too many years before when you couldn’t do anything commercial on the Internet and you had to sign pledges on paper certifying that you would be non-commercial. But I also remember when I worked for a big media company that the notion of that kind of restraint to the advertising-heavy establishment that employed me, seemed kind of absurd because why would you restrain yourself from going after the most money you can make?
2 Likes
I wonder if we even anticipated the current net - when 95% plus of users are “takers” and only a very few are makers. This really came into focus with devices and social media because obviously people were not going to write massive amounts of content on Phones or even early tablets. So that sorta broke it into two parts - whereas at first I think we envisioned more of all being creators and consumers.
The “here today gone tomorrow” fad is also interesting as it speaks against most of the real excitement we felt in terms of forever knowledge. Now it’s a quick jolt of attention and that goes away quickly also.
I still get excited on forums where I see my long term rep going up due to actual usefulness of my posts or opinions.
What we need is a new Moses coming down from the Mountain with the shore and sweet of the Internet guidelines!
The metaphor you choose with Moses might illustrate rather well how disputed such a strict top-down approach would be and how it could lead to conflicts around the nature, origin and believability of such guidelines.
1 Like
Well, that’s understandable - but it also reflects on how having proper food labeling and caloric content in eateries and other such information can help people know what is best for their own health and the health of society in general.
I have run into many real world situations where people don’t know the basics (even the Golden Rule) - but when I consider telling them the score I realize that if they don’t know by now I’m certainly not going to be able to clue them in!
My experience as admin of a large forum for 18 years was that Leadership (admin and mods and top helpers and posters) set those rules and 98% or more of the users followed along once they realized it was all good (no negatives involved).
Attention is a funny thing. Many people cannot discern negative attention from positive and from most everything in-between. When billions are online reacting to this…well, it causes the problems we see today. IMHO.
A “moses” in this case is a metaphor for a new take on the Golden Rule.
2 Likes
I have no faith that social media companies that make most of their money tracking you and subtly funneling you into where they want you to go in order to maximize revenue are ever going to allow enough user control to “un break” it.
1 Like
Social networks are a lot more socially random than the societies that form around specialty sites like yours or this site here. So I think rules of civility can be applied in the very broadest sense in a social network but otherwise I think one size just does not fit all and administrators are going to be playing defense endlessly. This is why I think it is necessary to have good user controls in social media for it to work in ways that we here think it should work. (Assuming we are more or less aligned.)
1 Like
This comprehensive study lays out the state of things today:
Samantha Bradshaw & Philip N. Howard, “The Global Disinformation Order: 2019 Global Inventory of Organised Social Media Manipulation.” Working Paper 2019.3. Oxford, UK: Project on Computational Propaganda. comprop.oii.ox.ac.uk. 23 pp.
From the intro, “Cyber troops’ are defined as government or political party actors tasked with manipulating public opinion online (Bradshaw and Howard 2017a). We comparatively examine the formal organization of cyber troops around the world, and how these actors use computational propaganda for political purposes. This involves building an inventory of the evolving strategies, tools, and techniques of computational propaganda, including the use of ‘political bots’ to amplify hate speech or other forms of manipulated content, the illegal harvesting of data or micro-targeting, or deploying an army of ‘trolls’ to bully or harass political dissidents or journalists online. We also track the capacity and resources invested into developing these techniques to build a picture of cyber troop capabilities around the world.”
1 Like
The summary:
Over the past three years, we have monitored the global organization of social media manipulation by governments and political parties. Our 2019 report analyses the trends of computational propaganda and the evolving tools, capacities, strategies, and resources.
-
Evidence of organized social media manipulation campaigns which have taken place in 70 countries, up from 48 countries in 2018 and 28 countries in 2017. In each country, there is at least one political party or government agency using social media to shape public attitudes domestically.
-
Social media has become co-opted by many authoritarian regimes. In 26 countries, computational propaganda is being used as a tool of information control in three distinct ways: to suppress fundamental human rights, discredit political opponents, and drown out dissenting opinions.
-
A handful of sophisticated state actors use computational propaganda for foreign influence operations. Facebook and Twitter attributed foreign influence operations to seven countries (China, India, Iran, Pakistan, Russia, Saudi Arabia, and Venezuela) who have used these platforms to influence global audiences.
-
China has become a major player in the global disinformation order. Until the 2019 protests in Hong Kong, most evidence of Chinese computational propaganda occurred on domestic platforms such as Weibo, WeChat, and QQ. But China’s new-found interest in aggressively using Facebook, Twitter, and YouTube should raise concerns for democracies
-
Despite there being more social networking platforms than ever, Facebook remains the platform of choice for social media manipulation. In 56 countries, we found evidence of formally organized computational propaganda campaigns on Facebook.