Social Media is broken, let's do better!

I absolutely CAN share something. This for example is our current approach for the “create new user account” flow: darcy-login

(this is a non-functional mockup - you can click some buttons, but none of the information entered will be stored anywhere)

Another demopage would be Darcy Demo. This shows the general layout of the main page. Again, non-functional and not quite finished yet :).

1 Like

Interesting enough, this is the complete opposite of some of the first subject moderation instances that was fully monetized - i.e. that of the AOL Chat Rooms in the early 1990’s.

We had a friend who was 14 years old and he was mod of the “teen forum” - he made 30K per year from doing so! AOL actually paid for him to come down to the VA. HQ for a sit down with other mods and admins.
Of course, that ended soon enough when AOL couldn’t charge 10 cents a minute for access to their digital world.

In our case (our forums), the mods were all volunteers but since we made some $$ from ads and sponsorships I paid them in later years with stipends and bonuses.

I think one main point is that a model has to be built on something which is already proven somewhere (human interaction has not really changed). What people will pay for is likely the same as always, as well as what they will not. Also, how they will act if they pay for something (ownership) comes into play, as do regional and national differences.

I have to also agree with most of what trythis says about the different goals of users vs. mods- or, in fact, the differing goals of each user. Although most mods start out as users they usually rise above the fray based on merit, not on pay. One can only imagine if FB allowed people to pay $10 a month to start a “more special group”…that many of those people would feel they were Kings and Queens of their little domain. As it stands many seem to feel this way just because they started a “page”. This is, in fact, one of the largest problems of Social Media…that of thinking they are special because they have a page or the like. Ah, Human Nature.

I wish I had all the answers - but perhaps Social Media is more like Movies and Documentaries where the “moderators” can be, in a sense, the creators (curators) and therefore rewarded (paid) as opposed to the opposite. It’s not so different…the users pay for their streaming services (internet service, netflix, etc.) and the money funnels up to the content creators and curators.

This would be near impossible but a tiny tax on internet service which could be used for certain types of social media (akin to community TV and Radio which they already DO fund) would be ideal. But in todays world…that is a non-starter, IMHO, for many reasons.

Definitely brainstorming here, but maybe $$ from the good ole billionaires who truly want to see a more just and civil world is part of the answer? They fund Public Radio and TV here in the USA and this could be, in a sense, sold as something similar.

I don’t see a lot of mention of Reddit here - and, yes, they have their problems, but there is some things they do right in terms of balance.

I think @JollyOrc’s idea with this is the user(s) pay for moderation or not depending on whether and how much it is wanted.

actually - it’s the server admins who pay for the moderation, not the users. And the fee is calculated by how many active users are on the instance, not how much work they generate.

So: Bob runs a server. They invite 20 of their friends to it, and everything is peachy. When these 20 get into an argument, or Alice posts an inappropriate picture, Bob is on it. With 20 users, it’s easy to do in ones free time.

Now, those friends are so excited about this, that they in term each invite 20 more users. So suddenly Bob has 400 users on his server, and a month later, it’s 2000.

Handling the moderation needs for 2000 users is too much work for Bob. So Bob turns to Darcy and pays a monthly fee to have their instance moderated. Bob can either pay this fee on their own (maybe they are a feeling generous, or have the extra income), or they can collect donations from the users. (Darcy will offer easy interfaces to Patreon or PayPal for this) Maybe they hit up the local hardware store to get a static “this instance is supported by DIY Haven!” banner somewhere.

(Bob might be tempted to add in automated banner advertising or worse. But that would get them into trouble with the Darcy moderation team terms of service, and the instance won’t be eligible for moderation anymore.)

We at Darcy strongly believe in paying the moderators, and paying them well. This is serious work, that needs serious training and also serious support. We’ve all read the reports out of Cognizant and similar outfits. Don’t be like that!

Thanks for clarifying.

Ands yes, moderators are chronically underpaid, though if they are doing moderation for profitable businesses they tend to do better. And a lot of moderating is part-time work, with not a great hourly rate, so it attracts stay-at-home mothers, people still living with their parents and others who don’t or can’t work full time. Many of these folks are very good at the work. Many are not.

Hey @JollyOrc,
nice to read you.
This is an interesting project you’re pushing here.
I was reading about the federation abilities of Darcy, and the comparison with other federated platforms such as Mastodon.
Will Darcy be a part of the Fediverse and able to federate with Pixelfed, Mastodon, etc, based on AcrivityPub?

Second question, did you reach out SwitchingSocial?
They’re doing a good job at referencing all social networks options :slight_smile:

right now, we are aiming to use ActivityPub, so Darcy instances will be able to federate with all instances that use the same protocol.

There is one caveat though: In order to keep the safety promise, federation might be cut with instances that don’t manage to self-moderate with the same (transparent and not too onerous) standards. This is especially important for instances that are aimed at minors for example.

As soon as we are actually live, we will for sure reach out to SwitchingSocial, but it’s too early for that right now.

1 Like

Well, I guess it’s only the choice of moderators isn’t it?
I mean, are you creating a tool, or a space?

both - we provide the tool, but we will also use the tool ourselves to create a space with it.

1 Like

I just signed up to the platform. Honestly, I am surprised by all the potential that is concentrated here. And I have to say this platform is well hidden. Anyway, @JollyOrc I love your idea. I didn’t follow most of the thread, but I am very very impressed by your initiative. I hope it will be far well into the future. Is it already in some testing phase?

3 Likes

we’re still developing, and are also looking for proper financing - doing things right requires a bit more than just gumption and good intentions :slight_smile:

1 Like

@SemelAri welcome!

Thanks… and yes, we are not very good at communicating what we do. The good thing is, Edgeryders is not based on selling eyeballs, so being bad at self-promotion does not kill us – though it’s still bad.

1 Like

Hello @SemelAri! Great to have you here!

you already make some very good points about the communication, maybe you could help by adding your thoughts about how you found us, perceive the platform and what motivates you here:

If you are interested in @JollyOrc projects check out his glossary

and/or come to the community call he is featured in on Tuesday the 6th:

Looking forward to seeing you there!

Otherwise, we are very curious to learn more about you and what you are up to in general. Maybe you could write a little bit about the projects and questions you are involved with in our “My Story” section :slight_smile:

2 Likes

It’s nice and noble what he is building, but there are too many flaws in his system which I pointed out. Like how can you make the distinction between good and bad if every human being has another perception of these values?

1 Like

Thoughts @JollyOrc? :slight_smile:

The baseline that is enforced on the top-down level is actually simple: The law.

On the top-down level, that is the standard that is enforced. On the bottom-up level, every community and local group can define their own standards, as long as they don’t violate the law. And end users will be empowered to curate their own experience, tailored to their needs and values.

Yes, I acknowledge that there are variants in global law - what is allowed in Germany might not be allowed in China and vice versa. But there are baselines that are universally acknowledged. Right now, we focus on what is accepted and baseline in Europe and will worry about China later…

1 Like

Thanks for sharing this!

Just for my understanding, how do you approach how visible conversations are?

Tom Coates once summed up some mechanics I found interesting like this:

“(…) it’s about the fundamental mechanics of the protocol and then a lot is just based on putting the filtering and abuse prevention in the clients.(…) For example, if you can’t see people more than two steps from you, then the incentive for them to be unpleasant or harassing diminishes a lot, and if there’s an assymetry between the two sides too, that can help. It depends on the physics basically, how far can your voice carry”

I found this metaphor pretty helpful to think about these things. What’s your approach?

1 Like

yes, that works - up to a point. Because sometimes people actively want to get into a conversation with people that are completely unknown to them, but only connected through a shared interest in something. Think striking up a conversation with a stranger in a bar, or while standing in queue somewhere.

This board for example: The only connection I have to basically everyone here, is that they are here too. Thankfully, all’y’all are nice people, but what would my options be if you weren’t? I’d probably have to count on the hosts here to make things friendly.

Generally, we need to recognize that there are different scopes of communication, different spaces with different needs and constraints. As soon as an internet space is open in the sense of “everyone can register and participate”, you are essentially dealing with the potential global public.

All it needs is a well-worded link in a high-traffic place directing new people to any given space to fill it up with a new audience - which might be friendly or hostile, come with its own agenda, might have an interest and understanding of the place, or just wants to complain… Any new social platform should keep this possibility in mind.

2 Likes

Further evidence of brokenness:

https://www.bloomberg.com/news/articles/2019-08-13/facebook-paid-hundreds-of-contractors-to-transcribe-users-audio

1 Like

I am actually a bit angry over how these things are reported. If there is any online, server based voice recognition service at the scale of millions of end users, of course there are going to be humans who check on samples to see if the product works as intended. And of course the bit that says so is hidden in the ToS somewhere, because if you put it front and center, no one is going to use it.

Also: Yes, it feels creepy, but in the end, the actual consequences of this privacy invasion are close to zero - the people who hear the recordings usually aren’t in a position to do anything with them.

That said: Yes, the big tech companies could have handled this better. Don’t allow remote work for this stuff for example, better communication and so on.

But what the reporting should always do is to tell people that checking these recordings is more or less mandatory to make the service better or even possible in the first place, and that people should know that whenever something translates voice to text in an online device, there should be the expectation that someone will eventually listen in to more or less anonymized tidbits of your voice.