Welcome Guest
Login
Magic Sign-On
Remote Authentication
Home
Magic Sign On
Apps
System Apps
Directory
Help
Language
Random Channel
Report Bug
Search
Fri, 27 Sep 2024 14:27:10 -0500
last edited: Sat, 28 Sep 2024 14:27:36 -0500
View Profile
Evan Prodromou
evan@cosocial.ca
Fediverse admins and mods: do you have a plan for dealing with US Presidential election disinformation in 2024?
#EvanPoll
#poll
Strong yes
11 Votes | 18%
Qualified yes
20 Votes | 33%
Qualified no
17 Votes | 28%
Strong no
12 Votes | 20%
60 Votes in total
Poll has ended
Link to Source
show all
9 comments
Fri, 27 Sep 2024 14:27:45 -0500
View Profile
Evan Prodromou
evan@cosocial.ca
I asked this in January; I'm interested to see how responses have changed.
https://cosocial.ca/@evan/111819694401940574
Link to Source
Fri, 27 Sep 2024 14:37:49 -0500
View Profile
John Francis
johnefrancis@mastodon.social
@evan
registration staying closed on cats.mastodon.musk.sucks
Link to Source
Fri, 27 Sep 2024 18:51:00 -0500
View Profile
James M.
jamesmarshall@sfba.social
@evan
not an admin or mod, but super interested to hear how they're planning to deal with this.
Link to Source
Sat, 28 Sep 2024 08:20:19 -0500
View Profile
Scott M. Stolz
scott@loves.tech
The biggest problem is that the people who are lying will use their resources to tell people that the truth is a lie. So you can't always count on the consensus or specific parties or interest groups or authorities to be accurate. In fact, there have been many cases where new facts and evidence came out months or years later that completely changed what was proven to be "true."
So instead of telling people what the truth is, I would do two things:
1. Encourage discussion of the issues (and prohibit fighting). A good honest discussion usually chips away at blatantly dishonest propaganda because one side will have evidence and the other side won't. Or, more typically, the other party simply has not thought their position through and it is easy to poke holes in it.
2. I would like to develop an addon (plugin / module) that detects certain keywords and allows moderators to add a "context notice" when certain posts are displayed. This "context notice" would not tell people what to believe, but would direct people to a website that includes links to all different points of view (like ground news), plus a list of independent fact checker articles on the topic. People can see the different points of view and decide for themselves. This would be displayed in the user inbox and next to publicly displayed posts on the website or app, and not distributed over ActivityPub.
I've found that prolonged exposure to evidence and different interpretations of that evidence (
i.e.
different points of view) will eventually convert even a "true believer" into someone who is more aligned with the truth. It doesn't happen overnight, but getting people out of their information bubble helps give people a more accurate view of the world.
Link to Source
Sat, 28 Sep 2024 08:55:53 -0500
View Profile
Evan Prodromou
evan@cosocial.ca
@scott
why not distributed over ActivityPub?
1
Scott M. Stolz
Link to Source
Sat, 28 Sep 2024 09:22:17 -0500
View Profile
Scott M. Stolz
scott@loves.tech
@
Evan Prodromou
why not distributed over ActivityPub?
A few reasons:
1. The only way to distribute it via ActivityPub currently is to concatenate the context notice onto the user's post. On most platforms, this would make it look like it came from the author. This would make it a form of compelled speech, which is not fair to the author of the post.
2. And since the only way to distribute it would be to concatenate it onto someone's post, you could wind up with duplicate notices attached to every post as it passes through different systems (shares/quote-posts/repeats/boosts).
3. It is usually better to have one context notice for the whole thread instead of a context notice on every post. This becomes a problem on non-threaded platforms like Mastodon because every post is a top level post on Mastodon (as far as threaded apps are concerned).
4. Different administrators have their own moderation style, and some would like to implement their own systems instead of what we have.
5. Some users would not want to be on a platform that places context notices on their outgoing posts. (They can't control what a receiving platform does, but they can choose a platform that does not send outgoing context warnings attached to their posts.)
So, it is a combination of free speech and technical issues.
If ActivityPub has a built in moderation protocol that distributes context notices, labels, and blocks separately from the post itself, that would be ideal. You could have multiple moderation providers to choose from, and platform developers can integrate moderation and context notices into their platform.
1
Maj - Michele Ann Jenkins
Link to Source
Sat, 28 Sep 2024 10:59:40 -0500
View Profile
Evan Prodromou
evan@cosocial.ca
@scott
yes, I think we might want to build that.
1
Scott M. Stolz
Link to Source
Sat, 28 Sep 2024 11:35:45 -0500
View Profile
Devin Murray :verifiedpurple:
Solarinas@posthat.ca
@evan
If you call banning and blocking on sight a plan, then yes
Link to Source
Sat, 28 Sep 2024 13:57:25 -0500
last edited: Sat, 28 Sep 2024 13:59:43 -0500
View Profile
Scott M. Stolz
scott@loves.tech
@
Devin Murray :verifiedpurple:
@
Evan Prodromou
If you call banning and blocking on sight a plan, then yes
Well, in my mind, you would have at least five levels of moderation available, depending on the configuration and features of the site or app.
Administrator and moderators of sending instance can ban the user and delete content.
Administrators and moderators of receiving instance can block user, block content based on filters, block individual posts, and/or block a domain.
Users can block other users, unfollow users, unfollow threaded conversations, set up filters, and/or set permissions (prevent commenting on a thread, require approval before comment appears, etc.).
Third-party moderation tools can be installed by the admin that automatically acts on incoming posts, such as blocking, marking as sensitive content, adding context notices, adding content warnings, etc. Ideally, individuals users can customize this for their own feeds/inbox.
Web host level blocks, which block entire domains, bots, AI tools, scrappers, etc.
The features you have would depend on what platform you are using. (For example, Mastodon doesn't understand forum threads so you can't unfollow a specific thread.) And some specialized fediverse web hosts offer web host blocks, otherwise the server admin would have to configure that at the server level.
But on most platforms, you can block other people yourself, even if they are not blocked on the other four levels of moderation.
Link to Source
Conversation Features
Loading...
Conversation Features
Loading...
Login
Magic Sign On
Local Login
Register
Login
Email or nickname
Password
Remember me
Login
Register
Password Reset
Sign On with Magic Sign On
Sign On with Hubzilla
Remote Authentication
Sorry, you have got no notifications at the moment
.
.
.
{2}
{4}
{2}
{10}