Skip to content

Commit

Permalink
Update cross-platform_pt_2.mdx (#543)
Browse files Browse the repository at this point in the history
Updating with correct conventions
  • Loading branch information
kaustubhavarma authored Apr 17, 2024
1 parent bf494f4 commit d7b2e9d
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions uli-website/src/pages/blog/cross-platform_pt_2.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,11 @@ import ContentPageShell from "../../components/molecules/ContentPageShell.jsx"
<ContentPageShell>

We're continuing by taking a look at how cross-platform abuse operates on federated and centralised platforms, and the kind of solutions they must test and implement to tackle cross-platform abuse.
The most popular decentralised/federated social media platform is Mastadon; Threads and Bluesky have also followed suit into adopting federation protocols for social media platforms.
The most popular decentralised/federated social media platform is Mastodon; Threads and Bluesky have also followed suit into adopting federation protocols for social media platforms.
The platforms, built on the Activity Pub Protocol and the AT Protocol, do not have a centralised authority that oversees activity on the platform.
Instead, users join different 'instances', and each instance has its own set of rules, block lists and moderators. The instances interact with one another through 'federation'.

##Federated Moderation
## Federated Moderation

On federated platforms mentioned above, it has been noted that hateful material can rapidly disseminate from one instance to [another](https://arxiv.org/pdf/2302.05915.pdf).
Federation policies help administrators of instances create rules that ban or modify content from [instances](https://arxiv.org/pdf/2302.05915.pdf).
Expand All @@ -28,7 +28,7 @@ Pleroma mentions the option to report a user's post to the administrator if it i
Centralised social media platforms on the other hand have more extensive documentation on the process for [redressal](https://docs-develop.pleroma.social/frontend/user_guide/posting_reading_basic_functions/).
On both federated and centralised platforms, the user goes through different reporting mechanisms for recourse.

##Responses
## Responses
As we discussed earlier, centralised responses to tackle cross-platform abuse focus on prima-facie illegal content such as CSAM and terrorism.
Amongst research on the decentralised web, there have been suggestions on tools that could be used to tackle issues that come with moderation on federated platforms:
(i)WatchGen, a tool which proposes instances that moderators should focus on and thus reduce the burden of moderation on [administrators](https://arxiv.org/pdf/2302.05915.pdf);
Expand All @@ -40,7 +40,7 @@ By automating and attempting to improve the detection of toxic content and flag
Both categories of platforms must test out tools that engage in collaborative moderation for a more effective and thorough action on content. Given the gravity of instances such as online abuse, platforms must extend signal-sharing protocols and similar tech responses to such offences as well, beyond
straightforward offences such as CSAM and terrorism.

##In sum
## In sum

Corporate accountability may limit the extent of responsibility a platform has towards a user (i.e. a platform entity is only responsible for what goes on in the platform), and considers an issue resolved when flagged content is acted on by moderators/administrators, as the case may be.
Within federated platforms, an administrator's responsbility is limited to acting upon content in their instance, and the issue is considered 'resolved', just as in centralised platforms.
Expand Down

0 comments on commit d7b2e9d

Please sign in to comment.