Skip to content

Commit

Permalink
fix: broken approach page
Browse files Browse the repository at this point in the history
  • Loading branch information
dennyabrain committed Sep 7, 2023
1 parent 33ae9cd commit 94cfd5e
Showing 1 changed file with 15 additions and 20 deletions.
35 changes: 15 additions & 20 deletions uli-website/src/pages/approach.mdx
Original file line number Diff line number Diff line change
@@ -1,10 +1,3 @@
---
name: "Responses to OGBV"
excerpt: " "
author: "CIS and Tattle"
project: " "
date: 2021
---
** This is one of the earliest articulations of the project vision and approach, written in the July/August 2021 **

The graphic narrative titled ‘Personal (Cyber) Space’ published in 2016 by Parthasarthy and Malhotra narrates an experience of a young internet user. The animated short comic hosted by Kadak, a South Asian women collective, asks: ‘If one says something, there’s the fear of hateful response. But if one doesn’t say something, isn’t that silence counterproductive?’ only to end with the question, ‘so what does one say?’
Expand All @@ -22,27 +15,29 @@ Through extensive qualitative data collection methods and participatory analysis
The ultimate aim of the project is to envision creative and collective responses to the structural problem of violence experienced online and help build solidarity and shared understanding while empowering users to take back control of their digital experience.

### Situating machine learning:
Machine learning based approaches are a commonly used technique to automate decision making when the volume of data is large. To put it briefly, machine learning works by finding patterns in existing data to ascribe a value to future queries. Instead of telling an algorithm what to do, in machine learning, the algorithm figures out what to do based on the data it is fed. The data used to train a machine learning system as well as the algorithm used to classify the data, can encode social beliefs and values. These are perpetuated in the performance of the machine learning systems.

The moderation decisions of social media platforms often make international news. Some decisions can be attributed to error. Machine learning system, like every prediction system, makes errors. But some decisions reflect the social values in the data and algorithms behind the model. So, what many communities find harmful may not be harmful as per the guidelines set by social media platforms.
Machine learning tools can also be designed to reflect the values of those at the forefront of tackling violence, to protect those who will be at the receiving end of the violence. This is precisely the goal of our project.
Machine learning based approaches are a commonly used technique to automate decision making when the volume of data is large. To put it briefly, machine learning works by finding patterns in existing data to ascribe a value to future queries. Instead of telling an algorithm what to do, in machine learning, the algorithm figures out what to do based on the data it is fed. The data used to train a machine learning system as well as the algorithm used to classify the data, can encode social beliefs and values. These are perpetuated in the performance of the machine learning systems.

The moderation decisions of social media platforms often make international news. Some decisions can be attributed to error. Machine learning system, like every prediction system, makes errors. But some decisions reflect the social values in the data and algorithms behind the model. So, what many communities find harmful may not be harmful as per the guidelines set by social media platforms.
Machine learning tools can also be designed to reflect the values of those at the forefront of tackling violence, to protect those who will be at the receiving end of the violence. This is precisely the goal of our project.

## Stakeholders

In 2021, we ran workshops with gender rights activists and researchers and identified the following stakeholders:

<Box width={'100%'} height={'fit-content'}>
<Image alignSelf={'start'} fill={true} fit="contain" src={"https://github.com/tattle-made/website/blob/master/static/products/OGBV_stakeholders_2.png"} />
</Box>
<Box width={"100%"} height={"fit-content"}>
<Image
alignSelf={"start"}
fill={true}
fit="contain"
src={
"https://github.com/tattle-made/website/blob/master/static/products/OGBV_stakeholders_2.png"
}
/>
</Box>{" "}

The first three stakeholders are those who are at the receiving end of online gender based violence. The fourth stakeholder might not be directly subject to OGBV but may encounter and be affected by such content on social media. The final stakeholders are those who can help respond to OGBV.

In 2021, we conducted focus group discussions and interviews. We cataloged the request for features we heard at the time, [here](https://docs.google.com/document/d/e/2PACX-1vQ29pzrZtFOxgb5yPgA1a-y_0LZbGRureUx0E0LZvRx2VgqANOYrAvDNuqYHQMpVcQPgH3ql-52YSu9/pub).

We prioritized features based on their perceived importance and the level of effort required in developing them.






We prioritized features based on their perceived importance and the level of effort required in developing them.

0 comments on commit 94cfd5e

Please sign in to comment.