Contributor Privacy: call to converse & reading notes #1218
Replies: 1 comment 1 reply
-
Interesting topic, which seems to me important to explore further and translate into actions. Great initiative to kick off! While going through the quotes, which are quite interesting, I sense that the discussion goes beyond privacy alone. There is undoubtedly an aspect related to democracy (community empowerment, shifts in power dynamics, etc.) and ethics (how we should responsibly use available data). @halcyondude, do you share my impression? In this context, I recalled the Linux Foundation Inclusive Speaker Orientation, a training that Kubernetes community leaders are mandated to undergo at some point. Perhaps a similar course for community leaders could be expanded to incorporate some of the ideas shared here. If our goal is to formalize actions for enhancing the CNCF community specifically, we might need to narrow down the focus (extracting ideas related to privacy and other relevant elements). We could frame a guiding question like, "How can we safeguard contributors' privacy in a transparent, open, and publicly accessible community?" Evaluate the community in this context and then suggest potential changes. |
Beta Was this translation helpful? Give feedback.
-
Contributor Privacy: Call for conversation(s)
Open Source Projects are created, managed, and sustained by communities of contributors, maintainers, users, & vendors. As we seek to better understand the size, composition, topology, and shape of open source communities, we must exercise care and caution. The evolution and prevalence of open source software has made it increasingly easy to inadvertently, accidentally and unknowingly violate the privacy of contributors, causing harm and potentially putting people at risk, and in some cases danger.
What follows are reading notes from a book recently published by MIT press. I think that the book forms a good backgrounder to inform a forward looking discussion around Privacy. We have guidelines and rubrics for understanding how to build secure systems and defined controls to ensure that once built, our systems remain secure. It's critical that as we embrace the evolution of our domain(s), we have analogous guidelines and controls for [Contributor] privacy.
"Data Action: Using data for public good" has a subtitle "How to use data as a tool for empowerment rather than oppression". I've captured some of the ideas presented in the text, and offer it as a potential starting point for a discussion around Contributor Privacy. It was at the MIT Bookstore in Cambridge, MA, which is one of my favorite places :).
As we seek to understand the communities we serve and their health, we must take care! Protecting the privacy of others also ensure's our own. Here, as is true for Accessibility, the CNCF has the opportunity to lead.
There's a few parts I haven't finished with, here's the working document, it'll be moved to a CNCF owned drive.
I had the opportunity to attend Sophia Vargas's talk, "Design Metric Programs to Respect Contributor Expectations and Promote Safety" (video, sched), and found it to be insightful.
⚡ Idea ⚡: Let's create a "TAG co-chair's bootstrap kit" which should include an extended and expanded (...ex{pa,te}nded...?) learning module (1 of n) covering this topic.
(original slack thread)
Values: How we use data
Excerpts and quotes taken from “Data Action: Using Data For Public Good” unless otherwise noted.
https://mitpress.mit.edu/9780262545310/data-action
Sharing Data Creates Transparency, Public Participation, and Collaboration
"Sharing Data does so much more than provide access to information. It creates trusting relationships, changes power dynamics, teaches us about policies, fosters debate, and helps to generate collaborative knowledge sharing, all of which are essential to building strong, deliberative communities." S. Williams, Data Action: Using data for public good, p. 137
"Data visualizations help create a narrative around an idea, and it's the narrative that ultimately has the ability to change people's hearts and minds. When using data for action, we must focus on the story we want to tell with the data." S. Williams, Data Action: Using data for public good, p. 141
It's how we work with data that really matters
…big data in its raw form cannot perform on its own; rather how data is transformed and operationalized can change the way we see the world. More specifically, data can be used for civic action and policy change by communicating with the data clearly and responsibly to expose the hidden patterns and ideologies to audiences inside and outside the policy arena. Communicating with data in this way requires the ability to **ask the right questions, **find or collect the appropriate data, analyze and interpret that data, and visualize the results in a way that can be understood by broad audiences.
Combining these methods transforms data from a simple point on a map to a narrative that has meaning. Data is not often processed in this way because data analysts are often not familiar with the techniques that can be used to tell stories with the data ethically and responsibly.
How to responsibly use data
We must interrogate the reasons we want to use data and determine the potential for our work to do more harm than good.
Building teams to create narratives around data for action is essential for communicating the results effectively, but team collaboration also helps to make sure no harm is done to the people represented in the data itself.
Building data helps change the power dynamics inherent in controlling and using data, while also having numerous side benefits, such as teaching data literacy.
Coming up with unique ways to acquire, quantify, and model data can expose messages previously hidden from the public eye; however, we must expose ideas ethically, going back to the first principle above.
We must validate the work we do with data by literally observing the phenomenon on the ground and asking those it [affects] to interpret the results.
Sharing data is essential for communicating the need for policy change and generating a debate essential for that work. Data visualizations are effective at doing that.
We must remember that data are people, and we must do them no harm. Regulations help provide standards of practice for the use of data, but they often are not developed in line with technological change; therefore, we must seek to develop our own standards and call upon others to do the same.
Axioms
The Purpose for Using Data Analytics Must Be Interrogated
…analysts must begin by asking policy questions of people with on-the-ground expertise, those who know the issue the best - and by believing ultimately that this collaboration will create smarter models. p.215
Building Expert Teams is Essential to Making Data Work for Policy Change
…working collaboratively with policy experts, communities, and designers is essential to reduce the potential for analytics to guide us toward misleading, unethical, or inaccurate conclusions. But more importantly, building expert teams helps communicate the work.
Building Data Changes Power Dynamics and Shapes Communities
…building data has other benefits: it teaches data literacy, builds communities around shared ideas, and creates media buzz around topics by placing them on the policy agenda.
Quantify Ingeniously, but Remember Data Is Biased by Its Creator
Data Brings Insights to the Public in Dynamic Ways
Data Are People, and We Must Do Them No Harm
TODO (a few more quotes from last 3 chapters)
Beta Was this translation helpful? Give feedback.
All reactions