-
Notifications
You must be signed in to change notification settings - Fork 2
/
post.html
303 lines (273 loc) · 18.3 KB
/
post.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
<!DOCTYPE HTML>
<!--
Digital Identities: Design and Uses
The Centre for Internet and Society, India
-->
<html>
<head>
<title>Digital Identities: Design and Uses</title>
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1, user-scalable=no" />
<meta name="description" content="Digital Identities: Design and Uses | A project of the Centre for Internet and Society, India, supported by Omidyar Network" />
<meta name="keywords" content="" />
<link rel="stylesheet" href="assets/css/post.css" />
<link rel="shortcut icon" type="image/x-icon" href="images/favicon.ico" />
</head>
<body class="is-preload">
<!-- Wrapper -->
<div id="wrapper">
<!-- Intro -->
<section class="intro">
<header>
<h1><a href="index.html">Digital Identities:<br />Design and Uses</a></h1>
<p>A project of <a href="https://cis-india.org/" target="_blank">the Centre for Internet and Society, India</a><br /> supported by <a href="https://www.omidyar.com/" target="_blank">Omidyar Network</a></p>
</header>
</section>
<!-- Section -->
<section id="first">
<header></header>
<div class="content">
<h2><em>Towards a framework for evaluation of Digital ID</em></h2>
</div>
</section>
<!-- Section -->
<section>
<header>
<p id="blurb">Draft for discussion</p>
<p id="blurb"><a href="https://www.rightscon.org/" target="_blank">RightsCon</a>, June 2019</p>
<p id="blurb"><a href="" target="_blank">Download the draft</a> (PDF)</p>
</header>
<div class="content">
<p>As governments across the globe implement new, foundational, digital identification systems (<strong>“Digital ID”</strong>), or modernize existing ID programs, there is dire need for greater research and discussion about appropriate uses of Digital ID systems. This significant momentum for creating Digital ID in several parts of the world has been accompanied with concerns about the privacy and exclusion harms of a state issued Digital ID system, resulting in campaigns and litigations in countries such as UK, India, Kenya, and Jamaica. Given the very large range of considerations required to evaluate Digital ID projects, it is necessary to think of evaluation frameworks that can be used for this purpose.</p>
<p>What follows is an attempt to build draft principles against which Digital ID may be evaluated. We hope that these draft principles can evolve into a set of best practices that can be used by policymakers when they create and implement Digital ID systems, provide guidance to civil society examinations of Digital ID and highlight questions for further research on the subject. We have drawn from approaches used in documents such as the <em>necessary</em> and <em>proportionate principles</em>, the OECD privacy guidelines and scholarship on harms based approaches.</p>
<p>When we refer to the ‘use’ of ID below, we mean the use of digital ID to identify or authenticate ID holders, or make authorisations of any kind on their behalf.</p>
</div>
</section>
<!-- Divider -->
<div id="divider"><hr /></div>
<!-- Section -->
<section>
<header>
<h2>Rule of Law Tests</h2>
</header>
<div class="content">
<p>The rise of Digital ID, and the opportunities they present, both for public and private actors, has often resulted in hasteful implementations and adoptions. This does not allow for sufficient deliberation to lead to governance mechanisms. Below are the most basic tests to ensure that a rule of law framework exists to govern the use of ID —</p>
</div>
</section>
<!-- Section -->
<section>
<header>
<h2>1.1</h2>
</header>
<div class="content">
<h5>Legislative Mandate</h5>
<h3>Is the project backed by a validly enacted law?</h3>
<p>Digital ID, by its nature, will entail greater collection of personally identifiable information, as well as privacy risks. Any such restrictions to fundamental rights must be prescribed by law in the form of a publicly available legislative act. Other forms of regulation, such as executive ordinance, only meet this requirement in limited ways.</p>
</div>
</section>
<!-- Section -->
<section>
<header>
<h2>1.2</h2>
</header>
<div class="content">
<h5>Defining Actors and Purposes</h5>
<h3>Does the law clearly specify the actors and the purposes?</h3>
<p>The law must clearly specify the actors, or a category of actors who may use the Digital ID. Actors include entities who may use the Digital ID, as well as agencies and databases to whom the it may be connected in any way. Similarly, the purposes for which the Digital ID is used, while may not be expressly defined, must always be clearly backed by law.</p>
</div>
</section>
<!-- Section -->
<section>
<header>
<h2>1.3</h2>
</header>
<div class="content">
<h5>Legitimate Aim</h5>
<h3>Are all purposes flowing from a ‘legitimate aim’ identified in the valid law?</h3>
<p>All the purposes for use of Digital ID must correspond to a legitimate aim identified in the valid law. This legitimate aim must be “necessary in a democratic society,” and not based merely on political expediency.</p>
</div>
</section>
<!-- Section -->
<section>
<header>
<h2>1.4</h2>
</header>
<div class="content">
<h5>Redressal Mechanism</h5>
<h3>Does the law provide for adequate redressal mechanisms against actors who use the Digital ID and govern its use?</h3>
<p>Adequate redressal mechanisms would necessarily include the following three requirements: a) <em>User Notification:</em> individuals must be notified when their Digital ID is used in any way; b) <em>Access and Correction:</em> individuals must have access to personally identifiable information collected through the use of Digital ID, and the ability to seek corrections, amendments, or deletion of such information where it is inaccurate; c) <em>Due Process:</em> individuals must be entitled entitled to a fair and public hearing within a reasonable time by an independent, competent and impartial judicial authority, established by law in cases where provisions of law governing the Digital ID are violated.</p>
</div>
</section>
<!-- Section -->
<section>
<header>
<h2>1.5</h2>
</header>
<div class="content">
<h5>Purposes</h5>
<h3>If legitimate aims for Digital ID correspond to its specific purposes, does the project restrict itself to the uses which directly relate to such purposes?</h3>
<p>Once the law or its supporting documents specify the legitimate aims of the Digital ID, all purposes must flow from this ‘aim’, and all uses of the Digital ID must have a rational nexus to these purposes.</p>
</div>
</section>
<!-- Section -->
<section>
<header>
<h2>1.6</h2>
</header>
<div class="content">
<h5>Mission Creep</h5>
<h3>Is there a legislative and judicial oversight mechanism to deal with cases of mission creep in use of Digital ID?</h3>
<p>In cases where there is an attempt to use the Digital ID for newer purposes, the executive authority must not be able to allow for such uses in the absence of a legislative process for deliberating the additional uses, or their judicial examination against the legitimate aims.</p>
</div>
</section>
<!-- Divider -->
<div id="divider"><hr /></div>
<!-- Section -->
<section>
<header>
<h2>Rights based Tests</h2>
</header>
<div class="content">
<p>The most clear and outright critiques of Digital ID systems have come in light of their violations of the right to privacy. Across jurisdictions, critics have discussed different forms of violations of privacy, including mandatory collection of sensitive personal data such as biometrics, lack of robust access-control mechanisms, inadequate protection of private sector collection of data, and increased behavioral profiling through use of one identifier for all services. Alongside, there have also been serious questions raised about exclusion concerns where absence of an ID or failures in its functioning can lead to denial of basic entitlements and benefits. Key rights-based principles are highlighted below —</p>
</div>
</section>
<!-- Section -->
<section>
<header>
<h2>2.1</h2>
</header>
<div class="content">
<h5>Necessary and Proportionate Privacy Violations</h5>
<h3>Are the privacy violations arising from the use of Digital ID necessary and proportionate to achieve the legitimate aim?</h3>
<p>The use of Digital ID may pose inherent risks to the right to privacy by leading to generation of more data, facilitating the connection of varied sets of behavioral data to unique identities, and involving a set of actors. Privacy violations arising from the use of Digital ID must satisfy the requirement of being necessary for achieving the legitimate aim. This means it must be the only means of achieving a legitimate aim, or, when there are multiple means, it is the means least likely to infringe upon the privacy rights. Additionally, the privacy violations caused by the use of Digital ID must be proportionate to the legitimate aim being pursued.</p>
</div>
</section>
<!-- Section -->
<section>
<header>
<h2>2.2</h2>
</header>
<div class="content">
<h5>Access Control</h5>
<h3>Are there protections in place to limit access to the digital trail of personally identifiable information created through use of Digital ID by both state and private actors?</h3>
<p>Privacy risks to individuals from use of Digital ID arise both from generation of data, as well as access to the generated data. Therefore, adequate access control mechanisms would entail access to information generated as a result of the use of Digital ID would be limited to actors who need this information to achieve specified purposes.</p>
</div>
</section>
<!-- Section -->
<section>
<header>
<h2>2.3</h2>
</header>
<div class="content">
<h5>Exclusions</h5>
<h3>Are there adequate mechanisms to ensure that the adoption of Digital ID does not lead to exclusion or restriction of access to entitlements or services?</h3>
<p>If the intended use of ID could lead to denial or restriction of services or benefits to individuals, or categories of individuals, then there must be mechanism to ensure that such individuals are not disadvantaged. In these cases, individuals must be able to use other forms of identification to seek access to services or benefits.</p>
</div>
</section>
<!-- Section -->
<section>
<header>
<h2>2.4</h2>
</header>
<div class="content">
<h5>Mandatory Use</h5>
<h3>In case enrolment and use of Digital ID are made mandatory, are there any valid legal grounds for doing so?</h3>
<p>Whether enrolling into and specific uses of ID should be mandatory or not remains one of the most important questions in ID. As mandating ID limits the agency of individuals, it should be subject to strict legal tests, such as the need to obtain information that is strictly necessary to provide a service to an individual, prevention of harm to others, and eligibility to undertake specialised tasks.</p>
</div>
</section>
<!-- Divider -->
<div id="divider"><hr /></div>
<!-- Section -->
<section>
<header>
<h2>Risk based Tests</h2>
</header>
<div class="content">
<p>The debate and discussion around Digital ID has centered primarily on the perceived or existing risks related to privacy, welfare, equality and inclusion. As a range of use cases of Digital ID emerge, laws and institutions governing Digital ID must be vigilant about the risks and harms emerging from them. This needs to be done with some urgency regarding the existing use cases of Digital ID, as well. A rights based approach is, by itself not sufficient to address these challenges, and there is a need for greater paternalistic regulatory measures that strictly govern the nature of uses of Digital ID. Below we attempt to articulate some draft principles. These principles do not exist in most jurisdiction dealing with Digital ID, though there is now an increasing focus on harms assessment in prominent frameworks such as the GDPR.</p>
</div>
</section>
<!-- Section -->
<section>
<header>
<h2>3.1</h2>
</header>
<div class="content">
<h5>Risk Assessment</h5>
<h3>Are decisions regarding the legitimacy of uses, benefits of using Digital ID and their impact on individual rights informed by risk assessment?</h3>
<p>Drawing from consumer protection laws, laws governing Digital ID need to take into account tangible harms to individuals, have clear provisions on prevention, and for appropriate recovery for those harms, if they occur.</p>
</div>
</section>
<!-- Section -->
<section>
<header>
<h2>3.2</h2>
</header>
<div class="content">
<h5>Proportionality</h5>
<h3>Does the laws on Digital ID envisage governance, which is proportional to the likelihood and severity of the possible risks of its use?</h3>
<p>Regulation of Digital ID needs to be sensitive to the actual harms caused by its uses, and be informed by the severity and likelihood of harm. For instance, a design involving the centralised storage of biometric data with robust security safeguards may have a remote likelihood of security risk, but has a very high severity in cases of breach.</p>
</div>
</section>
<!-- Section -->
<section>
<header>
<h2>3.3</h2>
</header>
<div class="content">
<h5>Response to Risks</h5>
<h3>In cases of demonstrable high risk from uses of Digital ID, are there mechanisms in place to prohibit or restrict the use?</h3>
<p>If the risks from uses of Digital ID are demonstrably high, they need to be restricted until there are adequate mitigating factors that can be introduced. This may need a responsive Digital ID regulator, who has the mandate and resources to intervene responsively.</p>
</div>
</section>
<!-- Section -->
<section>
<header>
<h2>3.4</h2>
</header>
<div class="content">
<h5>Differentiated Approaches to Risks</h5>
<h3>Do the laws and regulations envisage a differentiated approach to governing uses of Digital ID, based on the likelihood and severity of risk?</h3>
<p>Drawing from Fred Cate’s model of harms in data protection, a differentiated approach may involve categorising uses as (a) <em>Per Se Harmful</em>: where a use is always harmful (e.g., the use of ID to collect and use alternative data proven to be predatory for credit scoring and lending), the regulator could prohibit the use outright; (b) <em>Per Se Not Harmful</em>: the regulator may consider not regulating uses that present no reasonable likelihood of harm; and (c) <em>Sensitive Uses</em>: where use of personal data is neither per se harmful nor per se not harmful, the regulator may condition the use on several factors, such as aligning with a rights based approach.</p>
</div>
</section>
<!-- Divider -->
<div id="divider"><hr /></div>
<!-- Section -->
<section>
<header>
<span class="image logo"><img src="images/CIS_Logo.jpg" alt="The Centre for Internet and Society (CIS), India" /></span>
</header>
<div class="content">
<p>This website presents research undertaken by <a href="https://cis-india.org/" target="_blank">the Centre for Internet and Society, India</a> on appropriate design choices for digital identity frameworks, and their implications for both the sustainable development agenda as well for civil, social and economic rights. This research is supported by a <a href="https://www.omidyar.com/investees/centre-internet-and-society" target="_blank">grant</a> from <a href="https://www.omidyarnetwork.in/" target="_blank">Omidyar Network India</a>.<br /></p>
<p>CIS is a non-profit organisation that undertakes interdisciplinary research on internet and digital technologies from policy and academic perspectives. Through its diverse initiatives, CIS explores, intervenes in, and advances contemporary discourse and regulatory practices around internet, technology, and society in India, and elsewhere.</p>
</div>
</section>
<!-- Section -->
<section>
<header>
<footer>
<p>Copyright: <a href="https://cis-india.org" target="_blank">The Centre for Internet and Society, India</a>, 2019<br />
License: <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank">Creative Commons Attribution 4.0 International</a><br />
Design: <a href="https://html5up.net/paradigm-shift" target="_blank">Paradigm Shift</a> by <a href="https://html5up.net" target="_blank">HTML5 UP</a><br />
Fonts: <a href="https://fonts.google.com/specimen/Fira+Sans" target="_blank">Fira Sans</a> and <a href="https://fonts.google.com/specimen/IBM+Plex+Serif" target="_blank">IBM Plex Serif</a> by <a href="https://fonts.google.com/" target="_blank">Google Fonts</a><br />
Hosted on <a href="https://github.com/cis-india/digitalid.design" target="_blank">GitHub</a></p>
</footer>
</header>
<div class="content">
</div>
</section>
</div>
<!-- Scripts -->
<script src="assets/js/jquery.min.js"></script>
<script src="assets/js/jquery.scrolly.min.js"></script>
<script src="assets/js/browser.min.js"></script>
<script src="assets/js/breakpoints.min.js"></script>
<script src="assets/js/util.js"></script>
<script src="assets/js/main.js"></script>
<!-- Hypothesis -->
<script type="application/json" class="js-hypothesis-config">
{"showHighlights": false}
</script>
<script src="https://hypothes.is/embed.js" async></script>
</body>
</html>