Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...


Excerpt
Image Modified

The GDPR forces us to look at our data, categorise it as personal, personally identifiable and everything else (keeping in mind that what was once impersonal can become personally identifiable in association), but often we don't  question why we collect and store this information. It is already part of existing Data Protection legislation that only data that is necessary should be collected and then kept only for as long as it is necessary, Rarely do we consider whether data or metadata is useful in itself once we add it into our model and data stores. Often we start collecting it for some future use which is neither clear, decided or planned; and once we have it we keep it because its data and must be valuable.

I'm suggesting that we not collect common personal categorisation data unless there is an overriding need and that for the overwhelming cases there is no such need. This thought was provoked most recently by this Tweet.



Div
idtweet

Ada Rose Cannon ada@mastodon.social Retweeted ProPublica

As a developer if you are ever asked to do something like this. Pause and look at yourself and what you are enabling. "Someone else would do it so I might as well be paid for it" is not an excuse. Don't build evil. Don't enable evil systems. We need a tech hippocratic oath.

Ada Rose Cannon ada@mastodon.social added,

My initial response was:


Div
idretweet

Simon_Lucy Retweeted Ada Rose Cannon ada@mastodon.social

This sounds easy to avoid but there are simpler, basic enablers. Collecting and indexing classifiers such as ethnicity, gender, gender preference allows populations to be targetted for whatever purpose.

Because it is straightforward to not engage in writing systems and applications that can be used to aid prejudice and foster division but its very hard to avoid modelling and designing into data stores categorisations that can be used in ways which are prejudicial to the owners of that personal data.

But what about needing to know who is affected by this or that prejudice and persecution so we can protect them and improve their lot? Surely we need to collect information to identify those parts of the population that need help? But do you need to count in order to know what is the right way to treat everyone? Is counting and identifying itself the wrong?

For specific needs does someone really need to identify themselves as disabled, or are they really an individual with a requirement?

Personally, I don't categorise myself in any ethnic, religious category on any form and would avoid gender and age if I could. As a modeller, architect, would I argue against collecting this data? I would now. 

I would employ all the arguments about not collecting personnel data unnecessarily. Does your application/system require gender to be relevant? Really? Age? Should the provision of public services need ethnic data? And so on.

Really ask if each one of these metadata categories is necessary, bear in mind that each of the categories will likely be from a control list plus 'other' perhaps. What purpose will be served, if its for some broad population statistical use ask how does this category give meaning ful information that actually matters in a statistic. Take Male/Female (I won't fall back on self described gender issues to begin with the traditional simple case should suffice), how does it help knowing someone ticked either box? Will they buy or be interested in a different product or service? Will they want different information, will the content be filtered?

If the answer is yes I'd ask the question, so you wouldn't sell to someone of the wrong gender? Would you only show pink bikes to girls? Carbon fibre drop handlebars to boys? I'd hope not, so how does your case differ?

...