The K-PLEX project on the European Big Data Value Forum 2017

Mike Priddy (DANS, 2nd from right in the image) represented the K-PLEX project at the European Big Data Value Forum 2017 Conference in a panel on privacy-preserving technologies. Read here about the statements he made in answering to three questions posed.

IMG_0603

Question 1: There is an apparent trade-off between big data exploitation and privacy – do you agree or not?

  • Privacy is only one part of Identity. There needs to be respect for the individual’s right to build their identity upon a rich and informed basis.
  • The right not to know should also be considered. People have a right to their own levels of knowledge
  • Privacy is broader than the individual. Confidential data exists in and can affect: family, community, & company/organisations. The self is relational, it is not individual, it produces social facts and consequences.
  • Trust in data use & third party use – where should the accountability be?
  • There is the challenge of transparency versus accountability; just making all data available may obfuscate the accountability.
  • Accountability versus responsibility? Where is the ethical responsibility lie with human & non-human actors?
  • Anonymisation is still an evolving ‘science’ – the effectiveness of anonymising processes is is not always well and broadly understood. Aggregation may not give the results that users want or can use, but may protect the individual but not necessarily for a community or family.
  • Anonymity maybe an illusion; we don’t understand how minimal the data may need to be in order to expose identity. DoB, Gender & Region is enough to be disclosive for the majority of a population.
  • individuals, in particular young or vulnerable individuals, may not be in a position to defend themselves.
  • This means that big data may need to exclude communities & people with niche problems
  • Black boxes of ML & NNets don’t allow people to understand or track use or misuse or misinformation – wrong assertions being made: you cannot give informed consent under these conditions.
  • IOT and other technologies (facial recognition) mean that there is possibly no point at which informed consent can be given.

Strategies for meeting these issues:

  • There are well established strategies to deal with disclosure of confidential data in the Social Sciences and Official Statistics: such as output checking, off the grid access, remote execution (with testable data), secure rooms etc. Checks and balances are needed (a pause) before it goes out – this is a part of oversight and governance.
  • Individuals should be able to see when these processes are triggered, and decide if it is disclosive and whether that is appropriate.
  • More information about how data is used, shared, processed must be made available to the data creator (in a way they can use it)
  • meeting ISO 27001 standard in your data handling and procedures within your organisation is a good start.

Question 2: Regarding the level of development, privacy preserving big data technologies still have a long way to go – do you agree or not?

  • Biases are baked in. There isn’t enough differentiation between kinds of data: mine, yours, raw, cleaned, input, output – data is seen as just data and processed without narrative or context. We need not privacy by design, we need humanity at the centre of design and respect human agency.
  • Too often we only are concerned about privacy when it becomes a problem: privacy/confidentiality is NOT an obsolete concept.

Question 3: Individual application areas differ considerably with regard to the difficulty of meeting the privacy requirements – do you agree or not?

  • The problem is the way the question is formulated. By looking at application areas we are basically saying the problem is superficial. It is not. It is fundamental.
  • It has become very hard to opt out of everything. We cannot cut all of our social ties because of network effects.
  • Technology is moving faster than society can cope with and understand how data is being used. Not a new phenomena, we can see similar challenges in the historical record.
  • Privacy needs to be understood as a public good; there must be the right to be forgotten, but also right not to be recorded.
  • Data citizenship is needed: Citizens need to be involved enough & to be able to make better decisions about providing confidential/personal data & what happens to their data. What it means and what happens when you fill in that form

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s