A worry that is not yet on the scientific or cultural agenda is neural data privacy rights. Not even biometric data privacy rights (beyond genomics) are in purview yet which is surprising given the personal data streams that are amassing from wearable computing, Internet-of-Things biosensors, and quantified self-tracking activities. Neural data privacy rights is the notion of considering the privacy and security issues regarding personalized data flows that arise from the brain.
There are several reasons why neural data privacy rights could become an important concern. First, personalized health data is already a contentious personal data issue, and anything regarding the mind, and mental performance and potential pathology has even more sensitivity and taboo attached to it.
Second, neural data privacy rights could be an issue because it is not difficult to measure some level of the electrical and other activity of the brain, and ever-ratcheting price-performance technology improvements could make it possible to capture and process the neural activity of vast numbers of people simultaneously in real-time. There are already many consumer-available devices that measure neural activity such as EEGs, PPGs, and tMS systems, augmented headsets like Google Glass, Oculus Rift, and foc.us, and other emotion and cognitive state analysis applications using eye-tracking, mental state identification, and affect analysis.
Third, at some point, big data machine learning algorithms may be able to establish the validity and utility of neural data with correlation to a variety of human health and physical and mental performance states.
Fourth, despite the sensitivity of neural data streams, like any other form of personal data (where two data elements start to constitute an identification), privacy, security, and anonymity may be practically impossible. At worst, there could be malicious hacking, viruses, and spam targeting neural data streams.
Detailed Essay: "Neural Data Privacy Rights: An Invitation For Progress In The Guise Of An Approaching Worry"
There are several reasons why neural data privacy rights could become an important concern. First, personalized health data is already a contentious personal data issue, and anything regarding the mind, and mental performance and potential pathology has even more sensitivity and taboo attached to it.
Second, neural data privacy rights could be an issue because it is not difficult to measure some level of the electrical and other activity of the brain, and ever-ratcheting price-performance technology improvements could make it possible to capture and process the neural activity of vast numbers of people simultaneously in real-time. There are already many consumer-available devices that measure neural activity such as EEGs, PPGs, and tMS systems, augmented headsets like Google Glass, Oculus Rift, and foc.us, and other emotion and cognitive state analysis applications using eye-tracking, mental state identification, and affect analysis.
Does Google Glass come with a Faraday cage?
Third, at some point, big data machine learning algorithms may be able to establish the validity and utility of neural data with correlation to a variety of human health and physical and mental performance states.
Fourth, despite the sensitivity of neural data streams, like any other form of personal data (where two data elements start to constitute an identification), privacy, security, and anonymity may be practically impossible. At worst, there could be malicious hacking, viruses, and spam targeting neural data streams.
Detailed Essay: "Neural Data Privacy Rights: An Invitation For Progress In The Guise Of An Approaching Worry"