"Apparently, your brain creates a very specific electrical brain response, known as P300, when one is presented with information that is already contained in one’s mind. If you recognize the information (i.e., it is familiar to you), you will have a P300 response. There is no way to avoid this; it is a biological/electrical stimulus response event. Sort of like a lie detector, only (reportedly) always accurate.
Think of this as Mind-Reading 1.0.
Question: "Did you murder John Doe?"
Answer: "No."
Question: "Have you ever been inside this house?" [While presenting a picture of the front of a house]
Answer: "No."
Question: "Did you commit this murder?" [While presenting a picture of the murder scene, which took place in the bedroom]
Answer "No."
"Sir, you are under arrest – you had a P300 response to the murder scene."
Maybe you are feeling some comfort knowing this is not being accepted in the courtroom. Think again. P300 is already being used in court as admissible evidence by both defense and prosecuting attorneys.
Now what happens when science improves and Mind-Reading 2.0 is available?...
Now juxtapose this emerging technology with how our courts have interpreted our rights against unreasonable searches and seizures guaranteed in the Fourth Amendment of our Constitution. For example, the Supreme Court long ago ruled (Smith vs. Maryland) that we have no "reasonable expectation of privacy" in telephone call header data (i.e., whom you called, on what day, and for how long).
Now that the technology exists to "read" P300 responses, I have to wonder where we are going to draw the line...
Is it possible the Supreme Court could someday rule that certain brain activity is not private? If this seems far-fetched, let me share one plausible journey that might just make this true. The Court has held (Katz vs. United States) that an expectation of privacy is not "reasonable" unless both: (1) a person can claim "a legitimate expectation of privacy" over a particular type of information; and (2) this expectation is one that society is prepared to recognize as "reasonable." And, of course, what society sees as "reasonable" changes over time..."
Scary but a nice illustration of how a series of apparently small and some moderate-impact technology and legal systems decisions, spread out over a period of time and taken without a systemic appreciation of their cummulative effect, can lead to a dangerous place.
No comments:
Post a Comment