Americans are willing to "get naked" for their government if they feel it will make them more secure. That's the conclusion Jeffrey Rosen reached in his new book The Naked Crowd, which explores the willingness of Americans to abandon privacy for perceived security.I'd urge you to read the whole interview, because here I'm just going to mention two places where I disagree with Rosen.
The book takes its title from the name Rosen gives a high-tech X-ray machine tested in airports after 9/11. The machine exposes anything concealed beneath clothing, including plastic explosives stored in body cavities. But, as it was originally designed, the machine functions as an electronic strip-search, producing an anatomically correct image of the scanned person's naked body.
A simple programming tweak can make the machine produce non-gendered blobs instead, while still identifying contraband. But Rosen, a George Washington University law professor, found that many people, including his law students, preferred the machine in "naked" mode because they thought it would be more effective, even though they were told that the tweak made no difference in the machine's ability to expose concealed weapons.
One is that I think he's too sanguine about CAPPS II. Admittedly, it's latest proposed incarnation, which only proposes to compare an ID to a database to confirm identity is a great improvement over earlier versions which would have delved far more deeply and widely into a person's background. But as Rosen himself notes,
[y]ou might say that it's not a tremendously effective thing to say that people are who they say they are, because most of the 9/11 attackers had valid IDs. Regardless, I'm not as concerned about the privacy implications of a system that's engaging in authentication rather than identification.But if checking the validity of an ID does not provide any additional security, what's the point of doing it at all? And the second part is just silly: The system was always about authentication, since the identification was what you provided for them to authenticate. What's more, he gives no consideration to the issue of the accuracy of the databases and the associated watch lists, which are notoriously unreliable.
MY other disagreement arises from this exchange:
WN: Should companies be held accountable for not building privacy safeguards into their products? The naked machine, for example, could simply have been built so that it could only operate in "blob" mode.That's at best disingenuous on their part and naive on his. This attempt to slough off responsibility onto some vague "them" who "told us" how to design the machines, this "we've just helpless victims of others' demands" routine, is laughable. And it's ridiculous to say that they just don't want to "make policy." Of course they're making policy! Yes, the decision to produce "blob machines" involves a policy decision, one that values privacy. But so does the decision to make "naked machines" - it's just that in this case, the policy decision is to value marketability, to say that profit is more important than privacy.
Rosen: I was told again and again by companies in Silicon Valley, "We only build the machines; it's up to other people to tell us how to design them." I think these technologists felt in good faith that they're not policy makers. Even the decision to refine a naked machine (to become) a blob machine requires some executive to say that privacy is an important value to make that tweak. It's asking a lot of technologists who are instinctively uncomfortable with policy choices.
In fact, Rosen later contradicts his own assertion.
I'm very concerned about this military-technological complex. A lot of the post-9/11 policy choices have been driven by the effort on the part of Silicon Valley to market technologies to this burgeoning Homeland Security marketplace. The values of the market are not necessarily the same as the values of the Constitution, and there is indeed a danger that unregulated technologies may threaten constitutional values. [emphasis added]That is, technologies such as "naked machines" are not the result of policies determined by others, but quite the opposite: The technologies are driving policy and invasions of privacy become policy not because they are determined to be necessary but because they are technologically feasible. By excusing the companies' lack of privacy protection in pursuit of profit and instead putting all the burden on Congress and the courts (which he does elsewhere in the interview), Rosen ignores half the problem.
However, I will give him the last word, as it may be the best comment I've seen on an overriding issue:
It's often not until the misuse (of data) takes place that [people] realize the dangers of surrendering data, and at that point it's too late.Amen.
No comments:
Post a Comment