Security makes me hungry
From the Associated Press: School Mistakes Huge Burrito for a Weapon
The drama ended two hours later when the suspicious item was identified as a 30-inch burrito filled with steak, guacamole, lettuce, salsa and jalapenos and wrapped inside tin foil and a white T-shirt.
You keep using that word...
I've received much good feedback on my last post about the pudding-headed report criticising the new DHS smartcard program. Many people are justifyiably mystified by the report's references to Bluetooth. The strange thing isn't that the new smartcard doesn't use Bluetooth, but that smart cards and Bluetooth have absolutely nothing to do with each other. It's like asking, "Doesn't the new Honda Accord suffer from all the well documented problems of Esperanto?" The short answer is "no", the real answer is, "what the hell are you talking about?"
The problem, of course, is buzzword creep. With all the industry terminology floating around these days, it's hard for people to remember whether combining two particular concepts produces an argument that's coherent (like biometrics and privacy) or less so (like pancakes and the doctrine of original intent). That modesty does not typically hinder such people from writing technology assesments or legal opinions is beyond the scope of this blog post.
Bluetooth, a fine technology with many years of buzzwordiness behind it, is particularly suseptible to such content-free punditry. In service to all the technology companies who make perfectly good products that have nothing to do with Bluetooth, but feel market pressure to be 100% buzzword compliant, I offer the following decal:
You wouldn't put it on a cell phone (whether it had Bluetooth or not), but you could stick it onto a toaster, tax software, or a government smart card. I'd start sticking it on our software boxes, but I bet our attorneys wouldn't be too happy.
EPIC report is not so good
A couple of days ago, the Electronic Privacy Information Center (EPIC) issued a scathing analysis of the Department of Homeland Security's upcoming smart card program. Our country (indeed, much of the world) is currently struggling with the concepts of secure identity documents, and watchdog organizations such as the EFF, the ACLU and EPIC play a vital role shaping the debate. I am completely in favor of holding every government security program to unyielding standards of efficiency, effectiveness and privacy (see here and here, especially in the comments). Unfortunately, this particular report is muddled in many places and simply wrong in others.
Full disclosure: although I am not directly involved in the DHS card program, DHS is a customer of ours and we are working on several products that will make use of the card. In other words, I may be biased but I kind of know what I'm talking about.
Even the first sentence of the report is inauspicious for a security document:
President Bush's proposed $2.57 trillion federal budget for Fiscal Year 2006 greatly increases the amount of money spent on surveillance technology and programs while cutting about 150 programs—most of them from the Department of Education.
Why is the source of the funding relevant to the security analysis of the program? Would the technology be better if it were funded by, say, increased taxes on oil company profits?
EPIC quickly launched into the heart of their grievances:
The Department of Homeland Security Access Card (DAC) has vulnerabilities associated with its use of radio frequency identification (RFID) and Bluetooth technologies, biometric identifiers and PIN backup system. But there are also risks that come from the DAC's "mission creep"; the Department also wants the card to be used as a payment device for everyday items.
This is a good executive summary - five specific identified problems. Unfortunately the analysis of each one is pretty weak. I'm going to leave the "mission creep" stuff aside because there are legitimate policy and design questions there that have nothing to do with technology. The other four claims are fair game. Let's look at them in order:
Here's an easy defense against the RFID claim: The DAC does not use RFID. The DAC uses a standard called ISO/14443 for contactless (wireless) communication between the card and a reader. RFID is designed for tracking physical items. It has a long read range (about four feet) and is not encrypted. ISO/14443 is designed to identify people. It has a much shorter read range (about 5 inches) and weak encryption. The two standards are very different but they're frequently confused even by allegedly authoritative speakers. I don't get too worked up about this mistake because even though it's much harder to snoop ISO/14443 than RFID, the vulnerabilities are of the same type. Still, it doesn't help EPIC's credibility to conflate the two standards, especially since exactly this mistake was the center of much teeth-gnashing last month. The real answer is to eventually move to contactless cards with strong cryptography. Such cards are currently available but are not yet in common use.
The vulnerabilities of Bluetooth technology have also been well documented. Bluetooth technology enables wireless communication among electronic devices in close proximity. For example, a Bluetooth-enabled computer could work with a wireless keyboard or mouse. In August, security flaws in Bluetooth-enabled mobile phones allowed criminals to access the information in the phones including contact information and text messages.
This would be damming stuff, if it wasn't crazytalk. The DHS card has nothing to do with Bluetooth. Unlike the "RFID" claim in the paragraph above, there isn't even anything close to Bluetooth that the DAC uses. Nothing. No Bluetooth. Nuh-uh. Bluetooth has nothing to do with identity cards. I don't even think you could put Bluetooth onto a card if you tried; I believe (though I could be wrong) that Bluetooth requires an active power source and contactless cards are all passive. I have no idea what EPIC is talking about, other than maybe DHS said that they would test Bluetooth as a way to hook up computers to phones or something. Also, all the "Bluetooth flaws" that are so breathlessly reported in the EPIC report aren't really flaws with Bluetooth at all, but with specific phones and devices that happen to use Bluetooth. This is an important distinction but I don't want to dwell on it here because THE DHS CARDS DO NOT USE BLUETOOTH.
The DAC identifies the cardholder and her level of access through the use of a biometric identifier—a fingerprint. A recent report by National Institute of Standards and Technology (NIST) showed that one-fingerprint identification systems had an accuracy rate of 98.6 percent, while the accuracy rate rose to 99.6 when two fingerprints were used and 99.9 when four, eight and ten fingerprints were used.
This makes it sound like unauthorized individuals will be getting in all the time while legitimate users will often be locked out of their doors and computers! Fortunately, it doesn't work like that. The accuracy of most biometrics systems can be tuned by balancing two competing types of errors: false positives and false negatives. A false positive error occurs when a bad guy's fingerprint gets mistakenly matched for a good guy's fingerprint. A false negative error occurs when a good guy's fingerprint doesn't get recognized at all. Since fingerprint scanning produces slightly different results each time, the system must be configured with a certain tolerance level. If the tolerance level is very loose, you can virtually eliminate false negatives at the cost of greatly increasing false positives. The system basically says, "Meh, it looks kindda like a fingerprint - go on in." If the tolerance level is very strict, you get the opposite effect: "Your fingerprint is off by 0.00001 millimeters - no access for you!"
The accuracy rate is also heavily influenced by how many possible fingerprint matches the system has to consider. If the system has to match your scan against a large database of enrolled fingerprints (called a "one-to-many" match), it's far more likely to come up with a false positive ("hmmm, it kindda looks like user #7654231") and somewhat more likely to come up with a false negative ("it could be this guy or that guy, I better just punt"). The DHS card avoids this problem by matching your fingerprint against only one possible user - the user stored in the card - so the chances of a false positive are very low because someone trying to trick the system can't just match *anyone's* fingerprint, they have to match *your* fingerprint. Also, the match tolerance can be set very high thereby further reducing the chances of a false positive but increasing the chances of a false negative.
So you can virtually eliminate the false positives (and therefore security risks associated with biometric access), but doesn't the relatively high false negative rate still mean that legitimate users will be locked out? Not really. If you get a false negative, you just have to scan your finger a second time. Let's say it takes you 2 seconds to scan your finger and the false negative error rate is 5%. Most of the time (95%) you'll get access in two seconds. Most of the rest of the time (4.75%) you'll get in with two swipes and four seconds. Every 400 tries or so, you'll have to wait six seconds. If you stay at your job for 20 years, you might have a chance of waiting eight seconds for access once. I use a biometric reader to log onto my laptop and (once I figured out how to hold my finger) it takes me about two seconds to get a good match.
EPIC then proceed to quote out-of-context one of their own (earlier, better) reports:
Once a biometric identifier has been compromised, there can be severe consequences for the individual whose identity has been affected. It is possible to replace a credit card or Social Security numbers, but how does one replace a fingerprint, voiceprint, or retina scan?
Err. That's exactly why you need to link the biometric identifier to a card - just like DHS is doing. You can't revoke a fingerprint, but you can revoke a card. The fingerprint itself doesn't do you any good and, if you lose your card, you can always re-scan your finger and associate it with the replacement card. The criticism quoted above is perfectly legitimate when levied against ill-conceived attempts to use biometrics as identifiers by themselves, but is ironically inappropriate in discussing the DHS program.
The Department has a backup system built into the card—if the fingerprint identification fails, then the employee can gain access by using a 6- to 8- digit PIN. By allowing alternate access through the PIN, Homeland Security creates all of the vulnerabilities associated with allowing complete access to secure areas and information through one password.
The PIN is not inherently a way to bypass the biometrics, it's just another factor of authentication. The DHS card provides applications with three factors to choose from: physical possession of the card (which is always required), fingerprint biometrics and a PIN. Each door lock or computer program that uses the card can determine to use one, two or all three of these factors depending on the level of authentication security required. For example, getting into the front door of a busy, low-security area may require only the physical possession of the card. Logging into a computer may require the card and either the biometric or the PIN. Accessing a very high-security file may require all three. Giving applications designers more options does not reduce security. Of course, some designers may make dumb choices about authentication, but that's not the fault of the card program. Also, keep in mind that the lambasted "card and second factor" system is much better in almost every security and convenience regard than the "password only" systems it's designed to replace.
Wrapping it up
In the fall, hundreds of thousands of personnel will have access cards equipped with personal information, biometric and wireless technologies, and the security risks associated with their use.
Exactly. That's why we need coherent debate to distill some clarity about the risks and rewards. This EPIC report - by combining one part gross technology misidentification (RFID), one part random gibberish (Bluetooth), two parts common misunderstanding (biometric accuracy and PINs) and stewing in politics thinly-disguised as security analysis - just makes mud.