EPIC report is not so good
A couple of days ago, the Electronic Privacy Information Center (EPIC) issued a scathing analysis of the Department of Homeland Security's upcoming smart card program. Our country (indeed, much of the world) is currently struggling with the concepts of secure identity documents, and watchdog organizations such as the EFF, the ACLU and EPIC play a vital role shaping the debate. I am completely in favor of holding every government security program to unyielding standards of efficiency, effectiveness and privacy (see here and here, especially in the comments). Unfortunately, this particular report is muddled in many places and simply wrong in others.
Full disclosure: although I am not directly involved in the DHS card program, DHS is a customer of ours and we are working on several products that will make use of the card. In other words, I may be biased but I kind of know what I'm talking about.
Even the first sentence of the report is inauspicious for a security document:
President Bush's proposed $2.57 trillion federal budget for Fiscal Year 2006 greatly increases the amount of money spent on surveillance technology and programs while cutting about 150 programs—most of them from the Department of Education.
Why is the source of the funding relevant to the security analysis of the program? Would the technology be better if it were funded by, say, increased taxes on oil company profits?
EPIC quickly launched into the heart of their grievances:
The Department of Homeland Security Access Card (DAC) has vulnerabilities associated with its use of radio frequency identification (RFID) and Bluetooth technologies, biometric identifiers and PIN backup system. But there are also risks that come from the DAC's "mission creep"; the Department also wants the card to be used as a payment device for everyday items.
This is a good executive summary - five specific identified problems. Unfortunately the analysis of each one is pretty weak. I'm going to leave the "mission creep" stuff aside because there are legitimate policy and design questions there that have nothing to do with technology. The other four claims are fair game. Let's look at them in order:
Here's an easy defense against the RFID claim: The DAC does not use RFID. The DAC uses a standard called ISO/14443 for contactless (wireless) communication between the card and a reader. RFID is designed for tracking physical items. It has a long read range (about four feet) and is not encrypted. ISO/14443 is designed to identify people. It has a much shorter read range (about 5 inches) and weak encryption. The two standards are very different but they're frequently confused even by allegedly authoritative speakers. I don't get too worked up about this mistake because even though it's much harder to snoop ISO/14443 than RFID, the vulnerabilities are of the same type. Still, it doesn't help EPIC's credibility to conflate the two standards, especially since exactly this mistake was the center of much teeth-gnashing last month. The real answer is to eventually move to contactless cards with strong cryptography. Such cards are currently available but are not yet in common use.
The vulnerabilities of Bluetooth technology have also been well documented. Bluetooth technology enables wireless communication among electronic devices in close proximity. For example, a Bluetooth-enabled computer could work with a wireless keyboard or mouse. In August, security flaws in Bluetooth-enabled mobile phones allowed criminals to access the information in the phones including contact information and text messages.
This would be damming stuff, if it wasn't crazytalk. The DHS card has nothing to do with Bluetooth. Unlike the "RFID" claim in the paragraph above, there isn't even anything close to Bluetooth that the DAC uses. Nothing. No Bluetooth. Nuh-uh. Bluetooth has nothing to do with identity cards. I don't even think you could put Bluetooth onto a card if you tried; I believe (though I could be wrong) that Bluetooth requires an active power source and contactless cards are all passive. I have no idea what EPIC is talking about, other than maybe DHS said that they would test Bluetooth as a way to hook up computers to phones or something. Also, all the "Bluetooth flaws" that are so breathlessly reported in the EPIC report aren't really flaws with Bluetooth at all, but with specific phones and devices that happen to use Bluetooth. This is an important distinction but I don't want to dwell on it here because THE DHS CARDS DO NOT USE BLUETOOTH.
The DAC identifies the cardholder and her level of access through the use of a biometric identifier—a fingerprint. A recent report by National Institute of Standards and Technology (NIST) showed that one-fingerprint identification systems had an accuracy rate of 98.6 percent, while the accuracy rate rose to 99.6 when two fingerprints were used and 99.9 when four, eight and ten fingerprints were used.
This makes it sound like unauthorized individuals will be getting in all the time while legitimate users will often be locked out of their doors and computers! Fortunately, it doesn't work like that. The accuracy of most biometrics systems can be tuned by balancing two competing types of errors: false positives and false negatives. A false positive error occurs when a bad guy's fingerprint gets mistakenly matched for a good guy's fingerprint. A false negative error occurs when a good guy's fingerprint doesn't get recognized at all. Since fingerprint scanning produces slightly different results each time, the system must be configured with a certain tolerance level. If the tolerance level is very loose, you can virtually eliminate false negatives at the cost of greatly increasing false positives. The system basically says, "Meh, it looks kindda like a fingerprint - go on in." If the tolerance level is very strict, you get the opposite effect: "Your fingerprint is off by 0.00001 millimeters - no access for you!"
The accuracy rate is also heavily influenced by how many possible fingerprint matches the system has to consider. If the system has to match your scan against a large database of enrolled fingerprints (called a "one-to-many" match), it's far more likely to come up with a false positive ("hmmm, it kindda looks like user #7654231") and somewhat more likely to come up with a false negative ("it could be this guy or that guy, I better just punt"). The DHS card avoids this problem by matching your fingerprint against only one possible user - the user stored in the card - so the chances of a false positive are very low because someone trying to trick the system can't just match *anyone's* fingerprint, they have to match *your* fingerprint. Also, the match tolerance can be set very high thereby further reducing the chances of a false positive but increasing the chances of a false negative.
So you can virtually eliminate the false positives (and therefore security risks associated with biometric access), but doesn't the relatively high false negative rate still mean that legitimate users will be locked out? Not really. If you get a false negative, you just have to scan your finger a second time. Let's say it takes you 2 seconds to scan your finger and the false negative error rate is 5%. Most of the time (95%) you'll get access in two seconds. Most of the rest of the time (4.75%) you'll get in with two swipes and four seconds. Every 400 tries or so, you'll have to wait six seconds. If you stay at your job for 20 years, you might have a chance of waiting eight seconds for access once. I use a biometric reader to log onto my laptop and (once I figured out how to hold my finger) it takes me about two seconds to get a good match.
EPIC then proceed to quote out-of-context one of their own (earlier, better) reports:
Once a biometric identifier has been compromised, there can be severe consequences for the individual whose identity has been affected. It is possible to replace a credit card or Social Security numbers, but how does one replace a fingerprint, voiceprint, or retina scan?
Err. That's exactly why you need to link the biometric identifier to a card - just like DHS is doing. You can't revoke a fingerprint, but you can revoke a card. The fingerprint itself doesn't do you any good and, if you lose your card, you can always re-scan your finger and associate it with the replacement card. The criticism quoted above is perfectly legitimate when levied against ill-conceived attempts to use biometrics as identifiers by themselves, but is ironically inappropriate in discussing the DHS program.
The Department has a backup system built into the card—if the fingerprint identification fails, then the employee can gain access by using a 6- to 8- digit PIN. By allowing alternate access through the PIN, Homeland Security creates all of the vulnerabilities associated with allowing complete access to secure areas and information through one password.
The PIN is not inherently a way to bypass the biometrics, it's just another factor of authentication. The DHS card provides applications with three factors to choose from: physical possession of the card (which is always required), fingerprint biometrics and a PIN. Each door lock or computer program that uses the card can determine to use one, two or all three of these factors depending on the level of authentication security required. For example, getting into the front door of a busy, low-security area may require only the physical possession of the card. Logging into a computer may require the card and either the biometric or the PIN. Accessing a very high-security file may require all three. Giving applications designers more options does not reduce security. Of course, some designers may make dumb choices about authentication, but that's not the fault of the card program. Also, keep in mind that the lambasted "card and second factor" system is much better in almost every security and convenience regard than the "password only" systems it's designed to replace.
Wrapping it up
In the fall, hundreds of thousands of personnel will have access cards equipped with personal information, biometric and wireless technologies, and the security risks associated with their use.
Exactly. That's why we need coherent debate to distill some clarity about the risks and rewards. This EPIC report - by combining one part gross technology misidentification (RFID), one part random gibberish (Bluetooth), two parts common misunderstanding (biometric accuracy and PINs) and stewing in politics thinly-disguised as security analysis - just makes mud.
April 11, 2005 | Permalink
I guess that this week's received wisdom is that the government is a prying, malevolent, big brother rather than the incompetent, bungling, and lying CYA types of last week. It is all so hard to keep current on what is the latest.
A few observations:
1) This "report" implies that DoE programs were targetted for cuts to pay for the DAC program without any facts to back up the claim. One would think that Congress eliminated Head Start or Pell Grants to pay for smart cards. The fact is that government programs come and go all the time. For example, some programs might be demonstration programs. Most of us wouldn't think of their expiration as a cut.
Many other programs are one-time items designed to make a Member of Congress look good to the folks back home. These programs are commonly called "pork." Every appropriations bill is loaded up with items like this that don't carry over from one budget year to the next.
2) Correct me if I am wrong, but aren't the people who are getting these cards all employees of or somehow being paid at least in part by the United States Government? What every day items are these DHS employees going to be purchasing using their DHS-issued smart cards and why exactly would it be bad to know what these employees are purchasing using said card?
Posted by: Charlie McLain | Apr 11, 2005 2:22:49 PM
nice to see somebody who is able to provide some basic response to what unfortunately appears to be an all to common problem with so called 'technology advocates'.
And to think, they want us to take them seriously...
Posted by: RO | Apr 24, 2005 5:21:59 AM
We're hearing a lot of the familiar misunderstandings about smart ID cards now that the Congress has passed and the President has signed the Real ID program. The media are filled with references to RFID (Bluetooth hasn't gotten there yet), and the dangers to our civil liberties posed by having those dreaded biometrics mandated. :(
Posted by: Charlie McLain | May 12, 2005 3:03:07 PM