Welcome to Slashdot Apple Spam Portables Hardware Privacy
 

Biometric Face Recognition Exploit
Security
Technology/IT
Posted by michael on Friday June 27, @05:56PM
from the faceoff dept.
clscott writes "A researcher at the U. of Ottawa has developed an exploit to which most biometric systems are probably vulnerable. He developed an algorithm which allows a fairly high quality image of a person to be regenerated from a face recognition template. Three commercial face rec. algorithms were tested and in all cases the image could masquerade to the algorithm as the target person. Here are links to a talk and a paper. Unfortunately, biometric templates are currently considered to be non-identifiable, much like a password hash. This means that legislation gets passed to require hundreds of millions of people to have their biometrics encoded onto their passports. This kind of vulnerability could mean that anyone who reads these documents has access to the holders fingerprint, iris images, etc."

 

 
Slashdot Login
Nickname:

Password:

[ Create a new account ]

Related Links
· Dev Tools DevChannel
· researcher
· talk
· paper
· legislation
· More Security stories
· Also by michael

TiVo Data Collection Ramifications | Video Chat Software Reviewed  >
Biometric Face Recognition Exploit | Log in/Create an Account | Top | 188 comments | Search Discussion
Threshold:
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
This problem is solved by redundancy (Score:5, Funny)
by NumberField (670182) * on Friday June 27, @05:57PM (#6315242)
This isn't a problem because most people have extras of the body parts used for most biometric schemes. For example, you probably a large supply of fingers (about ten), so it doesn't matter if a few get compromised. Similarly, if you have two eyes, it's not a big deal if your retinal print becomes known to bad guys.

(P.S. Please no replies from humor-impaired folks.)

[ Reply to This ]
    Re:This problem is solved by redundancy (Score:5, Funny)
    by gerf (532474) on Friday June 27, @05:59PM (#6315259)
    (Last Journal: Monday July 29, @09:50AM)

    This isn't a problem because most people have extras of the body parts used for most biometric schemes. For example, you probably a large supply of fingers (about ten), so it doesn't matter if a few get compromised. Similarly, if you have two eyes, it's not a big deal if your retinal print becomes known to bad guys. (P.S. Please no replies from humor-impaired folks.)

    I don't get it. The way you're talking isn't in a standard joking format at all. Maybe you Canadians have a different sense of humor?

    [ Reply to This | Parent ]
    Re:This problem is solved by redundancy (Score:3, Insightful)
    by randyest (589159) <(ranorano) (at) (hotmail.com)> on Friday June 27, @06:33PM (#6315508)
    This isn't a problem because most people have extras of the body parts used for most biometric schemes.
    It's not a problem at all. On the contrary, it is a really good discovery IMHO. The most important conclusion from this is (from the talk slides):

    Biometric software systems should provide yes/no only, with no match score values.

    My question is: why would the software systems ever need to give a match score value, instead of a yes/no answer in the first place? It's not like the algorithm developer is there operating the machine and can thus use the score result to help decide what to do with "near" matches. Most of the people using these machines, I would surmise, are pretty clueless about how they work (except in a very general sense, of course), so providing a score result would only be confusing and a potential source of misidentification:

    "Hmm, that John Doe matched with a score of 95, and it turned out not to be the guy, so this 94 score can't possibly really be Osama Bin Laden -- go ahead and let him on the plane with his antique ceremonial religious knives."

    Either the system thinks it knows the person's face, or it doesn't. That's all it needs to say. Saying just that and nothing more will protect privacy (in that you can't reconstruct the face without the template and quantitative match score results), and it will prevent operator confusion and some types of misapplication.
    [ Reply to This | Parent ]
      Re:This problem is solved by redundancy (Score:3, Insightful)
      by PaulBu (473180) on Friday June 27, @07:47PM (#6315930)
      (http://slashdot.org/)
      Maybe because in different situations different threshold would have to be applied. E.g., if it is a terrorist monitoring camera on a random street corner, it might not be feasible to unleash FBI agents after every guy who matched at 80%, but if that random street corner happens to be in Washington, DC across the street from the White House, 80% confidence might be a reason to trigger further actions.

      And if it is a camera in the cash machine and you claim that you are Joe and want to get your $500, you better match Joe's face at, say, 99% (it can also ask you to turn a bit and face the lens if your score is lower than some threshold.

      Another example, if an airport screener can realistically check 10 people out of a hundred, she chooses ones with the highest scores. Yes, it might mean that John Doe in your example will be checked, and Osama will be not, IF there are other 9 people in line with scores >=95.

      Algorithms used might be the same, but exact policy is implemented by taking scores into account.

      There is more than binary yes/no in this world...

      Paul B.

      P.S. Not that I know anything about the actual numbers or policies, but I can see the value of having the scores available to people who program the machine, but not necessarily to the screeners (if any) who operate them.
      [ Reply to This | Parent ]
      Re:This problem is solved by redundancy (Score:1)
      by Axel Eble (685299) on Saturday June 28, @12:59PM (#6319882)
      Did you think about what will happen when your face is incorrectly recognized? Knowing people I believe you will be singled out at $airport and treated as a terrorist even if you could prove that you are not the culprit.

      The current political climate in the US (and several other terrorist-crazed parts of the world) will happily use anything that they think might work without wanting to hear people who see the problems with the latest and coolest technology.

      The end users will believe the system because they were trained to.

      The innocent people that happened to suffer from a false positive will lose the most.

      [ Reply to This | Parent ]
    Think of what might happen to body parts (Score:3, Insightful)
    by gotr00t (563828) on Friday June 27, @08:25PM (#6316129)
    (http://mod.homelinux.org/ | Last Journal: Saturday December 07, @01:34AM)
    When will people get concerned that their body parts are now vulnerable? Desperate criminals who want to infiltrate, or governments, for that matter, would find it rather suitable to simply kill a person and remove their face, eyes, fingers, etc., to use in a biometrics device.

    This is even easier to compromise than having a keycard or something, as the individual could at least hide it somewhere. They CAN'T hide their face without

    [ Reply to This | Parent ]
      Regarding eyes (Score:2)
      by Merk (25521) on Saturday June 28, @01:29AM (#6317810)
      (http://infofiend.com/)

      I remember reading a paper about biometric identification using the iris. The bit I remember is that it is really easy to tell if the eye you're scanning is alive or not. For example, as part of the scanning process the machine just needs to go from dark to bright in a short time. If it does that and the pupil doesn't narrow then the eye isn't attached to a living body. I can't speak for other body parts, but it's unlikely anybody will pluck out your eyes and scan them.

      [ Reply to This | Parent ]
      Biometrics: Read Bio - "Living" (Score:1)
      by MadCow-ard (330423) on Saturday June 28, @02:59PM (#6320482)
      It is not a pure fingerprint reader anymore, it is a "living specimen" and fingerprint reader. Sure the old ones are still sold, but they are considered low end. If any real security is needed, the latest generation will detect living versus dead. The previous post mentions the same with iris readers. The same is true of retinal, hand geometry, and voice print (theoretically by multiple passes). The only one I'm not sure of is facial geometry, but I would assume that digital video from which its taken would clear up the issue of dead or alive ;-).

      Lets see how long it takes to hack the new systems. It seems to me the real vulnerability is not in the recognition, but in the fact that the system is computerized and therefore hackable. And as we all bemoan with Biometrics, once comprimised, forever lost.
      [ Reply to This | Parent ]
  • 1 reply beneath your current threshold.
Other systems too? (Score:5, Interesting)
by mgcsinc (681597) on Friday June 27, @05:59PM (#6315256)
Personally I use BioPassword for authenticating my workstation using keystroke recognition, so I seem to be safe from the exploit as yet; holding an image up to a computer seems like it would require considerably less effort than attaching a PS2 device that typed at exactly the correct rate. Nonetheless, I wonder if this discovery will prompt the redesigning of the way user data is stored across the biometric spectrum, going as far as the oft considered-foolproof keystroke systems…
[ Reply to This ]
    Re:Other systems too? (Score:3, Interesting)
    by spydir31 (312329) * <hastur@h a s t u r k un.com> on Friday June 27, @06:27PM (#6315458)
    (http://www.hasturkun.com/)
    Keystroke and timing capture/playback is trivial, I wouldn't go trusting that as secure.
    [ Reply to This | Parent ]
    Re:Other systems too? (Score:5, Informative)
    by NixterAg (198468) on Friday June 27, @06:29PM (#6315483)
    BioPassword unfortunately suffers from a habit of producing false rejections. It really diminishes its usability. BioPassword's best trait is that it doesn't require an additional hardware purchase to work. Several high profile banks inspected BioPassword to determine whether they could use it for identity authentication within the context of online purchases. They came to the conclusion that it wasn't usable enough.

    I think many people miss the boat when it comes to biometric identity authentication. The fact is, any security protocol can be exploited. The idea is to make it a protocol difficult enough to exploit so that it isn't in the best interests of an attacker to go after whatever is being secured. It's like cryptography. There is no unbreakable code or cipher, but there are codes that are difficult enough to break that it isn't worth the time or effort required to break them.
    [ Reply to This | Parent ]
      Re:Other systems too? (Score:2)
      by Kashif Shaikh (575991) <k2shaikhNO@SPAMyahoo.com> on Saturday June 28, @01:27AM (#6317797)
      So why don't we just create a long 50000-bit key and slap it onto a magnetic-swipe card?

      That way, the system is only comprised when:

      a) You lose the card
      b) Someone threatens you at knife-point to hand the card over.

      In such cases, you simply call the card authority to invalidate the card's key and get a new one.

      [ Reply to This | Parent ]
        Re:Other systems too? (Score:2)
        by Dun Malg (230075) on Saturday June 28, @12:13PM (#6319656)
        (Last Journal: Thursday May 01, @09:15PM)
        So why don't we just create a long 50000-bit key and slap it onto a magnetic-swipe card?

        A huge key is unnecessary. If they have the card, they have the key. The key exists solely to keep someone from whipping up a card with your user ID and getting instant access. No one is going to guess your key even if it's only 128 bits.

        That way, the system is only comprised when:

        a) You lose the card
        b) Someone threatens you at knife-point to hand the card over.

        Seeing that we already have the system you describe above (sans ridiculously large key) in use for ATMs, one has to look at the purpose of biometric authentication. Yeah, that's right: with biometric you can't easily a) lose the "key", and b) no one can easily take it from you at knifepoint.

        [ Reply to This | Parent ]
    • 1 reply beneath your current threshold.
    Re:Something i've always wondered (Score:2)
    by mgcsinc (681597) on Friday June 27, @06:44PM (#6315580)
    Two things; one, I'm not so worried about strangers as those who coinhabit a dwelling with me! secondly, there is a traditional multi-password bypass feature for BioPassword...
    [ Reply to This | Parent ]
  • 2 replies beneath your current threshold.
paranoia (Score:5, Funny)
by klokwise (610755) on Friday June 27, @06:01PM (#6315280)
(http://www.nekrodomos.net/)
maybe i should extend my tin-foil hat to a tin-foil facemask and a pair of shiny gloves... that way they'll never recognise me!
[ Reply to This ]
Facial recognition (Score:1, Insightful)
by Anonymous Coward on Friday June 27, @06:02PM (#6315286)
...doesn't work worth a damn anyway. Other forms of biometric authentication are much more reliable.
[ Reply to This ]
At least a good guy discovered this (Score:1, Funny)
by Anonymous Coward on Friday June 27, @06:03PM (#6315296)
I'm glad to know that someone legit found this out before it got into the hands of those evil terrorists . Seriously, it's great that these kinds of things are being discovered now. It just goes to show that no matter what, things can be hacked/bypassed/etc somehow.
[ Reply to This ]
One thing that is missing from "the spoof" (Score:5, Interesting)
by adzoox (615327) * on Friday June 27, @06:04PM (#6315300)
(http://www.adzoox.com/ | Last Journal: Wednesday July 02, @11:23AM)
A local company to me, has a biometric scan + retina and thumbprint scan, but it also takes your body temp average/signature .... the combination of the three are pretty hard, if not impossible, to spoof. And, anyone that can, was going to break into your system anyway. (With the VERY expensive equipment and extensive knowledge it would take to reproduce all three)

Sometimes we give criminals to much credit. Again, if it's someone that can go through all three of those, they were going to get past the toughest of Indiana Jones hurdles.

[ Reply to This ]
Old News (Score:5, Funny)
by fobbman (131816) on Friday June 27, @06:05PM (#6315314)
(http://slashdot.org/)
The fallibility of biometric systems has been widely known since a scientific expose [imdb.com] was released on the topic no less than five years ago.

[ Reply to This ]
RTFA (Score:1, Interesting)
by Uhh_Duh (125375) on Friday June 27, @06:06PM (#6315318)
(http://www.glug.com/)
You'll notice that the data is insecure so much as the database the biometric information is stored in is protected.

All they're saying is that if they have access to that information, they can generate something that can authenticate against it. (DUH!)

The moral of the story is that if you don't want someone to pretend to be Bob's face, don't give anyone access to the database that has the information on what Bob's face looks like to the biometric scanners. /. has sure been good at wasting my time with useless news lately.
[ Reply to This ]
    RTFA yourself (Score:5, Insightful)
    by MarcoAtWork (28889) on Friday June 27, @06:28PM (#6315473)
    You don't understand what the article is talking about. When you enroll in a biometric system, the system itself -doesn't- match based on your picture, but on a 'template' which is created by taking your standard data and performing certain destructive operations to arrive to a much smaller 'template' which can still be used to identify you.

    This is very similar to the one-way hashing that happens with unix passwords, only that in this case the hashing is 'lossier' so you have 'confidence scores' instead of a black/white answer.

    The article shows that given this 'hashed' value you can recreate an image that has a good chance of not only being authenticated by the same system/algorithm (which already should be very hard, given the one-way nature of the templatization) =BUT= also by different systems!

    It also is really interesting how if you have access to the 'confidence score' outputted by the recognizer, you can take arbitrary images and blending/averaging them again come up with an image that works.

    This is definitely not useless news and will have quite some implications.
    [ Reply to This | Parent ]
      Re:RTFA yourself (Score:2)
      by randyest (589159) <(ranorano) (at) (hotmail.com)> on Friday June 27, @06:45PM (#6315584)
      I guess this is a meta-meta-RTFA, but you seem to have missed a key point in the article (though you clearly read at least some of it -- kudos for that).

      The exploit requires both the template and (repeated) access to score results (i.e., the evaluation / matching algorithm). The template itself is insufficient as the exploit depends on iterative image manupulations and "hotter, warmer, cooler" feeback from the evaluation algorithm to work.

      So, although you seem to get this in your final paragraphs (though your "also" seems to imply that this is an additional, separate thing, while I read the FA to say the scores are an inherent need). In any case, your earlier statements don't seem to take this into account. Such as your UNIX password analogy, which would only be applicable if failed password entries gave you some quantitative feedback like "you're about 50% on that one", "nope, worse this time -- only 45% of a real password", etc. instead of a simple "sorry." or "invalid login".
      [ Reply to This | Parent ]
        Re:RTFA yourself (Score:3, Insightful)
        by dbrutus (71639) on Friday June 27, @07:09PM (#6315686)
        (http://www.roitgroup.com/)
        Did you notice that nobody's using biometric systems that aren't also sold to companies. All you really need is to have a front company that says it needs a secure biometric company id system. The same people that sold the US their system will happily sell you an exact copy scaled down to one site. Once you own the system, you can run it to your heart's content. You can get data off of passports and create proper fakes at your leisure.

        Total cost for piercing the false security of the system? Way to little to be a barrier to ObL.
        [ Reply to This | Parent ]
        Re:RTFA yourself (Score:3, Insightful)
        by MarcoAtWork (28889) on Friday June 27, @08:30PM (#6316169)
        I originally thought the same, but have a look at slide 15, the researcher says:

        'Access to templates OR match scores implies access to biometric sample image' (emphasis mine)

        I originally thought that you needed both, but after re-reading the presentation a few times it seems the researcher has -TWO- different exploits, one which regenerates things from the biometric data (samples not shown) and the other which takes arbitrary pics and by using the match percentage iterates a few times until it finds something that passes.

        If I misunderstood and you need both things, please correct me.
        [ Reply to This | Parent ]
    Re:RTFA (Score:2)
    by deadsaijinx* (637410) <animemeken@hotmail.com> on Friday June 27, @06:50PM (#6315601)
    (http://kenbonilla.ath.cx/)
    what do you mean lately?
    [ Reply to This | Parent ]
  • 1 reply beneath your current threshold.
Yikes! (Score:2, Informative)
by ackthpt (218170) * on Friday June 27, @06:08PM (#6315332)
(http://www.dragonswest.com/ | Last Journal: Friday May 23, @10:22PM)
This means that legislation gets passed to require hundreds of millions of people to have their biometrics encoded onto their passports.

So this means that spotty, streaky photo of me (or is it a dog .. a wombat maybe?) on the back of my CostCo membership card isn't safe! Just about anyone could march in the door, past their regorously trained staff and buy Boca Burgers for half off!

Someone showed me a fake driver's license made by a "novelty" company. The only distinguishable difference was a missing apostrophe in the text on the reverse. It had holograms and everything. Thoughtfully, the company stated, "This is only for amusement value, illegal to use as ID", etc. Yeah, that should cover it.

[ Reply to This ]
The solution: store biometric data on a Java Card (Score:1, Insightful)
by ikewillis (586793) on Friday June 27, @06:09PM (#6315344)
I think this only further proves the need for something like a Java Card [sun.com]

(btw, I don't work for Sun)

A Java Card would allow you to store information (in this case biometric data) in a way that the data could be used in some sort of transformation but the original data is protected.

Were biometric data to be included on Passports, I see no better way to store it than in a Java Card. Portions of the biometric data analysis could be offloaded onto the Java Card itself, until an acceptable and mutual balance of trust and distrust can be achieved between the biometric processing algorithms and the data on the Java Card. In this way the biometric data is never exposed directly to the outside world, so one need not worry about it getting leaked to the "bad guys" even if your passport were stolen.

[ Reply to This ]
    Re:The solution: store biometric data on a Java Ca (Score:2)
    by jetmarc (592741) on Friday June 27, @08:10PM (#6316042)
    > In this way the biometric data is never exposed directly to the outside world, so one
    > need not worry about it getting leaked to the "bad guys" even if your passport were stolen. ..except of course, when the JavaCard can be used as an oracle by the attacker.
    Note that in the article they did not use any reference to the original image
    or to the dataset that the face recognition software creates from it. They rather
    chose 30 different (visually not related) images and then evolutionary selected
    the best fit.

    As soon as your JavaCard is going to be universal (and serve multiple purposes
    with varying degree of security) it has to return a "score" (rather than a yes/no
    decision). And nothing more than that very score is used by the attack, go figure.

    To put this into a real world example: imagine you use an ATM JavaCard with face
    recognition. Insert card, present your face into the cam lens, and enter how
    much money you need. Now a computer nerd "finds" your card. He emulates an ATM
    terminal to the card and presents a random face to the card. Recursively, he
    optimizes it according to the article until he achieves a "good enough" score.
    He prints that out on paper, and travels to Mexico - slowly, by car, doing a stop
    at every damn biometrics-enabled ATM he can find. Heck, even the security cam
    recordings provide no more evidence than a fake (still image) phantom photo of
    YOU!

    Marc
    [ Reply to This | Parent ]
      Re:The solution: store biometric data on a Java Ca (Score:2)
      by Cthefuture (665326) on Friday June 27, @10:18PM (#6316870)
      As soon as your JavaCard is going to be universal (and serve multiple purposes with varying degree of security) it has to return a "score" (rather than a yes/no decision).

      Eh? I understand the part about being able to use a score to slowly converge on a working template, but that's not the way any smartcard I've seen works.

      I've never worked with a card that returned a score. The biometric template is instead used like a PIN, it either unlocks the card or not and the card determines that. When the card is unlocked it then authenticates in a traditional manner (usually a standard public key, RSA or whatever). In other words, the biometric template unlocks the private key. Note that no private data is ever read off the card, everything is done on-card.

      When you're talking smartcards, it's not the client application that determines the security level. Normally it's the card that determines if you've passed all the security criteria. Hence smart card.
      [ Reply to This | Parent ]
Does the database depend on obscurity? (Score:3, Insightful)
by astrashe (7452) * on Friday June 27, @06:10PM (#6315350)
(http://slashdot.org/...id=31489&cid=3388020 | Last Journal: Friday June 20, @05:46PM)
I've been curious about these databases and how they work. They have to take the images and proces them, presumably into some sort of n-tuple. And then they database that.

But how will they handle changes? I mean, people will probably figure out how the recognition works, and learn how to trick it. If you know the scheme, it probably wouldn't be too hard.

If they have a giant database of these n-tuples, generated from photos, will they have to recrunch every photo in the db when they want to improve the system, or respond to holes that emerge? I guess they'll have a lot of computer power, so it's probably not too bad.

The thing that worries me about this stuff is the possibility that the crooks and terrorists will be able to defeat it trivially, but the average citizen will be tracked everywhere he or she goes.
[ Reply to This ]
x10 Get your Biometric Face Master Template (Score:3, Funny)
by bugsmalli (638337) on Friday June 27, @06:10PM (#6315351)
**Guy snooping on a girl sunbathing**

Want to snoop on your neighbor?? Want to trespass?? Want to know if there are Aliens at Area 51???

GET YOUR OWN BIOMETRIC FACE MASTER TEMPLATE. Guaranteed to *FOOL* all Biometric Scanners. Get the *NEW* and *IMPROVED* BIOMETRIC FACE MASTER TEMPLATE from X10. It will even fool our OWN SECURITY CAMERA!!! Our NEW special offer, buy one BFMT and get PRE-APPROVED Bail for FREE (good for 5000 dollars) ORDER NOW!!!
[ Reply to This ]
Am I reading the description incorrectly? (Score:1)
by EdgeShadow (665410) on Friday June 27, @06:12PM (#6315365)
Unfortunately, biometric templates are currently considered to be non-identifiable, much like a password hash. This means that legislation gets passed to require hundreds of millions of people to have their biometrics encoded onto their passports.

Those two statements seem to be contradictory. If biometric templates are considered to be "non-identifiable" (much like lie-detector tests are inadmissable in court due to unreliability), why would legislation be passed to require them to be used in passports? A United States passport is often considered the most reliable form of identification for a U.S. citizen. I don't see why the government would risk compromising the passport's reliability by incorporating into it a supposed "unreliable" technology.
[ Reply to This ]
Sounds easy to fix... (Score:2)
by hpa (7948) on Friday June 27, @06:17PM (#6315396)
(http://www.zytor.com/~hpa/)
Unlike all the *other* problems with biometrics, like false positives/false negatives/gelatin sheet spoofing, showing the camera a photograph, etc., this one seems like it should be easy to solve: don't store the biometric data, instead, treat it like a password and store a cryptographic hash of it instead.
[ Reply to This ]
    Re:Sounds easy to fix... (Score:2, Insightful)
    by jonatha (204526) on Friday June 27, @06:57PM (#6315634)
    Unlike all the *other* problems with biometrics, like false positives/false negatives/gelatin sheet spoofing, showing the camera a photograph, etc., this one seems like it should be easy to solve: don't store the biometric data, instead, treat it like a password and store a cryptographic hash of it instead.

    The paper explicitly covers encryption, etc., of the data.

    Any system that uses the data to decide whether or not the presented (fake) pattern matches the template is subject to this attack, i.e., hashing the data won't help.

    [ Reply to This | Parent ]
      Re:Sounds easy to fix... (Score:1)
      by eric256 (625188) on Friday June 27, @07:51PM (#6315954)
      Actualy its saying that if you have access to the template (unencrypted) then you can reconstruct the image from that. Its also saying that if the system gives a % of how close you are then that can be used to average out an initial picture until it provides a close enough match. Both of these could be handled by one-way hashing the stored template and then only giving a yes/no answer. Though then the hash of a template generated by a persons image must be consitently perfect, so that everytime it id's you it recreates the template, hashes it, and searches for that in the dbase.

      I think some sort of video/infrared system would work better. Of course it would be much harder. Then you could weight there movement/facial expresions/temperature match. Much harder to fool than a simple image. After all you can't hold a picture up to an infared camera and expect it to work too well.
      Of course it should also be noted that any system can be broken, includeing ALL current identification systems. So its realy a matter of is this better or worse than our current methods.

      Just my thoughts.
      [ Reply to This | Parent ]
    Re:Sounds easy to fix... (Score:2, Informative)
    by robindmorris (682328) on Friday June 27, @07:03PM (#6315656)
    If you bothered to RTFA (I know, this is /.), you would find that this exploit does not need access to the biometric data, instead it only needs access to the scoring function.

    Put simply:
    1. start with some random face
    2. ask the system to compute the recognition score for this face
    3. make changes to the face
    4. compute the new score
    5. if the score is higher, keep the change to the face, if the score is lower, reject the change
    6. goto 3

    You'll notice that nowhere do you have to look at the biometric data itself. You only have to ask the system to compute the recognition score (for which it comes with a handy api).

    Actually, this idea is so brilliantly simple, that I'm annoyed that I didn't think of it myself (it relates closely to a bunch of work I've done on image reconstruction.
    [ Reply to This | Parent ]
      Re:Thank god! (Score:1)
      by robindmorris (682328) on Friday June 27, @08:04PM (#6316006)
      Ah, mention the DMCA and get modded up... You don't need to break the law to exploit this. You only need to make api calls to the public api of the recognition system. It's all spelled out in the article.
      [ Reply to This | Parent ]
    • 1 reply beneath your current threshold.
    Read the technical paper (Score:2)
    by alizard (107678) <alizard@ecis.com> on Saturday June 28, @04:09AM (#6318350)
    (http://www.ecis.com/~alizard)
    Decryption isn't necessary, all the cracker needs to get is the "confidence level" that the image submitted to the sensors matches the image hash in the database.

    I don't think this can be worked around in any way that winds up with a usable product.

    [ Reply to This | Parent ]
Hash the data (Score:2)
by booch (4157) <[moc.kehcubgiarc] [ta] [todhsals]> on Friday June 27, @06:18PM (#6315404)
(http://craigbuchek.com/)
I'm not sure if it's possible, since the face-recognition data probably has to be "fuzzy". But if there's any data that is exact, you could just hash that.
[ Reply to This ]
    Re:Hash the data (Score:1)
    by robindmorris (682328) on Friday June 27, @07:05PM (#6315670)
    As I said in a reply to an earlier comment:
    (cut and pasted...)

    If you bothered to RTFA (I know, this is /.), you would find that this exploit does not need access to the biometric data, instead it only needs access to the scoring function.

    Put simply:
    1. start with some random face
    2. ask the system to compute the recognition score for this face
    3. make changes to the face
    4. compute the new score
    5. if the score is higher, keep the change to the face, if the score is lower, reject the change
    6. goto 3

    You'll notice that nowhere do you have to look at the biometric data itself. You only have to ask the system to compute the recognition score (for which it comes with a handy api).
    [ Reply to This | Parent ]
    Re:Hash the data (Score:1)
    by p3d0 (42270) on Friday June 27, @08:30PM (#6316165)
    (http://www.eecg.toronto.edu/~doylep)
    I don't understand. What part of the problem would hashing solve? (Did you read the paper?)
    [ Reply to This | Parent ]
Joe Average User... (Score:5, Interesting)
by Greyfox (87712) on Friday June 27, @06:19PM (#6315411)
(http://www.flying-rhenquest.net/)
Is going to be awfully put out when the authorities hold him because someone with his biometric pattern did soemthing highly illegal.

He will be in the position of being assumed guilty because everyone know that biometrics don't lie and are completely infallable. Thanks to legislation like the DMCA, no one will testify that the systems are, indeed, very easy to compromise. It'll be illegal to talk about those aspects of security. Not that the law has ever stopped the black hats...

[ Reply to This ]
    Re:Joe Average User... (Score:2)
    by Akardam (186995) on Friday June 27, @06:53PM (#6315615)
    Testifying about the system's ease of comprimise is entirely different from trying to bust some guy with a cast iron alabi, and trust me, it will happen. All the sooner if it's someone high profile, like a congressmonkey or star athlete or actor. At that point, the system's falability will have to be questioned, and once it is, every case after will have the defense scrambling to cite Senator Bob vs. BioID Ltd.. This is also another reason why people will always remain in the identification equasion for the forseeable future, at least until a computer can say "Gee, Mr. Bob, you're a few inches shorter then whence I saw you last".
    [ Reply to This | Parent ]
      Re:Joe Average User... (Score:3, Interesting)
      by Poeir (637508) on Friday June 27, @09:39PM (#6316665)
      Alphonse Bertillon advanced a system which would provide "unique" identification by taking measurements of various bones throughout the body. In 1903, two prisoners at the same facility were found to have almost identical Bertillion measurements, and the system was more or less scrapped. Modern facial recognition systems work in a matter similar to the Bertillion one, by comparing the ratio/measurement between various components of the face, like eyes, ears, nose, et cetera.

      Sir Francis Galton's work regarding fingerprints superceded the Bertillion system, and even that has shown some weaknesses [itworld.com]. Overall, biometrics do not appear to be as secure as one would expect to me.
      [ Reply to This | Parent ]
why passports at all? (Score:1, Interesting)
by civilengineer (669209) on Friday June 27, @06:21PM (#6315422)
With this kind of technology (biometrics), the need for passport should be eliminated, right?
A machine should look into you eye and make sure you are genuine, eliminating the need for a passport.
[ Reply to This ]
Not a surprise (Score:4, Insightful)
by Henry V .009 (518000) <marstrail&hotmail,com> on Friday June 27, @06:27PM (#6315460)
(http://thrasymachus.blogspot.com/ | Last Journal: Monday January 06, @02:06PM)
Anyone who has done work on computer vision would have guessed this to be so. What would interest me is in how it would be possible to exploit the algorithms, i.e., how bad of a picture can you get away with? Certain images that might not look anything like a face to you or me will quite possibly be able to fool the system.

The passport angle is probably a red herring though. The unreliability of photo identification is already known. Identity theft is simple and easy. Hell, here in New Mexico, we've already been the first state to accept 'Matricula Consular' cards as valid ID for driver's licenses. Matricula Consular cards, of course, are given out by Mexican embassies to undocumented Mexicans living in the US. By 'undocumented,' I mean illegal, of course. Check out the immigration reform site www.vdare.com for some more information on the subject.
[ Reply to This ]
Biometrics 101 (Score:2, Interesting)
by stupendou (466135) on Friday June 27, @06:28PM (#6315465)
While this is an interesting expolit, the sky isn't falling. Any and all biometric systems can be exploited, and in similar ways.

However, for this particular exploit to affect passport security and the like, the entire system would have to be automated, so that there would be no one to notice the perpetrator was holding a photo of someone else in front of his face as he walked by.

To guard against exploits like these in totally automated systems, the data that is fed into the matching system should be digitally signed, so that it is clear where the data is coming from
(e.g. a real fingerprint sensor, etc.).

Even so, a fake face or a fake finger can indeed spoof many biometric systems. Luckily, border crossings and airport security has humans in the loop to prevent these kind of exploits (or to accept bribes to allow them!).
[ Reply to This ]
    Re:Biometrics 101 (Score:1)
    by michaeljthieme (685202) on Friday June 27, @09:46PM (#6316706)
    This study is interesting, and there are probably some severe implications for vendors. However biometric templates are NOT going to be stored on passports. Biometric *images* are - definitely face, potentially also fingerprint and iris. ICAO (International Civil Aviation Organization) establishes these standards; they also determine the layout, data formats, etc of physical passports. A country can decide not to go along with ICAO recommendations but that is unlikely. So if one can read the data on the new "biometric" passports, one would be reading images, not templates.
    [ Reply to This | Parent ]
Ident-i-Eeze (Score:2, Funny)
by Anonymous Coward on Friday June 27, @06:28PM (#6315468)

    There were so many different ways in
which you were required to provide absolute proof of your iden-
tity these days that life could easily become extremely tiresome
just from that factor alone, never mind the deeper existential
problems of trying to function as a coherent consciousness in an
epistemologically ambiguous physical universe. Just look at cash
point machines, for instance. Queues of people standing around
waiting to have their fingerprints read, their retinas scanned, bits
of skin scraped from the nape of the neck and undergoing instant
(or nearly instant - a good six or seven seconds in tedious
reality) genetic analysis, then having to answer trick questions
about members of their family they didn't even remember they
had, and about their recorded preferences for tablecloth colours.
And that was just to get a bit of spare cash for the weekend. If
you were trying to raise a loan for a jetcar, sign a missile treaty
or pay an entire restaurant bill things could get really trying.

    Hence the Ident-i-Eeze. This encoded every single piece of
information about you, your body and your life into one all-
purpose machine-readable card that you could then carry around
in your wallet, and therefore represented technology's greatest
triumph to date over both itself and plain common sense.

                          Douglas Adams

                        Mostly Harmless
[ Reply to This ]
How to fix the problem (Score:4, Interesting)
by Atario (673917) on Friday June 27, @06:30PM (#6315486)

Make the cameras use x-ray backscattering (as in the earlier story today) of your face. Then in order to spoof the system, a printout of your picture (generated from the hash or not) would not work -- you'd have to build something that recreates your x-ray backscatter and show that to the camera. (I'm assuming that would be much more difficult, like making a sculpture out of meat or something -- anyone in the know wish to shoot down my theory?)

Of course, then there's the issue of getting x-rayed in the face every time you walk in the door...

[ Reply to This ]
    Re:How to fix the problem (Score:2, Interesting)
    by agrippa_cash (590103) on Friday June 27, @06:44PM (#6315576)
    Or some face topography scheme (IR distance sensors etc...), or make people turn their head so that the computer has to validate x number of positions between a frontal and quarter profile. Thermal is too easy to fool. No doubt these methods also could be fooled and likely sucessfully reversed as well. But the more complicated the verification, the more complicated circumvention will have to be. It appears that the currenet scheme is easier to circumvent than impliment.
    [ Reply to This | Parent ]
    Sorry, not comparatively hard (Score:3, Interesting)
    by mattr (78516) <<moc.ydobelet> <ta> <rttam>> on Saturday June 28, @12:29AM (#6317504)
    (http://telebody.com | Last Journal: Tuesday July 30, @08:28AM)
    Nope, check out this [3dsystems.com].

    An associate of mine runs a small factory in Japan where they make 3d-printers, much of the technology is from Texas-based DTM. Can't find their homepage, I think they might be owned or were by BFGoodrich. Many companies use their Sinterstation, which uses a laser to fuse nylon or metal powder deposited in thin layers inside the production bay.

    The machines are I believe in the hundreds of thousands of dollars each but they are used to make prototypes like mobile phone shells, or molds as for experimental automotive parts.

    Anyway nylon is easy, but they also have a rapidsteel process and the holy grail I understand is titanium, which would allow you to create surgical implants like joint replacements. As you can see in the link above, you can already pretty easily produce a 3d model of your skull from Cat-scan tomography. I've only seen plastic versions, though they might be more appropriate to trying to mimic x-ray backscatter from bone, and much cheaper than going through the trouble of making a mold, pouring metal, and finishing it. Hospitals are probably a lot easier to penetrate than these biometric systems. Come to think of it, you could skip the biometric penetration and just use anthropological techniques to build a face over the skull based on known data about skin depth at different parts of the skull. Painting surface features based on a pictures taken with a telephoto lens would also be cheap compared to the price tag mentioned in this thread for biometric analysis equipment.

    [ Reply to This | Parent ]
  • 1 reply beneath your current threshold.
So let me guess (Score:1)
by curtlewis (662976) on Friday June 27, @06:31PM (#6315498)
A photo of the person held up to the facial recognition camera passes the test?

[ Reply to This ]
Not as significant as you might think (Score:5, Insightful)
by swillden (191260) * on Friday June 27, @06:33PM (#6315513)

This isn't such a big deal for face recognition systems, because face recognition systems suck at identifying people anyway. Why? First a little tereminology:

With any biometric matcher you have to define a match "tolerance", which defines how close a pair of templates (usually one from a database and one from a livescan) have to be before they're considered to be a match. Set this tolerance too "loose" and you get lots of false positives (matches that shouldn't match), set it too "tight" and you get the opposite, false negatives. The tolerance setting where you get roughly the same number of errors each way is called the equal error point, and the error rate is called the equal error rate (abbreviated ERR for some unfathomable reason).

Well, all current face recognition systems have an ERR that is too high to be useful in nearly any situation, even when used for identity verification, as opposed to the much-harder problem of identification (verification: I say I'm Bill Gates, and the system agrees; identification: The system says I'm Bill Gates, not RMS or anyone else). It's possible that in the future this will change, of course.

However, this doesn't really matter because we already have ready access to an excellent and very widely available face recognition system: the Mark I eyeball. Millions of years of evolution have made people extremely good at identifying and matching human faces. What people aren't so good at (with notable exceptions) is matching a face against a database of thousands of faces they've seen only once, and *that* is something that face recognition systems can do extremely well. They may not be able to decide which faces are a "match", but they can do an excellent job of finding the *closest* faces, which can then be reviewed by the super-duper face-matching algorithm contained in the average person's head.

When automated face recognition is used in that sort of context, spoofs like this one are unlikely to be very useful; if you want to impersonate someone you'd better get a face that's good enough to fool another human. It's doable, certainly, but much harder. And holding a laptop screen in front of your face is likely to raise some suspicions.

[ Reply to This ]
    Re:Not as significant as you might think (Score:2)
    by Jasin Natael (14968) on Saturday June 28, @06:28PM (#6321590)
    Yeah, yeah. That's what they said about handwriting. Oh, wait. They were right. Maybe I'm agreeing with you.

    Just like my beloved Apple Newton -- It got the handwriting right 98% of the time, but for the other 2%, you'd find yourself double-tapping the word to see what else it thought you might have written. I'd be surprised to learn that this isn't the way most firms are implementing the technology. After all, "Blocks more than 98% of intruders" isn't a great advertising slogan unless you plan to use another system to back up that 2%.

    To sum up: You'll probably see it used in non-critical places (like advertisements), as supplementary ID (like at an ATM, but you'll still need your PIN), and as an entertainment enhancement (your TV recognizing who's in the room and recommending shows everyone is likely to enjoy). Just don't use it to lock your car, and certainly don't deploy it at work unless there's a real brain behind it.

    --Jasin Natael
    [ Reply to This | Parent ]
Better Than (Score:3, Funny)
by somethinghollow (530478) on Friday June 27, @06:39PM (#6315551)
(http://www.somethinghollow.com/ | Last Journal: Monday June 30, @01:49AM)
At least I don't have to cut someone's fingers off/eyes out/head off/etc. to get past these types of security measures any more.

Whew! What a relief.
[ Reply to This ]
Yo' Mama's So Ugly... (Score:1, Offtopic)
by sbillard (568017) on Friday June 27, @06:46PM (#6315587)
Yo mama's so ugly, she made the face recog system halt

Yo mama's so ugly, they use her face to stamp out gorilla cookies (Thanks Red Fox)

Yo mama's so ugly, you could hear the face recog cameras scream.

Yo mama's so ugly, when you brought her on the plane, they made you check your bag.

Yo mama's so ugly, she made Medusa's snakes turn to stone.

*_ducks and covers_*
[ Reply to This ]
One to one relationship / pigeonhole principle (Score:2)
by zerofoo (262795) on Friday June 27, @06:47PM (#6315595)
Devices like this can NEVER be used for personal identification unless a one to one relationship between a face recognition template and the person can be mathematicaly proven.

Much like a hashing algorithm (and the pigeonhole principle) if two items can hash to the same spot, then the algorithm is broken; or in this instance two people look alike and the computer can't tell them apart.

This will keep algorithms guys busy for a while.

-ted
[ Reply to This ]
    Re:One to one relationship / pigeonhole principle (Score:2)
    by awol (98751) on Friday June 27, @07:44PM (#6315913)

    Much like a hashing algorithm (and the pigeonhole principle) if two items can hash to the same spot, then the algorithm is broken; or in this instance two people look alike and the computer can't tell them apart.

    Er, actually no. Hashing two templates to the same key is not evidence of a broken algorithm as long as some of a whole range of other factors can be used to "work" the collision. In particular you want the algorithm to return an even distribution accross the key space and even more particularly yuou do not want similar faces to hash to similar keys, that is a sign of a potentially broken algorithm. The trade off is keyspace size versus computational complexity (and the range of factors like non stochastic key mapping and keyspace coverage). So for example if you hash the faced independent of skin colour, and eye colour (not the best but what the hey) then when you got the collision you could store those metrics with the template to allow for the more detailed comparison to be performed. Thus if dissimialr faces hash to the same key, you could easily have a factor of ten in the keyspace and still easily identify different faces.

    Having said that, the discovery of the hashing process is where the difficulty lies and that ain't my area.

    [ Reply to This | Parent ]
      Re:One to one relationship / pigeonhole principle (Score:2)
      by zerofoo (262795) on Sunday June 29, @02:51PM (#6325948)
      I agree the algorithm isn't necessarily broken. Every hash i've ever coded needs some way to handle collisions.

      The difficulty is getting that elusive differentiating data from a face. Obviously if a human can tell apart two faces, then there must be a way to create two different facial templates.

      The article shows that the reverse is true. The authors could reconstruct the face from an image template. I didn't go deeply into the math, but it does actually look like a one to one process (i.e. one template did not result in two possible images).

      -ted
      [ Reply to This | Parent ]
    Re:One to one relationship / pigeonhole principle (Score:1)
    by p3d0 (42270) on Friday June 27, @08:35PM (#6316205)
    (http://www.eecg.toronto.edu/~doylep)
    Wow, that statement is so ridiculous, I don't even know where to begin.

    You armchair computer scientists need to give the researchers just a little credit, and benefit of the doubt, especially if you haven't read the paper.

    [ Reply to This | Parent ]
Biometrics are the visual equivalent of soundex (Score:2)
by mikeophile (647318) on Friday June 27, @06:49PM (#6315598)
I'm guessing that the these biometric template could misidentify people that look quite different to a human observer.

For instance, if she had a little less facial hair, my aunt's bouffant hairdo under a scarf might give her the same biometric as Osama bin Laden.

[ Reply to This ]
Frequent changes... (Score:2)
by Pettifogger (651170) on Friday June 27, @06:49PM (#6315600)
For some reason, I don't think biometric face scans would hold up in Hollywood (well, Los Angeles for that matter) very well. Having lived there, people's faces just seem to keep changing. And so do hair and eye colors. It's almost like a hobby for some people.
[ Reply to This ]
Reset (Score:1)
by jabbadabbadoo (599681) on Friday June 27, @06:56PM (#6315627)
How to reset a biometric system? Show it a picture of CowboyNeal.
[ Reply to This ]
has the professor been arrested... (Score:1)
by u19925 (613350) on Friday June 27, @06:59PM (#6315643)
under DMCA (what else)?
[ Reply to This ]
Pattern recognition software for the military? (Score:1)
by ratfynk (456467) on Friday June 27, @07:18PM (#6315745)
(Last Journal: Friday June 27, @02:15PM)
It is interesting that the US military just purchased 800 million plus worth of software licences from Redmond. I hope they are not planning on using MS spaghetti code for mission critical security aps that use pattern recogniton code.
Bin Laden might win the war. Especially if they install Windows media player anywhere in their networks!
[ Reply to This ]
Gee (Score:1)
by The Metahacker (3507) on Friday June 27, @07:20PM (#6315766)
This is a big problem. not. Just take the data and push it through a one-way hash (like the aforementioned password transformation) before encoding it on the card.

TMH
[ Reply to This ]
    Re:Gee (Score:2)
    by swordgeek (112599) on Friday June 27, @07:36PM (#6315864)
    (Last Journal: Monday May 05, @07:46PM)
    Great. And then we'll go back to the OLD method, of recreating faces from pictures of people.
    Visual or at least optical biometrics are a disaster. Anyone (including government agencies) who think otherwise will end up getting in trouble by it.
    [ Reply to This | Parent ]
      Re:Gee (Score:1)
      by eric256 (625188) on Friday June 27, @07:57PM (#6315978)
      Heres an idea that just occured to me. Use two cameras in tandem, spaced a couple inches apart to give you a depth analysis as well. Harder to fool, and you can't just use a picture. It would have to be a full blown model.

      Or even better. Dual Cameras with Optical and Infared Capabilities (and audio) captureing the person saying a pass phrase "I am joe." One-way Hash all the data into the system and use that.

      You could even hash the audio and video seperate so that a head injury or a cold wouldn't lock you out, or just have a guard to let those people pass.

      Well you get the idea. Any thoughts?
      [ Reply to This | Parent ]
How to beat a fingerprint scanner (Score:1, Troll)
by biostatman (105993) on Friday June 27, @07:22PM (#6315776)
Breathe on the glass sensor to get the outline of the last person's print. Will fool many systems if the previous print was authorized. (Read this in the economist a couple of weeks ago...)

A bit OT, but thought others might find this interesting. Please don't let the DMCA dogs loose on me.

[ Reply to This ]
Links lost... (Score:2)
by joeytsai (49613) on Friday June 27, @07:24PM (#6315793)
I remember reading an article (possibly from here) about the challenges facial recognition systems faced, in particular comparing the facilities in the human brain. It had very interesting examples, for instance showing only a mouth and chin, but even with just that information, most people recognized it as Julia Roberts. They also altered a picture of Clinton and Gore but switched their mouths, something again that everyone notices but that a computer would have a very hard time picking up on. Finally, they also just had a grid of pictures, shrunk to 12x12 pixels, and even with that little data, your brain can easily discern who the pictures belong to. I'd like to look at that article again, would anybody know the link?
[ Reply to This ]
Everyone has missed the point (Score:5, Informative)
by SiliconEntity (448450) on Friday June 27, @07:29PM (#6315818)
Every comment I have read has missed the point!

This is not an exploit designed to show that biometric systems can be fooled or that you could create some kind of fake image that would match an existing one.

The whole point is that this shows that biometric templates are privacy-sensitive. Previously it was thought that they could be stored and promulgated without interfering with anyone's privacy, because it was thought to be infeasible to start from the template and reconstruct personally identifiable information about the subject.

The new paper shows that this is not true; from the templates, you can reconstruct an identifiable picture of the individual. That means that, for example, if you had a bunch of templates of people who went in for an AIDS test, you could re-create pictures of the people who went in, adequate to recognize individuals.

This would therefore interfere with the privacy of those individuals. And that implies that templates need to be subject to the same kind of privacy restrictions as other forms of personally identifying information, a standard to which they have not traditionally been held.

And that's the point of the paper.
[ Reply to This ]
Simple algorithm. It works. (Score:4, Insightful)
by jetmarc (592741) on Friday June 27, @07:37PM (#6315873)
The algorithm they used is simple. They use the face recognition
system as "oracle" and present different images until the match
is achieved. The different images are not chosen at random, but
rather evolutionary. That is, a selection of images is presented,
and the best (highest score) is chosen. Recursively, new selections
are derived from the best image, and again presented to the oracle.

According to the article 24,000 images are necessary to achieve
convergence, when the initial images were specifically chosen to
NOT be visually similar to the "target" image.

Some oracles can't be questionned 24,000 times - eg at an airport
or an ATM machine. You might become arrested long before finished.

However, often press releases indicate which company designed the
software for a particular implentation of face recognition. You
can easily purchase other software of the same company (or find
an OEM product) and thus have the same (or very similar) oracle
on your desk at home. There you can do the 24,000 iterations to
get ahold of the "good" image and then proceed to remodel your
face or whatever way you intend to "present" the image to the
real face recognition system.

In my opinion, biometrics just doesn't work for security. Because
everyone is open to see the datasets.

Just look at those stupid press releases of Siemens/Infineon, who
make high-payed security engineers invent ATM cards with finger
print sensors. Owners finger print => money from ATM. Where does
owner leave his finger print, when handling the card? Couldn't be
on the very ATM card, possibly?

Acceptable security requires

a) something you have, and

b) something you know.

When the item you have is stolen, the thief lacks the information
you know. And vice-versa, when the secret is learned (eg shoulder
surfing at ATM), the item you have still misses to complete the
electronic robbery.

Biometrics is something you have, not something you know. That is
the key thing to learn here!

It can be copied, without your noticing, but that doesn't make it
category b). It still is something you have, because everybody has
access to it when he's physically near to you. You can't just shut
up to make it stay secret.

Therefore, biometrics won't (ever) work as long as it's coupled with
other category a) stuff. A biometric dataset can possibly replace a
physical token, but it can NOT replace a PIN code.

I'm happy that this is once again demonstrated, with press coverage.

Marc
[ Reply to This ]
    Re:Simple algorithm. It works. (Score:1)
    by stupendou (466135) on Friday June 27, @09:33PM (#6316625)
    >Biometrics is something you have, not something you >know. That is
    > the key thing to learn here!

    No. Biometrics is something you *are*. A card
    or other token is something you have. A password is something you know. They are all distinct and can all be used together.
    [ Reply to This | Parent ]
    Re:Simple algorithm. It works. (Score:1)
    by ksni (684287) on Saturday June 28, @02:17AM (#6318010)
    (Last Journal: Tuesday June 24, @04:42PM)
    I beg to differ on the point: "A biometric dataset can possibly replace a physical token, but it can NOT replace a PIN code."
    A UK based ATM implementation using IRIS technology used cards with no pin and worked very well. The token is the the 'involved party' indentifier that provides context and limitation for iris presentation.
    [ Reply to This | Parent ]
Easily fixed (Score:1)
by afidel (530433) on Friday June 27, @07:38PM (#6315875)
Just check for thermal patterns, most CCD's used for image recognition can see near infra-red so just check to see if the image is a person with a pulse. A piece of paper isn't going to give off heat like a person does =)
[ Reply to This ]
Not Surprising In Ottawa (Score:2, Funny)
by Synesthesiatic (679680) * on Friday June 27, @07:46PM (#6315923)
A couple of decades ago Ottawa was the world's coldest capital city (I forget what it is now). The saying goes that come it's impossible to tell people apart, because everyone's wearing parkas. Now there's a challenge for facial recognition!
[ Reply to This ]
    Re:Not Surprising In Ottawa (Score:2)
    by JohnnyCannuk (19863) on Friday June 27, @08:49PM (#6316334)
    Flamebait?

    As a resident of Ottawa, I can say this is true...really! It's actally rather insightful. From January to March here your only like to see the tip of someone's nose, as the rest of your face is (and should be) covered by parkas, touques, belaclavas or scarves.

    Facial recognition biometrics would never be used here for that very reason

    [ Reply to This | Parent ]
yahoo biometrics listserv (Score:1, Informative)
by Anonymous Coward on Friday June 27, @08:49PM (#6316338)
I think all these comments are very interesting, and would like to invite those of you with a continuing interest in the subject to join the yahoo biometrics group.

Go to http:\\groups.yahoo.com\groups\biometrics

and follow the links to join. The listserv is open, you can select various email delivery options, and you can hide your email address if you choose.

Cheers

The yahoo biometrics group moderator
[ Reply to This ]
Not anything like a password hash (Score:5, Informative)
by lkaos (187507) <anthony@NOSpam.codemonkey.ws> on Friday June 27, @09:03PM (#6316452)
(http://lbpp.sourceforge.net/ | Last Journal: Tuesday October 23, @08:14PM)
A useful password hash (at least one that isn't considered to be plain-text equivalent) is a cryptographic hash. A cryptographic hash is one thought to be np-hard.

For instance, take this simple hash:

uint32_t hash;

for (size_t i=0; i < str.length(); i++) {
      hash += str[i];
}

Given an input of say, foobar, one would get a hash of 633. Now, if I start with an arbitrary password of say, google, I get a hash of 637.

Since I know that slight adjustments to the word, produce slight differences, I know that I can just start moving letters one space down the alphabet until I find a matching value.

Lets say I choose:

google -} 637
foogle -} 636
fnogle -} 635
fnngle -} 634
fnnfle -} 633 *bingo*

So know I've successfully "exploited" this password protection mechanism.. This is why it's referred to as plain-text equivalent.

A cryptographic hash though has the interesting proper that a small change results in a unpredictable different. For instance, in the same example you might get:

google -} 3453
foogle -} 234543
fnogle -} 234
fnngle -} 23425434
fnnfle -} 53424 ...

There's no reason biometrics can't be cryptographically strong. It's just that the algorithms currently being aren't. That's no big news for anyone with even half a clue stick.
[ Reply to This ]
    Re:Not anything like a password hash (Score:2)
    by WNight (23683) on Saturday June 28, @07:08PM (#6321787)
    (http://slashdot.org/)
    The problem is that passwords are an all or nothing. "google" works "goofle" does not. There's no hint.

    Biometric systems however supply a score. If password systems did this you crack then like this... If the password is "aaaa" and you first try "mmmm" it'll (let's say) give a score of 50. So you try "mmma" and "mmmz" and see which one gives the highest score. The first would give 62.5% and the second would be 37.5%, so you'd stick with the first and you'd make another change.

    With biometrics this is like showing it a standard face and getting a score. Then raising the cheekbones and trying again, then widening the nose, and so on. See how to change things to get closer to a match.

    You know how they crack ATM codes in the movies? Where all the numbers change randomly, but then they "get" a digit, and then another, etc... Passwords don't work this way because there's no way to tell if a given character is correct without getting the whole thing right. Biometrics let you solve a piece at a time.

    What this is equivalent to is the master-key problem from a month or two ago.
    [ Reply to This | Parent ]
Oh, really? Didn't Roger Wilco already do this? (Score:4, Funny)
by willith (218835) on Friday June 27, @09:10PM (#6316487)
(http://chroniclesofgeorge.nanc.com/)
"He developed an algorithm which allows a fairly high quality image of a person to be regenerated from a face recognition template..."

This kinda reminds me of the part in Space Quest III, where you gain access to the restricted area inside ScumSoft by holding up a xeroxed picture of the CEO's face to the facial recognition scanner.
[ Reply to This ]
Re:I don't have one, do you? (Score:1, Flamebait)
by BetterThanCaesar (625636) on Friday June 27, @06:38PM (#6315543)
(http://caesar.mine.nu/)

And, as luck would have it, the rest of the world is very satisfied with you not having any intention of travelling outside of the US.

[ Reply to This | Parent ]
Re:I don't have one, do you? (Score:2, Insightful)
by warloch71 (535769) on Friday June 27, @06:47PM (#6315590)
Last time I checked, you didn't need a passport to fly within the US, to buy a car, to rent a movie...big deal I say. You DO know that Planet Earth doesn't stop at the US border, don't you ?
[ Reply to This | Parent ]
    Re:I don't have one, do you? (Score:1)
    by NotQuiteReal (608241) on Friday June 27, @08:10PM (#6316038)
    I have a passport, but it is long expired. Without the passport, I have been to Canada, Mexico and the Bahamas. I suppose there are places I might like to go to that require a passport, but there are far more that require a passport that I have no desire to visit.

    Meanwhile, plenty of people visit and permenantly inhabit my state (California) without out passports from their (non-USA) countries of origin just fine.

    [ Reply to This | Parent ]
  • 1 reply beneath your current threshold.
Re:I don't have one, do you? (Score:2)
by dbrutus (71639) on Friday June 27, @07:15PM (#6315728)
(http://www.roitgroup.com/)
Some of us have relatives, girlfriends, and business partners who are going to get caught up in this mess even if we ourselves don't travel past any borders.

[ Reply to This | Parent ]
Re:i disagree (Score:1)
by Borg_5x8 (547287) <slashdot&fracturedreality,co,uk> on Saturday June 28, @01:41PM (#6320044)
(http://slashdot.org/)
Yes they are [reference.com], it's the name of the digit that's opposable.
[ Reply to This | Parent ]
  • 9 replies beneath your current threshold.
  •    
       
      If you are too busy to read, then you are too busy.
    All trademarks and copyrights on this page are owned by their respective owners. Comments are owned by the Poster. The Rest © 1997-2003 OSDN.
    [ home | awards | contribute story | older articles | OSDN | advertise | self serve ad system | about | terms of service | privacy | faq ]