Surely this is just a question of technology. A sufficiently advanced pattern matching algorithm could surely allow computers to do all the matching required and to rigorously determine the statistical likelihood of error based on the quality of any given fingerprint.
If the premise of fingerprint matching is sound but human involvement in matching is the failing here then that can be technologically overcome.
I wonder at the actual training received by the fingerprint analysts, and whether they themselves are actual trained scientists.
But I think the real problem is how fingerprints are presented to juries. Like with DNA, a typical jury member is going to just say "well, his prints were there, so he did it," without understanding the relative probabilities involved when dealing with fingerprint analysis. The print card may have a nice set of prints, but the prints you leave on an object through normal contact aren't nearly so clear. When the jury hears "it was a match," I'm not sure that they are actually told
how much of a match the prints are.
A positive match on 25% of a thumbprint is useful information, but it's not as significant as a match to multiple different full prints ( > 80% latents for both thumb and forefinger, for example). I don't think the average jury member thinks along those lines unless the defense has an expert who specifically directs them that way.
The human understanding when it has once adopted an opinion (either as being the received opinion or as being agreeable to itself) draws all things else to support and agree with it.
- Francis Bacon
"There are two novels that can change a bookish fourteen-year old's life: The Lord of the Rings and Atlas Shrugged. One is a childish fantasy that often engenders a lifelong obsession with its unbelievable heroes, leading to an emotionally stunted, socially crippled adulthood, unable to deal with the real world. The other, of course, involves orcs." - John Rogers