End the Lie

Nowhere to run: drones, facial recognition, soft biometrics and threat assessments

Decrease Font Size Increase Font Size Text Size Print This Page

By End the Lie

A 3D facial model composed by Progeny's software (Credit: Progeny Systems Corporation)

It is the stuff of science fiction: a drone flies hundreds of feet overhead, rapidly snapping images and collating them into a 3D model of your face, verifying your identity, recording your social interactions and even creating threat assessments of yourself and those you associate with.

Unfortunately, this is not science fiction. This is real technology being developed as you read this under several military contracts, all paid for by the American taxpayer adding on to the black hole of debt which continues to grow unabated thanks to unnecessary spending like this.

What is worse is that like most military technology, we can expect this new paradigm of war to bleed into domestic police activities and so-called homeland security operations.

The major issue brought up here, aside from the glaring privacy concerns, is that this technology can and will be used to treat suspects as guilty before proven innocent.

If you are not a fan of the government or prefer to not have your privacy violated, you can bet technology like the Department of Homeland Security’s Future Attribute Screening Technology (FAST) will set off some alarms.

Then, based on your alleged malicious intent, you will be treated as if they have any evidence that you pose a threat. How exactly they will treat this evidence is anyone’s guess but if the past is any indicator, it won’t be pretty.

This is precisely the idea behind contracts granted to the likes of Charles River Analytics which is developing a technology called Adversary Behavior Acquisition, Collection, Understanding, and Summarization (ABACUS).

ABACUS applies “a human behavior modeling and simulation engine” to data already collected by drones, intercepted phone calls and informants. It then produces “intent-based threat assessments of individuals and groups” as highlighted by Wired‘s Danger Room.

Producing a so-called threat assessment would be ineffective without knowing exactly who the subject is, and that’s where Progeny Systems Corporation comes in.

According to their official website, Progeny has fulfilled contracts since 1995 for the United States Army, United States Navy, United States Air Force, National Institutes of Health, Defense Advanced Research Projects Agency (DARPA) and Fortune 500 customers.

Among these is a contract recently issued for “Long Range, Non-cooperative, Biometric Tagging, Tracking and Location”.

This technology is “Non-cooperative” in that it does not require the subject’s knowledge or consent like traditional biometrics which are considered “cooperative” and thus “generally not applicable to the more difficult problem of ‘real-world’ recognition.”

Progeny’s technology is designed to be deployed on an Unmanned Aerial Vehicle (UAV) platform using existing UAVs and sensors. They have developed algorithms that can perform tagging, tracking, and locating (TTL) tasks on unwitting individuals with pre-existing video sources, just like another contract awardee Intelligent Automation, Inc.

Both Progeny and IAI have developed technologies that use existing UAV payloads to identify and track non-cooperative targets in real time in both urban and rural environments.

Now UAVs will be able to pick up lost targets near instantly, detect potential targets and create 3D images of their faces for future tracking, and create a threat assessment.

To remove the previous temporal constraints that made much of facial recognition too cumbersome to use in the field, Progeny also developed technology that employs “soft biometrics” to filter through crowds and “port and border monitoring scenarios to categorize individuals at a great distance.”

Without a doubt, “soft biometrics” is a nice way of saying computerized racial profiling. The algorithm cuts down on the time taken to identify individuals by sorting them “based on gender, skin color, height, weight, anatomical proportions, geometrical facial features”.

It is not farfetched to speculate that when they speak of “port and border monitoring scenarios” they would “categorize individuals at a great distance” based on ethnicity. The fact that they are promoting this as a potential law enforcement tool just emphasizes those concerns about systematic racial stereotyping.

After the Progeny system captures an image with a mere 50 pixels between the subject’s eyes, the process of building a 3D model begins. After the model is created, further images are used to create a more accurate facial model which can then be used so only about 20 pixels of image is required to identify the individual.

Crowds? No problem. Only a sliver of the suspect’s face is in the picture? No problem. No evidence to detain them? No problem, just run a threat assessment and then claim he is harboring malicious intent and the algorithm proves it.

Does anyone else see how these could create some serious problems if it were implemented at home?

Even saying “if” is a bit naïve seeing as shockingly similar threat assessment technology was being tested months ago by the Department of Homeland Security.

Seeing as police departments and DHS are increasingly utilizing drones domestically (even so-called “micro drones”), all it would take is the application of the new software (remember, it all uses existing sensors and UAVs) and we have a full-blown pre-crime police state in America.

While all of this technology has the real potential of cutting down on civilians slaughtered by drones in attacks on “suspected militants” it also has the much more grim potential of being used to wrongly accuse and/or detain innocent Americans while violating the privacy rights of each and every citizen.

This is not one of those situations where the logic of “if you have nothing to hide then why worry?” will work, if indeed such logic ever does. If we allow this type of science fiction absurdity to be forwarded on our own dime, we can expect what shred of liberty we have left under the PATRIOT Act to be eradicated.

It is a sad reality that the FBI already does not need any probable cause whatsoever to conduct surveillance on you, rifle through your trash, tap your phones, etc. we can expect this to be much worse if they can produce an “objective” threat assessment that says you’re malicious. What little need for evidence still existed in the so-called justice system will be gone in the blink of an eye.

I know these threats seem abstract or unreal but they are at our doorstep and if we sit by in silence we can expect more of our non-existent public funds to be squandered on projects that are everything that America is not supposed to be.

2 Responses to Nowhere to run: drones, facial recognition, soft biometrics and threat assessments

  1. Rwolf June 9, 2012 at 12:03 PM

    Next: Police Drones—Recording Conversations In Your Home & Business To Forfeit Property?

    Police are salivating at the prospect of having drones to spy on lawful citizens. Congress approved 30,000 drones in U.S. Skies. That amounts to 600 drones for every state.

    It is problematic local police will want to use drones to record without warrants, personal conversations inside Americans’ homes and businesses: Consider the House just passed CISPA the recent Cyber Intelligence Sharing and Protection Act. If passed by the Senate, CISPA will allow—the military and NSA spy agency (warrant-less spying) on Americans’ private Internet electronic Communications using so-called (Government certified self-protected cyber entities) and Elements that may share with NSA your private Internet activity, e.g. emails, faxes, phone calls and confidential transmitted files they believe (might) relate to a cyber threat or crime (circumventing the Fourth Amendment) with full immunity from lawsuits if done in good faith. CISPA does not clearly define what is an Element; or Self-protected Cyber Entity—that could broadly mean anything, e.g. a private computer, local or national network, website, an online service.
    Despite some U.S. cities and counties banning or restricting police using drones to invade citizens’ privacy, local police have a strong financial incentive to call in Federal Drones, (Civil Asset Forfeiture Sharing) that can result from drone surveillance). Should (no-warrant drone surveillance evidence) be allowed in courts—circumventing the Fourth Amendment, for example (drones’ recording conversations in private homes and businesses) expect federal and local police civil asset property forfeitures to escalate. Civil asset forfeiture requires only a preponderance of civil evidence for federal government to forfeit property, little more than hearsay: any conversation picked up by a drone inside a home or business, police can take out of context to initiate arrests; or civil asset forfeiture to confiscate a home/business and other assets. Local police now circumvent state laws that require someone be convicted before police can civilly forfeit their property—by turning their investigation over to a Federal Government Agency that can rebate to the referring local police department 80% of assets forfeited. Federal Government is not required to charge anyone with a crime to forfeit property. There are more than 350 laws and violations that can subject property to government asset forfeiture that have nothing to do with illegal drugs.

  2. Hank August 8, 2012 at 6:10 AM

    I would share this article if it would show a preview on facebook. But it doesn’t.


Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>