
Under the Gaze
June 2011 — Five men got into a car. Akram Shah, a pharmacist, picked up his cousin Sherzada, a student; Atiq-ur-Rehman, another pharmacist; Ishrad Khan, a teenager; and Umar Khan, an auto mechanic. As they drove to a nearby village, the hiss of U.S. drone missiles filled the air. Explosions rocked the road. One hit the car and charred the men alive.
May 2011 — NYPD officers ordered Nicholas Peart, a 23-year-old Black college student against the wall. It was the fifth time in his life he’d been stopped and frisked. They took his cellphone, keys and wallet, then handcuffed him and put him in the backseat. Later they released him.
Summer 2010 — Alfred Carpenter was laid off in the Great Recession; he tore his knee and was out of work for a year. When he looked again for a job, even with six years of experience in his field, no one hired him. He told his friend of the mess his finances were in, he responded, “Oh, you got bad credit? They’ll never hire you.”
Corpses in a burning car, a man roughed up by cops, another denied work are scenes of Precrime — state and corporate strategies meant to deter crime or profit loss before it even happens. It’s enough to be part of a group — a Muslim man in Pakistan’s tribal areas, an urban Black male or a low credit score applicant to be identified as the source of trouble and targeted.
Precrime is a term created by Philip K. Dick in his 1956 short story Minority Report, made into a 2002 movie of the same name. In it a trio of “precogs,” or mutant humans, float in a water tank seeing visions of future violence. After recording these dreams, the Precrime Division’s paramilitary cops arrest people before they commit the crime.
Today the role of the “precogs” is being played by computer software that uses data mining to map social networks, purchasing behavior and movement patterns to predict who will commit acts of terrorism, local crime or job negligence. And the film’s Precrime cops are in reality the military, police and employers who use physical force and legal discrimination to secure the future of the state and corporate profits.
Online, Everyone’s Guilty
If you have any online life, you leave a digital trail. Send an e-mail, make a call, Google porn, swipe a credit card or open a Facebook account and it will be recorded. The information being data mined allows behavioral patterns to be extracted and character profiles to be compiled. And it is being done every second by corporations, political campaigns and government divisions like the National Security Agency.
When Edward Snowden, a former contractor for the NSA, leaked documents to The Guardian newspaper, it exposed government data mining. In much the same way as universal gun registration terrified the Right, digital surveillance sent shudders through the Left. We saw in it the FBI tactics used against the anti-Vietnam War movement. We know elders who were hounded by COINTEL. We remember our own uneasy sleep under the police watchtowers at Occupy Wall Street.
The difference between past and present state surveillance is that today’s federal and local law enforcement can use data mining of social network sites to disrupt protests before they form. Or if another terrorist attack shocks the nation or we go through a digital McCarthy era, the totality of our online lives can be used as evidence against us. Most importantly, innocence is no longer a state of being one can lose through an action but is instead a transition point on a life trajectory predicted by computer programs.
Beyond political organizations are the nine million people who lost their jobs from 2008’s Wall Street Crash who are now part of the 22 million underemployed and the 50 million in poverty. Employers are scanning Facebook pages, criminal records and credit scores to sift applicants. The longer someone is unemployed and the poorer one is, the more likely that person is to run up bad debt or get into legal trouble, which will show up in their digital trail. When they apply for jobs, the past is used against them. They leave the job interview, condemned to generational poverty because of their online profile.
The logic of empire, the logic of Jim Crow and the logic of class war structure the data mining strategies of governments and corporations. The predictive policing by law enforcement and drug testing, background checks and credit checks by corporations recreate the status quo of a hierarchical America teetering on the edge.
Signature Strikes
March 17, 2011 — Some forty men gathered at a bus depot in Datta Khel, Pakistan for a tribal meeting called a jirga. Drones circled the sky, but they had told the local Pakistani military about the meeting and thus weren’t afraid. They were just there to settle a dispute over a mine. After settling in, a hissing sound filled their ears. An explosion blew them apart. The U.S. drone fired another missile, then another.
Afterward, the shocked survivors scooped up the remains. The tally of the dead was forty-two. Most were government employees or tribal leaders, according to The Bureau of Investigative Journalism’s report, “Living Under Drones.” Only four were Taliban. Witness Idriss Farid said, “They were pieces — body pieces — lying around.”
In the rough tribal areas, the sky is a source of terror. According to the report, from June 2004 to September 2012 “drone strikes killed 2,562–3,325 people in Pakistan, of whom 474–881 were civilians, including 176 children.”
Under President Bush most were “personality strikes” on named, high-value targets linked to non-state terrorist organizations. Under President Obama the spectrum of targets was expanded to include “signature strikes,” attacks based on “‘pattern of life’ analysis” that targets “‘groups of men who bear certain signatures, or defining characteristics associated with terrorist activity, but whose identities aren’t known.’”
If you are an adult male in Pakistan’s tribal areas, you are a suspect. If you carry a gun, go to a wedding or meeting and members of the Taliban are there, you are now by association a viable target. Your name, social networks and movements will enter the Disposition Matrix database. It is a program run by the National Counterterrorism Center, described by Glen Miller in The Washington Post as “a single, continually evolving database” that includes “biographies, locations, known associates and affiliated organizations” and “strategies for taking targets down.” But, according to NBC News, from 2010 to 2011 the CIA could not confirm the identity of a quarter of those killed by drone strikes. Here is the foreign face of Precrime.
Predictive Policing
“What’s the difference whether the drone is up in the air or on the building,” Mayor Michael Bloomberg said on the John Gambling radio show. “You’re gonna have face-recognition software … You can’t keep the tides from coming in. We’re going to have more visibility and less privacy. I don’t see how you stop that.”
Drones the size of small planes like the Houston Police Department’s ScanEagle or drones the size of hummingbirds could fly through the city. Hovering near windows they could have night vision, zoom lenses, see-through imaging and face recognition software. Domestic drones have already joined wiretapping, street cameras and web surveillance in the arsenal of law enforcement. But what will be done with this information? As police departments are flooded with data, private corporations such as SAS Institute are pitching them programs to analyze it. In their white paper, “Twitter and Facebook Analysis; It’s Not Just for Marketing Anymore,” SAS offers police the ability to gain entry into accounts, discover relationships, map social networks and collect individuals’ data.
Some programs aren’t just to find current crime but instead predict future crime waves, much like weather forecasts. Current models like CompStat in New York rely on compiling data and, based on it, patrolling where crime has happened. Using PredPol, a program created initially to predict earthquake aftershocks, police will be sent to where crime will happen.
And it works. San Diego, Seattle and Columbia, South Carolina are using PredPol and have seen burglary rates drop. But the website PrivacySOS makes the argument that predictive policing keeps in place the status quo of inequality. One example is that whites, Latinos and blacks smoke marijuana at roughly the same rate. But minorities are arrested nearly three times the rate of whites. Based on this data, predictive policing will come down harder on neighborhoods of color, justifying the cops’ presence with the language of math.
According to PrivacySOS, predictive policing just recreates “the feedback loop of injustice,” in which urban men of color are stopped and frisked incessantly. It’s a form of public shaming and punishment to be thrown against a wall and groped by cops. It’s the face of domestic Precrime that Mayor Bloomberg endorses.
“I think, we disproportionately stop whites too much and minorities too little,” he said to the New York Daily News. “I don’t know where they went to school, but they certainly didn’t take a math course, or a logic course.”
Class Warfare
“He asked for my Facebook password,” my mom said. My eyebrows lifted. “Did you give it to him?” I asked.
“No,” she said. She had told the employer that she deactivated her account. It seemed odd, but asking around, I heard friends say that background checks, drug tests and credit score checks were normal practice. And sometimes, they cost people a chance at life. On March 4, Blake Ellis wrote in CNN Money that one in four Americans go through a credit check for a job and one in ten are denied a job because of it.
Blacks and Latinos are more held back by this due to the deeply entrenched unemployment that pummels their credit scores. In June 2013, the African-American unemployment rate was 13.7 percent, Latinos 9.1 percent and whites 6.6 percent. So the members of communities hit hardest by unemployment have the hardest time getting a job. According to Ellis, citing a survey by the Society of Human Resources, companies screen credit scores to prevent theft, embezzlement or “negligent hiring.”
Low credit score applicants are not being punished for wrongs they did but ones they are expected to. And this is layered on older forms of institutional prejudice. In 2009, The New York Times reported on Black job applicants ‘whitening’ their résumés by changing their names, removing historically black colleges and selecting white references. Before doing so they did not get calls back for an interview.
Here is the face of corporate Precrime. And, while those who are poor are poor for so long that they can’t find a way into the workforce, Microsoft recently came up with a way to avoid them and their neighborhoods. In 2012, it issued a patent for an application that gives ‘walking directions.’ One of its features is computing the crime statistics of an area and directing the user around the bad neighborhoods. So, if they are ignored long enough, they just disappear.
Reversing the Dynamic
What happens when all the networks are connected? When drug tests, Facebook and credit scores, drone footage, e-mail and Google history, and buying and movement patterns are integrated into a single profile for each citizen that is monitored by corporations and government? Will those with progressive politics or anti-authoritarian temperaments be tracked?
The more important question is how being watched and threatened with Precrime punishment changes our behavior. Even as cameras track us and each keystroke is recorded, it’s the heavy feeling of judgment that kills us internally. And we’ve been through this before. In the racial paranoia of Jim Crow, the political fever of McCarthyism, the shame of poverty in today’s Great Recession, it’s the social ideal, the light by which we measure ourselves that casts a long shadow into our lives.
How do we cure ourselves of fear? How do we roll back the practice of Precrime? A recurring lesson of social movements is that we must reverse the surveillance dynamic. Whether it’s sitting at segregated lunch counters, confronting pharmaceutical bureaucrats with H.I.V. victims or hauling sofas in front of Bank of America to make visible the foreclosure crisis, the goal of activists is to pierce the social distance between the privileged and the invisible victims of their power.
We on the Left create scenes of moral crisis by making pain visible. In doing so, those who suffered silently can create community and take meaningful action. And the action becomes a social mirror where the ruling class itself is seen as a criminal enterprise.
A Leftist Supercomputer
In the film Minority Report, the plot turns on the possibility of the Precrime program going national. But it has one flaw. If someone knows the future, they can change it. At the climax, agent John Anderton confronts Lamar Burgess, the director of the Precrime Division who murdered people to keep this truth from going public. Standing, gun in hand, he is predicted to kill Anderton. If he does, the program is sound. If he doesn’t, the program is illegitimate and it ends. Burgess turns the gun on himself and shoots.
The film pivots on the idea that the future — the final outcome of our actions — is not determined if we see the consequences of our present. It is a choice denied to the millions caught in poverty, in war zones and in the social constructs of ethnic, gender or class categories. There, life feels like being caught in a flow of events one cannot control. There, life ends where the increasingly sophisticated programs predict it will, in death, jail or misery.
But like in Minority Report, we can change our future by not just following the predictions of the next crime but understanding that its source is often a larger, invisible, systemic crime. It’s the basic divide between the Left and the Right. The Right sees social institutions as innately valuable for continuity and accepts hierarchy and inequality. We, the Left, see social hierarchy as innately destructive.
So what if we made a film called Majority Report, about a Leftist supercomputer named Lenin? In it, technicians in white suits would say that it is programmed to save the earth, end poverty and create the Good Life. Unlike predictive policing, it wouldn’t target future crime but tally the world’s data and solve the very source of crime.
Tall as a cathedral, it would hum with immense electricity as people around the world wait to hear what it said. In the film, drones would be reprogrammed to buzz the offices of Goldman Sachs. Police would receive orders on their phones to arrest Wall Street traders. Planned Parenthood clinics would find their checking accounts flush with cash. The wealthy would receive e-mails announcing new high taxes. Schools would be told to hire more teachers.
And desperate to stop it, politicians and business leaders would try to yank the plug. Audiences would cheer as the film’s heroes defend the supercomputer as it churns out new laws, new policies for a new world. The climax would be Lenin being blown up by a bomb planted by a conservative, and in the moment, the people of the world would hang suspended. And then they would realize it was too late. Having seen the source of crime and its solution, the people wouldn’t need Lenin anymore and take over the cities because, finally, they’d be free to choose their future.
For more surveillance state coverage, see:
As the NSA Follows You, We Follow the Money, by Emily Masters
Glenn Greenwald Reflects on Meeting Snowden, by Glenn Greenwald
New Poster Series: Edward Snowden, by Indy Staff
Avoiding Online Surveillance: Tips & Tricks, by Tactical Tech Team
For a PDF version of this issue, click here.