Army ‘Pre-Crime’ Division wants to monitor computer activity

At SAIC, which is testing a behavior analytics system, Beard likened behavioral modeling to the Pre-Crime unit from the science fiction movie “Minority Report.” Instead of using psychics to stop crimes before they occur, the software would be programmed to detect behavior that has preceded malicious acts in the past.

armytimes.com | May 5, 2012

By Joe Gould

In the wake of the biggest dump of classified information in the history of the Army, the brass is searching for ways to watch what every soldier is doing on his or her Army computer.

The Army wants to look at keystrokes, downloads and Web searches on computers that soldiers use.

Maj. Gen. Steven Smith, chief of the Army Cyber Directorate, said the software was one of his chief priorities, joking that it would take the place of a lower-tech solution: “A guy with a large bat behind every user as they go to search the Internet.”

“Now we’ve been in the news — I don’t know if you’ve seen it — with a little insider threat issue,” Smith continued.

Smith did not mention Pfc. Bradley Manning by name. However, the effort comes in the wake of the former intelligence analyst’s alleged leak of hundreds of thousands of pages of classified documents to the anti-secrecy organization WikiLeaks in 2009 and 2010. Manning faces a military trial on 22 counts, including aiding the enemy.

According to Smith, the Army will soon shop for software pre-programmed to detect a user’s abnormal behavior and record it, catching malicious insiders in the act. Though it is unclear how broadly the Army plans to adopt the program, the Army has more than 900,000 users on its computers.

Smith explained how it might work.

“So I’m on the South American desk, doing intelligence work and all of a sudden I start going around to China, let’s say,” Smith said. “That might be an anomaly, it might be justified, but I would sure like to know that and let someone make a decision, almost at the speed of thought.”

The scenario echoes the allegations against Manning: As an intelligence analyst charged with researching the Shiite threat to Iraqi elections, Manning raided classified networks for State Department cables, Afghanistan and Iraq war logs and video from a helicopter attack, according to courtroom testimony.

Software of the type Smith describes is at various stages of development in the public and private sectors. Such software could spy on virtually any activity on a desktop depending on its programming, to detect when a soldier searches outside of his or her job description, downloads massive amounts of data from a shared hard drive or moves the data onto a removable drive.

The program could respond by recording the activity, alerting an administrator, shutting down the user’s access, or by feeding the person “dummy data” to watch what they do next, said Charles Beard, a cybersecurity executive with the defense firm SAIC’s intelligence, surveillance and reconnaissance group.

“It’s a giant game of cat and mouse with some of these actors,” Beard said.

What’s exciting, Smith said, is the possibility of detecting problems as they happen, on what cybersecurity experts call “zero day,” as opposed to after the fact.

“We don’t want to be forensics experts. We want to catch it at the perimeter,” Smith said. “We want to catch this before it has a chance to be exploited.”
A governmentwide effort

The Army’s efforts dovetail with a broader federal government initiative. President Obama signed an executive order last October that established an Insider Threat Task Force to develop a governmentwide program to deter, detect and mitigate insider threats.

Among other responsibilities, it would create policies for safeguarding classified information and networks, and for auditing and monitoring users.

In January, the White House’s Office of Management and Budget issued a memo directing government agencies that deal with classified information to ensure they adhere to security rules enacted after the WikiLeaks debacle.

Beyond technical solutions, the document asks agencies to create their own “insider threat program” to monitor employees for “behavioral changes” suggesting they might leak sensitive information.

The interagency Insider Threat Task Force is aiming to complete work on the new standards by October. These standards may address training and employee awareness protocols, said John Swift III, senior policy adviser to a task force now working on the draft policy.

Deanna Caputo, lead behavioral psychologist for Mitre Corp., said both technical solutions and monitoring of human behaviors are needed for a successful detection and prevention program.

“To think that we can tackle the problem simply by technical solutions is a mistake,” Caputo said.

A “culture of reporting” is essential, she said. “We need to up the ante and expect a little bit more from our people” to report abnormal behaviors among their co-workers. However, “there is a fine line with that [reporting]. People need to trust they are in a safe environment to do their job.”

Carnegie Mellon’s Software Engineering Institute has compiled 700 insider threat case studies, and come up with two broad profiles of insiders who steal intellectual property in business settings.

One is an “entitled independent” disgruntled with his job who typically exfiltrates his work a month before leaving. The other is an “ambitious leader” who steals information on entire systems and product lines, sometimes to take to a foreign country, such as China.

According to Patrick Reidy, who leads the FBI’s insider threat program, such users may be conducting authorized activities for malicious ends, and their actions would not register on intrusion detection or anti-virus systems.

“People look at computers and networks but not people and data,” he said. “The insider threat is all about people.”

Reidy, Swift and Caputo discussed the effort at a defense industry convention in Washington, D.C., on April 4.

The ‘Pre-Crime’ division

Private industry and the Defense Advanced Research Projects Agency are among the entities that have technological solutions in various stages of progress.

Raytheon’s SureView software captures any security breach or policy violation it’s programmed to find and can “replay the event like a DVR,” for a local administrator or others to view, according to the company’s website. The software’s trigger is programmable and can be set to any behavior considered suspicious or not.

Working with Raytheon, a group of cadets from the U.S. Military Academy at West Point last year conducted a simulation of an insider attack at a forward operating base. Cadets looked at how to fine-tune the way SureView detects potential threats and eliminate false positives for innocuous behavior, said West Point computer science professor Col. Greg Conti.

“It was very powerful, very flexible and allowed you to monitor with very fine resolution activities on the desktop, and the real trick becomes how you detect anomalous behavior,” Conti said. “Predictive models are kind of the holy grail. When you see that no one else has done something but bad guys, you can start being predictive.”

At SAIC, which is testing a behavior analytics system, Beard likened behavioral modeling to the Pre-Crime unit from the science fiction movie “Minority Report.” Instead of using psychics to stop crimes before they occur, the software would be programmed to detect behavior that has preceded malicious acts in the past.

In real life, researchers are examining the behavior of malicious insiders to see what actions they took before they acted out. That in turn would be used to teach the software what behavior to flag.

“We may want to administer policies that say, ‘Gee, gosh, why do you really want to download 300 [megabytes] of stuff or a gig of data in a single session?’ ” Beard said. “We look for the antecedents of behavior that would suggest based on past history that bad things are going to take place.”

That could be visiting restricted websites, requesting access to information outside of one’s job description or asking for large amounts of storage media — or likely some combination of the above. Individually, the actions may not seem problematic, but combined and in the context of human intelligence, they could raise alarms.

“We start taking those things and recombining them to say, ‘What is going on in the environment?’ ” Beard said. “Any one of those things independently can be totally innocuous and innocent, but when you put them together — plus their job, plus their access, plus the things they are working on — you may be looking at it as a counterintel kind of thing.”

Drawbacks and challenges

Cybersecurity expert Michael Tanji, an Army veteran who has spent nearly 20 years in the U.S. intelligence community, said he sees potential drawbacks and unanswered policy questions. He asked how the Army would implement such technology without unintentionally stifling cross-disciplinary collaboration among soldiers.

Knowing they are being monitored, personnel might avoid enterprising or creative behavior for fear it would be flagged by monitoring software, he said.

Tanji also predicted the technology would come at a considerable financial cost, both to warehouse the data collected by the software and to pay the added staff needed to monitor the reports it generates.

“A brigade-sized element that uses computers on a regular basis would probably need a company-sized element just to keep up with the data that comes in,” he said.

Reidy, the FBI official, said such concerns were valid. Because software may report benign behavior as malicious and vice versa, he cautioned against using technical solutions alone to solve insider threats.

“After a major incident, and no offense to any vendors, but the charlatanism always goes up,” he said. “It’s absolutely amazing how many phone calls I get from people who say they have solved the WikiLeaks problem or solved this or that problem. Everybody’s got to eat, but it’s simply not true.”

Finding bad behavior amid the vast sea of keystrokes, downloads and Web browsing on military computers is no easy task, DARPA acknowledges.

A DARPA solicitation for Suspected Malicious Insider Threat Elimination, or SMITE, announces it is attempting to recognize “moving targets” — telltale patterns of behavior amid “enormous amounts of noise (observational data of no immediate relevance).”

The program, based in behavioral science, would have to distinguish anomalous behavior from normal behavior, and deceptive and malicious behavior from anomalous behavior, the solicitation reads.

A solicitation for another program — Anomaly Detection at Multiple Scales, or ADAMS — uses accused Fort Hood shooter Maj. Nidal Hasan to frame the problem. It asks how to sift for anomalies through millions of data points — the emails and text messages on Fort Hood, for instance — using a unique algorithm, to rank threats and learn based on user feedback.

The program is trying to look beyond computers to spot the point when a good soldier turns, whether that means homicidal or suicidal or ready to dump stolen data.

“When we look through the evidence after the fact, we often find a trail — sometimes even an ‘obvious’ one,” the solicitation states. “The question is, can we pick up the trail before the fact, giving us time to intervene and prevent an incident? Why is that so hard?”

2 responses to “Army ‘Pre-Crime’ Division wants to monitor computer activity

  1. nonviolentconflict

    Reblogged this on NonviolentConflict.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s