Daily Archives: November 21, 2012

TSA issues holiday warning to ‘Opt Out and Film’ privacy rights activists

washingtonexaminer.com | Nov 19, 2012

The Transportation Security Agency sent out a message to privacy activists planning to opt out of TSA’s “nudie scanner” screenings and film the TSA’s optional patdown process.

“We will no longer allow the TSA to stick their hands down our pants and touch our private parts!” reads a note on a related Facebook page.

According to “Opt Out and Film week” organizers, the protest starts November 26 to send a message to the TSA about the violation of the right to privacy.

The TSA signaled today that they are aware of the protest, and may be prohibited from filming in their location.

“We’re also aware of the Opt Out and Film week, where some are planning on opting out of the body scanner and then filming their experience,” said TSA spokesman Bob Burns in a statement today.

Burns added that while the TSA, “does not prohibit photographs at screening locations, local laws, state statutes, or local ordinances may.”

“TSA takes its mission to protect the safety of the traveling public seriously and our officers will continue to uphold our high standards of professionalism during the busy holiday season,” Burns added.

Women, children main victims of attack, says Gaza hospital


Palestinian medic carries a wounded child into al-Shifa hospital, 18 November (Ashraf Amra / APA images)

Electronic Intifada | Nov 20, 2012

GAZA (IRIN) – The nurses at al-Shifa hospital in the Gaza Strip have seen bombing casualties before, but never on this scale.

“I was here when the [23-day] 2008-09 war took place, and I think this one is more difficult in [terms of] injuries and the type of demanding work we do,” said Talaat al-Ejla, a 30-year-old nurse.

Nurses work 12-hour shifts, but it is the night-time shifts that have been the hardest these last few days.

“It’s very hard now, with many injured people coming every hour. Women and children outnumbered men, especially with the new wave [of attacks] targeting houses and civilian buildings,” said Ibrahim Jirjawi, a nurse on the orthopedic ward, who has worked here for seven years.

“It’s more dangerous now than before, and we expect that things will be worse if ground operations start,” he said.

So far more than 139 Palestinians have been killed in the Israeli bombardment, according to the Ma’an News Agency (“Israeli strikes on Gaza kill 2 children, 12 others”).

“The health ministry [in Gaza] was facing severe shortages of medicines before this recent crisis,” said Mahmud Daher, the current head of the World Health Organization’s (WHO) office in Gaza.

Images of Gaza’s dead children force us to face reality

He said the number of injured persons arriving at Gaza hospitals had “dramatically increased in the last 24 hours,” with more than 700 visiting hospital, 252 of them children.

Many of the drugs that have run out are life-saving, according to the WHO (“World Health Organization concerned over the emergency situation in Gaza,” 17 November).
Catastrophic

Meanwhile, an Israeli military spokesperson said on Twitter on 19 November: “We continue to transfer goods & gas to #Gaza,” adding that the previous day 16 trucks carrying medical supplies entered Gaza, while 26 patients in Gaza were evacuated to Israel.

But the head of al-Shifa hospital, Medhat Abbas, said it was still lacking about 40 percent of the drugs needed.

“The shortage, of course, affects the quality of our work. However, our staff are working to the maximum to fulfill needs in this catastrophic situation,” he said.

Outside the hospital, ambulances line up to ferry patients over the Rafah border crossing to Egypt.

The crossing has been open throughout the bombardment, and government officials in Gaza say the Egyptian authorities have said the border will remain open.
Aid in Egypt

The Egyptian interior ministry has deployed ten ambulances at the crossing to receive Palestinian victims of the Israeli air raids. When casualties arrive, they are taken by ambulance to a hospital in al-Arish, the largest town in northern Sinai, close to the Israeli border.

“Palestinian victims have been arriving here since Friday,” said Tarek Khatir, a senior Egyptian health ministry official in northern Sinai.

“When they come to us, we take them to al-Arish Hospital for first aid, then we decide whether they need more treatment at other hospitals, either in Cairo or in other governorates.”

He said two doctors specializing in such emergencies had been sent to hospitals in the Egyptian border region.

Medical aid and food has also been sent into Gaza from the Egyptian side. The Egyptian Red Crescent sent in medical materials and medicines on 17 November.

The Arab Medical Union also sent in medical supplies. Several union members had visited the Gaza Strip in recent days to get first-hand experience of the needs there.

“The teams include orthopedic surgeons and neurologists,” said Ahmed Abdel Razik, medical coordinator with the Arab Medical Union.

TSA airport confiscation of personal property creates new surplus store re-sale market


Tom Zekos of Newbury, N.H., searched tubs of confiscated pocket knives for sale at the surplus property store.  CHERYL SENTER FOR THE BOSTON GLOBE

Surplus store offers up stuff that just won’t fly

TSA does not like to say they confiscate items.

bostonglobe.com | Nov 21, 2012

By Billy Baker

CONCORD, N. H. — As the busy holiday travel season ­arrives, so too does the infrequent flyer. That means a very particular secondhand market is about to start booming, one that depends on people who, more than a ­decade after the Sept. 11 attacks, still do not know that you cannot carry a machete onto an airplane. Or a baseball bat. Or scissors. Or hammers. Or . . . you wouldn’t believe it all.

But if you’d like, you can see it all.

And you can buy it real cheap.

The New Hampshire State ­Surplus Store in Concord has ­become a hub of this secondhand market, the spot where the bulk of the items people surrender to security at New England’s major airports is resold to the public for pennies on the dollar. (They also sell the things people forget while going through security, in case you’re in the market for a belt, a watch or sunglasses.)

“It’s amazing what people travel with,” said Rocky Bostrom, an employee at the store who spends a good part of his day going through boxes of items and shaking his head.

“I like to say we get soup to nuts, heavy on the nuts. I’ve got a bullwhip under my desk right now. I can’t put it out in the store because it’ll take out someone’s eye.”

And with the holidays, the ­inventory in the store will swell, as the quantity and savviness of airline passengers changes.

“With Thanksgiving and the holidays, you’re going to have more infrequent flyers, people who are less familiar with travel than your business travelers, which leads to more issues with the carry-on rules,” said Ann ­Davis, a spokeswoman for the Transportation Security Administration, which oversees airport security.

TSA does not like to say they confiscate items. “They’re surrendered,” Davis said. “Passengers have options.”

For all but the most serious incidents, such as a loaded gun, a passenger can leave the security line and bring the item to their car; give it to the person who dropped them off; and, at many airports, they can mail the item to themselves. They can also, of course, choose not to fly.

But short of that, the options require time, something many travelers do not have, so the prohibited items are simply left with security, an accidental gift to the government.

A look around the surplus store, an oddball series of rooms in an old dairy farm surrounded by cornfields, reveals a menagerie of items that fall into categories.

First, there are the accidental things, the sort that travelers might understandably forget they had in their possession. The core of this cache is pocketknives and tools, such as screwdrivers and corkscrews. They get them by the thousands, so many that there is an entire subculture of resellers who start waiting in line two hours before the surplus store opens so they can pounce on the newest inventory and then turn it around on eBay.

“Once in a while, there’s some pushing and shoving,” Bostrom said. “They just charge through the door, reach over each other, and then complain that everything is priced too high.”

Many visit the store — which is open Monday, Wednesday, and Friday from 8 a.m. to 3 p.m. — three or four times each day. The store also, as the name implies, sells the state government’s surplus items, everything from used snowplows to old office furniture and fax machines.

TSA collects so many miniature multitools and Swiss Army knives that each model has its own bin at the surplus store, where they sell for $2.

The next major category is the laughable, which has two subcategories: those items people can’t possibly think they can bring on an airplane and those items that can’t possibly be prohibited on an airplane.

Saws, pick axes, a prison shank. Come on, people.

Snow globes. Come on, TSA.

The ban on snow globes, which were outlawed along with many liquids and gels in 2007 ­after an apparent terrorist plot in London to use liquid explosives on US-bound planes, has long been the subject of ridicule and a source of bewilderment for souvenir-toting passengers who are not aware of the edict.

Though the agency relaxed its standards this summer to allow snow globes that contain less than 2.4 ounces of liquid, the surplus store still gets enough of them that you can buy 10 for a dollar.

Another laughable item they see a lot of is bowling pins, usually covered with autographs, from bowlers returning from tournaments. These are usually bought by a sheriff’s department, which uses them for target practice.

The final category is those items that are, you might say, unforgivable. “We get tons and tons of boxcutters,” said John Supry, who is the store manager. “That’s really how it all started” — boxcutters were a key weapon for the Sept. 11 hijackers — “and yet people still carry them.”

TSA says it makes every effort to reunite passengers with items accidentally left at security; they keep them for at least 30 days. But the employees of the New Hampshire State Surplus Store say there is simply too much to be in the match-making game.

Occasionally, they can be persuaded by someone who surrendered something of sentimental value. They have helped couples find engraved wedding cake serving knives and recently ­received a nice thank you note from a woman who was reunited with her grandmother’s heirloom silverware.

But for the most part, it’s people who call and say, “I lost my Swiss Army knife.”

Sorry, there’s no way they’re looking for it. But if you want one, come on up. They have boxes full of ones just like it.

And take some of these snow globes while you’re here.

Human Rights Watch: Ban “Terminator” Robots Before We Lose Control

dailytech.com | Nov 20, 2012

by Jason Mick

Humanitarian group predicts war crimes and worse if robot AIs are trained to target and kill humans

Thus far, no nation has produced a fully autonomous robotic soldier.

I. Human Rights Watch Warns of Robotic War Crimes

However, many observers fear we are creeping towards an era in which automated killing machines are a staple of the battlefield.  The U.S. and other nations have been actively been developing landair, and sea unmanned vehicles.  Most of these machines are imbued with some degree of artificial intelligence and operate in a semi-autonomous fashion.  However, they currently have a human operator in the loop, (mostly) in control.

But experts fear that within 20 to 30 years artificial intelligence and military automation will have advanced to the point where nations consider deploying fully automated war robots to kill their enemies.

International humanitarian group and war-crimes watchdog Human Rights Watch has published a 50-page report entitled “Losing Humanity: The Case Against Killer Robots“, which calls on world governments to install a global ban on autonomous killing robots, similar to current prohibitions on the use of chemical warfare agents.

Comments Steve Goose, Arms Division director at Human Rights Watch, “Giving machines the power to decide who lives and dies on the battlefield would take technology too far.  Human control of robotic warfare is essential to minimizing civilian deaths and injuries.  It is essential to stop the development of killer robots before they show up in national arsenal.  As countries become more invested in this technology, it will become harder to persuade them to give it up.”

II. Ban the ‘Bots

The proposal, co-endorsed by the Harvard Law School International Human Rights Clinic, also calls on a prohibition on development, production, and testing of fully autonomous war robots.

The groups address the counter-argument — that robotic warfare saves the lives of soldiers — arguing that it makes war too convenient.  They argue that an “autocrat” could turn cold, compassionless robots on killing their own civilian population.  It would be much harder to convince humans to do that.

. . .

Pull the Plug on Killer Robots

 

. . .

Countries could also claim their cyber-soldiers “malfunctioned” to try to get themselves off the hook for war crimes against other nations’ civilians.

And of course science fiction fans will recognize the final concern — that their could be legitimate bugs in the AI which cause the robots to either not properly calculate a proportional response to violence, to not distinguish between civilian or soldier, or — worst of all “go Terminator” and turn on their fleshy masters.

Comments Mr. Goose, “Action is needed now, before killer robots cross the line from science fiction to feasibility.”

Ban ‘killer robots’ rights group urges


A screen shot from Terminator Salvation. File picture. Image by: Industrial Light & Magic.

Hollywood-style robots able to shoot people without permission from their human handlers are a real possibility and must be banned, campaigners warn.

Sapa-AFP | Nov 20, 2012

The report “Losing Humanity” – issued by Human Rights Watch and Harvard Law School’s International Human Rights Clinic – raised the alarm over the ethics of the looming technology.

Calling them “killer robots,” the report urged “an international treaty that would absolutely prohibit the development, production, and use of fully autonomous weapons.”

The US military already leads the way in military robots, notably the unmanned aircraft or drones used for surveillance or attacks over Pakistan, Afghanistan, Yemen and elsewhere. But these are controlled by human operators in ground bases and are not able to kill without authorisation.

Fully autonomous robots that decide for themselves when to fire could be developed within 20 to 30 years, or “even sooner,” the 50-page report said, adding that weapon systems that require little human intervention already exist.

Raytheon’s Phalanx gun system, deployed on US Navy ships, can search for enemy fire and destroy incoming projectiles all by itself. The X47B is a plane-sized drone able to take off and land on aircraft carriers without a pilot and even refuel in the air.

Perhaps closest to the Terminator-type killing machine portrayed in Arnold Schwarzenegger’s action films is a Samsung sentry robot already being used in South Korea, with the ability to spot unusual activity, talk to intruders and, when authorised by a human controller, shoot them.

Fully autonomous fighting machines would spare human troops from dangerous situations. The downside, though, is that robots would then be left to make nuanced decisions on their own, the most fraught being the need to distinguish between civilians and combatants in a war zone.

“A number of governments, including the United States, are very excited about moving in this direction, very excited about taking the soldier off the battlefield and putting machines on the battlefield and thereby lowering casualties,” said Steve Goose, arms division director at Human Rights Watch.

While Goose said “killer robots” do not exist as yet, he warned of precursors and added that the best way to forestall an ethical nightmare is a “preemptive, comprehensive prohibition on the development or production of these systems.”  Jody Williams, the 1997 Nobel Peace Prize laureate, said in Washington that the prospect of killer robots “totally freaked me out.”  “I had visions of the Terminator,” she said.

“The thought that this development was proceeding without any public discussion I found more reprehensible than most military R&D because I really believe that this would… totally transform the face of warfare.”

The problem with handing over decision-making power to even the most sophisticated robots is that there would be no clear way of making anyone answer for the inevitable mistakes, said Noel Sharkey, professor of robotics at University of Sheffield.

“If a robot goes wrong, who’s accountable? It certainly won’t be the robot,” he said.

“The robot could take a bullet in its computer and go berserk, so there’s no way of really determining who’s accountable and that’s very important for the laws of war.”

Rise of the Machines: Autonomous killer robots ‘could be developed in 20 years’


Rise Of The Machines: The third instalment of the Terminator film franchise imagines the chain of events that leads to death-dealing computers taking over the planet using robot soldiers

‘If a robot goes wrong, who’s accountable? It certainly won’t be the robot’

– Noel Sharkey, University of Sheffield

Militaries around the world ‘very excited’ about replacing soldiers with robots that can act independently

U.S. leads the way with automated weapons systems, but drones still need remote control operator authorisation to open fire

Human Rights Watch calls for worldwide ban on autonomous killing machines before governments start using them

DailyMail | Nov 21, 2012

By Damien Gayle

Fully autonomous robots that decide for themselves when to kill could be developed within 20 to 30 years, or ‘even sooner’, a report has warned.

Militaries across the world are said to be ‘very excited’ about machines that could deployed alone in battle, sparing human troops from dangerous situations.

The U.S. is leading development in such ‘killer robots’, notably unmanned drones often used to attack suspected militants in Pakistan, Yemen and elsewhere.

Drones are remotely controlled by human operators and unable to kill without authorisation, but weapons systems that require little human intervention already exist.

Raytheon’s Phalanx gun system, deployed on U.S. Navy ships, can search for enemy fire and destroy incoming projectiles by itself.

The Northrop Grumman X47B is a plane-sized drone able to take off and land on aircraft carriers, carry out air combat without a pilot and even refuel in the air.

But perhaps closest to the Terminator-type killing machine portrayed in Arnold Schwarzenegger’s action films is a Samsung sentry robot already being used in South Korea.

The machine is able to spot unusual activity, challenge intruders and, when authorised by a human controller, open fire.

ROBOCOP GETS REAL
US researchers are working on a real-life Robocop who would patrol the streets to combat crime – just like in the film. Injured policemen or soldiers will be wired up to the ‘PatrolBot’, pictured below, which will effectively give them mechanical limbs that they have lost whilst in service. The plan is to make a basic version of Alex Murphy, the fictional policeman in the 1987 hit Robocop, who is turned into a cyber cop after being nearly killed in the line of duty. The new technology is based on advances in the US military in telerobotics, which is where users are wired up remotely to a robot and given physical feedback to simulate the feeling of being there.

The warnings come from a new report by Human Rights Watch, which insists that such Terminator-style robots are banned before governments start deploying them.

The report, dubbed Losing Humanity and co-written by Harvard Law School’s International Human Rights Clinic, raises the alarm over the ethics of the looming technology.

Calling them ‘killer robots,’ it urges ‘an international treaty that would absolutely prohibit the development, production, and use of fully autonomous weapons.’

Such machines would mean that human soldiers could be spared from dangerous situations, but the downside is that robots would then be left to make highly nuanced decisions on their own, the most fraught being the need to distinguish between civilians and combatants in a war zone.

‘A number of governments, including the United States, are very excited about moving in this direction, very excited about taking the soldier off the battlefield and putting machines on the battlefield and thereby lowering casualties,’ said Steve Goose, arms division director at Human Rights Watch.

While Goose said ‘killer robots’ do not exist as yet, he warned of precursors and added that the best way to forestall an ethical nightmare is a ‘preemptive, comprehensive prohibition on the development or production of these systems.’

The problem with handing over decision-making power to even the most sophisticated robots is that there would be no clear way of making anyone answer for the inevitable mistakes, said Noel Sharkey, professor of robotics at University of Sheffield.

‘If a robot goes wrong, who’s accountable? It certainly won’t be the robot,’ he said.

‘The robot could take a bullet in its computer and go berserk. So there’s no way of really determining who’s accountable and that’s very important for the laws of war.’

Now Big Brother is REALLY watching you

Frightening system to predict what people will do

WND | Nov 21, 2012

by Steve Elwart

In a government-sponsored research project eerily reminiscent of the 2002 film “Minority Report,” the Army’s Defense Advanced Research Projects Agency (DARPA) has partnered with Carnegie-Mellon University to create “an artificial intelligence (AI) system that can watch and predict what a person will likely do in the future.”

In “Minority Report,” a specialized “PreCrime” unit, part of the Washington, D.C. police department, arrests criminals based on the precognition of three psychics. In the near future, DARPA hopes that rather than using psychics, computers will be able to identify and order individuals detained based on their “anomalous behavior.”

Tapping into live surveillance video feeds and using specially programmed software, a new computer system dubbed “Mind’s Eye” will filter surveillance footage to support human operators, and automatically alert them whenever suspicious behavior is recognized.

According to the research coming from Carnegie-Mellon, the security camera system can monitor a scene in real time and sound an alarm if the program determines that illicit activity is indicated. The program would be sophisticated enough to determine if, for example, a person was setting down a bag in an airport because he is sitting next to it or that person has left the bag all together.

The researchers noted that humans are extremely skilled at choosing important pieces of information out of a mass of visual data and making decisions based on both the recorded information and acquired background knowledge. The DARPA project strives to mimic human behavior in picking out important pieces of information from a sea of visual data and make predictions on how people will behave based on their actions under uncertain conditions.

Darpa wants to deploy this software initially in airports and bus stations, and if the pilot program is successful, the software could be installed at every red light, street corner, and public place in America. It could also capture feeds from video conferencing systems, video emails, and other forms of streaming media.

According to Forbes, Carnegie Mellon is just one of 15 research teams that are participating in the program to develop smart video software. The final version of the program is scheduled to be deployed in 2015.

Mark Geertsen, a spokesman for DARPA, said in a statement that the goal of the project is “to invent new approaches to the identification of people, places, things and activities from still or moving defense and open-source imagery.”

The first part of the project involves a program called PetaVision. This initiative is a cooperative effort between Los Alamos National Laboratory (LANL) and Portland State University with the support of the National Science Foundation. The goal of this initiative is to “Achieve human-level performance in a ‘synthetic visual cognition’ system,” in other words, create a computer program that will duplicate a human’s ability to see and recognize objects, specifically faces. It would incorporate advanced artificial intelligence to identify people and objects in a video feed by looking at their shape, color, texture as well as how they move.

To do this type of advanced computing, the program is being developed on an IBM “roadrunner” supercomputer running one quadrillion (a million billion) mathematical operations every second.

While the initial software is being programmed by humans, the program has the ability to learn as it is being programmed.

According to the Los Alamos National Laboratory, the goal of the project is to recreate the visual functions of the human brain. They already have plans of implementing the second phase of the project which would be to develop a program that would mimic the function of the entire brain.

The second part of the project is another program called Videovor. While little is known about this program, what little information that is available seems to indicate that the program will be used to “summarize” data taken from video cameras.

The most time consuming part of surveillance analysis is looking at the accumulated video intelligence and determining its value. Videovor captures the video feed, analyzes it, and presents a summary of the useful information and events found in the feed.

All this would be done in real time, eliminating the need to wait for results.

The third part of the project is the development of a “geospatial oriented structure extraction” program, designed to automatically render a crude “wireframe” representation of the important events in the video from several angles, eventually eliminating the need for a human to condense hours of video into a few minutes of pertinent information.

This automated approach to video surveillance could one day replace using humans to monitor cameras. With Mind’s Eye installed, the computer system would be cheaper to maintain than human operators and would never need a lunch break or a day off. The computer could monitor every camera in a city around the clock, 365 days per year.

Also, current surveillance systems can only report what has happened in the past, it cannot forecast future behavior. Today, investigators can only see how a car was stolen or person mugged after the fact. This new software is being designed to prevent crimes before they happen.

Buried in the footnotes of the Carnegie-Mellon paper was a reference to P. W. Singer’s book, “Wired for War: The Robotics Revolution and Conflict in the 21st Century.” It is an interesting glimpse into the direction the research team may be taking. The book examines the revolution that is taking place on the battlefield and how it is changing not only how wars are fought, but also the politics, economics, laws, and ethics that surround war itself.

The book talks about the explosion of unmanned systems on the battlefield. It notes that the number of unmanned systems on the ground in Iraq during the Second Gulf War had gone from zero to 12,000 in just five years. The book also notes that these new computer systems will soon make human fighter pilots obsolete. Robotic scouts the size of house flies do reconnaissance work now conducted by Special Forces units and military pilots fly combat missions from their cubicles outside Las Vegas.

However, critics suggest just as there are inherent dangers associated with turning over wars to machines, so too are there dangers associated with turning over national security and the criminal justice system to mechanical watchdogs.

The Mind’s Eye AI system holds a very real danger to individual civil liberties. Critics say relinquishing surveillance and law enforcement to a machine leaves a society open to a future where all activities will be monitored and recorded, in the name of public safety. That surveillance would not be limited to just public venues. As the courts have increasingly limited an individual’s “expectation of privacy,” automated monitoring of human behavior can take on increasingly invasive proportions.

As with so many other government programs, the scope of the Mind’s Eye project can be vastly expanded into areas far outside of its original intent.

Deployment of this project could be a major threat to an individual’s privacy rights and turn a Hollywood script into reality.