Sorry for the absence, but work has been a bear, i could not pass up these two.
'Stingray' Phone Tracker Fuels Constitutional Clash
For more than a year, federal authorities pursued a man they called simply "the Hacker." Only after using a little known cellphone-tracking device—a stingray—were they able to zero in on a California home and make the arrest.
U.S. Patent and Trademark Office
A Harris StingRay II, one of several devices dubbed 'stingrays.'
.Stingrays are designed to locate a mobile phone even when it's not being used to make a call. The Federal Bureau of Investigation considers the devices to be so critical that it has a policy of deleting the data gathered in their use, mainly to keep suspects in the dark about their capabilities, an FBI official told The Wall Street Journal in response to inquiries.
A stingray's role in nabbing the alleged "Hacker"—Daniel David Rigmaiden—is shaping up as a possible test of the legal standards for using these devices in investigations. The FBI says it obtains appropriate court approval to use the device.
Stingrays are one of several new technologies used by law enforcement to track people's locations, often without a search warrant. These techniques are driving a constitutional debate about whether the Fourth Amendment, which prohibits unreasonable searches and seizures, but which was written before the digital age, is keeping pace with the times.
On Nov. 8, the Supreme Court will hear arguments over whether or not police need a warrant before secretly installing a GPS device on a suspect's car and tracking him for an extended period. In both the Senate and House, new bills would require a warrant before tracking a cellphone's location.
And on Thursday in U.S. District Court of Arizona, Judge David G. Campbell is set to hear a request by Mr. Rigmaiden, who is facing fraud charges, to have information about the government's secret techniques disclosed to him so he can use it in his defense. Mr. Rigmaiden maintains his innocence and says that using stingrays to locate devices in homes without a valid warrant "disregards the United States Constitution" and is illegal.
His argument has caught the judge's attention. In a February hearing, according to a transcript, Judge Campbell asked the prosecutor, "Were there warrants obtained in connection with the use of this device?"
Tech Giants Seek to Clamp Down on Multi-Tentacled Patent Suits Access thousands of business sources not available on the free web. Learn More The prosecutor, Frederick A. Battista, said the government obtained a "court order that satisfied [the] language" in the federal law on warrants. The judge then asked how an order or warrant could have been obtained without telling the judge what technology was being used. Mr. Battista said: "It was a standard practice, your honor."
Judge Campbell responded that it "can be litigated whether those orders were appropriate."
On Thursday the government will argue it should be able to withhold details about the tool used to locate Mr. Rigmaiden, according to documents filed by the prosecution. In a statement to the Journal, Sherry Sabol, Chief of the Science & Technology Office for the FBI's Office of General Counsel, says that information about stingrays and related technology is "considered Law Enforcement Sensitive, since its public release could harm law enforcement efforts by compromising future use of the equipment."
The prosecutor, Mr. Battista, told the judge that the government worries that disclosure would make the gear "subject to being defeated or avoided or detected."
A stingray works by mimicking a cellphone tower, getting a phone to connect to it and measuring signals from the phone. It lets the stingray operator "ping," or send a signal to, a phone and locate it as long as it is powered on, according to documents reviewed by the Journal. The device has various uses, including helping police locate suspects and aiding search-and-rescue teams in finding people lost in remote areas or buried in rubble after an accident.
The government says "stingray" is a generic term. In Mr. Rigmaiden's case it remains unclear which device or devices were actually used.
The best known stingray maker is Florida-based defense contractor Harris Corp. A spokesman for Harris declined to comment.
Harris holds trademarks registered between 2002 and 2008 on several devices, including the StingRay, StingRay II, AmberJack, KingFish, TriggerFish and LoggerHead. Similar devices are available from other manufacturers. According to a Harris document, its devices are sold only to law-enforcement and government agencies.
Some of the gadgets look surprisingly old-fashioned, with a smattering of switches and lights scattered across a panel roughly the size of a shoebox, according to photos of a Harris-made StingRay reviewed by the Journal. The devices can be carried by hand or mounted in cars, allowing investigators to move around quickly.
A rare public reference to this type of technology appeared this summer in the television crime drama "The Closer." In the episode, law-enforcement officers use a gadget they called a "catfish" to track cellphones without a court order.
The U.S. armed forces also use stingrays or similar devices, according to public contract notices. Local law enforcement in Minnesota, Arizona, Miami and Durham, N.C., also either possess the devices or have considered buying them, according to interviews and published requests for funding.
The sheriff's department in Maricopa County, Ariz., uses the equipment "about on a monthly basis," says Sgt. Jesse Spurgin. "This is for location only. We can't listen in on conversations," he says.
Sgt. Spurgin says officers often obtain court orders, but not necessarily search warrants, when using the device. To obtain a search warrant from a court, officers as a rule need to show "probable cause," which is generally defined as a reasonable belief, based on factual evidence, that a crime was committed. Lesser standards apply to other court orders.
A spokeswoman with the Bureau of Criminal Apprehension in Minnesota says officers don't need to seek search warrants in that state to use a mobile tracking device because it "does not intercept communication, so no wiretap laws would apply."
FBI and Department of Justice officials have also said that investigators don't need search warrants. Associate Deputy Attorney General James A. Baker and FBI General Counsel Valerie E. Caproni both said at a panel at the Brookings Institution in May that devices like these fall into a category of tools called "pen registers," which require a lesser order than a warrant. Pen registers gather signals from phones, such as phone numbers dialed, but don't receive the content of the communications.
To get a pen-register order, investigators don't have to show probable cause. The Supreme Court has ruled that use of a pen register doesn't require a search warrant because it doesn't involve interception of conversations.
But with cellphones, data sent includes location information, making the situation more complicated because some judges have found that location information is more intrusive than details about phone numbers dialed. Some courts have required a slightly higher standard for location information, but not a warrant, while others have held that a search warrant is necessary.
The prosecution in the Rigmaiden case says in court documents that the "decisions are made on a case-by-case basis" by magistrate and district judges. Court records in other cases indicate that decisions are mixed, and cases are only now moving through appellate courts.
The FBI advises agents to work with federal prosecutors locally to meet the requirements of their particular district or judge, the FBI's Ms. Sabol says. She also says it is FBI policy to obtain a search warrant if the FBI believes the technology "may provide information on an individual while that person is in a location where he or she would have a reasonable expectation of privacy."
Experts say lawmakers and the courts haven't yet settled under what circumstances locating a person or device constitutes a search requiring a warrant. Tracking people when they are home is particularly sensitive because the Fourth Amendment specifies that people have a right to be secure against unreasonable searches in their "houses."
"The law is uncertain," says Orin Kerr, a professor at George Washington University Law School and former computer-crime attorney at the Department of Justice. Mr. Kerr, who has argued that warrants should be required for some, but not all, types of location data, says that the legality "should depend on the technology."
In the case of Mr. Rigmaiden, the government alleges that as early as 2005, he began filing fraudulent tax returns online. Overall, investigators say, Mr. Rigmaiden electronically filed more than 1,900 fraudulent tax returns as part of a $4 million plot.
Federal investigators say they pursued Mr. Rigmaiden "through a virtual labyrinth of twists and turns." Eventually, they say they linked Mr. Rigmaiden to use of a mobile-broadband card, a device that lets a computer connect to the Internet through a cellphone network.
Investigators obtained court orders to track the broadband card. Both orders remain sealed, but portions of them have been quoted by the defense and the prosecution.
These two documents are central to the clash in the Arizona courtroom. One authorizes a "pen register" and clearly isn't a search warrant. The other document is more complex. The prosecution says it is a type of search warrant and that a finding of probable cause was made.
But the defense argues that it can't be a proper search warrant, because among other things it allowed investigators to delete all the tracking data collected, rather than reporting back to the judge.
Legal experts who spoke with the Journal say it is difficult to evaluate the order, since it remains sealed. In general, for purposes of the Fourth Amendment, the finding of probable cause is most important in determining whether a search is reasonable because that requirement is specified in the Constitution itself, rather than in legal statutes, says Mr. Kerr.
But it is "odd" for a search warrant to allow deletion of evidence before a case goes to trial, says Paul Ohm, a professor at the University of Colorado Law School and a former computer-crime attorney at the Department of Justice. The law governing search warrants specifies how the warrants are to be executed and generally requires information to be returned to the judge.
Even if the court finds the government's actions acceptable under the Fourth Amendment, deleting the data is "still something we might not want the FBI doing," Mr. Ohm says.
The government says the data from the use of the stingray has been deleted and isn't available to the defendant. In a statement, the FBI told the Journal that "our policy since the 1990s has been to purge or 'expunge' all information obtained during a location operation" when using stingray-type gear.
As a general matter, Ms. Sabol says, court orders related to stingray technology "will include a directive to expunge information at the end of the location operation."
Ms. Sabol says the FBI follows this policy because its intent isn't to use the data as evidence in court, but rather to simply find the "general location of their subject" in order to start collecting other information that can be used to justify a physical search of the premises.
In the Rigmaiden example, investigators used the stingray to narrow down the location of the broadband card. Then they went to the apartment complex's office and learned that one resident had used a false ID and a fake tax return on the renter's application, according to court documents.
Based on that evidence, they obtained a search warrant for the apartment. They found the broadband card connected to a computer.
Mr. Rigmaiden, who doesn't confirm or deny ownership of the broadband card, is arguing he should be given information about the device and about other aspects of the mission that located him.
In the February hearing, Judge Campbell said he might need to weigh the government's claim of privilege against the defendant's Fourth Amendment rights, and asked the prosecution, "How can we litigate in this case whether this technology that was used in this case violates the Fourth Amendment without knowing precisely what it can do?"
http://online.wsj.com/article/SB10001424053111904194604576583112723197574.html#ixzz1YjyW8gII
A future for drones: Automated killing
One afternoon last fall at Fort Benning, Ga., two model-size planes took off, climbed to 800 and 1,000 feet, and began criss-crossing the military base in search of an orange, green and blue tarp.
The automated, unpiloted planes worked on their own, with no human guidance, no hand on any control.
After 20 minutes, one of the aircraft, carrying a computer that processed images from an onboard camera, zeroed in on the tarp and contacted the second plane, which flew nearby and used its own sensors to examine the colorful object. Then one of the aircraft signaled to an unmanned car on the ground so it could take a final, close-up look.
Target confirmed.
This successful exercise in autonomous robotics could presage the future of the American way of war: a day when drones hunt, identify and kill the enemy based on calculations made by software, not decisions made by humans. Imagine aerial “Terminators,” minus beefcake and time travel.
The Fort Benning tarp “is a rather simple target, but think of it as a surrogate,” said Charles E. Pippin, a scientist at the Georgia Tech Research Institute, which developed the software to run the demonstration. “You can imagine real-time scenarios where you have 10 of these things up in the air and something is happening on the ground and you don’t have time for a human to say, ‘I need you to do these tasks.’ It needs to happen faster than that.”
The demonstration laid the groundwork for scientific advances that would allow drones to search for a human target and then make an identification based on facial-recognition or other software. Once a match was made, a drone could launch a missile to kill the target.
Military systems with some degree of autonomy — such as robotic, weaponized sentries — have been deployed in the demilitarized zone between South and North Korea and other potential battle areas. Researchers are uncertain how soon machines capable of collaborating and adapting intelligently in battlefield conditions will come online. It could take one or two decades, or longer. The U.S. military is funding numerous research projects on autonomy to develop machines that will perform some dull or dangerous tasks and to maintain its advantage over potential adversaries who are also working on such systems.
The killing of terrorism suspects and insurgents by armed drones, controlled by pilots sitting in bases thousands of miles away in the western United States, has prompted criticism that the technology makes war too antiseptic. Questions also have been raised about the legality of drone strikes when employed in places such as Pakistan, Yemen and Somalia, which are not at war with the United States. This debate will only intensify as technological advances enable what experts call lethal autonomy.
The prospect of machines able to perceive, reason and act in unscripted environments presents a challenge to the current understanding of international humanitarian law. The Geneva Conventions require belligerents to use discrimination and proportionality, standards that would demand that machines distinguish among enemy combatants, surrendering troops and civilians.
“The deployment of such systems would reflect a paradigm shift and a major qualitative change in the conduct of hostilities,” Jakob Kellenberger, president of the International Committee of the Red Cross, said at a conference in Italy this month. “It would also raise a range of fundamental legal, ethical and societal issues, which need to be considered before such systems are developed or deployed.”
Drones flying over Afghanistan, Pakistan and Yemen can already move automatically from point to point, and it is unclear what surveillance or other tasks, if any, they perform while in autonomous mode. Even when directly linked to human operators, these machines are producing so much data that processors are sifting the material to suggest targets, or at least objects of interest. That trend toward greater autonomy will only increase as the U.S. military shifts from one pilot remotely flying a drone to one pilot remotely managing several drones at once.
But humans still make the decision to fire, and in the case of CIA strikes in Pakistan, that call rests with the director of the agency. In future operations, if drones are deployed against a sophisticated enemy, there may be much less time for deliberation and a greater need for machines that can function on their own.
The U.S. military has begun to grapple with the implications of emerging technologies.
“Authorizing a machine to make lethal combat decisions is contingent upon political and military leaders resolving legal and ethical questions,” according to an Air Force treatise called Unmanned Aircraft Systems Flight Plan 2009-2047. “These include the appropriateness of machines having this ability, under what circumstances it should be employed, where responsibility for mistakes lies and what limitations should be placed upon the autonomy of such systems.”
In the future, micro-drones will reconnoiter tunnels and buildings, robotic mules will haul equipment and mobile systems will retrieve the wounded while under fire. Technology will save lives. But the trajectory of military research has led to calls for an arms-control regime to forestall any possibility that autonomous systems could target humans.
In Berlin last year, a group of robotic engineers, philosophers and human rights activists formed the International Committee for Robot Arms Control (ICRAC) and said such technologies might tempt policymakers to think war can be less bloody.
Some experts also worry that hostile states or terrorist organizations could hack robotic systems and redirect them. Malfunctions also are a problem: In South Africa in 2007, a semiautonomous cannon fatally shot nine friendly soldiers.
The ICRAC would like to see an international treaty, such as the one banning antipersonnel mines, that would outlaw some autonomous lethal machines. Such an agreement could still allow automated antimissile systems.
“The question is whether systems are capable of discrimination,” said Peter Asaro, a founder of the ICRAC and a professor at the New School in New York who teaches a course on digital war. “The good technology is far off, but technology that doesn’t work well is already out there. The worry is that these systems are going to be pushed out too soon, and they make a lot of mistakes, and those mistakes are going to be atrocities.”
Research into autonomy, some of it classified, is racing ahead at universities and research centers in the United States, and that effort is beginning to be replicated in other countries, particularly China.
“Lethal autonomy is inevitable,” said Ronald C. Arkin, the author of “Governing Lethal Behavior in Autonomous Robots,” a study that was funded by the Army Research Office.
Arkin believes it is possible to build ethical military drones and robots, capable of using deadly force while programmed to adhere to international humanitarian law and the rules of engagement. He said software can be created that would lead machines to return fire with proportionality, minimize collateral damage, recognize surrender, and, in the case of uncertainty, maneuver to reassess or wait for a human assessment.
In other words, rules as understood by humans can be converted into algorithms followed by machines for all kinds of actions on the battlefield.
“How a war-fighting unit may think — we are trying to make our systems behave like that,” said Lora G. Weiss, chief scientist at the Georgia Tech Research Institute.
Others, however, remain skeptical that humans can be taken out of the loop.
“Autonomy is really the Achilles’ heel of robotics,” said Johann Borenstein, head of the Mobile Robotics Lab at the University of Michigan. “There is a lot of work being done, and still we haven’t gotten to a point where the smallest amount of autonomy is being used in the military field. All robots in the military are remote-controlled. How does that sit with the fact that autonomy has been worked on at universities and companies for well over 20 years?”
Borenstein said human skills will remain critical in battle far into the future.
“The foremost of all skills is common sense,” he said. “Robots don’t have common sense and won’t have common sense in the next 50 years, or however long one might want to guess.”
http://www.washingtonpost.com/national/national-security/a-future-for-drones-automated-killing/2011/09/15/gIQAVy9mgK_print.html
No comments:
Post a Comment