Category Archives: Transhumanism

Facebook co-founder Peter Thiel: some people will live for centuries, rely on robots and take trips to the moon

Tech visionary Thiel sets out to spark a biotech revolution

fiercebiotech.com | Apr 17, 2012

By John Carroll

Peter Thiel, an early venture investor in Facebook and FierceMarkets, has handed out a round of grants of up to $350,000 to a slate of 6 startup biotech companies, each of which promises a game-changing approach to medicine. And he’s hoping that handing out checks to these startup dreamers will help ignite some radical thinking on the possibilities of our collective “amazing future.”

The list of radical “visionaries” includes Longevity Biotech, which is working on artificial protein technology to develop potent oral drugs; Arigos Biomedical, which is developing new technology to allow the long-term storage of organs; Immusoft, which is reprogramming human immune cells; Inspirotec, which is working on a low-cost device to gather and identify airborne agents; along with 3Scan and Positron, which are advancing new medical imaging technology.

“In the past, people dreamed of the future as a radically better, more technologically advanced place: You might live for centuries, delegate work to your robots, and take your vacations on the moon,” said Thiel, who established and funds the Thiel Foundation. “Now, many people expect their children to inherit a world worse than today’s. With Breakout Labs, we want to rekindle dreams of an amazing future. That’s why we’re supporting researchers who dream big and want to build a tomorrow in which we all want to live.”
Sign up for our FREE newsletter for more news like this sent to your inbox!

Thiel set up Breakout Labs to fund early-stage research work, backing teams of radical thinkers working outside traditional academic and industry circles. And he says more companies can earn his backing throughout the year. The new venture is currently focused primarily on the intersection of biology and technology, though Thiel plans to expand the focus as time goes on.

“Super-Soldiers” Fight Disease With Bionic Implants

mobiledia.com | Mar 21, 2012

By Kate Knibbs

The U.S. military plans to implant soldiers with medical devices, making them harder to kill with diseases.

The military’s Defense Advanced Research Projects Agency, or DARPA, announced plans to create nanosensors that monitor soldiers’ health on the battlefield and keep doctors constantly abreast about potential health problems.

DARPA’s plan for nanosensors reflects a larger trend, as scientists are trying to harness technology to improve health care across the globe. Doctors are already quickly adopting mobile technology to improve patient care, carrying around iPads to better explain procedures and inventing smartphone apps to oversee drug users’ progress and watch for signs of stress in at-risk patients.

DARPA called the implants “a truly disruptive innovation,” highlighting how healthier soldiers would change the state of modern warfare because most medical evacuations occur due to ordinary illnesses and disease, not injuries. If the U.S. can lead the way in this kind of high-tech monitoring, it could give the military another leg up on adversaries still beset by everyday illness.

Nanotechnology continues to find a place in the medical field as well. Stanford University researchers are developing tiny robotic monitors that can diagnose illnesses, monitor vital stats and even deliver medicine into the bloodstream, similar to the devices that the military plans to create.

Monkey controls robot hand through brain implants in Chinese lab


Image: China Daily China Daily Information Corp – CDIC/Reuters

newscientist.com | Feb 24, 2012

by Caroline Morley

Jianhui manipulates objects with his hands and gets a drink as a reward. Unknown to him, not far away a robot hand mirrors his fingers’ moves as it receives instructions from the chips implanted in his brain.

Zheng Xiaoxiang of the Brain-Computer Interface Research Team at Zhejiang University in Zijingang, China, and colleagues announced earlier this week that they had succeeded in capturing and deciphering the signals from the monkey’s brain and interpreting them into the real-time robotic finger movements.

The two sensors implanted in Jianhui’s brain monitor just 200 neurons in his motor cortex, Zheng says. However, this was enough to accurately interpret the monkey’s movements and control the robotic hand.

Humans have used electrodes to control prosthetic arms, but Zheng claims this research looks at the finer movements of the fingers.

“Hand moves are associated with at least several hundreds of thousands of neurons,” she said. “We now decipher the moves based on the signals of about 200 neurons. Of course, the orders we produced are still distant from the truly flexible finger moves in complexity and fineness.”

Darpa Implants Could Track Your Stress Level 24/7


Photo: U.S. Air Force

Wired | Feb 3, 2012

By Katie Drummond

Plenty of geeks are already obsessed with self-tracking, from monitoring sleep rhythms to graphing caffeine intake versus productivity. Now, the Department of Defense’s far-out research agency is after the ultimate kind of Quantified Self: Soldiers with implanted body sensors that keep intimate tabs on their health, around the clock.

In a new call for research, Darpa is asking for proposals to devise prototype implantable biosensors. Once inserted under a soldier’s skin, Darpa wants the sensors to provide real-time, accurate measurements of “DoD-relevant biomarkers” including stress hormones, like cortisol, and compounds that signal inflammation, like histamine.

Implantable sensors are only the latest of several Pentagon-backed ventures to track a soldier’s health. Darpa’s already looked into tracking “nutritional biomarkers” to evaluate troops’ diets. And as part of the agency’s “Peak Soldier Performance” program, Darpa studied how one’s genes impact physical ability, and tried to manipulate cellular mitochondria to boost the body’s energy levels.

Sensors alone won’t make troops stronger, smarter or more resilient. But they’d probably offer the kind of information that could. For one thing, the sensors would provide military docs an array of reliable info about the health of every single soldier. Plus, they’d tell leaders how a soldier’s body stood up to grueling physical training or a tough deployment. Tracking changes in the body’s endocrine system, for example, might tell a physician that a soldier is increasingly sleep deprived. Or observing chronically increased inflammation levels might tell a team leader that trainee number five isn’t cut out for the Navy SEALs.

Real-time sensors would also solve plenty of problems where warzone medical care is concerned. It’s not easy to take a urine test in the middle of a firefight. Darpa’s solicitation notes that health care often suffers because of “overnight shipping to a centralized laboratory,” and the “collection, processing and handling” that can mar specimens in transit.

Besides, urine samples and blood tests are hardly as personalized as an implanted sensor would be. A system that tracks several biomarkers could offer a robust and real-time analysis of how, say, a soldier’s sleeping patterns or dietary choices affect his or her physical performance.

Far out as the idea sounds, scientists have already made impressive strides toward implantable biosensors. A team at Clemson University, with Pentagon funding, has devised a sensor that can be implanted for short periods to monitor the well being of injured patients. Another group, at Tufts University, is making biosensors out of silk, which they think will be easier to introduce into bodily tissues. Some companies are even getting into niche implants, most notably those to monitor glucose levels among diabetics.

Still, plenty of challenges persist. For one, biocompatibility — the ability of the sensor to integrate into the body, without being “walled off” by surrounding tissues — is still a limiting factor in determining whether a sensor will even work, not to mention what it can measure and how long it’ll last. And Darpa’s ideal sensors don’t just need to be biocompatible. They’ve also got to offer extremely accurate information on several different biomarkers, and have a long enough lifespan to avoid frequent replacement.

Of course, a sensor that tracked every estrogen uptick and cortisol dip would be a self-tracker’s wet dream and a major aide for doctors — whether civilian or military. It’s also got some vaguely dystopian connotations, like the prospect of job hiring and firing based on, say, a body that’s got less than optimal stress responses.

But don’t panic just yet. For now, Darpa only wants prototypes tested on “biospecimens and animal models.”

Neuroscience could mean soldiers controlling weapons with minds


Medevac troops from the American 451st air expeditionary wing look out from their Pavehawk helicopter while heading to pick up casualties in Kandahar, Afghanistan. Photograph: Sean Smith for the Guardian

Neuroscience breakthroughs could be harnessed by military and law enforcers, says Royal Society report

Guardian | Feb 6, 2012

by Ian Sample

Soldiers could have their minds plugged directly into weapons systems, undergo brain scans during recruitment and take courses of neural stimulation to boost their learning, if the armed forces embrace the latest developments in neuroscience to hone the performance of their troops.

These scenarios are described in a report into the military and law enforcement uses of neuroscience, published on Tuesday, which also highlights a raft of legal and ethical concerns that innovations in the field may bring.

The report by the Royal Society, the UK’s national academy of science, says that while the rapid advance of neuroscience is expected to benefit society and improve treatments for brain disease and mental illness, it also has substantial security applications that should be carefully analysed.

The report’s authors also anticipate new designer drugs that boost performance, make captives more talkative and make enemy troops fall asleep.

Related

“Neuroscience will have more of an impact in the future,” said Rod Flower, chair of the report’s working group.

“People can see a lot of possibilities, but so far very few have made their way through to actual use.

“All leaps forward start out this way. You have a groundswell of ideas and suddenly you get a step change.”

The authors argue that while hostile uses of neuroscience and related technologies are ever more likely, scientists remain almost oblivious to the dual uses of their research.

The report calls for a fresh effort to educate neuroscientists about such uses of the work early in their careers.

Some techniques used widely in neuroscience are on the brink of being adopted by the military to improve the training of soldiers, pilots and other personnel.

A growing body of research suggests that passing weak electrical signals through the skull, using transcranial direct current stimulation (tDCS), can improve people’s performance in some tasks.

One study cited by the report described how US neuroscientists employed tDCS to improve people’s ability to spot roadside bombs, snipers and other hidden threats in a virtual reality training programme used by US troops bound for the Middle East.

“Those who had tDCS learned to spot the targets much quicker,” said Vince Clark, a cognitive neuroscientist and lead author on the study at the University of New Mexico. “Their accuracy increased twice as fast as those who had minimal brain stimulation. I was shocked that the effect was so large.”

Clark, whose wider research on tDCS could lead to radical therapies for those with dementia, psychiatric disorders and learning difficulties, admits to a tension in knowing that neuroscience will be used by the military.

“As a scientist I dislike that someone might be hurt by my work. I want to reduce suffering, to make the world a better place, but there are people in the world with different intentions, and I don’t know how to deal with that.

“If I stop my work, the people who might be helped won’t be helped. Almost any technology has a defence application.”

Research with tDCS is in its infancy, but work so far suggests it might help people by boosting their attention and memory. According to the Royal Society report, when used with brain imaging systems, tDCS “may prove to be the much sought-after tool to enhance learning in a military context”.

One of the report’s most striking scenarios involves the use of devices called brain-machine interfaces (BMIs) to connect people’s brains directly to military technology, including drones and other weapons systems.

The work builds on research that has enabled people to control cursors and artificial limbs through BMIs that read their brain signals.

“Since the human brain can process images, such as targets, much faster than the subject is consciously aware of, a neurally interfaced weapons system could provide significant advantages over other system control methods in terms of speed and accuracy,” the report states.

The authors go on to stress the ethical and legal concerns that surround the use of BMIs by the military. Flower, a professor of pharmacology at the William Harvey Research Institute at Barts and the London hospital, said: “If you are controlling a drone and you shoot the wrong target or bomb a wedding party, who is responsible for that action? Is it you or the BMI?

“There’s a blurring of the line between individual responsibility and the functioning of the machine. Where do you stop and the machine begin?”

Another tool expected to enter military use is the EEG (electroencephalogram), which uses a hairnet of electrodes to record brainwaves through the skull. Used with a system called “neurofeedback”, people can learn to control their brainwaves and improve their skills.

According to the report, the technique has been shown to improve training in golfers and archers.

The US military research organisation, Darpa, has already used EEG to help spot targets in satellite images that were missed by the person screening them. The EEG traces revealed that the brain sometimes noticed targets but failed to make them conscious thoughts. Staff used the EEG traces to select a group of images for closer inspection and improved their target detection threefold, the report notes.

Work on brain connectivity has already raised the prospect of using scans to select fast learners during recruitment drives.

Research last year by Scott Grafton at the University of California, Santa Barbara, drew on functional magnetic resonance imaging (fMRI) scans to measure the flexibility of brain networks. They found that a person’s flexibility helped predict how quickly they would learn a new task.

Other studies suggest neuroscience could help distinguish risk-takers from more conservative decision-makers, and so help with assessments of whether they are better suited to peacekeeping missions or special forces, the report states.

“Informal assessment occurs routinely throughout the military community. The issue is whether adopting more formal techniques based on the results of research in neuroeconomics, neuropsychology and other neuroscience disciplines confers an advantage in decision-making.”

IBM: Resistance is unnecessary, the Borg will be assimilated comfortably


“Star Trek” captain Jean-Luc Picard (Patrick Stewart) is fitted with gizmos for a fictional Borg transformation. The blending of humans and hardware will probably be more artful in real life by 2111. Paramount Pictures

This wouldn’t be a Borg-like assimilation, in which humans look increasingly like machines. Rather, the machines would blend into the human body.

IBM thinks about the next 100 years

MSNBC | Jun 16, 2011

By Alan Boyle

A hundred years from now, will we be assimilated by the machines? Or will we assimilate them? These are the kinds of issues facing International Business Machines as the company begins its second 100 years.

Right now, most folks are thinking about the past 100 years at IBM, which is celebrating the centennial of its founding on Thursday. But for Bernard Meyerson, the company’s vice president of innovation, it’s all about the next century.

“That’s pretty much what we think about,” Meyerson told me today.

Related

Meyerson has plenty to look back on, including his own not-so-small part in IBM’s past innovations. When his cell phone dropped the connection during our telephone conversation, he called back and casually mentioned that he had a hand in creating the transistors built into that cell phone. And when I asked him to explain, he said, “I actually invented the technology known as silicon-germanium.”

It turns out that IBM has played a behind-the-scenes role in all sorts of technologies, ranging from semiconductor development to barcodes to Wi-Fi. “IBM is a funny company,” Meyerson said. “We don’t force you to put a little sticker on anything that says, ‘We’re the smart guys.'”

IBM Centennial Film

But enough about the past: What about the future? “Going forward, you have tremendous opportunities,” particularly when it comes to making sense of the huge databases that are being built up in all sorts of fields, Meyerson said. For example, imagine a system that can take medical records from the 285 million people around the world with diabetes, anonymize those records and analyze them, looking for potential new treatments or preventive measures.

“The fact is, there is no mechanism today that could do that, and the reason is that medical data is unstructured,” he said. There’s little consistency in how the records are kept, and medical conditions might be described in different ways by different doctors.

When you put together the volumes of data and the numbers of people that have to be covered in these massive, unstructured data sets, the figures mount up to quintillions of bytes. That’s the challenge facing new types of computing tools — for example, the Watson supercomputer, which won a highly publicized “Jeopardy” quiz-show match earlier this year. Now Watson is being put to work on a tougher task: making sense of medical records, which is just the kind of job Meyerson has in mind.

Still other challenges await. Watson-style computers could digest the millions of data points involved in tracking the flow of highway traffic, then develop models to predict where the tie-ups could arise before they actually happen. The computers of the next century will have to handle a wide range of “big data” challenges, ranging from climate modeling to natural-language search engines for multimedia.

Meyerson doesn’t expect Watson to answer that challenge completely. A hundred years from now, Watson will almost certainly be considered a quaint antique, much like the tabulating machines that were made back in 1911.

“Watson specifically is not the issue, as much as the combination of Watson’s ability to interpret natural language, the capacity to store ‘big data’ and apply data analytics to come up with solutions for society,” he said. “In the absence of natural language, you’re going to have a short, unhappy life attempting this work. Without that key ingredient, how are you going to take the interaction of humans and machines to the next level and make it easy?”

What will the next level be in the year 2111? “Honestly, at 100 years I’m genuinely unsure,” Meyerson said. The past century has shown that the pace of technological advancement can be highly variable, depending on what kinds of breakthroughs come to the fore. But if Meyerson had to bet on one particular game-changing technology, it would be coming up with a direct interface between computing circuits and the human brain.

“If it turns out that there is a very natural way to communicate data back and forth without being obtrusive, then the whole world changes,” he told me. This wouldn’t be a Borg-like assimilation, in which humans look increasingly like machines. Rather, the machines would blend into the human body.

Does that sound like a grand dream for the next century? Or a nightmare?

Humans become ‘pets’ in rise of the machines: Apple co-founder


Steve Wozniak: “We’re going to become the pets, the dogs of the house.” Photo: Bloomberg

theage.com.au | June 6, 2011

by Tony Bartlett

Machines have won the war and the human race is destined to become little more than house pets.

That’s the future according to one of the smartest geeks on the planet, Steve Wozniak, who co-founded Apple Computers and is convinced that in his lifetime he will see computer intelligence equal that of humans.

The Woz is to the technological world what The Fonz was to leather jackets and denim, and when he talks, the global industry listens.

As technology explodes, humans are not going to be needed so much in the future and will settle back into a life of ease, Mr Wozniak told a business congress on the Gold Coast on Friday.

Related

 

“We’re already creating the superior beings, I think we lost the battle to the machines long ago,” he said.

“We’re going to become the pets, the dogs of the house.”

He said all of a sudden, true artificial intelligence will creep up on mankind like an accident.

“Every time we create new technology we’re creating stuff to do the work we used to do and we’re making ourselves less meaningful, less relevant.

“Why are we going to need ourselves so much in the future? We’re just going to have the easy life,” he said.

Mr Wozniak said Singularity, where a machine seems like a human being and has feelings, can think and be motivated, seemed an impossible dream to him years ago.

When he started Apple, he said, he never thought a computer would be powerful enough to hold an entire song and today we can fit 50 movies on a little disc in an iPhone.

“You don’t realise it’s happened until it’s there and I think that awareness of machines is getting very, very close and we’re getting close to where a machine will really understand you,” Mr Wozniak said.

“My comment about the machines winning the war is partly a joke, but we’ve accidentally already put so much in place that we can’t get rid of from our lives.

“Once we have machines doing our high-level thinking, there’s so little need for ourselves and you can’t ever undo it – you can never turn them off.”