Friday, September 25, 2009
Thursday, September 10, 2009
Here’s the book’s prologue, about a technology called BrainGate (excerpt with permission):
...Imagine this: you are waking up. As your eyes focus, you see a white-haired man in a lab coat congratulating you on a successful surgery. You are still groggy from the anesthesia and can’t quite remember what happened. The man enthusiastically explains that he is a scientist and that your surgery has previously been performed only on rats and rhesus monkeys. But with the help of a neurosurgeon, it has now been performed on yet another animal – a guinea pig – and that happens to be you.
Before you can gather your thoughts, the scientist makes an odd request: “Could you please turn off the lights?” As you look around the room, you don’t see a light switch. But just as the thought crosses your mind, the lights go off. Smiling, he asks you to turn the lights back on. You think of it momentarily, and they snap on. He smiles again. “The brain implant has worked!”
If this scenario seems like science fiction, I assure you that it has far more science than fiction. In fact, this technology exists today. The scientist is John Donoghue, chair of the Neuroscience Department at Brown University. He, along with his colleagues, has invented an implantable device called BrainGate that allows people to use their minds to control electronics such as computers.
I was introduced to BrainGate when I began my doctoral program in cognitive science at Brown. As I soon learned, the brain uses electrical and chemical charges to communicate with itself and the rest of the body. The idea behind BrainGate is actually quite simple: by tapping in to the electrical charges of the brain, doctors can position them outward to control other electrical devices, in the same way your TV remote allows you to change the channel without leaving your couch. After numerous animal trials (if you imagine monkeys running down the hallowed halls of Brown turning out the lights using brain waves, you will not be far off), BrainGate was approved by the FDA for clinical trials on humans. The immediate goal was to provide more mobility for those with severe dysfunction, such as quadriplegics and Parkinson’s patients.
Once I became familiar with these ideas, I urged one of Donoghue’s students to start a company. That company was quickly funded by a venture capitalist. It started human trials and quietly made its debut on the NASDAQ exchange a few years later. The first clinical trial in 2004 involved a paralyzed man who is now able to control a computer cursor with his mind. The lead surgeon, another professor at Brown, described the results as “almost unbelievable.” I suspect he added the word “almost” out of deference to Donoghue. Four other patients have since been implanted, all with remarkable success. The results were published in the venerable journal Nature in 2006 (Nature had published the animal trials in 2002).
Why does this story sound outrageous? It is mainly because – as the doctor who performed the surgery has said – the idea is too hard to believe. We think of the brain as something beyond our comprehension, so we dismiss the notion that it obeys the laws of science. Here is the way CBS’s 60 Minutes put it when it featured BrainGate in 2008: “Once in a while, we run across a science story that is hard to believe until you see it. That’s how we felt about this story when we first saw human beings operating computers, writing e-mails, and driving wheelchairs with nothing but their thoughts.”
The brain, however, is understandable. It is nothing more than a biological machine.
Wednesday, September 09, 2009
Tuesday, September 08, 2009
One of the most exciting, gee-whiz features being developed for mobile phones right now are augmented reality browsers. Rather than fire up a mobile Web browser like Safari or Opera, these generally add an information layer over the world as seen through your phone’s camera lens. Last year at TechCrunch50, Tonchidot’s Sekai Camera wowed the crowd with its AR browser demo, Layar is creating a lot of buzz in Europe, and this summer AR technologies finally started to hit the market. You had Yelp sneak in an AR feature into its latest iPhone app, and a growing number of Android apps are embracing AR as well.
We are at the very early stages of what may very well become a common interface for mobile browsing, which means that it is still very primitive. You can only click on buildings or objects within your immediate view. Daniel Wagner, a virtual reality researcher at Graz University of Technology in Austria, is proposing two ways to make AR browsing better: panoramic and bird’s-eye zooming.
In the video above, he demonstrates these two types of zooming techniques which allow the user to zoom out to see what else is around him, much like he would with an online map, select something to click on—maybe the museum two blocks over—and then zoom back in. The panoramic zoom gives the user a sense of other clickable items within a 360 degree view, whereas the bird’s-eye view gives a top-down picture that looks like a close-up satellite shot with clickable locations.
Okay, so now we have panning and zooming. What someone needs to figure out next is an elegant way to hyperlink from one hotlinked location to another. That way you could teleport, at least virtually.
Crunch Network: CrunchBoard because it’s time for you to find a new Job2.0
Monday, September 07, 2009
Imagine a small device that you wear on a necklace that takes photos every few seconds of whatever is around you, and records sound all day long. It has GPS and the ability to wirelessly upload the data to the cloud, where everything is date/time and geo stamped and the sound files are automatically transcribed and indexed. Photos of people, of course, would be automatically identified and tagged as well.
Imagine an entire lifetime recorded and searchable. Imagine if you could scroll and search through the lives of your ancestors.
Would you wear that device? I think I would. I can imagine that advances in hardware and batteries will soon make these as small as you like. And I can see them becoming as ubiquitous as wrist watches were in the last century. I see them becoming customized fashion statements.
Privacy disaster? You betcha.
But ten years ago we would have been horrified by what we nonchalantly share on Facebook and Twitter every day. I always imagine what a family in the 70s would think about all of their photo albums being posted on computers and available for the entire world to see. They’d be horrified, they couldn’t even imagine it. Heck, a life recorder is less of a privacy abandonment step forward than we’ve already taken with the Internet and electronic surveillance in general.
It’s clunky today and doesn’t do most of the things I mentioned in the first paragraph above. But a true life recorder that isn’t a fashion tragedy isn’t that far away.
In fact I’ve already spoken with one startup that has been working on a device like this for over a year now, and may go to market with it in 2010.
The hardware is actually not the biggest challenge. How it will be stored, transcribed, indexed and protected online is. It’s a massive amount of data that only a few companies (Microsoft, Google, Amazon) are equipped to really handle anytime soon.
But these devices are coming. And you have to decide if you’ll be one of the first or one of the last to use one.
Will you wear one? I will. Let us know in the poll below.
Crunch Network: MobileCrunch Mobile Gadgets and Applications, Delivered Daily.
Thursday, September 03, 2009
Future of the Screen: Terminator-Style Augmented-Reality Glasses: "
The most efficient possible display technology would be something that bypasses the eyes altogether and sends information straight to the brain. Sadly, cranial USB ports are still pretty hard to install. The second most efficient possible display technology anyone's devised projects images directly into the eye. The dream of a wearable virtual retinal display, or VRD, has been around for nearly two decades; it's on the horizon, but it's still going to be a while until it gets here.
The idea of VRD was first tossed around at the University of Washington's Human Interface Technology Lab back around 1991. Thomas Furness, who'd been working on helmet-based displays for the Air Force in the '80s, and research engineer Joel Kollin were part of the team that put together the initial (and enormous) prototype. The concept was that tiny, ultra-low-power lasers could paint an image onto the human retina by scanning across it at high speed, essentially treating it as a tiny TV screen. If you could assemble a set of microscopic red, blue and green lasers, stick them where they could project onto your eyes, and hook them up to a computer, you could still see whatever you'd normally see, but with three-dimensional, full-color displays of additional information or imagery overlaid on the visible world—an effect called "augmented reality." Think of Arnold Schwarzenegger's sunglasses in Terminator, and you're on the right track.
Prof. Steven Feiner, of Columbia University's computer science department, notes that the potential advantages of retinal displays are energy-efficiency and unobtrusiveness: 'What many of us want is something you're always wearing so that you can experience overlaid stuff, as opposed to having to put something on.' There is clearly some money to be made with augmented reality, and a Seattle-area company called Microvision has been working on commercial applications of the HITLab's VRD concepts since the early '90s. (More recently, the Japanese printer company Brother Industries has been developing a similar technology, which it calls 'retinal imaging display.') The military has paid Microvision to research VRD eyewear for soldiers and pilots, who need to have a lot of information instantly accessible in addition to what's in front of their eyes.
But there are plenty of day-to-day civilian uses for an unobtrusive, full-color "heads-up" display—one that wouldn't require looking away from its users' physical, nonvirtual surroundings. A mobile phone could have a "screen" as large as its user's visual field. Driving directions could appear in front of your eyes while you're looking at the road, even in bright daylight. Cooking wouldn't require shuttling your attention between the stove and a cookbook. Hearing-impaired people could see voice-recognition transcriptions of what people around them were saying. Surgeons could keep watch on their patients' vital signs and medical reference texts without looking away from an operation.
So where are your Terminator shades? In 1992, Furness and Kollin claimed that it would be at least five years until full-on VRD was a reality, and it's been considerably more than that. One problem is that people's eyes don't stand still—in practice, projecting an image onto a retina is like trying to project a movie onto a moving screen. Another is that, while the wearable part of the system may be small, the gear that needs to be hooked up to it is still gigantic; if it's not portable, it's not very useful.
Still, Dr. Bruce H. Thomas, the director of the Wearable Computer Lab at the University of South Australia, believes that "in the near future we might actually see head-mounted displays become consumer products because of iPods—a legitimate video delivery unit that lots of people carry around with them." In the meantime, primitive VRD has begun to appear in the real world. Microvision released the Nomad Expert Technician System in 2004. (It cost $4,000 a unit and only projected images in red; Honda ordered some for their training centers, but the NETS never caught on, and was discontinued by 2006.) And Brother announced last year that it was hoping to make their retinal imaging display device commercially available sometime in 2010. Maybe by then it'll be small enough for a non-Schwarzenegger-sized person to carry around.
Ars Technica reports: A series of technical breakthroughs has led to myriad new ways to look what used to be peripheral — today there is a new display technology for a every possible use, and a possible use for new every display technology.