Tuesday, May 21, 2013

Deep Learning and Memory Implants, Oh My! MTR's 10 Breakthrough Technologies of 2013

MIT Technology Review (MTR) magazine offers an annual list of 10 technology breakthroughs. This year's list is embedded in their introduction to the full set at http://www.technologyreview.com/featuredstory/513981/introduction-to-the-10-breakthrough-technologies-of-2013/:
Our definition of a breakthrough is simple: an advance that gives people powerful new ways to use technology. It could be an intuitive design that provides a useful interface (see “Smart Watches”) or experimental devices that could allow people who have suffered brain damage to once again form memories (“Memory Implants”). Some could be key to sustainable economic growth (“Additive Manufacturing” and “Supergrids”), while others could change how we communicate (“Temporary Social Media”) or think about the unborn (“Prenatal DNA Sequencing”). Some are brilliant feats of engineering (“Baxter”). Others stem from attempts to rethink longstanding problems in their fields (“Deep Learning” and “Ultra-Efficient Solar Power”). As a whole, we intend this annual list not only to tell you which technologies you need to know about, but also to celebrate the creativity that produced them.
Each link in the introduction above leads to an article length exploration of the breakthrough. Deep Learning and Memory Implants caught my eye. Deep Learning uses computer processing power to simulate neural networks; the article focuses on Ray Kurzweil's work at Google, where they've made magnitudes of progress in speech and image recognition using deep learning. The image recognition will be key to Google Cars, to indexing images and video; the speech software does the same for sound, including human speech. Kurzweil's next area of inquiry? -- natural language understanding:
    Kurzweil isn’t focused solely on deep learning, though he says his approach to speech recognition is based on similar theories about how the brain works. He wants to model the actual meaning of words, phrases, and sentences, including ambiguities that usually trip up computers. “I have an idea in mind of a graphical way to represent the semantic meaning of language,” he says.
    That in turn will require a more comprehensive way to graph the syntax of sentences. Google is already using this kind of analysis to improve grammar in translations. Natural-language understanding will also require computers to grasp what we humans think of as common-sense meaning. For that, Kurzweil will tap into the Knowledge Graph, Google’s catalogue of some 700 million topics, locations, people, and more, plus billions of relationships among them. It was introduced last year as a way to provide searchers with answers to their queries, not just links.
    Finally, Kurzweil plans to apply deep-learning algorithms to help computers deal with the “soft boundaries and ambiguities in language.” If all that sounds daunting, it is. “Natural-language understanding is not a goal that is finished at some point, any more than search,” he says. “That’s not a project I think I’ll ever finish.”
Imagine what kind of grammar checker  that would produce. Or maybe it could create an auto-writer instead as well as writing assessment tools -- ask the computer (speech recognition, remember; no need to use a mouse) to create a report on the latest trends in and it arrives in your inbox in a day or two, freeing your assistant to do proper things, like pick up your dry-cleaning and getting your lunch. And then, if you really want to get scifi with it, imagine that the memory implant, the idea of which is to embed a chip that allows the return of long term memory capabilities for people who have lost (perhaps from traumatic injury, or Alzheimer's) that ability. Once that's possible, would there be a service that offered to preserve your memories on a chip so that should you suffer a stroke, and receive a memory implant, the implant provides not just the ability to create new long term memories, but restores key prior ones? What if the implant could give you memories you never had but would like or need, such as, oh, maybe remembering the contents of that history textbook you never got around to reading before the final exam?

No comments: