Imagine Watson with reason and better communication skills.
The Watson supercomputer may be able to beat
reigning Jeopardy champions, but scientists at IBM (IBM) are developing
new, super-smart computer chips designed from the human brain -- and
that might ultimately prove much more impressive.
These new silicon “neurosynaptic chips,” which will be fed using about the same amount of energy it takes to power a light bulb, will fuel a software ecosystem that researchers hope will one day enable a new generation of apps that mimic the human brain’s abilities of sensory perception, action and cognition.
It’s akin to giving sensors like microphones and speakers brains of their own, allowing them to consume data to be processed through trillions of synapses and neurons in a way that allows them to draw intelligent conclusions.
These new silicon “neurosynaptic chips,” which will be fed using about the same amount of energy it takes to power a light bulb, will fuel a software ecosystem that researchers hope will one day enable a new generation of apps that mimic the human brain’s abilities of sensory perception, action and cognition.
It’s akin to giving sensors like microphones and speakers brains of their own, allowing them to consume data to be processed through trillions of synapses and neurons in a way that allows them to draw intelligent conclusions.
IBM’s ultimate goal is to build a chip ecosystem
with ten billion neurons and a hundred trillion synapses, while
consuming just a kilowatt of power and occupying less than a two-liter
soda bottle.
“We are fundamentally expanding the boundary of what computers can do,” said Dharmendra Modha, principal investigator of IBM’s SyNAPSE cognitive computing project. “This could have far reaching impacts on technology, business, government and society.”
The researchers envision a wave of new, innovative
“smart” products derived from these chips that would alter the way
humans live in virtually all walks of life, including commerce,
logistics, location, society, even the environment.
“Modern computing systems were designed decades ago for sequential processing according to a pre-defined program,” IBM said in a release. “In contrast, the brain—which operates comparatively slowly and at low precision—excels at tasks such as recognizing, interpreting, and acting upon patterns.”
These chips would give way to a whole new “cognitive-type of processing,” said Bill Risk, who works on the IBM Research SyNAPSE Project, marking one of the most dramatic changes to computing since the traditional von Neumann architecture comprised of zeros and ones was adopted in the mid-1940s.
“These operations result in actions rather than just stored information, and that’s a whole different world,” said Roger Kay, president of Endpoint Technologies Associates, who has written about the research. "It really allows for a human-like assessment of problems."
[...]
Providing a real-life example of how their partnership might one-day work, Kay imagined a medical professional giving triage to a patient.
Digital computers would provide basic functions such as the patient’s vitals, while the cognitive computer would cross reference data collected at the scene in real-time with stored information on the digital computer to assess the situation and provide relevant treatment recommendations.
“It could be a drug overdose or an arterial blockage, a human might not know which is which [from the naked eye],” explains Kay. “But a [cognitive] computer could read the symptoms, reference literature, then vote using a confidence level that can kind of infer which one is more likely the case.”
“Modern computing systems were designed decades ago for sequential processing according to a pre-defined program,” IBM said in a release. “In contrast, the brain—which operates comparatively slowly and at low precision—excels at tasks such as recognizing, interpreting, and acting upon patterns.”
These chips would give way to a whole new “cognitive-type of processing,” said Bill Risk, who works on the IBM Research SyNAPSE Project, marking one of the most dramatic changes to computing since the traditional von Neumann architecture comprised of zeros and ones was adopted in the mid-1940s.
“These operations result in actions rather than just stored information, and that’s a whole different world,” said Roger Kay, president of Endpoint Technologies Associates, who has written about the research. "It really allows for a human-like assessment of problems."
[...]
Providing a real-life example of how their partnership might one-day work, Kay imagined a medical professional giving triage to a patient.
Digital computers would provide basic functions such as the patient’s vitals, while the cognitive computer would cross reference data collected at the scene in real-time with stored information on the digital computer to assess the situation and provide relevant treatment recommendations.
“It could be a drug overdose or an arterial blockage, a human might not know which is which [from the naked eye],” explains Kay. “But a [cognitive] computer could read the symptoms, reference literature, then vote using a confidence level that can kind of infer which one is more likely the case.”
By: Fox News
Aucun commentaire:
Enregistrer un commentaire