You can date the first modern era of computing, in which massive
mainframes like ENIAC were put to work on math and business problems too
complex for the simple counting machines that came before, to a series
of talks about computer science in the late 1940s.
Likewise, you can mark the moment technology started to move away
from those days of Big Iron toward the era of the personal computer as
Dec. 9, 1968, when Douglas Englebart
introduced computer mice, word processing, hypertext and video
conferencing at an event in San Francisco dubbed “The Mother of All Tech
Demos.”
On Nov. 19, IBM held what it hopes will be another such watershed conference at its
Almaden Research Center in San Jose — a colloquium on emerging computing
technologies modeled on how the human mind works. The talks entitled
“Cognitive Systems: The New Era of Computing,” may well usher in a new
era.
“What we think of this event as is a kind of open parenthesis on the cognitive computing era,” Michael Karasick,
IBM VP and head of the Almaden Research Center. “We don’t necessarily
know where it’s going, but we want to get people thinking about these
technologies and what’s now becoming possible.”
Cognitive computing is a branch of computer science that seeks to
create computers that process data in ways that are more similar to how
an organic brain processes data. It’s more of an umbrella term than a
specific technology, touching on topics like machine learning,
artificial intelligence, and computational creativity.
Broadly speaking, these systems are better than traditional computing
at the things that organic brains excel at. Chief among those things is
that they can learn, enabling them to figure out how to perform tasks
that are far too complicated for a human developer to model on their
own, like language processing or image recognition.
By: Jon Xavier
Aucun commentaire:
Enregistrer un commentaire