With the continuous rise of big data, that's no longer good enough.
We now are entering the Cognitive Systems Era, in which a new generation of computing systems is emerging with embedded data analytics, automated management and data-centric architectures in which the storage, memory, switching and processing are moving ever closer to the data.
Whereas in today's programmable era, computers essentially process a series of "if then what" equations, cognitive systems learn, adapt, and ultimately hypothesize and suggest answers. Delivering these capabilities will require a fundamental shift in the way computing progress has been achieved for decades.
The four characteristics of cognitive systems
They are data-centric
The volume of data produced today isn't just increasing—it's getting faster, taking more forms and is increasingly uncertain in nature. Uncertainty arises from such sources as social media, imprecise data from sensors and imperfect object recognition in video streams. IBM experts believe that by 2015, 80 percent of the world's data will be uncertain.
The volume of data produced today isn't just increasing—it's getting faster, taking more forms and is increasingly uncertain in nature. Uncertainty arises from such sources as social media, imprecise data from sensors and imperfect object recognition in video streams. IBM experts believe that by 2015, 80 percent of the world's data will be uncertain.
Watson, the Jeopardy-winning system, is an early example. When Watson answers a question it analyzes uncertain data, and develops a statistical ranking and a level of confidence in its answers. It then goes "offline" for additional training to refine its capabilities. In the future, Watson will be able to engage in interactive dialog with people, develop evidence profiles revealing the source of its answers, and engage in continuous learning based on its own experiences.
These systems "scale-in"
Historically, performance improvements in IT systems have come from scaling down (Moore's law, which describes how semi-conductors become more that the density of semi-conductors become more powerful and more compact); scaling up (more powerful processors added to a single system), and scaling out (linking together more and more processors or entire systems in parallel).
In cognitive systems, performance improvements will derive from scaling in: moving key components, such as storage, memory, networking and processing onto a single chassis, closer to the data. Netezza and the new IBM PureSystems are the first commercially available examples of scaling in. In the future these capabilities will move even closer to the data, scaling-in computing elements first in a single drawer or card and eventually onto a single, three-dimensional chip module. This scale-in effect will reduce the latency that can occur when trying to move terabytes or exabytes of data around a computing system.
They automate system and workload management
Deploying applications in an enterprise environment often requires that multiple virtual machines be configured manually, a complex, time-intensive process prone to error. For the new PureSystems, IBM Research scientists developed software tools to create and manipulate blocks of code so users can drag-and-drop the pieces they need for compute power, storage and software applications. And the blocks already know how to connect to one another and across multiple virtual machines.
Aucun commentaire:
Enregistrer un commentaire