vendredi 29 novembre 2013

A boy and his atom: the world's smallest movie

Technology rules our world. And to continue pushing boundaries, we need future generations to embrace technology's foundation — science. This pursuit, to get today's youth to admire scientists the way they admire athletes and actors, became IBM's mission. So they asked us: How do we spread the word about science? Our solution was to make the world's smallest movie. Each frame is made of hundreds of atoms (yes, real atoms), moved to their exact placements by the scientists at IBM Research — Almaden. The frames are combined into an animation, which is now the Guinness World Records ™ record holder for World's Smallest Stop-Motion Film. The scientists themselves made the movie, along with promotional materials like movie posters (all also made with atoms). And buzz was generated by word of mouth — we premiered the film live in a Brooklyn science class before it went live to the world as part of the Tribeca Online Film Festival, then promoted on social media networks with bonus content like additional video shorts and infographics. Proof that when you create something unique, it gets people talking.

The audience was drawn in by paid media like banners, print ads, and online video pre-roll, and by word-of-mouth social media initiatives and press outlets like Wired, Buzzfeed, and Mashable. All assets were compelling to the audience because they teased the most impressive aspect of our content piece - it was made with atoms.

IBM loved this simple, shareable way to spread the word about science and data storage, and the film was accepted into the Tribeca Online Film Festival and shown at the New York Tech Meet-up and the World Science Festival. The film, now in the top 1% of all most-watched YouTube videos, surpassed 1 million views in 24 hours, and 2 million views in 48 hours, with more than 27,000 likes. It was trending on twitter on its release day and totaled more than 21,000 social media mentions with 96% positive sentiment, increasing IBM social mentions by 137%. As of submission, the film garnered 2.4 million news impressions (not including television coverage) and 23.6 million impressions overall, effectively reaching all targets — the science, tech, film, education and entertainment communities — with a strong global reach (33% of online activity from Europe, 10% from China and 8% from India, Australia and Japan).





 

IBM's supercomputer Watson is now your best shopping companion



It’s been more than two years since IBM’s Watson made its hit TV debut on Jeopardy! And now, Watson is the latest “cognitive, expert personal shopper” developed by an early stage partner in the IBM Watson Developers Cloud, digital shopping company Fluid.

Their Watson-powered app—the Fluid Expert Personal Shopper—marries Watson’s transformational cognitive computing that provides dialogue-driven assistance with product recommendations and content, letting consumers make queries using Watson’s natural language processing (NLP). 

But, “Watson’s natural language capabilities are not what make it special,” Brooke Aguilar, VP Global Business Development at Fluid told brandchannel. “The big differentiator is that Watson is a learning machine and one that quickly learns from and adapts with each interaction. In the context of digital shopping this presents a tremendous opportunity for Fluid to give consumers highly engaging and rich shopping experiences that are personalized and become more so with continued use.”

The app incorporates consumer information to become smarter with each interaction and operates as a knowledgeable sales associate in the palm of your hand, fueling a new era of cognitive apps. 

IBM is sharing its technology with the global entrepreneurial community to help build the next generation of apps. “The move aims to spur innovation and fuel a new ecosystem of entrepreneurial software application providers—ranging from start-ups and emerging, venture capital backed businesses to established players," according to a press release.
 
The North Face is the first brand to sign-on to the Fluid Expert Personal Shopper. “Say you’re on the TheNorthFace.com site. You’re planning a two-week camping trip to Wyoming’s Wind River mountain range in May. You could ask, “What gear and food will I need?” and learn what you’ll want to know so you can have a more safe and comfortable experience—such as the fact that it often snows in the Wind Rivers in springtime.”

Since that now infamous public debut in 2011, Watson has gained a 240 percent improvement in system performance and a 75 percent reduction in the physical requirements needed to run the system which can now operate from a single Power 750 server with Linux from a cloud computing environment. 

The biggest challenge to the Fluid team in building the app was adapting “the Watson system and team to accommodate an entirely new vertical market focus,” said Aguilar. “Previously, the majority of effort commercializing Watson technology has been in healthcare. Adapting Watson to direct consumer access in a retail context requires new thinking and creative approaches.”

The future of apps like Watson in retail, Aguilar added, is to “serve the rich, relevant content so consumers can make smart, satisfying purchases in a totally natural context that begins to break down the consumer/computer barrier. It puts the power in the hands of the consumer like never before, and it gets better every time.”



By: Sheila Shayon
Link: http://www.brandchannel.com/home/post/2013/11/28/IBM-Watson-Fluid-App-112813.aspx

mercredi 27 novembre 2013

Airtel, HCL, Tata Motors deploy IBM’s cloud software

IBM said that Bharti Airtel, HCL and Tata Motors are using its cloud-based talent management software to improve productivity. 

Financial details of IBM’s arrangements with the three Indian companies were not disclosed. 

In partnership with IBM, HCL has replaced its home grown candidate tracking system with a new solution that automates the recruitment process, according to a press statement. 

Tata Motors has teamed up with IBM to study ‘job fit’ within the organisation by using an online system that can screen candidates for the right position. 

Bharti Airtel is using IBM’s Kenexa survey to understand factors driving employees to be more productive and stay with the company for a longer period of time. 

About 70 per cent of Chief Executive Officers cite human capital as the single biggest contributor to sustained economic value, according to a recent global study by IBM. 



lundi 25 novembre 2013

IBM's Big Bets: Cognitive and Analytics

IBM boasts an extensive portfolio of products and services to help organizations develop analytics solutions to gain business advantage and to improve the human condition.

At IBM Information on Demand (“IOD”) in Las Vegas the first week of November, IBM SVP and Group Executive Steve Mills, and other IBM executives, unveiled 30 announcements that identified new and updated products and services to augment the IBM portfolio.

IBM also spent some time on at IOD educating attendees about the commercialization of its other big bet, cognitive computing, perhaps an even more strategic bet than analytics, operating under the now famous IBM Watson moniker. IBM, however, held the next major step about opening the Watson ecosystem development in abeyance for one full week after the end of IOD. 

IBM’s two big bets, cognitive and analytics, are situated at entirely different points of their respective lifecycles. Analytics, and related information management solutions, are commercially here and now. Cognitive computing has only recently hatched from the research side of R&D, and has nearly 100% of its commercial life still ahead of it. How do analytics and cognitive fit together into the larger IBM strategy to continuously help customers apply technology to the benefit of business and people?

Analytics

Despite the flurry of announcements one key message came through loud and clear at IOD: IBM believes that analytics is a primary game-changer for businesses, IBM believes customers should bet big on analytics, and preferably by buying and obtaining help applying IBM’s analytics and information management offerings. To IBM’s credit its analytics consultants receive no special compensation for selling IBM’s products — their primary objective concerns customer success. IBM even puts its money where its mouth is by entertaining the use of value-based pricing when applicable.
The IOD keynotes stirred attendee passion for analytics through a mix of IBM executive, customer, and partner presentations and interviews. Mr. Mills aptly drove the point home at a news briefing stating that we are entering an “era of decision-making excellence” and that we are just “… at the beginning of the revolution…” Despite IBM presenting a large quantity of analytics customer case stories, “You ain’t seen nothing yet” according to Mr. Mills. Mr. Mills cited decreasing hardware costs as contributing factors to the rise of analytics and predicted that eventually businesses would spend more on analysis and prediction than process automation.
Mr. Mills better be correct about the demand cycle for IBM has placed a giant wager on analytics. IBM employs over 9,000 business analytics consultants, and has helped customers implement over 3,000 big data deployments to date. Though IBM has experienced 5 consecutive declines in year-over-year quarterly revenue, business analytics has grown 8% year over year through the first 3 quarters of FY13. IBM has and will continue to make organic R&D investments “measured in the billions.”  Dating back to and including the 2009 purchase of SPSS, over the past 40 months IBM has made 43 acquisitions, with more than half fitting into either analytics or related information management spaces.
IBM hinted that is nowhere ready to sit on its laurels, and customers should expect on-going innovation in both the analytics and related information management space such as in databases, business process automation, integration, data governance, and content management. Some complain that IBM’s portfolio is difficult to navigate because there are so many options and sub-brands. IBM exhibited awareness of the issue, and customers may see IBM realign and simplify the portfolio during 2014.
If the realignment and considerable evangelism on display at IOD, which will be called IBM Insight in 2014, makes it simpler for customers to grasp the benefit and move towards solutions with IBM more quickly, IBM will happily continue to make such investments. Analytics in 2013, 2014, and probably for several years thereafter, will be looked to by IBM’s executives and shareholders as a lynchpin for moving IBM revenues in a more northerly direction.

Cognitive

If analytics and all that it entails carries and will carry an important revenue load for IBM in the near and medium term, cognitive computing supplies the air cover to keep customers coming back to IBM for tech innovation-based transformation. Though IBM Watson continues in its technology transfer phase, make no mistake about it: IBM Watson is already attracting and attaching developers, and is already being used to help customers, albeit on a limited basis.
The most fascinating angle about IBM’s recent Watson announcement is the notion of “cognitive applications.” IBM’s new ecosystem program for Watson aims to recruit entrepreneurial ISVs, putting IBM into the pole position with a new platform for the first time in a generation. With a few exceptions, IBM largely left the packaged enterprise application market to the likes of SAP, Oracle, Microsoft, Infor, and Salesforce.com, choosing instead to win at the edge of those solutions where services, infrastructure, customization and hand-holding were required.  That may very well change with Watson.
Perhaps whetted by the value-added nature of analytics, and some of the industry-specific applications where IBM has succeeded, the notion of cognitive as an entrée into the next generation enterprise application space portends an IT competitive overhaul. Given that the focus on SMAC – social, mobile, analytics, cloud – by the developer, VC, and tech entrepreneurial community has been racing for several years now some have begun to whisper “but what’s next?” IBM’s greatest challenge with Watson over the coming years may be how to manage the explosion of interest from developers. Being a “platform” vendor is the right problem to have.

EMA Perspective

Virtually every Global 2000 company has implemented ERP, CRM, and SCM solutions. While all of these core solutions are experiencing a refresh cycle due to SMAC, and in doing so delivering improved business effectiveness, another slice of attention has gone towards derivative applications, including:
  • Analytic apps offer insight-driven business solutions that leverage existing data. Visit EMA’s Business Intelligence and Data Warehousing research for more detailed coverage.
  • By wrapping APIs around data integration, data governance, and business processes, organizations may now develop a fresh set of integrative applications. Just as big data opens the door to better insight, evolution in the integration space allows for a refresh in business process optimization. On a visionary basis this has to do with harnessing IoT, or the Internet of Things, but on a practical basis the integrative approach will help enterprises harness YoT – Your own Things. Many of IBM's comprehensive offerings on this front were evident at IOD.
  • Cognitive applications, which also tap largely into existing informational and process assets, promise an entirely rethought approach to business model re-engineering.
Investors worry about IBM revenue growth.  Share prices have reflected the concern recently. Industry analysts, however, enjoy the freedom to look many years ahead when assessing vendor success probabilities. Between analytics, integration, and now particularly cognitive, over the long run IBM is as positioned as well as any enterprise IT solution supplier.




By: Evan Quinn
Linkhttp://blogs.enterprisemanagement.com/blog/ibms-big-bets-cognitive-analytics/

IBM introduces Watson to the public sector cloud

IBM has strengthened its cloud computing offerings with a new high-end service that takes advantage of its famed Jeopardy!-winning Watson technology. 

The service, dubbed IBM Watson Developers Cloud, could help agencies by applying Watson’s cognitive computing intelligence to the federal government’s big data problems, from fraud analysis to intelligence surveillance and sensor-gathered data.

IBM is making its Watson technology available as a development platform in the cloud in the hopes of prompting third-party developers to create new applications that take advantage of its ability to learn from its interactions with data and reprogram itself for better results. IBM is providing a developer toolkit, educational materials and access to Watson’s application programming interface. 

Developers that build Watson-powered apps in the cloud can use their organization’s data or they can access the IBM Watson Content Store, which features third-party data. Additionally, IBM has committed 500 subject matter experts to the IBM Watson Developers Cloud effort. 

IBM’s high-profile Watson technology is leading the way towards a new era of cognitive computing systems. In September, Frost & Sullivan recognized IBM Watson Solutions with the 2013 North America New Product Innovation award, which is given to a company with an innovative product that leverages leading-edge technologies and produces value-added features and benefits for customers.

"The IBM Watson Engagement Advisor technology can listen to and respond to a series of follow-up questions and remember the previous questions that were posed," said Frost & Sullivan analyst Stephen Loynd. "In other words, IBM Watson combines technologies that allow the Engagement Advisor to understand natural language and human communication, generate and evaluate evidence-based hypothesis, as well as adapt and learn from user selections."

The announcement of IBM Watson Developers Cloud comes just weeks after IBM forfeited its legal battle against Amazon Web Services to win a 10-year, $600 million cloud computing services contract with the CIA. Analysts said it doesn’t make sense for IBM to compete head-to-head against Amazon on federal deals aimed at the lowest possible price.

"From a pure analytics standpoint, Watson is a great platform," said Shawn McCarthy, research director for IDC Government Insights. "So easing the way that people can have access to what it does is a good thing, and it allows IBM to focus on something it does really well as opposed to playing the commodity game, which is tough." 

The initial target market for IBM Watson Developers Cloud is the private sector, with IBM touting third-party applications in such areas as retail and health care. But analysts say the offering will impact big data problems in the public sector, too. McCarthy sees potential for Watson-powered apps in such areas as fraud analysis, which the White House is ramping up due to worries about scammers taking advantage of consumers signing up for its new health care plans. 

"Fraud issues could be huge. That could be anything from tax issues at the state and local level, to unemployment or other benefits — anything that people can dream up for fraud," McCarthy says. "A good analytics solution can help unravel where A and B don’t exactly line up."

Another possible application of the IBM Watson Developers Cloud is entity analytics, which is used by the Department of Homeland Security to find patterns in data by looking for commonalities about entities, whether they are people, phone numbers or license plates. 

"A good example of entity analytics is when a credit card is used here and goes to this address, and the address is suspicious because a phone call made from that address was used to contact a criminal or terrorist network," McCarthy explains. "Entity analytics is about comparing many sources of data and looking for commonalities and patterns. What Watson has is the ability to learn from the data flowing through it, so it could learn that this address is associated with this group of friends."



By: Carolyn Duffy Marsan

vendredi 22 novembre 2013

IBM Joins Fight Against Cancer in Developing Nations

Armed with big data and analytics, IBM enlists in the fight against cancer in developing nations by building a cancer registry.

IBM is working with the Union for International Cancer Control (UICC) to create a massive clinical data set on cancer patients by building cancer registries in developing nations.

Company officials said IBM's efforts will start where they are most needed, in Sub-Saharan Africa—where less than 1 percent of the region's population is covered by a cancer registry. With more than a billion people in the region, the new effort will improve cancer registration and, in time, treatment for patients in Africa while enriching knowledge about cancer for patients all over the world.

IBM said this data set could become the world's largest and most comprehensive clinical data set on cancer patients, and the company is donating its big data and analytics technology to the cause.

Cancer registries provide governments with incidence and mortality data so effective policies for cancer control can be developed. They also provide clinicians with information about patient outcomes to help identify tailored treatment options. Reliable and comprehensive data leads to the most effective interventions for saving lives, IBM said.

Gary Cohen, chairman of IBM Africa, announced IBM's donation of big data and analytics technology at the World Cancer Leaders' Summit in Cape Town, South Africa. "IBM's objective is to help find ways to level the field of access through innovation and knowledge, so that we can bridge the divide between the discovery of cancer and the delivery of treatment with positive outcomes—regardless of geography," he said, in a statement.

IBM officials said the initiative will begin in two to three countries in Sub-Saharan Africa, continue throughout the region, and extend to Southeast Asia and Latin America. The IBM collaboration supports UICC's work with the Global Initiative for Cancer Registries (GICR) in low- and middle-income countries. According to the World Health Organization, about 70 percent of all cancer deaths occur in developing nations. Experts predict that the Sub-Saharan region alone will see more than an 85 percent increase in its cancer burden by 2030.

"With IBM's expertise in big data and analytics, I can imagine a world in which the very latest scientifically proven means of detecting and treating cancer is available in all countries, benefitting patients wherever they are in the world," said Cary Adams, CEO of UICC, in a statement. "This information will provide unique and compelling insights on cancer, the likes of which we have not seen before."

According to the World Health Organization, more than 12 million people worldwide will be diagnosed with cancer this year, and approximately 8 million will die. Yet, Adams notes that this number is drawn from a database that is increasingly weak as the cancer burden moves, as predicted, from developed to developing countries. "Much of the world is tracking a growing burden of cancer with very incomplete information," he said. "Improving the collection of data is critical to our ability to address cancer around the world."

Collecting data about the incidence of cancer in many countries in the region is achieved through a paper-based system, which can consume hours to gather information for a single patient. All of the United States and Canada, 94 percent of Western Europe and 80 percent of Australia are covered by a cancer registry, according to leaders of the GICR initiative.

"IBM has always contributed its best assets and thinking to the world's biggest challenges, and there are few more serious than cancer," Dan Pelino, general manager of IBM Global Public Sector, said in a statement. "By helping UICC build cancer registries, we can shorten the time between discovery and treatment to save lives."

IBM joined UICC in 2012 to help the organization address the increasing data collection and analysis needs of the cancer community. IBM awarded an initial consulting grant that determined the business and technology plans required to build cancer registries. The next steps for IBM will be collaborating with the UICC and its GICR partners to plan and design the cancer registry in Sub-Saharan Africa, including the services, hardware, software, technical support and expertise to support the plan.

"Improved cancer registry data will reveal the population-based trends that are so important in shaping and adapting a cancer strategy," Dr. Isaac Adewohle, a gynecologist in Nigeria and president of the African Organization for the Research & Training in Cancer, said in a statement. "This will inform my daily practice in ways that my hospital data alone cannot." 

IBM has a history in teaming up with clinicians, researchers and public health organizations to help fight cancer through big data, cloud, analytics and other technologies. For example, IBM's Watson cognitive computing technology is advancing evidence-based treatment and research with Memorial Sloan-Kettering Cancer Center and MD Anderson Cancer Center.

IBM Research recently developed a microfluidic probe with a Swiss hospital to enhance cancer diagnosis, and nanotechnology to improve the treatment of breast cancer with the Institute of Bioengineering and Nanotechnology. IBM's World Community Grid provides free computational power to speed up cancer research as part of the Help Conquer Cancer project. And in collaboration with the Kenyan government, IBM has developed a plan to promote cervical cancer screening.





By: Darryl K. Taft
Link: http://www.eweek.com/database/ibm-joins-fight-against-cancer-in-developing-nations.html

Computers that emulate the brain may be the future, IBM says

You can date the first modern era of computing, in which massive mainframes like ENIAC were put to work on math and business problems too complex for the simple counting machines that came before, to a series of talks about computer science in the late 1940s.

Likewise, you can mark the moment technology started to move away from those days of Big Iron toward the era of the personal computer as Dec. 9, 1968, when Douglas Englebart introduced computer mice, word processing, hypertext and video conferencing at an event in San Francisco dubbed “The Mother of All Tech Demos.”

On Nov. 19, IBM held what it hopes will be another such watershed conference at its Almaden Research Center in San Jose — a colloquium on emerging computing technologies modeled on how the human mind works. The talks entitled “Cognitive Systems: The New Era of Computing,” may well usher in a new era.

“What we think of this event as is a kind of open parenthesis on the cognitive computing era,” Michael Karasick, IBM VP and head of the Almaden Research Center. “We don’t necessarily know where it’s going, but we want to get people thinking about these technologies and what’s now becoming possible.”

Cognitive computing is a branch of computer science that seeks to create computers that process data in ways that are more similar to how an organic brain processes data. It’s more of an umbrella term than a specific technology, touching on topics like machine learning, artificial intelligence, and computational creativity.

Broadly speaking, these systems are better than traditional computing at the things that organic brains excel at. Chief among those things is that they can learn, enabling them to figure out how to perform tasks that are far too complicated for a human developer to model on their own, like language processing or image recognition.



By: Jon Xavier

jeudi 21 novembre 2013

IBM Ranked #1 on The Graph500 Supercomputing List

IBM supercomputers have taken the top three spots on the latest Graph500 list released today during the Supercomputing Conference (SC'13) in Denver, Colorado. The biannual list ranks high-performance computing systems on the basis of processing massive amounts of Big Data.




The top three positions have been awarded to Lawrence Livermore National Laboratory's Sequoia, Argonne National Laboratory's Mira and Forschungszentrum Juelich's (FZJ) JUQUEEN, which all use IBM Blue Gene/Q systems. Blue Gene supercomputers have ranked #1 on The Graph500 list since 2010 with Sequoia topping the list three consecutive times since 2012. IBM also was the top vendor on the most recent list, with 35 entries out of 160.

The Graph500 was established in 2010 by a group of 50 international HPC industry professionals, academics, experts and national laboratory staff. There are five key industries that the Graph500 tries to address with its benchmark which include cybersecurity, medical informatics, data enrichment, social networks, and symbolic networks. All of these industries process and analyze large amounts of data, which is also why the Graph500 looks at graph-based data problems, since this a foundation of most analytics work, and the ability of systems to process and solve complex problems. The Graph500 was established as a complement to the TOP500 list, which ranks supercomputers based on performance speed via a benchmark called the LINPACK.



By: IBM News Releases
Link: http://www-03.ibm.com/press/us/en/pressrelease/42542.wss

IBM Uses RESTful APIs to Turn Watson into a Cloud Service

The combination of natural language processing and advanced text analytics is giving rise to a new class of  cognitive applications that have the potential to radically transform the way entire industries operate. The most famous instance of a cognitive application is, of course, IBM Watson, the supercomputer that IBM built on top of Power processors to best the champions from the Jeopardy quiz show.




Beyond playing games, however, the ability to easily query a knowledge base of expertise that gets smarter with each successive correct answer it generates has the potential to put a massive amount of expertise directly into the hands of the average. Now IBM is moving to put that power in the hands of developed with the launch of IBM Watson Developer Cloud, which includes a software development kit that will allow developers to build applications on top of IBM Power systems running on an IBM cloud while at the same time exposing a set of RESTful APIs that will make it possible to invoke those applications from within another application. In addition, IBM plans to make third-party applications developed on this platform available to customers through a new IBM Watson Content Store.

The first three independent software vendors to participate in the program include Fluid, which builds on-line shopping applications for retailers; MD Buyline, a provider of supply-chain applications for hospitals; and Welltok, which built a social health application for end users. All three are scheduled to have applications available on the IBM Watson cloud in early 2014.

IBM is also partnering with providers of sources of data that can be consumed by Watson. They include Healthline, a provider of health information services; and Elance, an online marketplace for freelancers.

According to Rob High, a chief technology officer and IBM fellow in the IBM Software group, IBM plans to work with several venture capital firm to help promote the development of both cognitive computing applications and additional sources of data services that could be consumed by Watson.

Just like any Web application, cognitive applications will likely be most useful when they are invoked within the context of a business process. For example, a healthcare provider recording a patient’s symptoms could dynamically query a cognitive application running in the cloud to determine the best course of treatment to pursue. Similarly, a law firm would be able to make use of a cognitive application to determine what laws best apply to any given case.

Today, people spend countless hours researching information that an application that supports natural language processing could instantly put at their fingertips or a voice command away. To enable that IBM wants to build an ecosystems of Watson applications in the cloud that could be dynamically invoked by thousands of other applications, High says.

The challenge with creating cognitive applications is that right now most developers don’t have access to supercomputers based on IBM Power systems. By making Watson available as a service in the cloud IBM is removing hardware issues as a barrier to the development of next-generation cognitive computing applications. To accomplish that commercial success, High says, IBM has almost completely revamped the Watson system that appeared on Jeopardy!

No doubt other ecosystems of cognitive applications will emerge in time. But for developers looking to create truly unique applications that have the potential to transform the economics of entire industries, IBM Watson in the cloud provides an opportunity to get started building these applications without having to spend millions of dollars just to find out whether they will work or not.



By: Michael Vizard
Linkhttp://blog.programmableweb.com/2013/11/19/ibm-uses-restful-apis-to-turn-watson-into-a-cloud-service/

mercredi 20 novembre 2013

IBM Helps Cities in Basque Region of Spain Build A Sustainable Community and Turn Data into Insight

Today during Smart City Expo World Congress, IBM announced that it is working with the towns of Irun and Hondarribia, Spain, on a new Smarter Cities project. Using data from trash containers that know how much is thrown away, smart street lights that report out when they need maintenance, and parking places that know when they are empty, IBM’s smarter cities technology is providing real-time insight to help make better decisions.

The area known as the Bajo Bidasoa area in the Basque region of Spain, with a population of 78,000 people, is leading the way in mining patterns in vast quantities of diverse data, using real-time data to make accurate predictions, and engaging citizens via social collaboration to make the area a better place to live and work. 

At the foundation of the project, IBM’s Intelligent Operations Center software provides real-time insight into all city operations. It also powers the Smart City Center, an integrated command center where data is analyzed and shared. For example, city leaders can see the correlation between water consumption and waste generation, monitor and predict the effect of bad weather on incidents within the area, or visualize the amount of resources used across water, waste management, transportation, energy and public works departments.  

Bold city leaders from the towns of Irun and Hondarribia set out to work together to improve sustainability, encourage more citizen participation and provide greater transparency. They did this by working with technology partners IBM, Servicios de Txingudi, the local water and waste water management and street cleaning agency, and Smartland Technologies, a group of six companies including IBM Business Partner BuntPlanet.  

“The possibility of analyzing large amounts of data through new technology opens up enormous possibilities for better public sector management,” said the mayor of Irun, Jose Antonio Santano. “We live in an era of global crisis and it is precisely at this time when we need to sharpen our ingenuity to better know how to apply talent and technology for the benefit of our citizens.” 

The region has also made numerous advancements to improve its water systems under the leadership of Servicios de Txingudi by installing 32,000 sensors that collect water consumption data in real time. Water leaks decreased by 70 percent; water supply pumping costs decreased by 14 percent; and unnecessary water treatment decreased by 40 percent as a result of the ability see and manage water systems in real-time. The area is also generating renewable and efficient energy by installing small hydro plants, generating electricity from bio gas obtained from wastewater treatment, installing solar panels on water tanks, and building a combined heat and power facility (CHP facility) that allows the water treatment plant to be energy independent when necessary. 

To improve waste management and encourage more citizen participation in recycling, more than 750 compost bins were distributed to citizens, meanwhile volunteers are placing RFID tags on trash allowing waste generation to be more accurately measured and provide better insight into which social or environmental conditions create more waste and how to prevent it. Citizens are also turning to smartphones to communicate with city leaders. Incidents such as a fallen tree or traffic accident or pothole can be reported and shared including a photo and geographical information. Data is collected by the Smart City Center where issues are resolved and tracked and citizens can check the status of each.  

“We understand the importance of working with leading technology companies like IBM in order to meet citizen needs and respond to their problems,” said mayor of Hondarribia, Aitor Kerejeta. “But we are also very proud to have worked with local businesses, most from the Spanish region of Guipuzcoa, with entrepreneurs that promote economy and jobs that are close to us”. 

The community’s efforts to become smarter and more sustainable have also resulted in economic development as local entrepreneurs have worked alongside technology partners to create new technologies and in turn added jobs to the local economy. 

“The Bajo Bidasoa region of Spain has emerged as a model for other European cities as they apply leadership, collaboration and innovation to become a more sustainable, liveable area” said Sylvie Spalmacin-Roma, vice president, Smarter Cities Europe. 

IBM has deep expertise in working with cities of all sizes, helping solve their toughest challenges. By bringing together cloud computing, mobile and social, IBM his helping cities realize the potential to build more sustainable, efficient cities that are focused on the needs of citizens.  





By: IBM News Releases
Link: http://www-03.ibm.com/press/us/en/pressrelease/42526.wss

OpenStack brings agility to the enterprise

In his keynote at the OpenStack Summit in Hong Kong earlier this month, Jonathan Bryce, the director of the OpenStack Foundation, referenced a user poll identifying the top 10 environments within which OpenStack was being deployed. Not surprising was that the usage today is for web applications, agile development and DevOps.

What did not get coverage is how IBM is adopting OpenStack in the enterprise, bringing the same agility and flexibility of commodity cloud to enterprise systems. In October, IBM announced Power Virtualization Center (PowerVC) as the management center for PowerVM. Credit where credit is due, VMware has done a great job with vCenter, providing an easy-to-use, administrator-friendly approach to ESX virtualization management. PowerVC seeks to do the same for PowerVM and IBM Power Systems.

Built on the OpenStack Havana release, PowerVC uses the core Cinder, Nova and Neutron management components of OpenStack to manage the provisioning and configuration of LPARs, the creation and allocation of storage LUNs to LPARs and the assignment of IP addresses and VLANs. Initial storage support is for IBM Storwize V7000 and SAN Volume Controller (SVC), directly allocating LUNs to LPARs via NPIV. Support for Shared Storage Pools (SSP) and other storage devices with OpenStack Cinder device drivers are also in plan.

Power administrator productivity is greatly improved. Finally they can allocate LPARs with SAN attached storage in one go. The waiting for storage admins to map and assign LUNs to the dynamically created NPIV WWNs is eliminated. Even more impressive is the ability to manage LUN reassignment during Live Partition Migration (LPM) of LPARs from one system to another. Remapping LUNs from the source system to the target system as required and validating the configuration to ensure success of the live LPAR migration.

PowerVC 1.2 is available in December 2013. In the meantime, you can find more information on IBM Developer Works and a sneak preview on YouTube.



By: Steve Strutt

The Gap in skills for mobile, social, analytics continues !

Skills have been quite the topic lately.  Why?  We have such skills gaps that any great company must really address the challenge, with the industry.

Here’s the results of the study that show the gaps !

Read last week’s blogs for IBM’s focus with the academic universities in security and analytics !

Vidéo link: http://www.youtube.com/watch?v=sdnFtmeVqg0




From: Sandy Carter blog
Link: http://socialbusinesssandy.com/2013/11/18/3190/


AMD, Cray, IBM, Intel and Nvidia Receive $25.4 Million in Contracts for Exascale Supercomputer Interconnect Design.

The Department of Energy’s (DOE) office of science and the national nuclear security administration (NNSA) have awarded $25.4 million in research and development contracts to five leading companies in high-performance computing (HPC) to accelerate the development of next-generation supercomputers.

Under DOE’s new DesignForward initiative, AMD, Cray, IBM, Intel Federal and Nvidia will work to advance extreme-scale, on the path to exascale, computing technology that is vital to national security, scientific research, energy security and the nation's economic competitiveness.

Exascale Computing Is Crucial

“Exascale computing is key to NNSA’s capability of ensuring the safety and security of our nuclear stockpile without returning to underground testing. The resulting simulation capabilities will also serve as valuable tools to address nonproliferation and counterterrorism issues, as well as informing other national security decisions,” said Robert Meisner, director of the NNSA office of advanced simulation and computing program.

As the nation’s largest funder of physical science research, DOE provides thousands of researchers at national labs and universities with access to some of the world’s most powerful supercomputers. These systems, which have peak speeds of quadrillions of calculations per second, are helping scientists study climate change, develop renewable energy sources, understand the makeup of our universe and develop new materials. But taking these research missions to the next level will require supercomputing systems that are 1000 times faster than today’s systems.



“In an era of fierce international HPC competition, the development of exascale computing becomes critical not only to our national security missions but to the nation’s economic competitiveness in the global marketplace. This partnership between industry, the DOE Office of Science and NNSA supports the development of technology to overcome the obstacles on the road to exascale systems,” said William Harrod, FastForward program manager and research division director for DOE’s advanced scientific computing research program.

DesignForward Initiative

The DesignForward contracts, which cover a two-year performance period, will support the design and evaluation of interconnect architectures for future advanced HPC architectures. Such interconnects will tie together hundreds of thousands or millions of processors, as building blocks of supercomputers to be used in studying complex problems in unprecedented detail. The DesignForward focus will be on developing interconnects that are energy efficient, have high bandwidth, and minimize the time to move data among processors.

“A major disruption is facing high performance computing because energy constraints are causing our building blocks, microprocessors and memory to change dramatically. We need to collaborate with computer companies to ensure that future supercomputers meet DOE's mission needs in science, energy and national security. Berkeley Lab is pleased to place these contracts on behalf of DOE and its laboratories,” said Sudip Dosanjh, director of the national energy research scientific computing center at Lawrence Berkeley national laboratory (Berkeley Lab).



DesignForward is the follow-on to DOE’s FastForward project, a public-private partnership between DOE and HPC industry to advance extreme scale computing technologies with the ultimate goal of funding innovative, critical R&D technologies needed to deliver next generation capabilities within a reasonable energy footprint. FastForward is funded by DOE's Office of Science and NNSA, and technically managed by seven national laboratories.

Successful Public-Private Partnership

Under the new contract, Intel will focus on interconnect architectures and implementation approaches, Cray on open network protocol standards, AMD on interconnect architectures and associated execution models, IBM on energy-efficient interconnect architectures and messaging models and Nvidia on interconnect architectures for massively threaded processors.

“We are honored to be selected for this research program and to continue our work with DOE to shape the future of high-performance computing, started last year with FastForward. This public-private partnership extends AMD’s research into the use of next generation APUs to meet the demands of extreme-scale computing. We believe this program will benefit the DOE and provide technology insights into challenges throughout the computing industry,” said Alan Lee, AMD’s corporate vice president of research and advanced development.

The vendors will collaborate with DOE's exascale co-design centers to determine how changes in the system architectures will affect how well the scientific applications perform.

“U.S. leadership in HPC is essential to meeting the mission-critical needs of DOE and other federal agencies. We are proud to have been chosen by DOE as a continued partner in their strategic work toward the advancement of next-generation supercomputing technology. Intel has a long standing commitment to science, research and innovation, from supercomputing to personal computing,” noted Dave Patterson, president of Intel Federal.

“Partnerships and collaborations are an important element of exploring new ideas and overcoming the challenges of exascale computing, and we look forward to working with the DOE researchers and playing a role in the co-design efforts that will be key to the success of the DesignForward program,” said Peg Williams, Cray’s senior vice president of high performance computing systems.

“Exascale computing is vitally important for U.S. scientific research and economic competitiveness, and Nvidia’s expertise in massively-parallel heterogeneous systems will play a critical role in reaching exascale. Interconnection networks are a key component of exascale systems. We are excited to collaborate with other DesignForward recipients to enable an open ecosystem for next-generation high-performance system interconnects,” noted Bill Dally, chief scientist and senior vice president of research at Nvidia



By: Anton Shilov
Link: http://www.xbitlabs.com/news/other/display/20131115202502_AMD_Cray_IBM_Intel_and_Nvidia_Receive_25_4_Million_in_Contracts_for_Exascale_Supercomputer_Interconnect_Design.html



 

 

lundi 18 novembre 2013

IBM Neo Heats Up Competition in BI Search

I wrote last week that one of the big trends in big data was a resurgence in the use of search and natural language processing for making BI as easy as Google. Last week at IBM's annual Information on Demand conference, IBM announced project Neo, heating up competition in BI Search and cloud BI.

Neo starts with a simple search box in which users can ask a question. In this screenshot, "what is the relationship between Budget, Gross Domestic Sales, by Story type. Neo will present a list of possible data sources that can answer the question. For now, these data sources are restricted to data sets loaded to the cloud, in a DB2 columnar data store. IBM concedes that for the product to be fully embraced, Neo will need to support on-premise data sources as well, and has said that is part of the product roadmap. 

Once the user selects the optimum data source, Neo generates an interactive visualization. For example, a user can refine the question by change "budget" to "units sold," for example. The visualization can also be changed to display as a trend rather than a bubble chart. In addition to the visualization, Neo generates a number of infographics (shown along the top)based on statistical algorithms that might be relevant. So even though the user didn't ask about seasonality, the data is showing there is a pattern with Fall being the best season for sales (early holiday shoppers, perhaps?).

Neo brings together a lot of intellectual property that IBM has acquired in recent years. The visualizations are powered by RAVE (Rapidly Adaptive Visualization Engine) technology and skills acquired through SPSS. The infographics are based on some of the capabilities in Analytic Catalyst, a module released in June of this year that makes advanced analytics easy for a casual business user. The natural language processing leverages Vivisimo, later rebranded Infosphere Data Explorer.

Not surprisingly, the initial demos of Neo are impressive. It's easy, visual, and powerful, for the most casual of decision makers. It could do for data what Google has done for the Internet. Today, the industry average for BI adoption is at 24% of employees, and ease of use is an oft-cited barrier to broader BI (take this year's survey to rate your BI adoption and ease of use).

What wasn't shown, though, is how the data sources get indexed and loaded to the cloud. IBM Cognos has previously tried to leverage the simplicity of search with its Go! Search interface launched in 2006. With that tool, content in PowerCubes and reports had to be indexed on a periodic schedule. Search was limited to key words, and the interface was existing reports, not nearly as visual as Neo. So Go! Search had a degree of complexity to implement, was less intuitive, appealing, and smart. Perhaps these are all reasons why it wasn't widely adopted? Just how well Neo overcomes the past limitations of Go! Search will only be known once the beta launches in January.

IBM also announced another new product, IBM Concert, that brings collaboration, workflow, and mobile together in a SaaS solution. This product is expected to be available in December. I see collaboration as still an emerging capability that customers are trying to figure out. Recognizing the influence that vendors like Facebook and Twitter have had on consumers, social and collaboration capabilities have began appearing in BI tools and enterprise apps a few years ago. SAP first launched Streamwork, then later acquired Success Factor's Jam, while Microsoft acquired Yammer, and TIBCO launched Tibbr. IBM has had strong collaboration capabilities in its Lotus Connections product, whose technology was first integrated with IBM Cognos in its version 10 release back in 2010. But I haven't found a single customer using those capabilities. Again, it's not clear if that's because it was poorly marketed and BI teams have other priorities, or if it reflects a larger, industry-wide problem. Collaboration around data today is usually offline from the data, whether via email or in meeting rooms or conference calls. So first, capturing comments in collaboration software is a change in the current work flow. Second, making comments publicly requires an analytic culture in which it's safe to voice opinions, dissentions, and to ask tough questions. Just imagine, if 30 years ago, the engineer who was worried about the Space Shuttle Challenger posted a comment in the data analysis of O-ring tests to the effect of, "the data shows we shouldn't launch. Too cold out." In those days, the engineer could barely voice a concern in whispers and only to his direct supervisor. (If you're as fascinated about that story, culture, and decision making catastrophe as I am, I'll be watching the Science Channel's documentary this Saturday). How far have we come since then? Is social something reserved more for personal and public opinion, or is it something that the industry is ready to embrace in BI?



By: Cindi Howson
Link: http://biscorecard.typepad.com/biscorecard/2013/11/ibm-neo-heats-up-competition-in-bi-search.html

IBM acquires Fiberlink as mobile-security strategy keystone

IBM Wednesday announced an agreement to acquire Fiberlink Communications, saying the purchase is a key part of a broader mobile-security strategy to provide assurance in transactions conducted via devices such as iPhones and Android smartphones.

Fiberlink provides mobile-device management (MDM) through its MaaS360 cloud-based offering, counting about 3,500 customers in industries that include financial services, healthcare and manufacturing. IBM's director of application data and mobile security Caleb Barlow says the acquisition, expected to be concluded shortly, puts IBM on a path to compete with MDM vendors such as, Symantec, AirWatch, MobileIron and Good Technology.

But Barlow also points out that Fiberlink should be considered part of IBM's broader strategy for mobile-device security, which includes IBM's recent acquisition of Trusteer, the security firm specializing in an anti-fraud, anti-malware approach that has been used in the banking industry in particular on the Web.

Through Fiberlink and Trusteer combined, "which is the key part of this,"  he says, IBM intends to provide a type of trust assurance in transactions done on mobile devices in business-to-business or business-to-consumer communications. With the Fiberlink acquisition, IBM is also solidifying its approach to supporting "Bring Your Own Device" environments.  

IBM's intention is to develop a unified mobile-security framework through cloud- and agent-based means that provides not just management of devices but security checks against malware or device hijacking, for example, especially during any sensitive transaction process.

In addition, the goal would be to enable transmission of relevant mobile-device security-event information to IBM's security information and event management tool, QRadar.

Barlow acknowledges there is "some overlap" in what Fiberlink can provide in application management and IBM's managed mobile security service started two years ago. "But it's fairly minimal," he says. IBM's main focus going forward is Apple iOS, and Android, "but we're also looking at Windows Mobile."



By: Ellen Messmer
Link: http://news.techworld.com/security/3489128/ibm-acquires-fiberlink-as-mobile-security-strategy-keystone/

The promise of software defined storage is for companies of all sizes

Software defined storage is a new concept for some providers—but not for IBM Storage. The first step in software defined storage is virtualization, which IBM has been offering for 10 years. When you virtualize your storage, data on all your storage from any storage company can be managed as a single pool of data with the same management interface with the same capabilities. The benefits of this are huge, including:
  • 47 percent reduction in time spent managing storage
  • 30 percent reduction in storage capacity growth

IBM’s newest member of the Storwize Family, the IBM Storwize V5000, makes software defined storage more affordable than ever. Any size company can now afford to experience the benefits of storage virtualization.

In addition, the Storwize V5000 has capabilities common to the Storwize family like Easy Tier automated tiering, thin provisioning and advanced remote mirroring and FlashCopy for data protection. This video covers these benefits (link: http://www.youtube.com/watch?v=Eq53TmFuvnI)

I mentioned that IBM Storage has supported the first step in software defined storage—storage virtualization—for many years. The Storwize V5000 also supports the next step: opening up the platform to allow others to provide innovation. The Storwize family now supports IP replication to reduce the cost of remote mirroring. In addition, SANSlide technology from a company called Bridgeworks is included to reduce network costs even further by increasing the utilization of the network.

This exactly defines the promise of software defined storage. When you virtualize your storage with the Storwize V5000, all of your storage can now get the benefits of IBM innovation like Easy Tier or innovation from other companies like Bridgeworks.

It’s one thing to get a storage system with advanced capabilities, but when your storage can give other storage systems advanced capabilities they didn’t have before, that’s simply amazing.

Check out the below infographic, and then let me know: What storage management challenges is your company facing?





By: David Vaughn
Link: http://www.smartercomputingblog.com/smarter-storage/software-defined-storage/


IBM bets on big data visualization

IBM broadened its big data portfolio with a series of software enhancements as well as visualization tools that are aimed to bring analytics to the enterprise masses.

The company rolled out the additions as it kicked off its Information On Demand Conference in Las Vegas.

IBM touted visualization as a key differentiator and its technologies such as Project Neo, software that takes raw big data logs and puts them into a graphical interface. Project Neo will be in beta in early 2014.

What IBM is gunning for is applications that can bring big data to your average marketing manager. Big Blue noted its Rapidly Adaptive Visualization Engine (RAVE) can provide analytical graphics for human resources, marketing and sales on any device.

Another visualization effort is IBM Concert on Cloud, which is a social analytics platform for remote employees on mobile.

Among the notable product points:
  • IBM launched an analytics cloud tool dubbed SmartCloud Analytics-Predictive Insights. With the software, enterprises can comb through terabytes of IT operations data in real time and head off problems before they become too large. The analytics technology runs on IBM's recently acquired SoftLayer infrastructure.
  • The company also launched a new version of its SmartCloud Virtual Storage Center, which automates tiering and moves resources to cloud storage. The software learns patterns over time.
  • Meanwhile, IBM sprinkled its BLU Acceleration technology throughout its portfolio. BLU Acceleration is meant to be use with in-memory systems and has been added to IBM's Power Systems as well as PureSystems.
  • InfoSphere Data Explorer will get a big data search tool using visualization tools.
  • InfoSphere Data Privacy for Hadoop will allow customers to anonymize data in Hadoop, NoSQL and relational systems.
  • PureData System for Hadoop has built in archiving tools, administration and security levels.


By: Larry Dignan
Linkhttp://www.zdnet.com/ibm-bets-on-big-data-visualization-7000022741/

IBM and University Hospital Zurich to Collaborate on Tool for Diagnosing Cancer

IBM scientists are collaborating with pathologists at the University Hospital Zurich to test a new prototype tool to accurately diagnose different types of cancer.

According to a release, this work is based on a technology developed by IBM scientists called a microfluidic probe, which slightly resembles the nib of a fountain pen.

A critical step in the diagnosis of cancer is the analysis of a patient's biopsy tissue sample, which sometimes can be as small as a pinhead. Even with such a small sample, pathologists can test for the absence or presence of tumor cells and provide information pertaining to the course of treatment to doctors.

To analyze samples, pathologists typically stain the tissue sample with liquid reagents. The intensity and distribution of the color stain classify and determine the extent of the disease. While this approach provides insights into the tumor, it is increasingly being realized that significant variations exist within the tumor itself; mapping these variations may help understand the drivers for each tumor, and consequently assist in personalizing treatment strategies.

IBM scientists have developed a technology called a microfluidic probe which can interact with tissue sections at the micrometer scale to help unravel some of the molecular variations within tumors.

The collaboration between IBM and the University Hospital Zurich puts an emphasis on uncovering the heterogeneity of tumors. More specifically, the collaboration focuses on lung cancer, which is one of the most prevalent forms of cancer and has a high mortality rate.

"Pathologists are determined to obtain as much accurate information as possible from markedly small biopsy samples," said Alex Soltermann, a pathologist specializing in lung cancer at the Institute for Surgical Pathology of the University Hospital Zurich. "We hope to introduce new technologies, such as the microfluidic probe, into the clinical molecular pathology diagnostic framework to enable a range of investigations, which were previously thought to be infeasible. If we are successful, the tool will be a driver for personalized medicine, and translate into increased confidence in diagnosis and better detection of predictive cancer markers."

Peter Schraml, director of the tissue biobank at the Institute of Surgical Pathology, University Hospital Zurich, said, "In addition to assisting in diagnostics, this tool may provide insight into the biomarker distribution in tumor tissues, which can aid in understanding cancer progression."

The eight-millimeter-wide, diamond-shaped probe consists of a silicon microfluidic head ending with a small tip bearing two microchannels.

"For about a year we have been testing the probe in our lab, and initial results are very encouraging we are now developing the technology in the context of important aspects in pathology," said Govind Kaigala, a scientist at IBM Research Zurich. "Over the next several months, we will install a prototype device at the hospital and work alongside pathologists."

The tool which houses the microfluidic probe is roughly the size of a tissue box it is now at stage where it may assist in studying the distribution of low numbers of cancer cells in biopsied samples.

The probe injects very small volumes of reagents on the tissue surface and then continuously aspirates the reagents to prevent spreading and accumulation. This approach is used to deliver and retrieve reagents locally in selected areas of a tissue section with pinpoint accuracy. This local interaction with the tissue sample helps in mapping the heterogeneity in the tissue.

"We are very excited to partner with IBM on the microfluidic probe technology to develop techniques for its use in the clinical pathology framework this is a fine example of a translational research that could also help answer some basic science questions," said Holger Moch, head of the Institute of Surgical Pathology at the University Hospital Zurich.

The microfluidic probes are designed and manufactured at the Binnig and Rohrer Nanotechnology Center on the campus of IBM Research - Zurich.

This research collaboration is funded by SystemsX.ch, the Swiss initiative in systems biology.
 
 
 
By: Professional Services Close - Up
 

IBM vision for computer of the future

International computer giant IBM has unveiled its vision for a "supercomputer in a sugarcube" powered by "electronic blood".

The aim is to base the design on the workings of the human brain in order to surmount some of the difficulties - delivering power and removing the excess heat - that computer engineers are currently encountering as they seek to make chips smaller and faster. 

The heat problem effectively doubles the running cost of a computer, because nearly as much energy has to be spent on cooling as on computing. 

Current figures suggest that just running the Internet accounts annually - in energy terms - for an equivalent amount of CO2 as the airline industry, and computing generally equates to a global spend of over $30 billion just making hot air. 

But, to make processors more compact and run even faster, chips will need to shrink, partly to reduce the distances - and hence time - over which information needs to be transmitted and to pack in more processing power. Doing so, though, will intensify the present problems. 

So IBM have turned to the human brain for inspiration because, despite consuming energy at the rate of only about 20 watts, the brain is, according to IBM researchers Patrick Ruch and Bruno Michel, "10,000 more dense and efficient than any computer around today." 

This they put down to the fact that the brain solves several problems in one: the 3D structure of the brain makes it highly compact, and it uses a highly efficient circulatory (blood) system to deliver energy and keep the operating environment right for optimal performance of the components. 

According to Michel, the brain uses "40% of its volume for function and 10% for energy and cooling." The best current computers, on the other hand, devote only about 1% to processing! 

Instead, Michel and Ruch propose to pile up processors like the chip equivalent of a skyscraper to boost computing power; then, they'll keep them cool by circulating a fluid - "electronic blood" - through special channels within the chips to soak up heat; and - here's the really clever bit - they'll also use the same fluid to power the chips using a chemical reaction. 

Analogous to blood delivering sugar to hungry neurones, the approach being explored at IBM is to use a solution of a chemical like vanadium that can be "charged" before it is pumped into the computer. 

Then, as it passes over the processor components, the vanadium alters its chemical oxidation state, releasing electrons to the transistors to power them. 

The ultimate goal is to cram a super-computer that currently occupies half a football pitch into a volume the size of a sugarcube by 2060.



By: Chris Smith


IBM Corporation Research and Institute of Bioengineering and Nanotechnology Discover Breakthrough for Breast Cancer Drug Delivery

Today, scientists from IBM and Singapore's Institute of Bioengineering and Nanotechnology (IBN) published a breakthrough drug-delivery technique, demonstrating the first biodegradable, biocompatible and non-toxic hydrogel that can deliver treatment more efficiently to people fighting breast cancer. 

Approximately 25% of all breast cancer patients have human epidermal growth factor receptor 2 (HER2), a specific type of cancerous cell identified in this study that is considered aggressive because it spreads quickly and has a low survival rate.

Treatment of breast cancer varies according to the size, stage and rate of growth, as well as the type of tumor. There are currently three main categories of post-surgery therapies available: hormone blocking therapy, chemotherapy and monoclonal antibodies (mAbs) therapy.

In the case of antibodies, the drugs are paired with saline and delivered intravenously into the body. Targeting specific cells or proteins, the antibodies block specific cell receptors to destroy cancer cells and suppress tumor growth. However, these drugs are absorbed in the body and have limited lifetimes and effectiveness when injected directly into the bloodstream.

Recognizing this, IBM and IBN scientists developed a novel synthetic hydrogel made up of over 96% water and a degradable polymer that is capable of sequestering a range of cargos from small molecules to large molecules including mAbs.

It also exhibits many of the biocompatible characteristics of water-soluble polymers, which hold form in the body without completely dissolving. This allows the hydrogel to function as a depot for the drug to slow-release its contents in a targeted location directly at the tumor site over weeks instead of days. Once the drug has been delivered, the hydrogel biodegrades naturally and passes through the body.

"Drawing from our experience in materials innovation for electronics technology, we are now applying these techniques to the quest for improved health," said Dr. James Hedrick, Advanced Organic Materials Scientist, IBM Research Almaden. "This hydrogel can help deliver drugs over an extended period of time without causing a significant immune response, effectively sending its contents directly to the tumor without harming healthy surrounding cells."


In animal studies done by Singapore's IBN, testing demonstrated improved results when the antibody was paired and delivered with the hydrogel, even at low concentration, than on its own.
  • Tumor Size: Over the course of 28 days, the tumor shrank 77% when paired with the hydrogel via subcutaneous injection at the tumor site as opposed to 0% without it by intravenous injection.
  • Treatment Frequency: When paired with the hydrogel and injected subcutaneously at a site far away from the tumor, the treatment frequency was reduced from 4 to 1 while maintaining a similar therapeutic effect. This is when compared to just the antibody solution formulation injected intravenously.
  • Weight: The ability to target and deliver the drug directly at the tumor site allowed for only the infected cells to be eradicated, leaving healthy cells alone. This resulted in stable to moderate weight gain during the study instead of massive weight loss traditionally associated with cancer drug treatments.
  • Non-Toxic: Since the hydrogel is non-toxic, it demonstrated high biocompatibility as evidenced by no cellular inflammation with minimal immune system response while degrading naturally and passing through the body within 6 weeks
"We have developed new, effective materials for nanomedicine, which has been one of IBN's key research focus areas since 2003. The sustained delivery of Herceptin from our hydrogel provides greater anti-tumor efficacy and reduces injection frequency. Thus, our approach may help to improve patient compliance, offering a better alternative to existing breast cancer treatments. This technology can also be used to deliver other types of antibodies or proteins to treat different diseases," said Dr. Yi Yan Yang, Group Leader, Nanomedicine, Institute of Bioengineering and Nanotechnology, Singapore.

The IBM nanomedicine polymer program - which started in IBM's Research labs four years ago with the mission to improve human health stems from decades of materials development traditionally used for semiconductor technologies. This advance will expand the scope of IBM and the Institute of Bioengineering and Nanotechnology's collaborative program, allowing scientists to simultaneously pursue multiple methods for creating materials to improve medicine and drug discovery. An industry and institute collaboration of this scale brings together the minds and resources of several leading scientific institutions to address the complex challenges in making practical nanomedicine solutions a reality.




Linkhttp://www.devicespace.com/news_story.aspx?full=1&StoryID=313988