vendredi 6 décembre 2013

IBM's Big Plans for Cloud Computing



Ambition is an impressive thing, particularly when a desire for world domination is combined with existential survival. 

Four heavyweight tech companies are translating that ambition into investments in their cloud computing services: IBM, Microsoft, Amazon and Google are all expected to spend more than $1 billion annually on their global networks in the coming years

Even more important, however, is that all the companies are developing knowledge through their cloud services of how to run truly huge Internet-based computing systems — systems that may soon be nearly impossible for other companies to match. If any other company is thinking of entering the business, like China’s Tencent, for example, they’ll need to move fast or come up with something revolutionary.

IBM’s response? You ain’t seen nothing yet. 

In 2014, the company will make a series of announcements that will shiver all challengers, according to Lance Crosby, chief executive of SoftLayer, a cloud computing company that IBM purchased earlier this year for $2 billion

More than 100 products, like e-commerce and marketing tools, will be put inside the cloud as a comprehensive series of offerings for business, Mr. Crosby said. So will another 40 infrastructure services, like big data analysis and mobile applications development. 

“It will take Amazon 10 years to build all of this,” he said. “People will be creating businesses with this that we can only dream about.” 

Maybe. IBM already claims to lead in cloud computing revenue, with $1 billion in revenue in the past quarter alone. That’s impressive, though that revenue includes revenue from software that used to be attributed to a different category at the company. And some of the revenue is being generated by companies IBM recently acquired, including SoftLayer.

On many other fronts, such as the number of machines it operates, the number of major companies running big parts of their business on IBM’s public cloud, and the new technology it appears to have built for cloud computing, IBM is arguably the laggard among the top four providers. As the SoftLayer purchase indicates, it has had to buy big for what the others have mostly grown internally.

What IBM does have, however, is a lot of money and resources it plans to throw at cloud computing. And given its experience in the early-1990s, when it faced a near-death experience after missing a major technology shift, the company may also have a belly for a swift change.

The big push will begin in February, Mr. Crosby said, with a formal inauguration of its new cloud offerings by Virginia M. Rometty, IBM’s chief executive. 

IBM has also deployed 400 employees to Openstack, an open source software project with more than 200 corporate members that goes after much of the proprietary cloud systems of Amazon, Microsoft and Google. This seems much like IBM’s involvement a decade ago in Linux, which helped that open source operating system win corporate hearts and minds. 

In addition to the consolidation of online software and services, Mr. Crosby said, IBM is “absolutely” looking to sell its big mainframe computing capabilities as a cloud-based service. It also plans to draw on the insights it has gained from building and licensing technology used by Microsoft in the Xbox gaming console, and Google in its own network operations, he said, and will make more acquisitions for the cloud business.

“We make the processors in Google’s server racks,” he said, “We understand where gaming is going. Before I got here, I thought this was a big old tech company, too; I didn’t see all of the assets.” 

It’s true that IBM is big. And, it is also a tech company. And undeniably 102 years old, which makes it both a survivor and a creature of successful processes. Mr. Crosby has two bosses between him and Ms. Rometty, and numerous executive vice presidents above him that may agree on the eventual future, but have their own views about the speed with which they’ll move there.


By: Quentin Hardy
Link: http://mobile.nytimes.com/blogs/bits/2013/12/04/ibms-big-plans-for-cloud-computing/

mardi 3 décembre 2013

IBM storage GM: Flash impacts everything

Ambuj Goyal, a 31-year veteran of IBM, became the company's top storage executive in January of 2013 when he took over as general manager (GM) of IBM's System Storage & Networking business. Goyal's previous roles at IBM included GM of global development and manufacturing for the Systems and Technology Group, GM of IBM information management software, GM of workplace, portal and collaboration software, GM of solutions and strategy for software, vice president of services of software, and director of computer sciences. With his first year as the IBM storage GM winding down, we spoke with him about IBM's flash storage, how he sees the storage world changing, and where he sees it going in the near future.

What were your main goals for improving IBM storage when you took over as GM early this year?

Ambuj Goyal: There were a few things I wanted to change. We had a lot of products. We were doing a battle on speeds and feeds and capacity. What I tried to do was say, 'This is not a storage battle, this is a data battle.' Clients are looking at how they manage data. Their different workloads have different data management needs. When people think about data, the first thing they say is, 'Don't lose my data.' The second thing they say is, 'My data should be available to my workload.' And the third thing they say is, 'When I need capacity or performance, give it to me.' So we changed the focus to data management and that has rationalized the product portfolio.

Does that mean you have to get rid of products?

Goyal: When you get into speeds, feeds and capacity, everything feels like it is overlapping.
Let me give you three scenarios of data management. First is, I have business-critical data that I cannot survive without -- I cannot load my ledger, I cannot report earnings, whatever it is. I have to make sure the data is available and secure. The second scenario is, I have lots of data, let me understand the data and leverage data. Let me start quick and add value -- tell me what I want to keep and what I can throw away and what I can put into cost-effective scenarios and where I need real-time analytics. The third scenario is I have a clutter of data, I have lots of products, and I want to start a new project quickly to leverage all my existing data on depreciated capital.

In the first scenario, which is business-critical data, we lead with multi-copy data management. It's about how you access a created copy of the data [and] how fast you can create a copy. In the past it was called batch processing. Now it's called creating an analytics copy so you can do analytics associated with it. Our DS8000 family is useful for copy management of business-critical data.

In the world of start quick and add value, we have the world's most popular virtualization platform. Originally that was SVC (SAN Volume Controller) and we have significantly improved it over the last three or four years to create things like non-disruptive integration in a data center. So when you put it in it takes a short time for applications to run, and it gives you a huge amount of utilization improvement through data reduction. And this product line is now called Storwize.

In the third scenario where you say have lots of data, the cloud, big data, people are looking for amazing capacity and a grid-scale-architecture. I want to make sure I have so many applications running, yet it should automatically self-adjust and provision itself so I don't have to get humans involved. But it still has things like encryption of critical data. That is the XIV family. The XIV family is now the most used in the OpenStack, big data, clouds and analytics scenario.

Our strategy has shifted. We start with a workload, understand what the need for that particular data is, and then lead with a solution rather than speeds and feeds. Now many of our clients have shifted from saying, 'Give me the best dollars per gigabyte,' to saying, 'I want to buy the right data architecture.'

Our sales team has really been well-educated about that, and that's why we are taking competitive share now.

Taking share? I haven't seen any numbers that indicate that.

Goyal: I don't know about IDC or [Gartner] Dataquest, they will publish the numbers. But I look at competitive changes. We are starting to get into many clients where the normal answer was, 'We are standardized on something, show me your value before we will even consider you' and we are starting to displace competitors.

In that sense we are starting to move forward. I'll give you an example. We just went into eBay. EBay is using our big data and cloud solutions for huge amounts of grid-scale data, and that's built on XIV.

Storage is more than 10 Gigabit Ethernet versus Fibre Channel versus 6 terabyte DASD (Direct Access Storage Device), or eMLC (Enterprise Multi-Level Cell) or SLC (Single- Level Cell) flash. Storage is about data management.

Where does flash fit in?

Goyal: Flash is impacting everything -- flash is not a product for us. Yes, we can sell a standalone product, but flash is leveraged behind Storwize and SVC, all-flash technology is leveraged in multi-copy data management scenarios with the DS8000, and flash is being leveraged in XIV for big data and cloud scenarios, as well.

We use flash to get the fastest ROI [return on investment] without any operational change in the data center. We just announced a DS8000 product line that is all-flash. There's a significant improvement in performance, a significant improvement for clients who want extremely consistent response time. An all-flash DS8000 is good because from an application perspective, the mainframe is using DS8000 and no software needs to change. You can get amazing response times and consistent response times with a reduction in floor space without changing a single line of software. If you roll in a new product with different APIs [application programming interfaces] and different management environments, then you will have to disrupt the data center and ROI will take a long time.  

What about the FlashSystem all-flash platform acquired from Texas Memory Systems?

Goyal: That goes into scenarios where clients say, 'I'm already doing things like data backup and replication and all the data loss prevention things in my software. All I want is the amazingly fast capability to access data."  In those scenarios we are seeing a huge interest in all-flash.

More than 1,000 organizations in six months have purchased FlashSystems, and we have exceeded 100 petabytes of flash.

What are the challenges your customers are seeing in backup?

Goyal: In backup, they want cheap and deep. For the data they want to access, it needs to be quick. They don't want to depend on a particular media. They say, 'Don't sell me tape, don't sell me flash, don't sell me a separate controller, I have a data management problem. I want to put away petabytes and petabytes of data, and then I want to tell you my recovery point and recovery time objectives, and then you decide the media and give me the cost associated with it.'

There are scenarios where we used to say the answer is a virtual tape library for backup if you only want disk, or real tape, or some other backup software scenario. Now we say [that] through a mechanism we are calling long-term file system, we can use any combination of tape and flash and disk. You tell us the capacity you need, we will give you the lowest capacity for dollars per gigabyte for storage and the best performance based on recovery point and recovery time objectives.

When storing data for archive -- think about for an audit or legal hold -- in those scenarios a combination of flash and tape works nicely. In a media asset management scenario where people need to play out movies, a combination of disk and tape is working nicely.

We don't want to have a separate media business associated with tape, disk or flash. I look at what problem I am I trying to solve, and come up with the right software and media.

What is your cloud storage strategy?

Goyal: There are two ways to think about the cloud. One is being an arms supplier to people who are providing the cloud. Those can be MSPs [ managed service providers], and there are lots of people building private clouds based on our XIV with a cloud option. The second way is through our SoftLayer offering.

Are you seeing any other storage trends in the market?

Goyal: Everything that we are doing is going open. Even when we do tape, the drives we put LTFS on can run on somebody else's libraries. Everything we are doing with respect to OpenStack, Cinder or Swift, we are not going to create proprietary APIs. Many clients say, 'We're stuck with proprietary APIs, and now our application is tied to the API associated with a proprietary vendor.' We want to be in a situation where we endorse open standards and win with execution.

So our strategy is to be open so people are not tied. Even with Storwize and SVC, you can put non-IBM storage behind it. Many of the flash vendors and our traditional competitors enable their storage behind our storage virtualization engine. I can change applications and change the storage associated with it because it's open [source]. I'm trying to endorse everything open. Just like I said flash is pervasive behind every architecture we are doing, open is also becoming pervasive. EBay would have never bought our XIV product without OpenStack support.

We want to win with execution, not by controlling your data center.



By : Sonia Lelii & Dave Raffo
Link : http://searchstorage.techtarget.com/news/2240209881/IBM-storage-GM-Flash-impacts-everything?src=5186535

Indian companies get on to cloud to manage workforce

How are leading Indian companies such as Bharti Airtel, HCL & Tata Motors using the cloud to make smarter appointments and increase productivity



The biggest asset of any organisation is the internal consumer force or the work force and keeping this sect well trained and pepped up is now more important than ever. According to a recent announcement by IBM, 70 per cent of CEOs cite human capital as the single biggest contributor to sustained economic value.

Marrying data with employees’ qualities

To enhance the value of their human capital and to transform their work force, leading India organisations such as Bharti Airtel Ltd, HCL and Tata Motors have taken resort to cloud based social software. The aim of using IBM’s cloud based social software is to combine industry leading social business and analytics capabilities with the human capital management offerings and get cloud-based capabilities that allows organisations to capture and analyse data shared by employees.

Bharti Airtel Limited,a leading global telecommunications company, was looking to implement evolved and proven methods to build organizational culture and employee engagement. By using Kenexa's employee engagement survey, which gives accurate measure of employee opinions, Airtel is now able to understand manageable factors that cause employees to be more productive, stay with the company for a longer period of time and care more deeply about the work they do. As a result, leadership is able to use this feedback to change management strategy, align it with key cultural factors and ultimately improve business performance.

Right data to detect the right candidate !

Tata Motors on the other hand was looking for a service that would help them screen the right candidates for a particular position.
 
Tata Motors, a Fortune 500 and leading automobile player, has an internal job posting practice called Opportunities++.

The company teamed up with IBM to study job fit within the organisation by using an online system that also had the ability to screen candidates for the right position. The goal was to select candidates on various parameters such as those, who provide great customer service. Once Tata’s pinpointed these characteristics, the company refocused its efforts on assessing and selecting the right people for open jobs.

“IBM is uniquely positioned to help organizations capture information, create insights and generate interactions that translate into real business value,” said Anmol Nautiyal, Director, Smarter Workforce, IBM India and South Asia. “By marrying powerful social and talent management offerings in the cloud, organisations can attract, develop and inspire their employees, which in the end help to ensure their growth and long-term business success.”

All major organizations in India, usually hire once in a year, picking up the best talent. However, with the growing youth in the country, the task gets difficult day by day. To solve this issue, HCL, a leading global Technology and IT Enterprise, was looking to deploy a cloud-based recruitment system that would allow it to improve recruiter productivity, cycle time, and quality of hire and fulfilment rate while at the same time reducing costs of recruitment.

HCL was also looking to improve customer satisfaction in terms of improved candidate and recruiter experience. Working with IBM, HCL replaced its home grown Smart Recruit candidate/requisition tracking system with a new Talent Acquisition solution that automates its entire recruitment process.

Recruitment and then retaining the trained recruited staff has been a major challenge for organizations. With technology and data entering the space organizations are empowered to take wiser decision.

IBM’s Smarter Workforce initiatives help businesses capture and understand data and use these insights to empower their talent, manage expertise and optimize people-centric processes. It includes IBM’s 2012 $1.3B acquisition of Kenexa’s talent management, recruitment, compensation, engagement, leadership and assessment offerings.




By : Saloni
Link : http://e27.co/indian-companies-get-on-to-cloud-to-manage-workforce/

lundi 2 décembre 2013

Design: IBM's secret weapon in the coming SMACdown ?

I’ve talked before about how Phil Gilbert – the former President and CTO of Lombardi Software who joined IBM when it bought his company – now has a role to develop a cross-company design practice in IBM. IBM Design is centred around a lab in Austin TX but with plans to spread wider. It’s the centre of excellence for IBM’s own take on Design Thinking, and is hoovering up design talent like you wouldn’t believe.

What I hadn’t realised – until I saw Gilbert present at last week’s Analyst Insights event – was the full extent to which IBM Design is really a kind of rediscovery of the company’s industrial design heritage.

Starting in IBM’s Software Group, Gilbert’s IBM Design group is applying IBM Design Thinking to existing and new products. This variant of Design Thinking – which itself might be roughly characterised as “an approach to design that’s focused first around the experience that the user has, rather than a product; and that takes an ‘outside-in’ approach to analysing problems and opportunities” – is being retooled in a way that enables IBM to scale the approach to very large teams and communities. The approach is being spread through IBM product teams through week-long intensive “designcamps”. The IBM Design group is focusing in particular on dramatically improving six areas of customer and user experience: find/install/setup, first use, everyday use, upgrade, API use, and maintenance.

Not content with spreading the religion amongst those building and improving products, IBM Design is also running distilled versions of its designcamps for senior executives in IBM’s Software Group. And Gilbert makes no secret of the fact that the ambition is to take IBM Design Thinking beyond the Software Group to other groups in IBM.

Well, this is all very nice. But so what?

The first part of why this is so important to IBM: one of the very legitimate ways that its competitors have in recent years been able to score points against IBM is to highlight how complicated its technologies are to navigate, implement and use. As Gilbert himself says: “too many users are working for our products; we want to turn this around.” From what I’ve seen of IBM Design’s work, it’s already started to have a pretty radical impact on the intuitiveness of some of IBM’s products look and feel.

The second part is more forward-looking. It relates to the ways in which technology vendors large and small are currently investigating and investing in Social, Mobile, Analytics and Cloud (“SMAC”) technologies and platforms for their customers, to augment the infrastructure platforms they already have and help to deliver on a Digital Enterprise vision. Every vendor has to have a story about how SMAC technologies affect them and how they’re taking advantage. This is all well and good; but the truth is that there’s a very real danger for enterprises as they embark on explorations of the new platforms being assembled for them.

The danger is that enterprises will slide into platform investments that bring *many* more moving parts and more integration points; and at the same time more control-point tussles between vendors with each trying to make sure that their own social front-end, or application development/design repository, or device management toolset, or whatever becomes the ‘master’ in the customer’s environment. Make no mistake, this will happen. Just as it always has when new business technology platforms have emerged.

This is where I think the power of IBM Design has the potential, possibly, to strengthen IBM’s strategic position. Note that IBM Design’s mission is to “design an IBM that works together, works the same, and ‘works for me’”.

It’s that last part that really resonates in today’s environment, I think. In Phil Gilbert’s own words:  “The only platforms that matter are the platforms in our customers’ organisations.” If this turns out to be more than words, then it will be very powerful indeed.

Of course, the proof of the pudding is in the eating. IBM Design is off to a great start but it’ll be another year at least before we can say for certain the impact that IBM Design Thinking is having on IBM’s business and its customers’ businesses.



By: Neil Ward-Dutton

IBM supercharges Power servers with graphics chips

IBM will support Nvidia GPUs in its Power servers, mainframes, and supercomputers starting next year.

IBM achieved a computing breakthrough when the Watson supercomputer outperformed humans in game show "Jeopardy," but the company now wants to supercharge its high-end Power servers by tapping into graphics processors for the first time.
 
Starting next year, IBM will start using Nvidia's Tesla graphics chips in servers with Power chips, which have been used in Watson and supercomputers like Sequoia, IBM said on Monday. 

Nvidia's Tesla graphics processors have been used alongside CPUs in some of the world's fastest supercomputers to accelerate technical computing. The addition of GPUs to Power servers would be new; previous servers were boosted by vector co-processors, FPGAs (field-programmable gate arrays) and other circuitry.

In addition to supercomputers, the combination of Power processors and Nvidia GPUs could speed up mainframes used for critical tasks like financial transaction processing.

The addition of Tesla to IBM's Power CPUs in servers will help customers process and analyze data faster, said Sean Tetpon, an IBM spokesman, in an email. IBM plans to deploy Power-based rack based servers with Nvidia's GPUs as early as 2014, Tetpon said.

The first servers could combine Tesla with IBM's upcoming 12-core Power8 chip, which will ship next year. IBM claims the Power8 chip is up to three times faster than the Power7, which was released in 2010 and is used in the Watson supercomputer.

In August, IBM unexpectedly announced that it would open up its Power8 architecture and start licensing intellectual property to third parties looking to build Power servers or components. IBM also established the OpenPower Consortium, whose members include Nvidia, Google, Tyan and Mellanox. Tyan will be the first company outside IBM to build a Power server.

IBM is also making it easier to plug-in co-processors like GPUs to Power8 servers. It is providing a connector called CAPI (Coherence Attach Processor Interface) to which third-party component makers can attach graphics cards, storage devices, field-programmable gate arrays, networking equipment or other hardware.

IBM already uses Nvidia GPUs in the System x servers, which use Intel's x86 processors. The Power8 server architecture is built around the PCI-Express 3.0 data-transfer standard, which is already used for GPUs in PCs and x86 servers.

IBM is also adding native support for Nvidia GPUs to its version of the Java Development Kit (JDK).

A lot of applications in distributed computing environments are written using Java, and Nvidia's GPUs will be able to process more mainstream applications in environments like Hadoop, said Sumit Gupta, general manager of Tesla Accelerated Computing products at Nvidia.

"This expands us into the broader general enterprise market," Gupta said.

Currently, Nvidia GPUs are used mostly to process scientific and math applications in supercomputers. IBM will support Nvidia's proprietary parallel programming tools called CUDA, in which code can be written for parallel execution across graphics processors.

IBM is "exploring many application areas" across its software portfolio that could be off-loaded to graphics processors, IBM's Tetpon said.

"Any existing or new compute applications that are developed with the NVIDIA CUDA programming model will be supported," Tetpon said.



By: Agam Shah
Link: http://podcasts.infoworld.com/d/computer-hardware/ibm-supercharges-power-servers-graphics-chips-231064

Graph500: IBM is #1 for Big Data Supercomputing

IBM has taken eight of the top 10 places on the latest Graph500 list. The firm has also secured a huge market share when it comes to building the majority of the world’s best computers for processing big data. Big data is critical to IBM’s current strategy and the graph 500 list, ranked super computers based on their processing ability to deal with huge amounts of big data. The top three positions have been awarded to Lawrence Livermore National Laboratory’s Sequoia, Argonne National Laboratory’s Mira and Forschungszentrum Juelich’s (FZJ) JUQUEEN, which all use IBM Blue Gene/Q systems.


Blue Gene supercomputers have ranked #1 on The Graph500 list since 2010 with Sequoia topping the list three consecutive times since 2012. IBM also was the top vendor on the most recent list, with 35 entries out of 160. Big Blue was also the top  vendor on the list, with 35 entries out of 160. Competitor, Dell featured 12 times and Fujitsu, seven.

The Graph500 was established in 2010, by a group of 50 international HPC industry professionals, academics, experts and national laboratory staff. There are five key industries that the Graph500 tries to address with its benchmark which include cybersecurity, medical informatics, data enrichment, social networks, and symbolic networks. All of these industries process and analyze large amounts of data, which is also why the Graph500 looks at graph-based data problems, since this a foundation of most analytics work, and the ability of systems to process and solve complex problems.

The name also comes from graph-type problems or algorithms which is at the core of many analytics workloads in applications, such as those for data enrichment. According to llnla graph is made up of interconnected sets of data with edges and vertices, which in a social media analogy might resemble a graphic image of Facebook, with each vertex representing a user and edges the connection between users. The Graph 500 ranking is compiled using a massive data set test. The speed with which a supercomputer, starting at one vertex, can discover all other vertices determines its ranking”.

The rankings are geared toward enormous graph-based data problems, a core part of most analytics workloads. Big data problems currently represent a huge $270 billion market and are increasingly important for data driven tech businesses such as Google, Facebook and Twitter. While the limit for what actually constitutes ‘big data’ continues to evolve rapidly, businesses and startups, need to understand and unlock additional value from the data that is most relevant to them, no matter the size.



By:  Hayden Richards

vendredi 29 novembre 2013

A boy and his atom: the world's smallest movie

Technology rules our world. And to continue pushing boundaries, we need future generations to embrace technology's foundation — science. This pursuit, to get today's youth to admire scientists the way they admire athletes and actors, became IBM's mission. So they asked us: How do we spread the word about science? Our solution was to make the world's smallest movie. Each frame is made of hundreds of atoms (yes, real atoms), moved to their exact placements by the scientists at IBM Research — Almaden. The frames are combined into an animation, which is now the Guinness World Records ™ record holder for World's Smallest Stop-Motion Film. The scientists themselves made the movie, along with promotional materials like movie posters (all also made with atoms). And buzz was generated by word of mouth — we premiered the film live in a Brooklyn science class before it went live to the world as part of the Tribeca Online Film Festival, then promoted on social media networks with bonus content like additional video shorts and infographics. Proof that when you create something unique, it gets people talking.

The audience was drawn in by paid media like banners, print ads, and online video pre-roll, and by word-of-mouth social media initiatives and press outlets like Wired, Buzzfeed, and Mashable. All assets were compelling to the audience because they teased the most impressive aspect of our content piece - it was made with atoms.

IBM loved this simple, shareable way to spread the word about science and data storage, and the film was accepted into the Tribeca Online Film Festival and shown at the New York Tech Meet-up and the World Science Festival. The film, now in the top 1% of all most-watched YouTube videos, surpassed 1 million views in 24 hours, and 2 million views in 48 hours, with more than 27,000 likes. It was trending on twitter on its release day and totaled more than 21,000 social media mentions with 96% positive sentiment, increasing IBM social mentions by 137%. As of submission, the film garnered 2.4 million news impressions (not including television coverage) and 23.6 million impressions overall, effectively reaching all targets — the science, tech, film, education and entertainment communities — with a strong global reach (33% of online activity from Europe, 10% from China and 8% from India, Australia and Japan).





 

IBM's supercomputer Watson is now your best shopping companion



It’s been more than two years since IBM’s Watson made its hit TV debut on Jeopardy! And now, Watson is the latest “cognitive, expert personal shopper” developed by an early stage partner in the IBM Watson Developers Cloud, digital shopping company Fluid.

Their Watson-powered app—the Fluid Expert Personal Shopper—marries Watson’s transformational cognitive computing that provides dialogue-driven assistance with product recommendations and content, letting consumers make queries using Watson’s natural language processing (NLP). 

But, “Watson’s natural language capabilities are not what make it special,” Brooke Aguilar, VP Global Business Development at Fluid told brandchannel. “The big differentiator is that Watson is a learning machine and one that quickly learns from and adapts with each interaction. In the context of digital shopping this presents a tremendous opportunity for Fluid to give consumers highly engaging and rich shopping experiences that are personalized and become more so with continued use.”

The app incorporates consumer information to become smarter with each interaction and operates as a knowledgeable sales associate in the palm of your hand, fueling a new era of cognitive apps. 

IBM is sharing its technology with the global entrepreneurial community to help build the next generation of apps. “The move aims to spur innovation and fuel a new ecosystem of entrepreneurial software application providers—ranging from start-ups and emerging, venture capital backed businesses to established players," according to a press release.
 
The North Face is the first brand to sign-on to the Fluid Expert Personal Shopper. “Say you’re on the TheNorthFace.com site. You’re planning a two-week camping trip to Wyoming’s Wind River mountain range in May. You could ask, “What gear and food will I need?” and learn what you’ll want to know so you can have a more safe and comfortable experience—such as the fact that it often snows in the Wind Rivers in springtime.”

Since that now infamous public debut in 2011, Watson has gained a 240 percent improvement in system performance and a 75 percent reduction in the physical requirements needed to run the system which can now operate from a single Power 750 server with Linux from a cloud computing environment. 

The biggest challenge to the Fluid team in building the app was adapting “the Watson system and team to accommodate an entirely new vertical market focus,” said Aguilar. “Previously, the majority of effort commercializing Watson technology has been in healthcare. Adapting Watson to direct consumer access in a retail context requires new thinking and creative approaches.”

The future of apps like Watson in retail, Aguilar added, is to “serve the rich, relevant content so consumers can make smart, satisfying purchases in a totally natural context that begins to break down the consumer/computer barrier. It puts the power in the hands of the consumer like never before, and it gets better every time.”



By: Sheila Shayon
Link: http://www.brandchannel.com/home/post/2013/11/28/IBM-Watson-Fluid-App-112813.aspx

mercredi 27 novembre 2013

Airtel, HCL, Tata Motors deploy IBM’s cloud software

IBM said that Bharti Airtel, HCL and Tata Motors are using its cloud-based talent management software to improve productivity. 

Financial details of IBM’s arrangements with the three Indian companies were not disclosed. 

In partnership with IBM, HCL has replaced its home grown candidate tracking system with a new solution that automates the recruitment process, according to a press statement. 

Tata Motors has teamed up with IBM to study ‘job fit’ within the organisation by using an online system that can screen candidates for the right position. 

Bharti Airtel is using IBM’s Kenexa survey to understand factors driving employees to be more productive and stay with the company for a longer period of time. 

About 70 per cent of Chief Executive Officers cite human capital as the single biggest contributor to sustained economic value, according to a recent global study by IBM. 



lundi 25 novembre 2013

IBM's Big Bets: Cognitive and Analytics

IBM boasts an extensive portfolio of products and services to help organizations develop analytics solutions to gain business advantage and to improve the human condition.

At IBM Information on Demand (“IOD”) in Las Vegas the first week of November, IBM SVP and Group Executive Steve Mills, and other IBM executives, unveiled 30 announcements that identified new and updated products and services to augment the IBM portfolio.

IBM also spent some time on at IOD educating attendees about the commercialization of its other big bet, cognitive computing, perhaps an even more strategic bet than analytics, operating under the now famous IBM Watson moniker. IBM, however, held the next major step about opening the Watson ecosystem development in abeyance for one full week after the end of IOD. 

IBM’s two big bets, cognitive and analytics, are situated at entirely different points of their respective lifecycles. Analytics, and related information management solutions, are commercially here and now. Cognitive computing has only recently hatched from the research side of R&D, and has nearly 100% of its commercial life still ahead of it. How do analytics and cognitive fit together into the larger IBM strategy to continuously help customers apply technology to the benefit of business and people?

Analytics

Despite the flurry of announcements one key message came through loud and clear at IOD: IBM believes that analytics is a primary game-changer for businesses, IBM believes customers should bet big on analytics, and preferably by buying and obtaining help applying IBM’s analytics and information management offerings. To IBM’s credit its analytics consultants receive no special compensation for selling IBM’s products — their primary objective concerns customer success. IBM even puts its money where its mouth is by entertaining the use of value-based pricing when applicable.
The IOD keynotes stirred attendee passion for analytics through a mix of IBM executive, customer, and partner presentations and interviews. Mr. Mills aptly drove the point home at a news briefing stating that we are entering an “era of decision-making excellence” and that we are just “… at the beginning of the revolution…” Despite IBM presenting a large quantity of analytics customer case stories, “You ain’t seen nothing yet” according to Mr. Mills. Mr. Mills cited decreasing hardware costs as contributing factors to the rise of analytics and predicted that eventually businesses would spend more on analysis and prediction than process automation.
Mr. Mills better be correct about the demand cycle for IBM has placed a giant wager on analytics. IBM employs over 9,000 business analytics consultants, and has helped customers implement over 3,000 big data deployments to date. Though IBM has experienced 5 consecutive declines in year-over-year quarterly revenue, business analytics has grown 8% year over year through the first 3 quarters of FY13. IBM has and will continue to make organic R&D investments “measured in the billions.”  Dating back to and including the 2009 purchase of SPSS, over the past 40 months IBM has made 43 acquisitions, with more than half fitting into either analytics or related information management spaces.
IBM hinted that is nowhere ready to sit on its laurels, and customers should expect on-going innovation in both the analytics and related information management space such as in databases, business process automation, integration, data governance, and content management. Some complain that IBM’s portfolio is difficult to navigate because there are so many options and sub-brands. IBM exhibited awareness of the issue, and customers may see IBM realign and simplify the portfolio during 2014.
If the realignment and considerable evangelism on display at IOD, which will be called IBM Insight in 2014, makes it simpler for customers to grasp the benefit and move towards solutions with IBM more quickly, IBM will happily continue to make such investments. Analytics in 2013, 2014, and probably for several years thereafter, will be looked to by IBM’s executives and shareholders as a lynchpin for moving IBM revenues in a more northerly direction.

Cognitive

If analytics and all that it entails carries and will carry an important revenue load for IBM in the near and medium term, cognitive computing supplies the air cover to keep customers coming back to IBM for tech innovation-based transformation. Though IBM Watson continues in its technology transfer phase, make no mistake about it: IBM Watson is already attracting and attaching developers, and is already being used to help customers, albeit on a limited basis.
The most fascinating angle about IBM’s recent Watson announcement is the notion of “cognitive applications.” IBM’s new ecosystem program for Watson aims to recruit entrepreneurial ISVs, putting IBM into the pole position with a new platform for the first time in a generation. With a few exceptions, IBM largely left the packaged enterprise application market to the likes of SAP, Oracle, Microsoft, Infor, and Salesforce.com, choosing instead to win at the edge of those solutions where services, infrastructure, customization and hand-holding were required.  That may very well change with Watson.
Perhaps whetted by the value-added nature of analytics, and some of the industry-specific applications where IBM has succeeded, the notion of cognitive as an entrée into the next generation enterprise application space portends an IT competitive overhaul. Given that the focus on SMAC – social, mobile, analytics, cloud – by the developer, VC, and tech entrepreneurial community has been racing for several years now some have begun to whisper “but what’s next?” IBM’s greatest challenge with Watson over the coming years may be how to manage the explosion of interest from developers. Being a “platform” vendor is the right problem to have.

EMA Perspective

Virtually every Global 2000 company has implemented ERP, CRM, and SCM solutions. While all of these core solutions are experiencing a refresh cycle due to SMAC, and in doing so delivering improved business effectiveness, another slice of attention has gone towards derivative applications, including:
  • Analytic apps offer insight-driven business solutions that leverage existing data. Visit EMA’s Business Intelligence and Data Warehousing research for more detailed coverage.
  • By wrapping APIs around data integration, data governance, and business processes, organizations may now develop a fresh set of integrative applications. Just as big data opens the door to better insight, evolution in the integration space allows for a refresh in business process optimization. On a visionary basis this has to do with harnessing IoT, or the Internet of Things, but on a practical basis the integrative approach will help enterprises harness YoT – Your own Things. Many of IBM's comprehensive offerings on this front were evident at IOD.
  • Cognitive applications, which also tap largely into existing informational and process assets, promise an entirely rethought approach to business model re-engineering.
Investors worry about IBM revenue growth.  Share prices have reflected the concern recently. Industry analysts, however, enjoy the freedom to look many years ahead when assessing vendor success probabilities. Between analytics, integration, and now particularly cognitive, over the long run IBM is as positioned as well as any enterprise IT solution supplier.




By: Evan Quinn
Linkhttp://blogs.enterprisemanagement.com/blog/ibms-big-bets-cognitive-analytics/

IBM introduces Watson to the public sector cloud

IBM has strengthened its cloud computing offerings with a new high-end service that takes advantage of its famed Jeopardy!-winning Watson technology. 

The service, dubbed IBM Watson Developers Cloud, could help agencies by applying Watson’s cognitive computing intelligence to the federal government’s big data problems, from fraud analysis to intelligence surveillance and sensor-gathered data.

IBM is making its Watson technology available as a development platform in the cloud in the hopes of prompting third-party developers to create new applications that take advantage of its ability to learn from its interactions with data and reprogram itself for better results. IBM is providing a developer toolkit, educational materials and access to Watson’s application programming interface. 

Developers that build Watson-powered apps in the cloud can use their organization’s data or they can access the IBM Watson Content Store, which features third-party data. Additionally, IBM has committed 500 subject matter experts to the IBM Watson Developers Cloud effort. 

IBM’s high-profile Watson technology is leading the way towards a new era of cognitive computing systems. In September, Frost & Sullivan recognized IBM Watson Solutions with the 2013 North America New Product Innovation award, which is given to a company with an innovative product that leverages leading-edge technologies and produces value-added features and benefits for customers.

"The IBM Watson Engagement Advisor technology can listen to and respond to a series of follow-up questions and remember the previous questions that were posed," said Frost & Sullivan analyst Stephen Loynd. "In other words, IBM Watson combines technologies that allow the Engagement Advisor to understand natural language and human communication, generate and evaluate evidence-based hypothesis, as well as adapt and learn from user selections."

The announcement of IBM Watson Developers Cloud comes just weeks after IBM forfeited its legal battle against Amazon Web Services to win a 10-year, $600 million cloud computing services contract with the CIA. Analysts said it doesn’t make sense for IBM to compete head-to-head against Amazon on federal deals aimed at the lowest possible price.

"From a pure analytics standpoint, Watson is a great platform," said Shawn McCarthy, research director for IDC Government Insights. "So easing the way that people can have access to what it does is a good thing, and it allows IBM to focus on something it does really well as opposed to playing the commodity game, which is tough." 

The initial target market for IBM Watson Developers Cloud is the private sector, with IBM touting third-party applications in such areas as retail and health care. But analysts say the offering will impact big data problems in the public sector, too. McCarthy sees potential for Watson-powered apps in such areas as fraud analysis, which the White House is ramping up due to worries about scammers taking advantage of consumers signing up for its new health care plans. 

"Fraud issues could be huge. That could be anything from tax issues at the state and local level, to unemployment or other benefits — anything that people can dream up for fraud," McCarthy says. "A good analytics solution can help unravel where A and B don’t exactly line up."

Another possible application of the IBM Watson Developers Cloud is entity analytics, which is used by the Department of Homeland Security to find patterns in data by looking for commonalities about entities, whether they are people, phone numbers or license plates. 

"A good example of entity analytics is when a credit card is used here and goes to this address, and the address is suspicious because a phone call made from that address was used to contact a criminal or terrorist network," McCarthy explains. "Entity analytics is about comparing many sources of data and looking for commonalities and patterns. What Watson has is the ability to learn from the data flowing through it, so it could learn that this address is associated with this group of friends."



By: Carolyn Duffy Marsan

vendredi 22 novembre 2013

IBM Joins Fight Against Cancer in Developing Nations

Armed with big data and analytics, IBM enlists in the fight against cancer in developing nations by building a cancer registry.

IBM is working with the Union for International Cancer Control (UICC) to create a massive clinical data set on cancer patients by building cancer registries in developing nations.

Company officials said IBM's efforts will start where they are most needed, in Sub-Saharan Africa—where less than 1 percent of the region's population is covered by a cancer registry. With more than a billion people in the region, the new effort will improve cancer registration and, in time, treatment for patients in Africa while enriching knowledge about cancer for patients all over the world.

IBM said this data set could become the world's largest and most comprehensive clinical data set on cancer patients, and the company is donating its big data and analytics technology to the cause.

Cancer registries provide governments with incidence and mortality data so effective policies for cancer control can be developed. They also provide clinicians with information about patient outcomes to help identify tailored treatment options. Reliable and comprehensive data leads to the most effective interventions for saving lives, IBM said.

Gary Cohen, chairman of IBM Africa, announced IBM's donation of big data and analytics technology at the World Cancer Leaders' Summit in Cape Town, South Africa. "IBM's objective is to help find ways to level the field of access through innovation and knowledge, so that we can bridge the divide between the discovery of cancer and the delivery of treatment with positive outcomes—regardless of geography," he said, in a statement.

IBM officials said the initiative will begin in two to three countries in Sub-Saharan Africa, continue throughout the region, and extend to Southeast Asia and Latin America. The IBM collaboration supports UICC's work with the Global Initiative for Cancer Registries (GICR) in low- and middle-income countries. According to the World Health Organization, about 70 percent of all cancer deaths occur in developing nations. Experts predict that the Sub-Saharan region alone will see more than an 85 percent increase in its cancer burden by 2030.

"With IBM's expertise in big data and analytics, I can imagine a world in which the very latest scientifically proven means of detecting and treating cancer is available in all countries, benefitting patients wherever they are in the world," said Cary Adams, CEO of UICC, in a statement. "This information will provide unique and compelling insights on cancer, the likes of which we have not seen before."

According to the World Health Organization, more than 12 million people worldwide will be diagnosed with cancer this year, and approximately 8 million will die. Yet, Adams notes that this number is drawn from a database that is increasingly weak as the cancer burden moves, as predicted, from developed to developing countries. "Much of the world is tracking a growing burden of cancer with very incomplete information," he said. "Improving the collection of data is critical to our ability to address cancer around the world."

Collecting data about the incidence of cancer in many countries in the region is achieved through a paper-based system, which can consume hours to gather information for a single patient. All of the United States and Canada, 94 percent of Western Europe and 80 percent of Australia are covered by a cancer registry, according to leaders of the GICR initiative.

"IBM has always contributed its best assets and thinking to the world's biggest challenges, and there are few more serious than cancer," Dan Pelino, general manager of IBM Global Public Sector, said in a statement. "By helping UICC build cancer registries, we can shorten the time between discovery and treatment to save lives."

IBM joined UICC in 2012 to help the organization address the increasing data collection and analysis needs of the cancer community. IBM awarded an initial consulting grant that determined the business and technology plans required to build cancer registries. The next steps for IBM will be collaborating with the UICC and its GICR partners to plan and design the cancer registry in Sub-Saharan Africa, including the services, hardware, software, technical support and expertise to support the plan.

"Improved cancer registry data will reveal the population-based trends that are so important in shaping and adapting a cancer strategy," Dr. Isaac Adewohle, a gynecologist in Nigeria and president of the African Organization for the Research & Training in Cancer, said in a statement. "This will inform my daily practice in ways that my hospital data alone cannot." 

IBM has a history in teaming up with clinicians, researchers and public health organizations to help fight cancer through big data, cloud, analytics and other technologies. For example, IBM's Watson cognitive computing technology is advancing evidence-based treatment and research with Memorial Sloan-Kettering Cancer Center and MD Anderson Cancer Center.

IBM Research recently developed a microfluidic probe with a Swiss hospital to enhance cancer diagnosis, and nanotechnology to improve the treatment of breast cancer with the Institute of Bioengineering and Nanotechnology. IBM's World Community Grid provides free computational power to speed up cancer research as part of the Help Conquer Cancer project. And in collaboration with the Kenyan government, IBM has developed a plan to promote cervical cancer screening.





By: Darryl K. Taft
Link: http://www.eweek.com/database/ibm-joins-fight-against-cancer-in-developing-nations.html

Computers that emulate the brain may be the future, IBM says

You can date the first modern era of computing, in which massive mainframes like ENIAC were put to work on math and business problems too complex for the simple counting machines that came before, to a series of talks about computer science in the late 1940s.

Likewise, you can mark the moment technology started to move away from those days of Big Iron toward the era of the personal computer as Dec. 9, 1968, when Douglas Englebart introduced computer mice, word processing, hypertext and video conferencing at an event in San Francisco dubbed “The Mother of All Tech Demos.”

On Nov. 19, IBM held what it hopes will be another such watershed conference at its Almaden Research Center in San Jose — a colloquium on emerging computing technologies modeled on how the human mind works. The talks entitled “Cognitive Systems: The New Era of Computing,” may well usher in a new era.

“What we think of this event as is a kind of open parenthesis on the cognitive computing era,” Michael Karasick, IBM VP and head of the Almaden Research Center. “We don’t necessarily know where it’s going, but we want to get people thinking about these technologies and what’s now becoming possible.”

Cognitive computing is a branch of computer science that seeks to create computers that process data in ways that are more similar to how an organic brain processes data. It’s more of an umbrella term than a specific technology, touching on topics like machine learning, artificial intelligence, and computational creativity.

Broadly speaking, these systems are better than traditional computing at the things that organic brains excel at. Chief among those things is that they can learn, enabling them to figure out how to perform tasks that are far too complicated for a human developer to model on their own, like language processing or image recognition.



By: Jon Xavier

jeudi 21 novembre 2013

IBM Ranked #1 on The Graph500 Supercomputing List

IBM supercomputers have taken the top three spots on the latest Graph500 list released today during the Supercomputing Conference (SC'13) in Denver, Colorado. The biannual list ranks high-performance computing systems on the basis of processing massive amounts of Big Data.




The top three positions have been awarded to Lawrence Livermore National Laboratory's Sequoia, Argonne National Laboratory's Mira and Forschungszentrum Juelich's (FZJ) JUQUEEN, which all use IBM Blue Gene/Q systems. Blue Gene supercomputers have ranked #1 on The Graph500 list since 2010 with Sequoia topping the list three consecutive times since 2012. IBM also was the top vendor on the most recent list, with 35 entries out of 160.

The Graph500 was established in 2010 by a group of 50 international HPC industry professionals, academics, experts and national laboratory staff. There are five key industries that the Graph500 tries to address with its benchmark which include cybersecurity, medical informatics, data enrichment, social networks, and symbolic networks. All of these industries process and analyze large amounts of data, which is also why the Graph500 looks at graph-based data problems, since this a foundation of most analytics work, and the ability of systems to process and solve complex problems. The Graph500 was established as a complement to the TOP500 list, which ranks supercomputers based on performance speed via a benchmark called the LINPACK.



By: IBM News Releases
Link: http://www-03.ibm.com/press/us/en/pressrelease/42542.wss

IBM Uses RESTful APIs to Turn Watson into a Cloud Service

The combination of natural language processing and advanced text analytics is giving rise to a new class of  cognitive applications that have the potential to radically transform the way entire industries operate. The most famous instance of a cognitive application is, of course, IBM Watson, the supercomputer that IBM built on top of Power processors to best the champions from the Jeopardy quiz show.




Beyond playing games, however, the ability to easily query a knowledge base of expertise that gets smarter with each successive correct answer it generates has the potential to put a massive amount of expertise directly into the hands of the average. Now IBM is moving to put that power in the hands of developed with the launch of IBM Watson Developer Cloud, which includes a software development kit that will allow developers to build applications on top of IBM Power systems running on an IBM cloud while at the same time exposing a set of RESTful APIs that will make it possible to invoke those applications from within another application. In addition, IBM plans to make third-party applications developed on this platform available to customers through a new IBM Watson Content Store.

The first three independent software vendors to participate in the program include Fluid, which builds on-line shopping applications for retailers; MD Buyline, a provider of supply-chain applications for hospitals; and Welltok, which built a social health application for end users. All three are scheduled to have applications available on the IBM Watson cloud in early 2014.

IBM is also partnering with providers of sources of data that can be consumed by Watson. They include Healthline, a provider of health information services; and Elance, an online marketplace for freelancers.

According to Rob High, a chief technology officer and IBM fellow in the IBM Software group, IBM plans to work with several venture capital firm to help promote the development of both cognitive computing applications and additional sources of data services that could be consumed by Watson.

Just like any Web application, cognitive applications will likely be most useful when they are invoked within the context of a business process. For example, a healthcare provider recording a patient’s symptoms could dynamically query a cognitive application running in the cloud to determine the best course of treatment to pursue. Similarly, a law firm would be able to make use of a cognitive application to determine what laws best apply to any given case.

Today, people spend countless hours researching information that an application that supports natural language processing could instantly put at their fingertips or a voice command away. To enable that IBM wants to build an ecosystems of Watson applications in the cloud that could be dynamically invoked by thousands of other applications, High says.

The challenge with creating cognitive applications is that right now most developers don’t have access to supercomputers based on IBM Power systems. By making Watson available as a service in the cloud IBM is removing hardware issues as a barrier to the development of next-generation cognitive computing applications. To accomplish that commercial success, High says, IBM has almost completely revamped the Watson system that appeared on Jeopardy!

No doubt other ecosystems of cognitive applications will emerge in time. But for developers looking to create truly unique applications that have the potential to transform the economics of entire industries, IBM Watson in the cloud provides an opportunity to get started building these applications without having to spend millions of dollars just to find out whether they will work or not.



By: Michael Vizard
Linkhttp://blog.programmableweb.com/2013/11/19/ibm-uses-restful-apis-to-turn-watson-into-a-cloud-service/

mercredi 20 novembre 2013

IBM Helps Cities in Basque Region of Spain Build A Sustainable Community and Turn Data into Insight

Today during Smart City Expo World Congress, IBM announced that it is working with the towns of Irun and Hondarribia, Spain, on a new Smarter Cities project. Using data from trash containers that know how much is thrown away, smart street lights that report out when they need maintenance, and parking places that know when they are empty, IBM’s smarter cities technology is providing real-time insight to help make better decisions.

The area known as the Bajo Bidasoa area in the Basque region of Spain, with a population of 78,000 people, is leading the way in mining patterns in vast quantities of diverse data, using real-time data to make accurate predictions, and engaging citizens via social collaboration to make the area a better place to live and work. 

At the foundation of the project, IBM’s Intelligent Operations Center software provides real-time insight into all city operations. It also powers the Smart City Center, an integrated command center where data is analyzed and shared. For example, city leaders can see the correlation between water consumption and waste generation, monitor and predict the effect of bad weather on incidents within the area, or visualize the amount of resources used across water, waste management, transportation, energy and public works departments.  

Bold city leaders from the towns of Irun and Hondarribia set out to work together to improve sustainability, encourage more citizen participation and provide greater transparency. They did this by working with technology partners IBM, Servicios de Txingudi, the local water and waste water management and street cleaning agency, and Smartland Technologies, a group of six companies including IBM Business Partner BuntPlanet.  

“The possibility of analyzing large amounts of data through new technology opens up enormous possibilities for better public sector management,” said the mayor of Irun, Jose Antonio Santano. “We live in an era of global crisis and it is precisely at this time when we need to sharpen our ingenuity to better know how to apply talent and technology for the benefit of our citizens.” 

The region has also made numerous advancements to improve its water systems under the leadership of Servicios de Txingudi by installing 32,000 sensors that collect water consumption data in real time. Water leaks decreased by 70 percent; water supply pumping costs decreased by 14 percent; and unnecessary water treatment decreased by 40 percent as a result of the ability see and manage water systems in real-time. The area is also generating renewable and efficient energy by installing small hydro plants, generating electricity from bio gas obtained from wastewater treatment, installing solar panels on water tanks, and building a combined heat and power facility (CHP facility) that allows the water treatment plant to be energy independent when necessary. 

To improve waste management and encourage more citizen participation in recycling, more than 750 compost bins were distributed to citizens, meanwhile volunteers are placing RFID tags on trash allowing waste generation to be more accurately measured and provide better insight into which social or environmental conditions create more waste and how to prevent it. Citizens are also turning to smartphones to communicate with city leaders. Incidents such as a fallen tree or traffic accident or pothole can be reported and shared including a photo and geographical information. Data is collected by the Smart City Center where issues are resolved and tracked and citizens can check the status of each.  

“We understand the importance of working with leading technology companies like IBM in order to meet citizen needs and respond to their problems,” said mayor of Hondarribia, Aitor Kerejeta. “But we are also very proud to have worked with local businesses, most from the Spanish region of Guipuzcoa, with entrepreneurs that promote economy and jobs that are close to us”. 

The community’s efforts to become smarter and more sustainable have also resulted in economic development as local entrepreneurs have worked alongside technology partners to create new technologies and in turn added jobs to the local economy. 

“The Bajo Bidasoa region of Spain has emerged as a model for other European cities as they apply leadership, collaboration and innovation to become a more sustainable, liveable area” said Sylvie Spalmacin-Roma, vice president, Smarter Cities Europe. 

IBM has deep expertise in working with cities of all sizes, helping solve their toughest challenges. By bringing together cloud computing, mobile and social, IBM his helping cities realize the potential to build more sustainable, efficient cities that are focused on the needs of citizens.  





By: IBM News Releases
Link: http://www-03.ibm.com/press/us/en/pressrelease/42526.wss

OpenStack brings agility to the enterprise

In his keynote at the OpenStack Summit in Hong Kong earlier this month, Jonathan Bryce, the director of the OpenStack Foundation, referenced a user poll identifying the top 10 environments within which OpenStack was being deployed. Not surprising was that the usage today is for web applications, agile development and DevOps.

What did not get coverage is how IBM is adopting OpenStack in the enterprise, bringing the same agility and flexibility of commodity cloud to enterprise systems. In October, IBM announced Power Virtualization Center (PowerVC) as the management center for PowerVM. Credit where credit is due, VMware has done a great job with vCenter, providing an easy-to-use, administrator-friendly approach to ESX virtualization management. PowerVC seeks to do the same for PowerVM and IBM Power Systems.

Built on the OpenStack Havana release, PowerVC uses the core Cinder, Nova and Neutron management components of OpenStack to manage the provisioning and configuration of LPARs, the creation and allocation of storage LUNs to LPARs and the assignment of IP addresses and VLANs. Initial storage support is for IBM Storwize V7000 and SAN Volume Controller (SVC), directly allocating LUNs to LPARs via NPIV. Support for Shared Storage Pools (SSP) and other storage devices with OpenStack Cinder device drivers are also in plan.

Power administrator productivity is greatly improved. Finally they can allocate LPARs with SAN attached storage in one go. The waiting for storage admins to map and assign LUNs to the dynamically created NPIV WWNs is eliminated. Even more impressive is the ability to manage LUN reassignment during Live Partition Migration (LPM) of LPARs from one system to another. Remapping LUNs from the source system to the target system as required and validating the configuration to ensure success of the live LPAR migration.

PowerVC 1.2 is available in December 2013. In the meantime, you can find more information on IBM Developer Works and a sneak preview on YouTube.



By: Steve Strutt