Courtesy of Information Week
called Deep Computing Solutions, aims to apply supercomputing to
complex industrial problems ranging from new aircraft design to
agriculture's impact on the environment.
Lawrence Livermore National Laboratory and IBM on Wednesday
announced a partnership to provide high-performance computing
capabilities to businesses looking to tackle complex
The joint venture, called Deep Computing Solutions, will be
located within LLNL's High Performance
Computing Innovation Center (HPCIC), established 12 months
ago to help U.S. industries tap supercomputing capabilities to
compete in the world marketplace.
In March, the White House announced a
big data initiative, where federal resources will be
applied to harnessing the vast amounts of data gathered by the
government to address scientific, economic, environmental, and
medical challenges. The IBM-LLNL partnership comes in response to
"Lawrence Livermore is dealing with big data problems
today. It's frequently the case that [we are] doing things for the
first time, things that haven't been done before," said Jeff Wolf,
HPCIC's chief business development officer, in an interview
with InformationWeek. "Livermore's
motivation is to try to improve U.S. national security. We're
trying to transfer the knowledge we have to businesses."
IBM considers big data "the foundational stone" to solving
business problems, said James Sexton, program director of IBM's
T.J. Watson Computational Science Center. "It could be an
engineering design for a new aircraft or gas turbine engine. It
could be an agricultural system where you're managing the land and
want to minimize damage to the environment," said Sexton.
"How do you model? How do you simulate? How do you send back
[results] to the person who's going to make the decision?" said
Sexton. Big data on HPC systems is the answer to those
There are two prerequisites to tackling big data problems,
Sexton said. One is to have raw data on what has happened in the
past, and the second is to have an understanding of "how the system
is actually functioning." Those serve as the basis for the
computational models to be applied.
Lawrence Livermore Lab will dedicate a portion of Vulcan, a
24-rack, 5-petaflop IBM Blue Gene/Q supercomputer to such
problem-solving projects. Vulcan is due to be delivered to HPCIC
this summer. It will be used to support HPCIC and Deep Computing
Solutions, as well as unclassified National Nuclear Security
Administration research programs, academic alliances, and science
and technology projects.
Vulcan is part of the same contract that brought Sequoia, the
Blue Gene/Q supercomputer that recently ranked
No. 1 in the world, to LLNL.
Deep Computing Solutions is not intended as a profit-making
venture, Wolf said. Businesses, nonprofits, and government agencies
will be charged on a cost-recovery basis. "If a single company or
organization wanted to do this, there would be a much higher entry
cost," IBM's Sexton said. "As we continue to expand the program,
the price per petaflop will drop."
Computer and science experts from IBM Research and Lawrence
Livermore Lab will work with businesses to develop high-performance
computing solutions. "Today we're trying to bootstrap them," Sexton
said. "It's a proof of concept that we're trying to get going."
The Office of Management and Budget demands that federal
agencies tap into a more efficient IT delivery model. The new
Shared Services Mandate issue of InformationWeek
Government explains how they're doing it. Also in this issue: Uncle
Sam should develop an IT savings dashboard that shows the returns
on its multibillion-dollar IT investment. (Free registration