International Data Corporation (IDC) Lauds Innovations in Supercomputing with Innovation Excellence Awards
The HPC Innovation Excellence Award recognizes noteworthy achievements by users of high performance computing technologies. The program's main goals are to showcase return on investment (ROI) and scientific success stories involving HPC; to help other users better understand the benefits of adopting HPC and justify HPC investments, especially for small and medium-size enterprises (SMEs); to demonstrate the value of HPC to funding bodies and politicians; and to expand public support for increased HPC investments.
"IDC research has shown that HPC can accelerate innovation cycles greatly and in many cases can generate return on investment. The ROI program aims to collect a large set of success stories across many research disciplines, industries, and application areas," said Earl C. Joseph, Ph.D., IDC's program vice president for High-Performance Computing (HPC) and executive director of the HPC User Forum. "The winners achieved clear success in applying HPC to greatly improve business ROI, scientific advancement, and/or engineering successes. Many of the achievements also directly benefit society."
Over thirty 2016 Innovation Excellence Award finalists were drawn from a broad pool of public and private sector organizations that have applied advanced supercomputing to realize breakthroughs of major scientific, economic, or artistic importance, often while saving millions (and even billions) of dollars. The HPC User Forum steering committee served as the initial judging panel for the awards.
HPC Innovation Excellence Award Winners 2016
The Centre for Computational Medicine, University of Toronto: From genomics to medical imaging, almost every discipline in health care is dealing with a "Data Deluge." Translating this into something that will ultimately benefit patients requires massive amounts of computation and storage in an environment that is fast, secure, and run with optimal efficiency. The University of Toronto's SickKids Centre for Computational Medicine uses a supercomputer operating at 107 trillion calculations per second to predict the minute differences between individual children to identify the most precise treatment possible for each child under their care.
Disney Animation Studios
- Frozen: Software engineers used advanced mathematics and physics, with assistance from mathematics researchers at the University of California Los Angeles (and 4,000 computers), to design breathtaking, believable scenes.
- Tangled: This film employed a unique artistic style by blending features of computer-generated imagery (CGI) and traditional animation, while using non-photorealistic rendering to create the impression of a painting. Disney developed unique techniques and tools to makes the computer "as pliable as the pencil" to create the ultimate (and most expensive) 3D movie of all time.
- Big Hero 6: Walt Disney Animation Studios created new software, called Denizen, to create over 700 distinctive characters. Another, called Bonzai, was used to create the city's 250,000 trees, and a new rendering tool, called Hyperion, offered new illumination possibilities. Disney had to assemble a new supercomputing cluster just to handle Hyperion's intense processing demands, which consisted of over 2,300 Linux workstations in four data centers, backed by a central storage system with capacity of five petabytes.
- How to Train Your Dragon 2: Over the five years before the film's release, DreamWorks Animation overhauled its production workflow and animation software. How to Train Your Dragon 2 was the first DreamWorks Animation film that used "scalable multi-core processing," developed together with Hewlett-Packard. This "next revolution in filmmaking" enabled artists for the first time to work on rich complex images in real time, instead of waiting eight hours to see the results the next day. Programs named Premo and Torch allowed unique subtlety, improving facial animation and enabling "…the sensation of skin moving over muscle instead of masses moving together."
- Kung Fu Panda: The computer animation used in this film was more complex than anything DreamWorks had applied before. They found help through the Department of Energy's Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program – the company was awarded a grant to refine and test its redesigned software on the leadership-class supercomputer at the Oak Ridge National Laboratory. The prototype software was successfully tested and immediately put to use. Knowledge gained from the INCITE grant informed an entirely new rendering architecture and has become essential in creating all of DreamWorks' animated films. But the biggest win was the raw speed at which those responsible for the film's lighting could get back frames. Iterations were sped up by an order of magnitude – a tenfold savings in total processing time. An iteration that once took hours was now accomplished in mere seconds.
- Kung Fu Panda 2: Building on what was learned during the making of Kung Fu Panda, Kung Fu Panda 2 is the first DreamWorks Animation film to use dynamic, physics-based crowd characters, such as the wolves. Intensive computational graphics require seven million render hours to produce 14,000 frames.
- Monsters vs. Aliens: It took approximately 45.6 million computing hours to make this film, more than eight times as many as the original Shrek. Several hundred Hewlett-Packard workstations were used, along with a "render farm" of HP ProLiant blade servers with over 9,000 server processor cores to process the animation sequence. Animators used 120 terabytes (TB) of data to complete the film. They used 6 TB for an explosion scene. Since Monsters vs. Aliens, all feature films released by DreamWorks Animation are produced in a stereoscopic 3D format, using Intel's InTru3D technology.
KINETICA/United States Postal Service: How does a company that makes daily deliveries to more than 154 million addresses, using several hundred thousand vehicles and employees, improve efficiencies using visualizations and analytics of real-time data? It's called Kinetica, an in-memory GPU accelerated database, and it's how the United States Postal Service (USPS) is optimizing its operations. Faced with the daunting task of managing the nation's postal service, which covers a larger geographical area than any other, with increasingly limited resources, USPS needs to continuously improve safety, efficiency, and services without overspending. The complexities and dynamics of USPS' logistics have reached all-time highs, while consumers have greater demands and more alternative options than ever before; they require sophisticated services like the just-in-time supplies, tracking, and delivery updates, and dynamic shipment routing. Improving end-to-end business process performance while concurrently reducing costs requires the ability to make fast business decisions based on live data. The USPS has accomplished this with Kinetica, optimizing operational efficiencies to save time and money.
Novartis/Amazon Web Services (AWS)/Cycle Computing: Novartis ran a project that involved virtually screening ten million compounds against a common cancer target in less than a week in 2013. They calculated it would take 50,000 cores and close to a $40 million investment if they wanted to run the experiment internally. Partnering with Cycle Computing and AWS, Novartis built a platform leveraging Amazon Simple Storage Service (Amazon S3), Amazon Elastic Block Store (Amazon EBS), and four Availability Zones. The project ran across 10,600 Spot Instances (approximately 87,000 compute cores) and allowed Novartis to conduct 39 years of computational chemistry in nine hours, all for a cost of $4,232. Out of the 10 million compounds screened, three were successfully identified.
University of Rochester Medical Center: By combining genetics, neurobiology, and supercomputing, researchers at the University of Rochester Medical Center, New York, identified a genetic mutation responsible for a potentially deadly seizure disorder found in infants and young children. Young people with the condition who survive beyond infancy often struggle for the rest of their lives with developmental disabilities, autism, and uncontrollable seizures. The researchers used a supercomputer cluster to quickly obtain a full genetic profile – more than 20,000 genes – for each study subject and to compare the results with data from other families. These findings opened up what was a "black box," enabling researchers to more fully understand the biological pathways associated with these disorders and why some patients do not respond to treatment.
IDC Central Europe GmbH
International Data Corporation (IDC) is the premier global provider of market intelligence, advisory services, and events for the information technology, telecommunications, and consumer technology markets. With more than 1,100 analysts worldwide, IDC offers global, regional, and local expertise on technology and industry opportunities and trends in over 110 countries. IDC's analysis and insight helps IT professionals, business executives, and the investment community to make fact-based technology decisions and to achieve their key business objectives. Founded in 1964, IDC is a subsidiary of IDG, the world's leading technology media, research, and events company. To learn more about IDC, please visit www.idc.com. Follow IDC on Twitter at @IDC.