Gone are the days when incredibly powerful and lightning-fast server technologies, called high performance computing (HPC) in tech parlance, were perceived to be apt for use in only specific ?complex? computing scenarios?engineering, science and medical research projects?where aspects of supercomputing were critical.
While this convoluted opinion of HPC made it difficult for mainstream verticals to relate with this extremely relevant technology, the then high costs involved in deploying HPC solutions added another key challenge that restricted the growth and adoption of this technology in the market.
Today, the scenario is changing fast and enterprises across mainstream industries?automobiles, manufacturing, life sciences, banking, oil and gas and public sector?are in need of faster ways of doing business. With the initial market barriers of cost and complexity resolved, Indian enterprises are treading on a wider and more mainstream adoption of HPC solutions. These solutions are fast becoming the ?must have? common technology requirement that equip their businesses with higher processing power.
?HPC is moving into mainstream, since customers can build their cluster depending upon their budgets and other objectives. It could be anywhere from a 4-node cluster to 1,000-2,000-plus nodes cluster,? says Rahul Bindal, vice-president, industry systems, systems and technology group, IBM India/South Asia. In addition to the growing commercial opportunity, HPC has got lots of social ramifications too. Take the case of weather modeling or Tsunami warning systems?all these use good deal of HPC.
?HPC is no longer meant just for scientists, researchers and academia. Commercial enterprises are also looking for an edge to help solve highly complex problems, perform business-critical analysis or run computationally intensive workloads,? says Karthik Ramarao, director-technology, systems practice, Sun Microsystems. HPC continues to penetrate the enterprise, leading to reduced product development cycles, faster decisions, and more intelligent what-if scenarios, he adds.
A look at some of the industries deploying HPC solutions makes interesting reading. With increasingly complex product designs and globally dispersed operations, manufacturers need to rapidly deploy effective tools to help them collaborate with multiple design teams and vendors, while getting the most from their computational resources. HPC solutions vastly improve engineering computational work-flows by allowing disparate systems to be pooled and managed as a common computing resource.
The financial firms are today facing an unprecedented number of market wrenching challenges due to growing volumes of algorithmic trading, the impending enactment of trading regulations and higher demands of real-time processing. They are under tremendous pressure to re-architect their compute systems to stay ahead of competition.
To remain competitive, financial institutions are looking to simulate complex trading scenarios faster than the competition. Not surprising, CIOs are looking to deploy HPC solutions that can scale on-demand, deliver extreme low-latency, real-time performance and ensure data/info security.
In the oil and gas sector, upstream organisations know that HPC is the most effective way to enable geoscientists to speed their most complex, data-intensive projects to completion. HPC helps geoscientists to make informed and more accurate reservoir drilling decisions. Drilling for new oil or improving the yield of an existing oil-field is an expensive proposition. Using HPC technologies, engineers can be more confident on their return on investment.
In the life sciences sector, pharma companies are hard-pressed to bring newer drugs to market faster. With new-found diseases coming to light everyday, biologists are trying to find answers by running extremely complex life-science simulations. HPC can accommodate this industry?s wide range of computational needs, and supply the various types of storage that are required to solve its fascinating problems.
Last but not the least, the public sector has a growing array of high-performance computing needs, especially focusing on the need to conduct advanced simulations and enhance images, perform pattern recognition, and crack signals faster with more fidelity and confidence. The need is especially critical in defence, intelligence, and internal security. And, while computation requirements are increasing, so is the need to improve turnaround to decision makers, who have less time to make critical decisions. A key component of the solution is the need to deal with increasingly large and diverse data-sets originating, not only from traditional computing systems, but new generations of wireless communications devices.
A few years ago, HPC was dominated by proprietary mainframes and large UNIX systems. Faisal Paul, country manager, HPC and Linux business, Hewlett-Packard India, says, ?Today, all the productionable HPC is built around industry standards-based infrastructure and as much as possible open source-based software stacks. These deployments are easy to deploy and manage and provide best return on investment to the customers.?
Companies like Tata Motors, Ashok Leyland, Shell, ONGC, General Electric, Indian Institutes of Technologies, Crest Animation, Maya Entertainment among others, are discovering that HPC solutions can help them make significant productivity gains. That?s not all. The advent and rapid rise of outsourcing engineering, simulation and animation projects is further pumping into the adrenaline rush in the Indian HPC market scenario. India, says Paul, is at par with global trends and technology adoption and deployment.
At its simplest, HPC is the use of parallel processing for running advanced application programs efficiently, reliably and quickly. The term applies especially to systems that function above a teraflop or 1,012 floating-point operations per second. HPC solutions connect a group of servers to create a compute ?cluster? that reduces execution time for computation and increases the output speed for processing and returning data to applications.
Karthik Ramarao, director-technology, systems practice, Sun Microsystems, says that as per IDC HPC research findings over the last four years, the HPC market globally has seen revenue growth averaging 20% per year. HPC clusters continued to gain momentum across all HPC segments. Revenue from clusters represented 68% of the overall HPC server revenue for the third quarter of 2007. HPC system usage is growing in all end-user segments including government, academia, and industry. For many engineering and scientific studies, it has become cheaper and faster to use computer simulation instead of more costly physical experiments. Competitive pressures are pushing many R&D groups to complete their research in a much shorter time frame.
Among the fast-growing vertical segments are biosciences, geosciences (oil and gas), computer-aided engineering (CAE), electronic data automation (EDA), defense, and university research. With the growing demand of computer graphics, animation and gaming content, companies in the media and entertainment industry have been early adopters of HPC solutions.
Ramarao says the need for HPC has steadily increased over the years in India. From R&D specific scenarios, HPC is increasingly finding acceptance in mainstream computing areas?verticals like telecom, finance and manufacturing are going for HPC deployments.
The IT and the ITeS industry in India have been leaders in adoption of HPC solutions, whether for structural analysis, crash simulation or financial analysis. Even small life sciences companies that want to understand protein-folding characteristics have been adopting HPC.
Apart from traditional applications used by customers of technical background, there has been traction of HPC adoption amongst commercial customers for applications such as financial analysis and portfolio management, digital security and surveillance, and decision support computing as well.
Initially, like any new technology, HPC had its own set of challenges that it had to overcome before it gained larger market acceptability. As an indicator, a 10Gflop system would have probably cost in the range of $30 million in 1991. A 10 Gflop system today (on a Windows platform) probably comes in the range of $4,000.
?This is a huge enabler in helping HPC adoption to move to the mainstream. Adoption of commodity processors and network components are on the rise. Users are being able to characterise benefits of ease of use with commodity components, better control, faster turn-around time and affordability is helping this in a big, big way,? says Pallavi Kathuria, director-server business, Microsoft India.
The recent advances in multi-core X86 processors are driving the HPC adoption at a much faster pace. Well-built X86-based cluster systems are delivering compute power required by majority of the HPC applications. And this is available to HPC users at a fraction of the cost of large supercomputers of yesteryears.
The message is clear?with the network expanding rapidly, and data processing requirement growing exponentially, HPC is the need of the hour, which provides accelerated results for a broad range of industries, including commercial and scientific ventures.