The rising expectations of executives require IT to be even more agile when addressing the needs of the business
Enterprise IT continued to evolve in 2012 at a lightning-fast pace. Concepts like hybrid environments and cloud operating systems moved from being buzzwords that were discussed and planned for, to, in many cases, full-scale implementation.
Simultaneously, other trends began to take shape—trends that will heavily influence the way enterprises approach IT well into this year and beyond. These trends range from the way organisations handle big data to the ever-present need for mission-critical computing. They signify the rising importance of IT within the enterprise, and beg the answer to the question “What does the future enterprise platform really look like?”
The role of IT has never been more important within the enterprise. In a recent Gartner and Forbes survey of Board of Directors, the percent of respondents that rated IT’s strategic business value contribution as high or extremely high doubled between 2010 and 2012. The rising expectations of executives require IT to be even more agile when addressing the needs of the business.
In response, we’re seeing a trend towards convergence of compute, storage and network as integrated infrastructure in the next few years. Many organisations are also looking to standardise their infrastructure to become more efficient. While the long-term goal is simplification and standardisation, this represents a significant shift for IT. How and when companies move forward will depend on their ongoing virtualisation and cloud computing efforts.
From a process perspective, we’re also seeing the convergence of development, infrastructure and IT operations teams to strengthen the interdependence of these groups and reduce development and deployment time. More organisations will adopt an integrated DevOps approach to increase communication, collaboration and integration between these teams, and thus eliminate issues that stem from incomplete hand-offs or insufficient skills.
This will put additional strain on developers. Not only are they tasked with developing and managing code; many of tomorrow’s developers will also be challenged with troubleshooting infrastructure issues once solutions they build are in deployment. This makes the need for reliable platforms and stable operating systems even more critical. As their jobs become more complex, they need an enterprise platform that frees them to focus on new projects, instead of managing downtime.
This model will continue to gain traction within organisations as the IT department achieves prominence and enterprises look to streamline operations. But it won’t come without challenges. Enterprises will struggle with how to successfully adopt this approach to software development, and it’s up to open source solution providers to commit to making the transition as painless as possible.
Operating systems have always served two primary purposes: to enable software and developers to consume and take advantage of hardware innovations as they become available and to deliver a stable foundation on which applications can run. Moving forward, operating systems will continue to evolve in these ways to power the cloud. Take Linux as an example. Linux was developed on and for the internet and has evolved to support 8 out of every 10 cloud-based applications today. This is because it’s portable, secure and reliable.
The cloud demands choice and flexibility, and we believe that’s what will maintain Linux the cloud operating system well into the future. As organisations move to the cloud, the OS will continue to deliver a critical foundation. The question is which ones are the best fit—those based on a traditional, walled-garden approach that fosters vendor lock-in, or the ones built on open source, that originated on the internet and are tailor-made for the cloud.
Global data is estimated to increase 50-fold by 2020, and our customers recognise that they need to harness the increased volume, variety and speed of data if they are going to succeed. Not only are they concerned about how to store tremendous amounts of data; they’re also struggling with how to analyse it, because data is only valuable when you can gain insights from it to make decisions.
While businesses have always run on information, big data introduces data sets so large and complex that storing them for easy retrieval is cumbersome. This data comes from a variety of structured and unstructured sources, including business transactions, sensor data, audio, video, click streams, log files and more. IT must ensure that big data is an asset and not a cost by supporting the ability to store, aggregate, normalise, and integrate it from all sources across multiple systems.
But storing the data is only valuable if you can use it. Big data also circumvents our ability to apply a traditional business intelligence approach to working with the data to make decisions. While batch versus real-time data analytics is currently split, companies are putting even greater focus on shifting more analytics to be done in real time. IT must continue to invest in the right transactional and big data analytics systems to analyse and communicate the results that aid in decision making.
The writer is vice-president and general manager, Platform Business Unit, Red Hat