The new chip is also designed to address today’s most pressing data protection concerns.
Intel on Tuesday took the wraps off Ice Lake, its new 3rd Gen Intel Xeon Scalable processor for data centres. This CPU, which is based on a 10-nanometre process, can deliver up to 40 cores per processor, with – in the words of Intel – an average 46 percent jump in performance on popular data center workloads over the last generation. Intel also claims that it is the only data centre processor with built-in AI.
Ice Lake is designed to power the industry’s broadest range of workloads, from the cloud to the network to the edge. More than 50 unique OEMs are launching over 250 servers based on the new platform alongside, and Intel said, Ice Lake will go on to be deployed by the world’s top tier cloud service providers to highlight its adoption across all market segments.
“Our 3rd Gen Intel Xeon Scalable platform is the most flexible and performant in our history, designed to handle the diversity of workloads from the cloud to the network to the edge,” Navin Shenoy, EVP and GM of Intel’s Data Platforms Group said in a statement. “Intel is uniquely positioned with the architecture, design and manufacturing to deliver the breadth of intelligent silicon and solutions our customers demand.”
In addition to the near 50 percent increase in performance, the new chip also has Intel’s SGX (Software Guard Extensions), to “protect sensitive code and data with the smallest potential attack surface within the system.” The technology can isolate and process up to 1 terabyte of code and data in private memory areas that Intel calls enclaves. There are a couple of more security features in Ice Lake, including Total Memory Encryption and Platform Firmware Resilience, that together can “address today’s most pressing data protection concerns.” The chip also features cryptographic acceleration for businesses that run encryption-intensive workloads.
Lastly, the new chip boasts 74 percent faster AI performance compared with the previous generation making it possible “to infuse AI into every application from edge to network to cloud.”
Ice Lake supports up to six terabytes of system memory, up to eight channels of DDR4-3200 memory and up to 64 lanes of PCIe Gen4 per socket.