IT and Business Insights for SMB Solution Providers

Intel Launches Flexible Xeon Scalable Processors for a Changed Data Center Market

Formerly code-named “Ice Lake,” the 3rd Gen Intel Xeon Scalable family is designed to accommodate the varied needs of hybrid cloud, artificial intelligence, 5G, and edge computing workloads. By Rich Freeman

Intel has introduced a new collection of data center processors designed to accommodate the varied needs of hybrid cloud, artificial intelligence, 5G, and edge computing workloads.

“With our newest Xeon Scalable processors, our ecosystem can create solutions that matter, and our customers can place their workloads securely where they perform the best, from the edge to the network to the cloud,” said Intel CEO Pat Gelsinger this morning in a launch webcast.

Based on Intel’s 10nm process, the 3rd Gen Intel Xeon Scalable family (formerly code-named “Ice Lake”) features 57 SKUs offering a mix of one-, two-, four-, and eight-socket configurations with as many as 40 cores. The platform supports up to 6 TB of memory per socket, up to 8 channels of DDR4-3200 memory per socket, and up to 64 lanes of PCIe Gen4 interface capacity per socket.

The new chips also support 200 series Intel Optane persistent memory SKUs, Intel’s P5800X Optane SSD, and Intel D5-P5316 NAND SSDs, as well as Intel Ethernet 800 Series adapters and Intel Agilex FPGAs.

On average, Intel says, the Ice Lake platform delivers a 46% performance improvement over previous-generation Xeon Scalable products on popular data center workloads, and up to a 2.65x average performance gain versus comparable five-year-old systems.

Cloud users in particular, according to Intel, can expect to experience up to 50% better performance on database, e-commerce, web server, and other latency-sensitive workloads, as well as 65% better analytics speeds. 

The new processors are designed to support bi-directional workload migration between on-premises and online environments as well. “For example, if a customer is using VMware on premise, migrating their virtual machines between the public cloud and their on-prem environment is seamless,” said Lisa Spelman, corporate vice president and general manager of Intel’s Xeon and Memory Group, during this morning’s broadcast.

The platform’s broad software compatibility, she added, enables consistent application performance across cloud environments.

Drawing on Intel Deep Learning Boost technology, Intel says, 3rd Gen Xeon Scalables provide a 74% increase in inference performance on AI solutions generally versus earlier generations and up to 1.56x better image classification performance more specifically. In testing, Intel claims, 3rd Gen Xeon Scalable chips performed up to 1.5x faster across 20 AI workloads than AMD’s EPYC 7763 CPU and up to 1.3x faster than NVIDIA’s A100 GPU.

The new platform’s four network optimized “N-SKUs,” Intel says, deliver up to an average 62% better performance than predecessors on common network and 5G tasks. When combined with Intel Ethernet 800 series adapters and eASIC chips, the manufacturer adds, users can double their 5G MIMO performance within a similar power envelope.

Security features in the new portfolio include Intel’s Total Memory Encryption, Platform Firmware Resilience, and Software Guard Extensions (SGX) technologies. The last of those, which partitions data and application code in isolated memory “enclaves”, was previously available on Intel Xeon E series processors and now ships as well on dual-socket Xeon Scalable processors capable of protecting up to 1 TB of content.

Intel Crypto Acceleration technology in the new platform increases the performance of encryption-intensive workloads like SSL web serving, Intel says, allowing organizations to secure sensitive data without slowing down applications.

ChannelPro SMB Magazine

Get an edge on the competition

With each issue packed full of powerful news, reviews, analysis, and advice targeting IT channel professionals, ChannelPro-SMB will help you cultivate your SMB customers and run your business more profitably.