Dell has significantly expanded its AI infrastructure capabilities, positioning itself at the forefront of enterprise AI hardware solutions. The company's latest announcement showcases a major advancement in its AI Factory ecosystem, with new server offerings designed to handle increasingly complex AI workloads while giving customers unprecedented flexibility in deployment options.
![]() |
---|
Dell Technologies showcases its cutting-edge AI hardware solutions at the Dell Technologies World event |
Dell's AI Factory Evolution
Dell Technologies has unveiled a substantial expansion of its AI server portfolio, headlined by the new PowerEdge XE9780 and XE9785 servers featuring NVIDIA's cutting-edge Blackwell Ultra GPUs. This release represents a significant milestone in Dell's AI strategy, building upon the foundation established two years ago through Project Helix, Dell's initial collaboration with NVIDIA that introduced the concept of enterprise AI factories before NVIDIA widely adopted the terminology. The new servers are designed to address the growing demand from enterprises looking to process AI workloads locally, particularly for sensitive and valuable data that remains behind corporate firewalls.
![]() |
---|
Dell's AI Factory initiative emphasizes significant investment and innovation in AI technologies |
Blackwell Ultra Integration and Configurations
The PowerEdge XE9780 and XE9785 servers come in both air-cooled and liquid-cooled variants (XE9780L and XE9785L), with the liquid-cooled models supporting an impressive 256 NVIDIA Blackwell Ultra GPUs per rack. The air-cooled versions support up to 192 GPUs and are designed for easier integration into existing enterprise data centers. These servers leverage NVIDIA's HGX B300 architecture and deliver up to four times faster large language model (LLM) training compared to previous generation offerings, highlighting the substantial generational performance improvement.
Expanding Beyond NVIDIA
While the NVIDIA partnership remains central to Dell's AI strategy, the company is embracing silicon diversity across its product lines. At Dell Technologies World, the company announced PowerEdge XE9785 servers equipped with AMD's Instinct MI350 GPUs, which offer a viable and potentially more power-efficient alternative to NVIDIA-based systems. Additionally, Dell revealed one of the first mainstream deployments of Intel's Gaudi 3 AI accelerators in its PowerEdge XE9680 servers, configured with eight Gaudi 3 chips. This multi-vendor approach provides enterprises with greater flexibility in hardware selection based on their specific requirements and existing software ecosystems.
Future-Ready Hardware Portfolio
Looking ahead, Dell has announced plans to support NVIDIA's Vera CPUs and develop PowerEdge designs featuring NVIDIA's Rubin GPUs, further expanding its comprehensive AI hardware portfolio. The company is also introducing the PowerEdge XE7745 server, which will feature NVIDIA RTX Pro 6000 Blackwell Server Edition GPUs starting in July 2025. This platform is designed for physical and agentic AI use cases like robotics, digital twins, and multi-modal AI applications, supporting up to 8 GPUs in a 4U chassis.
Software Enhancements for AI Workloads
Beyond hardware, Dell has introduced new software capabilities under its AI Data Platform umbrella. Project Lightning addresses one of the biggest challenges with large AI models—fast data access and memory loading—by implementing a parallel file system that Dell claims offers twice the performance of comparable solutions. The company has also enhanced its Data Lakehouse, improving the structure used by many AI applications to access and manage large datasets more efficiently.
Dell's PC Division Innovations
In an interesting development from Dell's PC division, the company launched the Dell Pro Max Plus portable workstation, featuring the first use of a discrete NPU (Neural Processing Unit) in a mobile PC—specifically, the Qualcomm A100. This device targets organizations running customized inferencing applications at the edge and AI model developers looking to leverage Qualcomm's NPU design. With its large onboard memory cache, the A100 enables the use of models with over 100 billion parameters, far exceeding the capabilities of even the most advanced Copilot+ PCs currently available.
Market Positioning and Strategy
Dell's comprehensive approach to AI hardware reflects the growing recognition that cloud and on-premises computing can effectively coexist. The company is positioning itself to support distributed hybrid AI applications that leverage both public and private clouds, anticipating that such configurations will become mainstream rapidly. As Dell CEO Michael Dell stated, We're on a mission to bring AI to millions of customers worldwide. Our job is to make AI more accessible. This customer-centric strategy, combined with Dell's extensive hardware portfolio, places the company in a strong position to meet the diverse AI infrastructure needs of enterprises across various industries.