Develop, test, and run artificial intelligence (AI) and generative AI models on a trusted, accelerated computing platform built on an AI-optimized operating system.
AI infrastructure requires a combination of high-performance servers and flexible software that can both support intensive computing and can process workloads that streamline and fine-tune AI model training. The ability to package, install, and deploy the different customized components necessary for a generative AI platform accelerates time to value, so it's important to choose a suitable AI model (transparent and purpose-built) that can be customized with limited data science expertise.
Leverage the high-performance computing (HPC) capabilities of Lenovo SR675 V3 servers paired with Red Hat Enterprise Linux AI (RHEL AI) to accelerate AI innovation. This combination delivers unmatched performance, scalability, and security for enterprises looking to harness the transformative power of artificial intelligence.
RHEL AI is validated and is factory loaded on ThinkSystem SR675 V3 servers for customers to more easily develop, test, and run artificial intelligence (AI) and generative AI (gen AI) models on a trusted foundation model platform, built on an AI-optimized operating system that includes the Granite family of open-source models and instructLab model alignment tools.
Designed to handle the most demanding AI workloads, Lenovo’s ThinkSystem SR675 V3 servers feature cutting-edge AMD EPYC™ processors and NVIDIA GPUs. When paired with RHEL AI, you can run advanced generative AI models, such as the open-source Granite large language models, at optimal speeds - ensuring faster data processing and shorter AI model training times, enabling your teams to iterate and innovate more quickly.
The ThinkSystem SR675 V3 server, optimized for AI and HPC tasks, integrates seamlessly with RHEL AI’s hybrid cloud capabilities. This combination allows enterprises to quickly implement their AI deployments across multiple cloud environments, ensuring high availability and performance for even the most resource-intensive AI models. With RHEL AI’s support for OpenShift, you can manage containerized AI workloads across a distributed infrastructure, ensuring smooth operations and minimal downtime.
The ThinkSystem SR675 server, equipped with NVIDIA’s most powerful GPUs, provides the computational muscle necessary to accelerate AI training and inference. Combined with RHEL AI’s streamlined AI tools and open-source frameworks, this setup enables businesses to build and deploy AI models at lightning speed. The ThinkSystem SR675 provides the raw power to drive faster, more efficient AI workflows, whether training generative AI models or fine-tuning large language models.
With Lenovo’s highly reliable ThinkSystem SR675 V3 servers, enterprises can deploy AI models securely and confidently, ensuring data integrity and business continuity. RHEL AI has the robust security features that enterprise-grade deployments require, including Red Hat’s image mode for Red Hat Enterprise Linux. Lenovo’s globally trusted hardware and Red Hat’s efficient and secure Image Mode deployment for RHELAI provide the stability, enhanced security, and DevOps integration needed for mission-critical AI applications.
The Red Hat Ecosystem Catalog is the official source for discovering and learning more about the Red Hat Ecosystem of both Red Hat and certified third-party products and services.
We’re the world’s leading provider of enterprise open source solutions—including Linux, cloud, container, and Kubernetes. We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.