How Red Hat partners are powering the next wave of enterprise AI
Link⚡ TL;DR
📝 Summary
How Red Hat partners are powering the next wave of enterprise AI Powering fast, flexible and efficient inference Accelerating agentic AI deployments Scaling AI across the hybrid cloud Delivering a leading AI ecosystem About the author Ryan King More like this Blog post Blog post Original podcast Original podcast Keep exploring Browse by channel Automation Artificial intelligence Open hybrid cloud Security Edge computing Infrastructure Applications Virtualization Share The pace of AI innovation is accelerating, and with the launch of Red Hat AI 3 , we're reminded that turning this potential into enterprise reality requires a robust, open ecosystem built on choice and collaboration. Our goal has always been to provide a consistent, powerful platform for AI that works with any model, on any accelerator, and across the hybrid cloud. Today, we're thrilled to highlight the momentum from our partners, who are working alongside us to build out the future of open, hybrid AI on Red Hat. The Red Hat partner ecosystem is the engine that will deliver the generative AI (gen AI) and agentic capabilities that customers need for broad market adoption. It’s about bringing together the best in hardware, software, and services to create a whole that is far greater than the sum of its parts. The launch of Red Hat AI 3 is centered on driving enterprise AI inference, expanding model choice and enabling open models for optimized cost and flexibility – so organizations can go from training to “doing. ” And Red Hat partners play a critical role in making this happen. To deliver AI inference at scale using vLLM and Kubernetes, the llm-d open source project is now generally available as part of Red Hat OpenShift AI 3.0, powered by a coalition of leading gen AI model providers, AI accelerators and premier AI cloud platforms. Founding contributors include CoreWeave, Google Cloud, IBM Research, and NVIDIA, with additional support from partners like AMD, Cisco, Hugging Face, Intel, Lambda, and Mistral AI. Since the project’s introduction earlier this year, Microsoft, Oracle and WEKA have also become active contributing members. As Large Language Models (LLMs) become the foundation for a wide range of gen AI applications, Red Hat is introducing the Partner Model Validation Guide to empower our partners and provide greater choice to customers. This guide outlines a standardized, step-by-step process for Red Hat partners to benchmark their LLMs for inclusion in the Red Hat OpenShift AI model catalog.
Open the original post ↗ https://www.redhat.com/en/blog/how-red-hat-partners-are-powering-next-wave-enterprise-ai