Open source-powered AI/ML for the hybrid cloud
Enterprise grade Artificial Intelligence and Machine Learning (AI/ML) for Developers, Data Engineers, Data Scientists and Operations.
Try it in our no-cost Developer SandboxIT Admins, Try it in your own cluster

Overview
Open source software is at the heart of cutting edge innovation like Generative AI in addition to its already prominent role in powering Predictive AI. To deliver these innovations at a global scale, enterprises have to deal with the complexities of security, privacy, compliance, reliability, scale, and performance. To handle these complexities, enterprises usually end up with a hybrid cloud footprint where their data and applications are deployed on environments ranging from on-prem data centers to hyperscaler cloud provider infrastructure. Operationalizing AI/ML and utilizing open source powered AI/ML in intelligent applications that deliver exponentially enhanced customer experiences in a hybrid cloud environment require platforms withcapabilities for both machine learning operations (MLOps) and application development.

MLOps platform
An MLOps platform, with workflows inspired by DevOps and GitOps principles, to integrate ML models into the software development process.
- A flexible and scalable platform with tools to build, deploy, and manage AI-enabled applications.
- Leverage the vast number of pre-trained models from open source providers.
- Utilize ML frameworks and serving formats like Pytorch, Tensorflow, ONNX, and others for model development.
- Deliver inference in a hybrid cloud environment at high performance and throughput.


Application platform
A consistent Kubernetes-based application platform for development, deployment, and management of existing and modernized cloud-native applications that runs on any cloud.
- A wide range of languages, runtimes, and frameworks to develop business logic.
- Integration technologies like API Management, SSO and more, that allow the applications to be exposed securely and at scale.
- Tools that support modern CI/CD DevOps practices.
- Developer Tools that enable seamless on-boarding and DevSecOps.
Hybrid cloud AI/ML platforms combine MLOps and application platform capabilities by:
- Providing developers, data engineers, data scientists and operations teams with consistency in how applications and models are developed, packaged, deployed, and managed.
- Developing, training, tuning, deploying, and serving models and applications as containerized workloads through common interfaces and tools without dealing directly with the underlying complexities of a Kubernetes configuration, orchestration, security, and compliance with mature cloud native CI/CD practices.
- Supporting containerized AI workloads and their specialized needs.
- Enabling an ecosystem of specialized best-in-class open source projects and ISV software that complement and extend the platform.
Hybrid cloud AI/ML platform capabilities
Learn about the capabilities of a hybrid cloud AI/ML platform including AI workloads, an integrated MLOps and application development platform, and developer productivity tools.

AI workload support
Containerized workloads deployed across the hybrid cloud, based on the core AI techniques of machine learning (ML) and deep learning where data and information drive these workloads.

Integrated MLOps & App Dev platform
A common platform to bring IT, data science, and app dev teams to support the end-to-end lifecycle of ML models and cloud native applications.

Developer Tools & AI-enabled products
AI-enabled code generation, internal developer portals, and MLSec Ops that enhance the developer experience through open source powered developer tools.
AI/ML learning exercises
Try these self-directed learning exercises to gain experience and bring your creativity to AI and Red Hat OpenShift AI – Red Hat’s dedicated platform for building AI-enabled applications. Learn about the full suite of MLOps to train, tune, and serve models for purpose-built applications.

Fundamentals of OpenShift AI
Learn the foundations of Red Hat OpenShift AI, that gives data scientists and developers a powerful AI/ML platform for building AI-enabled applications. Data scientists and developers can collaborate to move quickly from experiment to production in a consistent environment.

Real-time Data Collection and Processing
Create a demo application using the full development suite: MobileNet V2 with Tensor input/output, transfer learning, live data collection, data preprocessing pipeline, and modeling training and deployment on a Red Hat OpenShift AI developer sandbox.

Data Engineering: Extract Live Data
Learn engineering techniques for extracting live data from images and logs of the fictional bike-sharing app, Pedal. You will deploy a Jupyter Notebook environment on Red Hat OpenShift AI, develop a pipeline to process live image and log data, and also extract meaningful insights from the collected data.
Latest AI/ML articles

Discover the comprehensive security and scalability measures for a...

Learn how to overcome compatibility challenges when deploying OpenShift AI...

Harness Llama Stack with Python for LLM development. Explore tool calling,...

A beginner's guide to Podman Desktop and Podman AI Lab.