news
IBM's Red Hat Pivots to Slop, CIQ Follows (Slop, Not Freedom)
-
Red Hat Official ☛ The new AI stack: Choice, control, and production-ready innovation [Ed: IBM Red Hat selling slop again - truly a shame.]
At Red Hat, our mission remains centered on open source principles: Collaboration, transparency, and choice. We believe that for AI to truly deliver on its promise of productivity and business value, it cannot remain a proprietary "black box." It must be grounded in the same solid engineering principles that made Linux, Kubernetes, and other open source innovations the foundation of the modern enterprise.
-
Red Hat ☛ How to manage Red Bait OpenShift Hey Hi (AI) dependencies with Kustomize and Argo CD [Ed: All about slop again]
Red Hat OpenShift AI provides a platform for data science and Hey Hi (AI) workloads, but managing its external dependencies can become complex, slowing down setup and maintenance.
To streamline this workflow, we created the odh-gitops repository. This repository provides a declarative, version-controlled, GitOps-ready template for deploying OpenShift Hey Hi (AI) dependencies on OpenShift. It includes predefined configurations to help you manage the entire dependency lifecycle.
-
Red Hat ☛ Zero trust GitOps: Build a secure, secretless GitOps pipeline
Within the Red Bait OpenShift ecosystem, OpenShift GitOps is one of the most popular Day 2 operators. This powerful tool enables teams to leverage Git as a single source of truth for the management of cluster configurations and application deployment.
-
Red Hat Official ☛ Subscription watch: Managing your hybrid cloud estate
Managing a hybrid cloud environment spanning on-premise data centers, edge deployments, and multiple public clouds often results in subscription sprawl. Even in simpler environments, it can be challenging to maintain clear visibility into subscription use. Organizations frequently struggle to answer a basic question: “Exactly how much of our purchased Red Hat capacity are we actually using right now?”
-
PR Newswire ☛ CIQ Announces General Availability of RLC Pro AI, Enterprise Linux Built to Deliver More from Every GPU in Production
CIQ, the founding support and services partner of Rocky Linux, today announced the general availability of Rocky Linux from CIQ Pro AI (RLC Pro AI), an Enterprise Linux distribution purpose-built for AI inference and GPU-accelerated workloads, engineered to deliver more from every GPU in production. RLC Pro AI ships today with PyTorch and the full NVIDIA CUDA and DOCA-OFED stack, with expanded support for additional hardware partners and frameworks on the active roadmap.
AI infrastructure is now core to how enterprises operate. Organizations across every industry are moving GPU-accelerated workloads into production, and the operating system (OS) has become the constraint. The OS underneath AI workloads determines how much performance the hardware actually delivers. For most enterprises, that performance has been left on the table.