news
Latest From Red Hat's Official Site (Some Buzzwords, Too)
-
Red Hat Official ☛ Virtualization success stories: Join Red Hat OpenShift Virtualization's momentum in 2025
More organizations than ever are adopting Red Hat OpenShift Virtualization, underscoring its momentum as a trusted choice for enterprise virtualization workloads and modern infrastructure. Since the start of 2024, customer adoption has accelerated sharply, with the number of customers growing by 178%, production clusters doubling with a 121% increase, and the number of VMs managed by OpenShift Virtualization rising by more than 250%. These adoption figures make it clear that organizations are choosing OpenShift Virtualization as their reliable platform for running and scaling critical VM-based workloads while also having the flexibility to modernize applications on their terms, whenever they’re ready.
-
Red Hat Official ☛ Trust and authenticity: In the kitchen and the software supply chain
Software is inherently complex so an analogy concerning an area of life that we can all relate to should help. Here's a conversation about cooking lasagna!
-
Red Hat Official ☛ The “No Math AI” podcast: demystifying the impact of the latest AI developments [Ed: Pushing buzzword and hype and apparently there's now "PhD student in AI".]
Hosted by Dr. Akash Srivastava, Red Hat chief architect and a pioneer of the InstructLab project, and Isha Puri, Massachusetts Institute of Technology PhD student in AI, “No Math AI” is a monthly podcast designed to make cutting-edge AI research more accessible. Whether you’re an AI practitioner, business leader or a tech enthusiast, this podcast offers insights into the real-world impact of AI advancements on business.
-
Red Hat Official ☛ Red Hat Summit 2025: A guide for first-time attendees
Spread the word - invite your team or network to join
-
Red Hat Official ☛ Cloud-native at your pace: Why the guide you choose matters
The journey to the cloud-native enterprise.
-
Red Hat ☛ How to scale smarter with OpenShift Serverless and Knative
Welcome back to our series about building a smart, cloud-native financial advisor powered by AI. In How to build AI-ready applications with Quarkus, we explored how to infuse the application with AI-powered chat and retrieval-augmented generation. In this installment, we'll evolve the WealthWise application into a serverless architecture using Red Hat OpenShift Serverless, enabling more responsive scaling, lower operational costs, and flexible deployment options across hybrid and multi-cloud environments.
Why move to serverless?
Over time, the deployment models for software have moved to try and best utilize available computing resources. The move from applications running on bare-metal servers to virtual machines allowed for multiple monolithic systems to be run on a single piece of hardware. The introduction of container technology meant that applications could be decomposed into smaller microservices that could again be deployed to make the best use of available computing resources, as shown in Figure 1.
-
Red Hat ☛ The benefits of dynamic GPU slicing in OpenShift
In the era of AI and machine learning, efficient resource management is paramount. As Red Hat OpenShift administrators, we face the challenge of deploying intensive Hey Hi (AI) workloads on a platform where GPUs represent a significant cost. Traditional methods like pre-slicing with NVIDIA’s Multi-Instance GPU (MIG) can lead to resource wastage, especially when the static slices do not align with dynamic workload demands.
In this article, we will explore how dynamic GPU slicing—enabled by the dynamic accelerator slicer operator—can revolutionize GPU resource management in OpenShift by dynamically adjusting allocation based on workload needs.