news
Fedora / Red Hat / RHEL / IBM Leftovers
-
GNOME ☛ Richard Hughes: Prem’Day 2025 – Firmware Update Management with LVFS & fwupd
A few weeks ago I was invited to talk about firmware updates for servers using fwupd/LVFS at Prem’Day 2025. I gave the hardware vendors a really hard time, and got lots of instant feedback from customers in the the audience from the “little green thumbs” that people could raise. The main takeaway from the Prem’Day community seemed to be that proprietary tooling adds complexity without value, and using open ecosystems enable users to better operate their infrastructure.
-
Web Pro News ☛ Oracle Ships Unbreakable Enterprise Kernel 8
Oracle has shipped its Oracle Linux kernel, the Unbreakable Enterprise Kernel 8, bringing a number of improvements to memory management, file system support, and more.
Oracle is a major player in the enterprise Linux space, and ships its own “purpose-built Oracle Linux kernel” as part of it. The company has shipped its latest version, version 8, which the company is touting as its best yet.
“This is our eighth release of UEK, and I think it’s the best one yet,” said Greg Marsden, senior vice president of Linux software development, Oracle. “In addition to bringing in significant improvements like memory folios, UEK 8 is built on the foundation of the UEK-next project. For the past year, UEK-next has allowed both Oracle and our customers to test out the latest upstream features that are now production- and enterprise-ready with UEK 8. UEK has been around nearly as long as upstream stable kernels, and it continues to push the boundaries of Linux innovation and deliver the performance and stability that businesses depend on—while keeping Linux open and free.”
-
Red Hat ☛ Node.js function calling with LangGraph.js in Podman Hey Hi (AI) Lab
AI tool calling, also known as function calling, extends large language models (LLMs) to allow them to perform specific actions instead of just generating text responses. This article dives into a new recipe dedicated to Hey Hi (AI) function calling with Node.js using the LangGraph.js framework that was recently added to the Podman Hey Hi (AI) Lab extension.
-
Red Hat Official ☛ Model Context Protocol: Discover the missing link in AI integration
Sound familiar?
-
Red Hat ☛ A practical example of the custom metrics autoscaler
The custom metrics autoscaler is the Red Bait version of the Kubernetes Event Driven Autoscaler (KEDA) community operator based on the Cloud Native Computing Foundation (CNCF) project. It has been designed to use metrics from different sources, such as pods CPU, memory usage, or other cluster metrics, to decide on whether to autoscale a pod up or down based on the metrics in question.