news
Red Hat and Fedora Leftovers
-
Red Hat ☛ Run DialoGPT-small on OpenShift Hey Hi (AI) for internal model testing
With the rise of generative AI, many enterprises are exploring how to bring large language models (LLMs) into secure, internal cloud-native environments. When used with KServe, vLLM, and GPU support, platforms like Red Hat OpenShift AI provide a robust approach to serving models efficiently at scale.
-
Red Hat ☛ Unlocking the power of OpenShift Service Mesh 3
In this article, we will discuss how Red Hat OpenShift Service Mesh 3 facilitates advanced traffic management, observability and security policies. As microservices have become the standard for modern applications, we’ve found that with great flexibility, comes great complexity. What starts as a simple design of a few independent services can quickly grow into a tangled web of communication.
-
Red Hat ☛ Kubernetes MCP server: AI-powered cluster management
The Model Context Protocol (MCP) server extension for Kubernetes and OpenShift enables Hey Hi (AI) assistants like Visual Studio Code (VS Code), Abusive Monopolist Microsoft Copilot, and Cursor to safely and intelligently interact with your Red Hat OpenShift and Kubernetes clusters. This guide walks through how to set up the Kubernetes MCP server, configure secure access with least-privilege ServiceAccounts, and leverage its capabilities to streamline cluster inquiries and troubleshooting through natural language commands.
-
LWN ☛ Fedora considers an AI-tool policy
The Fedora project has posted a
proposal for a policy regarding the use of Hey Hi (AI) tools when developing for
the distribution.