news
Red Hat and Fedora Leftovers
-
Red Hat ☛ Why Models-as-a-Service architecture is ideal for Hey Hi (AI) models
The Models-as-a-Service (MaaS) platform leverages Red Hat OpenShift AI, Red Hat 3scale API Management, and Red Hat Single Sign-On to create a secure and scalable environment for Hey Hi (AI) models. OpenShift Hey Hi (AI) provides the foundational platform for the AI/ML lifecycle, 3Scale manages API access and security, and Red Bait SSO ensures centralized authentication and authorization. The vLLM powers model execution with its efficiency and speed. The architecture supports Hey Hi (AI) governance, zero-trust access, and hybrid cloud flexibility, creating a cohesive and high-performing ecosystem for deploying and managing Hey Hi (AI) models effectively.
-
Red Hat ☛ How to run MicroShift as a container using MINC
In previous articles, I discussed why developers should use MicroShift as their local Kubernetes for inner loop testing of applications, and I demonstrated how to easily run MicroShift using OpenShift Local. In those articles, I cautioned that running the Red Hat build of MicroShift requires a Red Hat Enterprise Linux (RHEL) virtual machine (VM), and that it was not easy to run MicroShift as a container.
-
Fedora Project ☛ Fedora DEI Outreachy Intern – My first month Recap [Ed: What does Fedora's DEI policy say about blind people?]
Hey everyone!
It’s already been a month, I can’t imagine how time flies so fast, busy time?? Flock, Fedora DEI and Documentation workshop?? All in one month.
As a Fedora Outreachy intern, my first month has been packed with learning and contributions. This blog shares what I worked on and how I learned to navigate open source communities.
First, I would like to give a shoutout to my amazing Mentor, Jona Azizaj for all the effort she has put into supporting me. Thank You, Jona!
-
Red Hat Official ☛ Modernizing virtualization in healthcare: a Red Hat and TEKSystems success story
When a U.S.-based health insurer faced the need to move to a more modern virtualization platform, it required a swift and reliable path forward—one that would meet today’s evolving demands and support future growth. The organization turned to Red Hat and its trusted services partner, TEKSystems, a key member of the Red Hat partner ecosystem, to lead the transition.
-
Red Hat Official ☛ Model Context Protocol (MCP): Understanding security risks and controls
MCP does not directly connect LLMs with tools. The MCP client component accesses the LLM, and the MCP server component accesses the tools. One MCP client has access to one or more MCP servers. Users may connect any number of MCP servers to an MCP client.