Red Hat Leftovers
-
Red Hat Official ☛ Getting hands-on with Red Hat OpenShift Virtualization
If you’re like me, before you buy into a product, you probably want to try it out first. The OpenShift Virtualization Roadshow lets you do just that in a lab environment where OpenShift Virtualization experts will be available to answer your questions.
-
Red Hat ☛ Ensure a scalable and performant environment for ROSA with hosted control planes
Red Hat OpenShift Service on AWS (ROSA) is a turn-key and performant solution to run managed Red Hat OpenShift on Amazon Web Services (AWS). It lets users to focus more on their applications and less on the infrastructure or platform it runs on, helping drive developer innovation.
Ensuring that OpenShift is performant and scalable is a core tenant of the OpenShift Performance and Scale team at Red Hat. Prior to its release (and still to this day), ROSA undergoes a vast array of performance and scale testing to ensure that it delivers industry leading performance. These tests run the gamut from control plane and data path focus, to upgrades, to network performance. These tests have been used to help measure and better the performance of “classic” ROSA, but what happens when we move to hosted control planes?
-
Fedora Project ☛ Fedora Community Blog: Fedora London Meetup comes alive in 2024
Our London-based Fedora Community members got together in person to get to know the people behind the Fedora Account System and Matrix nicks.
-
InfoQ ☛ AI Lab Extension Allows Podman Desktop Users to Experiment with LLMs Locally [Ed: Red Hat riding moronic hype waves that generate lies and plagiarism]
One year after its 1.0 release, Podman Desktop announced the Podman AI Lab plugin, promising to help developers start working with Large Language Models (LLM) on their machines. Podman AI Lab streamlines LLM workflows featuring generative AI exploration, built-in recipe catalogue, curated models, local model serving, OpenAI-compatible API, code snippets, and playground environments.
-
Highlights from Red Hat Summit 2024: Expanding on Innovation [Ed: They emphasise buzzwords instead of actual substance]
Red Hat CEO Matt Hicks believes that the intersection of open source and AI is opening an explosion of options, with open source and academia at the center of innovation. Open source is seen in LLM development, deployment, and container platforms, and is fueling machine learning, neural networks, and generative AI. Matt predicts that AI will not be built by a single vendor or monolithic model, but by contributing to models and fine-tuning as part of the open-source community. InstructLab offers a platform for training LLMs, keeping data private, or contributing. Red Hat is also working on virtualization, with KubeVirt being in the top 10 Cloud Native Computing Foundation projects and being developed by 100 companies, including Dell, IBM, HPE, Amazon Web Services (AWS), NetApp, and NVIDIA. The company is focused on guiding the open-source community to build the necessary capability and durability for different ecosystems.
-
Ask Noah Show 390
This week Vincent Danen the VP of product security from Red Hat joins the Ask Noah Show to talk about Open Source security practices. We take your calls, answer your questions, and give you an update about The Linux Challenge Coin.
-
6Q4: Red Hat’s Francis Chow On Software-Defined Vehicles, Edge, and Open Source
The auto industry is moving to software-defined vehicles to meet rapidly changing customer preferences and the constant need to update old or add new advanced features. Here is a brief overview of the technical challenges and technologies that will make all of this happen.