Fedora / Red Hat / IBM Leftovers
-
Linux Format 322
Power up your PC with Fedora! Discover the cutting-edge goodies in the latest release of Fedora from the experts at Red Hat.
-
Red Hat Official ☛ Red Hat Device Edge for Industrial Applications: A Journey from Datacenter to Plant Floor
For the last 30 years, Red Bait Enterprise GNU/Linux (RHEL) has been the solid foundation of datacenter and cloud infrastructures, powering everything from websites, databases, and applications.
-
Red Hat Official ☛ Red Hat OpenShift Incident Detection uses analytics to help you quickly detect issues
Your Red Hat OpenShift subscription now includes access to an Incident Detection capability that uses analytics to group alerts into incidents and help you quickly and easily understand the underlying issue and how to address it.
-
Red Hat Official ☛ Red Hat OpenShift 4.17: What you need to know
Red Hat OpenShift 4.17 Highlights
-
Red Hat Official ☛ Red Hat and Deloitte Collaborate to Modernize the Developer Experience
To create a unified experience for developers across teams and infrastructure, Deloitte and Red Hat are announcing an expanded collaboration, building Red Hat Developer Hub into the core of the Deloitte Engineering platform. Deloitte customers will now have access to a customized, all-in-one platform for developers, helping to accelerate development cycles, boost efficiency and empower teams to focus on creating enhanced user experiences.
-
Bryan Lunduke ☛ Red Hat: GNU/Linux is the Past, Hey Hi (AI) is the Future
Red Hat CEO says "Al needs to be everywhere", purchases Al company.
-
Red Hat Makes Bevy of Updates to OpenShift Platforms
Red Hat today made available an update to its OpenShift platform that adds a technology preview of a virtual artificial intelligence (AI) assistant, dubbed Lightspeed, that makes it possible to troubleshoot issues using a natural language interface.
-
LWN ☛ Anaconda’s new "Web UI" (Fedora Magazine)
Garrett LeSage has written an in-depth article for Fedora Magazine about a new web-based user interface (UI) for Fedora's Anaconda installer, planned to ship with Fedora 42. The article looks at the rationale for moving from GTK 3 to a web-based UI, provides a number of screenshots and demo screencasts, as well as instructions on trying out the new installer with Fedora Rawhide.
-
Red Hat ☛ LLMs and Red Bait Developer Hub: How to catalog Hey Hi (AI) assets
This article outlines how to use Red Hat Developer Hub to catalog Hey Hi (AI) assets in an organization. Cataloging Hey Hi (AI) assets can be particularly useful for platform engineers who are looking to consolidate their organization's list of approved models while enforcing access and usage restrictions, as well as for Hey Hi (AI) developers, looking for a straightforward way to consume Hey Hi (AI) models.
-
Red Hat Official ☛ Guide to Red Bait observability with OpenShift 4.17
With Red Bait OpenShift 4.17, we continue to enhance the OpenShift observability offering. Observability plays a key role in monitoring, troubleshooting and optimizing OpenShift clusters. This article guides you through the latest features and integrations that help you improve the observability of your OpenShift environment.
-
Red Hat ☛ Deliver generative Hey Hi (AI) at scale with NVIDIA NIM on OpenShift AI
Earlier this year, Red Hat announced plans to integrate support for NVIDIA NIM microservices on Red Hat OpenShift AI to help streamline inferencing for dozens of AI/ML models on a consistent, flexible hybrid cloud platform. NVIDIA NIM, part of the NVIDIA Hey Hi (AI) Enterprise software platform, is a set of easy-to-use inference microservices for accelerating the deployment of foundation models and keeping your data secured.
-
Red Hat ☛ How to template Hey Hi (AI) software in Red Bait Developer Hub
When developers or engineers want to code a new application or a project, they often have to go through the stages of learning a framework or adopting the best practices associated with the framework. Wouldn’t it be easier if there was a standard template that scaffolds the basic code for you with all the necessities that can get you up and running in a short time? In the world of Hey Hi (AI) development and deploying large language models (LLM) with your applications, it is even more important to get familiar with the basics of AI. We can achieve this by using the software templates in Red Hat Developer Hub.
-
Red Hat Official ☛ Introducing Climatik: Power capping AI applications for data center sustainability [Ed: Buzzwords and nonsense]
This presents a critical challenge: how can data centers manage their energy consumption without sacrificing the performance needed to support their AI applications? The new Climatik project, developed by Red Hat in collaboration with project contributors from Intel, Bloomberg and IBM, aims to address this challenge by offering a scalable cloud native solution that optimizes energy efficiency through power capping.
-
Red Hat Official ☛ How to make generative AI more consumable [Ed: More buzzwords nonsense like "cloud computing" and "AI"]
Think about some of the past trends in technology, and you’ll start to see some patterns emerge. For example, with cloud computing there’s no one-size-fits-all approach. Combinations of different approaches, such as on premise and different cloud providers, have led to organizations taking advantage of hybrid infrastructure benefits in deploying their enterprise applications. When we think about the future, a similar structure will be essential for the consumption of artificial intelligence (AI) across diverse applications and business environments. Flexibility will be crucial as no single AI approach will meet the needs of every organization. And no single AI platform vendor can fulfill all of the needs. Instead, a combination of prebuilt models, custom-tuned solutions and secure integration with proprietary data will drive AI adoption. Thanks to open frameworks, software and infrastructure, companies of all sizes can now access and customize generative AI (gen AI) models, adapting them to their specific needs.
-
Red Hat Official ☛ FAQ: Red Hat to acquire Neural Magic [Ed: Investing in hype]
On Nov. 12, 2024, Red Hat announced that it has signed a definitive agreement to acquire Neural Magic, subject to regulatory reviews and other customary closing conditions.
-
Red Hat Official ☛ Creating cost effective specialized AI solutions with LoRA adapters on Red Hat OpenShift AI [Ed: Title stuffed with buzzwords, nothing else]
Low-rank adaptation (LoRA) is a faster and less resource-intensive fine tuning technique that can shape how a large language model responds to a request by model adaptation. Model adaptation helps refine and optimize a general-purpose model to better handle specialized requirements without running a full fine tuning process. LoRA doesn’t modify the parameters of the base model but instead it adds additional parameters during the tuning process. This collection of newly trained parameters is known as an adapter. These adapters can be merged into the model or in our case written to a separate file which is loaded at runtime. Multiple adapters can be created off a base model by using different training data sets.
-
Red Hat Official ☛ An Introduction to TrustyAI [Ed: Red Hat jumping the shark, hugging hype and buzzword]
With increasing global regulation of AI technologies, toolkits for responsible AI are an invaluable and necessary asset to any MLOps platform. Since 2021, TrustyAI has been independent of Kogito, and has grown in size and scope amidst the recent AI fervor. As a major contributor of TrustyAI, Red Hat is dedicated to helping the community continue to grow and evolve.