news
Latest in redhat.com
-
Red Hat ☛ New: Local development with JetBrains IDEs in OpenShift Dev Spaces
While businesses emphasize centralized, secure development environments, developers often struggle with the high-end hardware requirements of traditional desktop IDEs. The Red Hat OpenShift Dev Spaces 3.19 release brings the best of both worlds to developers by introducing a local development experience for the following JetBrains IDEs:
- IntelliJ IDEA Ultimate
- WebStorm
- PyCharm
- RubyMine
- CLion
Now users can seamlessly integrate their cloud-native development environments with the local JetBrains IDE experience (Figure 1).
-
Red Hat Official ☛ New updates to Konveyor AI: Use AI-driven application modernization without fine-tuning a model [Ed: A mix of buzzwords and hype]
Bringing AI-powered modernization to developers
-
Red Hat Official ☛ Red Hat Advanced Cluster Management 2.13 expands hybrid cloud management capabilities
Red Hat today announced the general availability of Red Hat Advanced Cluster Management 2.13, delivering enhanced capabilities for managing hybrid cloud environments. This release introduces key features that simplify operations across traditional virtualization and cloud-native infrastructure. Red Hat Advanced Cluster Management 2.13 is also another Extended Update Support (EUS) term 2 release (Red Hat Advanced Cluster Management 2.11 was the previous EUS term 2 release), which means that Red Hat offers an available 3rd year of extended support providing backports of critical and important impact security updates and urgent bug fixes. This means you can have a longer term of maintenance and security hardening available for your central hub management to keep your mind at ease.
-
Red Hat Official ☛ Meet vLLM: For faster, more efficient LLM inference and serving [Ed: Still dabbling in a bubble which will implode]
vLLM, originally developed at UC Berkeley, is specifically designed to address the speed and memory challenges that come with running large AI models. It supports quantization, tool calling and a smorgasbord of popular LLM architectures (Llama, Mistral, Granite, DeepSeek—you name it). Let’s learn the innovations behind the project, why over 40k developers have starred the project on GitHub and how to get started with vLLM today!