Language Selection

English French German Italian Portuguese Spanish

IBM/Red Hat Leftovers

Filed under
Red Hat
  • Best practices for building images that pass Red Hat Container Certification

    Building unique images for various container orchestrators can be a maintenance and testing headache. A better idea is to build a single image that takes full advantage of the vendor support and security built into Red Hat OpenShift, and that also runs well in Kubernetes.

    A universal application image (UAI) is an image that uses Red Hat Universal Base Image (UBI) from Red Hat Enterprise Linux as its foundation. The UAI also includes the application being deployed, adds extra elements that make it more secure and scalable in Kubernetes and OpenShift, and can pass Red Hat Container Certification.

    This article introduces you to nine best practices you should incorporate into your Dockerfile when building a UAI. Each section in this article explains a practice, shows you how to implement the practice, and includes Red Hat certification requirements related to the topic.

  • What is AI/ML and why does it matter to your business?

    AI/ML—short for artificial intelligence (AI) and machine learning (ML)—represents an important evolution in computer science and data processing that is quickly transforming a vast array of industries.

    As businesses and other organizations undergo digital transformation, they’re faced with a growing tsunami of data that is at once incredibly valuable and increasingly burdensome to collect, process and analyze. New tools and methodologies are needed to manage the vast quantity of data being collected, to mine it for insights and to act on those insights when they’re discovered.

  • High performance computing 101

    The data is in—massive amounts of it, and high computing power can help enterprises make some sense out of it. For a technology that has gone through ebbs and flows in popularity, high performance computing (HPC) may be expanding to use cases beyond those found in scientific research as more industries can tap into valuable insights gained from artificial intelligence, machine learning, and other emerging technologies.

    So, what does this mean to your organization? If you’re increasingly facing the need to translate large amounts of consumer data to track trends or calculate thousands of financial transactions a day to support business growth, is HPC something you should be considering?

  • Top 5 resources to learn about the IBM and Cloudera partnership

    Six months, four blogs, three videos, two conference presentations, and one amazing partnership — that is how I would describe the IBM and Cloudera partnership so far. This blog post highlights some of the best developer-focused resources to help you leverage your data to build AI-enabled applications.

    Earlier this year, IBM and Cloudera announced that they would partner together to create a new joint offering: Cloudera Data Platform for IBM Cloud Pak for Data, bringing together two leading data platforms. The benefits of using boths platforms are outlined in the various product pages and focused on security, scalability, and, of course, combining the best technologies for data and AI.

    Soon after, a few of us on the IBM Developer and Hybrid Cloud Build Team were tasked with testing the products, building PoCs for customers, and creating assets to be consumed by external audiences.

    Below are our top five resources for learning about the IBM and Cloudera partnership. Before we get into it, I would like to give a shout-out to the folks that made it possible: Tim Robinson, Brett Coffmann, Dave Fowler, Marc Chisinevski, and Erik Beebe. Let’s get started!

  • CentOS project moves to development using GitLab

    The CentOS Project announced the launch of a collaborative development service based on the GitLab platform. The decision to use GitLab as the primary hosting platform for CentOS and Fedora projects was made last year. It is noteworthy that the infrastructure was raised not on its own servers, but on the basis of the service, in which the section is provided for projects related to CentOS.

    At the moment, work is underway to integrate the section with the user base of the CentOS project, which will allow developers to connect to the Gitlab service using existing accounts. Separately, it is noted that based on the Pagure platform will continue to be considered as a place to host the source code of packages ported from RHEL, as well as as the basis for the formation of the CentOS Stream 8 branch. But the CentOS Stream 9 branch is already developing on the basis of a new repository in GitLab and is distinguished by the ability to connect to the development of contributors from the community. Other projects hosted on remain in place for now and are not forced to migrate.

  • Simplify Kafka authentication with Node.js

    Apache Kafka is a publish-subscribe messaging system that is commonly used to build loosely coupled applications. These types of applications are often referred to as reactive applications.

    Our team maintains a reactive example that shows the use of Kafka in a simple application. If you've looked at these types of applications, you know that although the components are decoupled, they need access to a shared Kafka instance. Access to this shared instance must be protected. This means that each component needs a set of security credentials that it can use to connect to the Kafka instance.

    As a Node.js developer, how can you safely share and use those credentials without a lot of work? Read on to find out.

    Note: You can learn more about using Node.js in reactive applications in the article Building reactive systems with Node.js.

More in Tux Machines

digiKam 7.7.0 is released

After three months of active maintenance and another bug triage, the digiKam team is proud to present version 7.7.0 of its open source digital photo manager. See below the list of most important features coming with this release. Read more

Dilution and Misuse of the "Linux" Brand

Samsung, Red Hat to Work on Linux Drivers for Future Tech

The metaverse is expected to uproot system design as we know it, and Samsung is one of many hardware vendors re-imagining data center infrastructure in preparation for a parallel 3D world. Samsung is working on new memory technologies that provide faster bandwidth inside hardware for data to travel between CPUs, storage and other computing resources. The company also announced it was partnering with Red Hat to ensure these technologies have Linux compatibility. Read more

today's howtos

  • How to install go1.19beta on Ubuntu 22.04 – NextGenTips

    In this tutorial, we are going to explore how to install go on Ubuntu 22.04 Golang is an open-source programming language that is easy to learn and use. It is built-in concurrency and has a robust standard library. It is reliable, builds fast, and efficient software that scales fast. Its concurrency mechanisms make it easy to write programs that get the most out of multicore and networked machines, while its novel-type systems enable flexible and modular program constructions. Go compiles quickly to machine code and has the convenience of garbage collection and the power of run-time reflection. In this guide, we are going to learn how to install golang 1.19beta on Ubuntu 22.04. Go 1.19beta1 is not yet released. There is so much work in progress with all the documentation.

  • molecule test: failed to connect to bus in systemd container - openQA bites

    Ansible Molecule is a project to help you test your ansible roles. I’m using molecule for automatically testing the ansible roles of geekoops.

  • How To Install MongoDB on AlmaLinux 9 - idroot

    In this tutorial, we will show you how to install MongoDB on AlmaLinux 9. For those of you who didn’t know, MongoDB is a high-performance, highly scalable document-oriented NoSQL database. Unlike in SQL databases where data is stored in rows and columns inside tables, in MongoDB, data is structured in JSON-like format inside records which are referred to as documents. The open-source attribute of MongoDB as a database software makes it an ideal candidate for almost any database-related project. This article assumes you have at least basic knowledge of Linux, know how to use the shell, and most importantly, you host your site on your own VPS. The installation is quite simple and assumes you are running in the root account, if not you may need to add ‘sudo‘ to the commands to get root privileges. I will show you the step-by-step installation of the MongoDB NoSQL database on AlmaLinux 9. You can follow the same instructions for CentOS and Rocky Linux.

  • An introduction (and how-to) to Plugin Loader for the Steam Deck. - Invidious
  • Self-host a Ghost Blog With Traefik

    Ghost is a very popular open-source content management system. Started as an alternative to WordPress and it went on to become an alternative to Substack by focusing on membership and newsletter. The creators of Ghost offer managed Pro hosting but it may not fit everyone's budget. Alternatively, you can self-host it on your own cloud servers. On Linux handbook, we already have a guide on deploying Ghost with Docker in a reverse proxy setup. Instead of Ngnix reverse proxy, you can also use another software called Traefik with Docker. It is a popular open-source cloud-native application proxy, API Gateway, Edge-router, and more. I use Traefik to secure my websites using an SSL certificate obtained from Let's Encrypt. Once deployed, Traefik can automatically manage your certificates and their renewals. In this tutorial, I'll share the necessary steps for deploying a Ghost blog with Docker and Traefik.