Python, Chatbots, and Podcast
-
Advanced dependency management and building Python wheels with Meson
Everything here uses only Meson. There are no external dependency managers, unix userland emulators or special terminals that you have to use.
In theory this could work on macOS too, but the code is implemented in C++23 and Apple's toolchain is too old to support it.
-
dupeGuru: Fine and Remove Duplicated Files in Any System
dupeGuru is a cross-platform (Linux, OS X, Windows) GUI tool to find duplicate files in a system. It’s written mostly in Python 3 and has the peculiarity of using multiple GUI toolkits, all using the same core Python code. On OS X, the UI layer is written in Objective-C...
-
Faster CPython at PyCon, part two
In part one of the tale, Brandt Bucher looked specifically at the CPython optimizations that went into Python 3.11 as part of the Faster CPython project. More of that work will be appearing in future Python versions, but on day two of PyCon 2023 in Salt Lake City, Utah, Mark Shannon provided an overall picture of CPython optimizations, including efforts made over the last decade or more, with an eye toward the other areas that have been optimized, such as the memory layout for the internal C data structures of the interpreter. He also described some additional optimization techniques that will be used in Python 3.12 and beyond.
-
Python Get File Size from System
There are several ways to get the file size in Python such as using the “os.path.getsize()”, “os.stat()”, or “pathlib.Path().stat()” method.
-
Democratizing AI with open-source language models [Ed: Openwashing and surveillance all-in-one]
Meta trained LLaMA on publicly available data sets, such as Wikipedia and Common Crawl. The code to run LLaMA is GPLv3-licensed, but to obtain the full weights of the model, users were required to fill out a form and agree to a "non-commercial bespoke license". Moreover, Meta proved to be quite selective in granting access. But within a week, the weights were leaked on BitTorrent, and LLaMA kickstarted the development of a lot of derivatives. Stanford University introduced Alpaca 7B, based on the LLaMA model with seven billion parameters and supplemented with instructions based on OpenAI's text-davinci-003 model of the GPT-3.5 family. Both the data set and the model were released under the CC BY-NC 4.0 license and thus do not permit commercial use. One reason for this is that OpenAI's terms of use disallow the development of models that compete with OpenAI.
-
Linux Out Loud 64: Linux Workflow
This week, Linux Out Loud chats about our Linux workflow. Welcome to episode 64 of Linux Out Loud. We fired up our mics, connected those headphones as we searched the community for themes to expound upon. We kept the banter friendly, the conversation somewhat on topic, and had fun doing it.