Open Source AI Hype and Openwashing Nonsense
-
Hackaday Podcast 218: Open Source AI, The Rescue Of Salyut 7, The Homework Machine
This week, Editor-in-Chief Elliot Williams and Kristina Panos have much in the way of Hackaday news — the Op Amp Challenge is about halfway over, and there are roughly three weeks left in the Assistive Tech challenge of the 2023 Hackaday Prize. Show us what you’ve got on the analog front, and then see what you can do to help people with disabilities to live better lives!
-
The open-source AI boom is built on Big Tech’s handouts. How long will it last? [Ed: There's no "open-source AI boom"; they're proprietary and they actively abuse Free software, e.g. with GPL violations. This publisher is taking money from Microsoft entities, so such propaganda is not unexpected.]
Last week a leaked memo reported to have been written by Luke Sernau, a senior engineer at Google, said out loud what many in Silicon Valley must have been whispering for weeks: an open-source free-for-all is threatening Big Tech’s grip on AI. New open-source large language models—alternatives to Google’s Bard or OpenAI’s ChatGPT that researchers…
[...]
But this open-source boom is precarious. Most open-source releases still stand on the shoulders of giant models put out by big firms with deep pockets. If OpenAI and Meta decide they’re closing up shop, a boomtown could become a backwater.
For example, many of these models are built on top of LLaMA, an open-source large language model released by Meta AI. Others use a massive public data set called the Pile, which was put together by the open-source nonprofit EleutherAI. But EleutherAI exists only because OpenAI’s openness meant that a bunch of coders were able to reverse-engineer how GPT-3 was made, and then create their own in their free time.
-
The Climate Cost of the AI Revolution
ChatGPT and other AI applications such as Midjourney have pushed "Artificial Intelligence" high on the hype cycle. In this article, I want to focus specifically on the energy cost of training and using applications like ChatGPT, what their widespread adoption could mean for global CO₂ emissions, and what we could do to limit these emissions.
-
AI Hype Will Drive Datacenter GPU Prices Sky High
People are not going to be any more patient about adding generative AI to their workloads today than they were in the late 1990s and early 2000s to add Web infrastructure to modernize their applications to deploy interfaces for them on the Internet. The difference this time around is that the datacenter is not transforming itself into a general purpose X86 compute substrate, but rather is becoming more and more of an ecosystem of competing and complementary architectures that are woven together to provide the overall best possible bang for the buck across a wider variety of workloads.