Psychological care should grant you freedom and protection
Quoting: Psychological care should grant you freedom and protection —
"Imagine an AI parenting companion that's always in your corner – ready to answer questions or talk about how you're feeling at any time of the day (or night)," posts one of the authors from the venture capital firm Andreessen Horowitz, who writes about large language models (LLM) and machine-learning companies, and seems to think this is a bright vision. More accurately, this is a nightmare when the app is proprietary and the company collects your data like Mamatech LTD does with its app Soula.
Soula claims to be the "first ethical AI in maternity" and "to empower women to flourish as individuals and mothers." While it's debatable if software is the right solution here (I personally think we need more human support), I fully agree with the founder of Soula that we need to support pregnant and postpartum parents much more! Therefore, it would be fabulous to have a truly ethical software that helps parents find information and advice in difficult phases during pregnancy and parenthood.
If Soula really does a job as important as it promises to do, empowering "women to flourish as individuals and mothers," it is even more essential that it grants its users software freedom, i.e. the freedom to run, modify, copy, and share the software. Don't you agree that all parents should be able to benefit from the help that Soula promises? Shouldn't they be able to use it whenever, wherever, and however they see fit and to share it with other caregivers who can also be supported by it? And shouldn't researchers be able to study what the machine learning application does, refine it, and adapt it for similarly important use cases? The problem, besides the question of how sensible it is to rely on a machine learning application that, contrary to its description, "AI" is not intelligent at all, is that the app used to communicate with the LLM is proprietary.