[prev in list] [next in list] [prev in thread] [next in thread] 

List:       kde-devel
Subject:    Re: Interest in building an LLM frontend for KDE
From:       "Joseph P. De Veaugh-Geiss" <joseph () kde ! org>
Date:       2023-12-04 11:09:43
Message-ID: 06c8b6ca-0d80-4f6f-94ed-705ea2c4f7b9 () kde ! org
[Download RAW message or body]

Hi,

On 12/2/23 13:00, kde-devel-request@kde.org wrote:
> Hey Loren,
> 
> I agree with everyone else that as long as the model is ethically sourced and
> local-by-default (which is my two main problems with modern LLMs.) I see no
> qualms about including such a frontend in KDE.

Re ethical LLMs Nextcloud has developed a traffic light system based on 
three parameters 
(https://nextcloud.com/blog/ai-in-nextcloud-what-why-and-how/) to 
address some of the concerns discussed in this thread:

1. Is the software open source? (Both for inferencing and training)
2. Is the trained model freely available for self-hosting?
3. Is the training data available and free to use?

Perhaps KDE could adopt a similar system?

> I think the hardest part will be the marketing around this new software. One
> of our current goals is "Sustainable Software" and current LLM models as far
> as I understand is extremely power hungry and inefficient. If you do go ahead
> with this project, I would be very careful with how we promote it. It would
> have to somehow push KDE values while also avoiding the ethical, climate and
> social minefield that's currently littering this topic. Just something to keep
> in mind 🙂

I agree with the concerns Josh raises about the energy consumption of 
training LLMs (see, e.g., [1]). A benefit of satisfying the above 
characteristics is it is then possible for us to measure the energy 
consumption for training/using the LLMs. This would enable KDE to be 
transparent about what these tools consume in terms of energy and 
present this information to users.
k
Cheers,
Joseph

[1] "The real climate and transformative impact of ICT: A critique of 
estimates, trends, and regulations", 2021. Charlotte Freitag, Mike 
Berners-Lee, Kelly Widdicks, Bran Knowles, Gordon S. Blair, and Adrian 
Friday. https://doi.org/10.1016/j.patter.2021.100340

"AI has the greatest potential for impact given the complexity of 
training and inferencing on big data, and especially so-called deep 
learning. Researchers have estimated that 284,019 kg of CO2e are emitted 
from training just one machine learning algorithm for natural language 
processing, an impact that is five times the lifetime emissions of a 
car. While this figure has been criticized as an extreme example (a more 
typical case of model training may only produce around 4.5 kg of CO2), 
the carbon footprint of model training is still recognized as a 
potential issue in the future given the trends in computation growth for 
AI: AI training computations have in fact increased by 300,0003 between 
2012 and 2018 (an exponential increase doubling every 3.4 months)."

> Thanks,
> Josh

-- 
Joseph P. De Veaugh-Geiss
KDE Internal Communications & KDE Eco Community Manager
OpenPGP: 8FC5 4178 DC44 AD55 08E7 DF57 453E 5746 59A6 C06F
Matrix: @joseph:kde.org

Generally available Monday-Thursday from 10-16h CET/CEST. Outside of 
these times it may take a little longer for me to respond.

KDE Eco: Building Energy-Efficient Free Software!
Website: https://eco.kde.org
Mastodon: @be4foss@floss.social
[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic