You are only browsing one thread in the discussion! All comments are available on the post page.

Return

macallik OP ,

Personally, my (uneducated) opinion is that we already have plug-and-play functionality on a program level ie I can add an OpenAI api key to various programs and make them 'smarter'. Since the Linux experience is often pretty piecemeal as is, this would be a solid enough approach for most.

In terms of AI being ingrained within a Desktop Environment, that seems harder for me to imagine... Like how the Office Suite has AI functionality, would the KDE suite of apps allow for cross-program functionality? Would this require a substantial change in system requirements for local processing? Would there be an open-source LLM hosted in the cloud for chat purposes that also mirrors the privacy expectations of the average Linux user?

I understand people's apprehension towards Linux distros seemingly chasing the latest fad, but I think it's also worth hypothesizing the alternative if AI and LLMs are here to stay/differentiate.

nottheengineer ,

LLMs are big, so you either need a powerful PC to run them or use cloud services. Linux users tend to not be fans of either, so it’ll probably take a while before anything big happens.

Besides, for the things where an LLM actually makes sense (like a copilot-style code generator), there are already implementations.

waspentalive ,
@waspentalive@kbin.social avatar

I am a Debian user, and I can't really say I am not a fan of "Big". I have a laptop as my production machine but I also have as big a file server as I can afford. I would not want an AI that is part of my OS unless it is local. I do use ChatGPT and Stable Diffusion, but only for non-critical functions.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • [email protected]
  • All magazines