Pennomi wrote a whole list of potential ideas. And honestly, while I agree that local LLMs on typical hardware are underpowered for most tasks, it’s possible they would have the option for those that can run it.
People are getting all upset over this announcement without even knowing what their plan actually is, like the word “AI” is making them foam at the mouth or something. I’m just saying we should reserve judgements for when we have an idea of what’s happening.
Pennomi wrote a whole list of potential ideas. And honestly, while I agree that local LLMs on typical hardware are underpowered for most tasks, it’s possible they would have the option for those that can run it.
People are getting all upset over this announcement without even knowing what their plan actually is, like the word “AI” is making them foam at the mouth or something. I’m just saying we should reserve judgements for when we have an idea of what’s happening.
And I replied to that comment, without any mouth foaming.
Yes, and then you asked for ideas, which were in that comment that you replied to.