cron@feddit.orgtoTechnology@lemmy.world•OpenAI, Google, Anthropic admit they can’t scale up their chatbots any furtherEnglish
0·
3 days agoIt’s absurd that some of the larger LLMs now use hundreds of billions of parameters (e.g. llama3.1 with 405B).
This doesn’t really seem like a smart usage of ressources if you need several of the largest GPUs available to even run one conversation.
It might be my personal preference, but i find conversations on mastodon hard to follow once there are more than a handful replies.
Lemmy (or reddit) keeps the flow of comments well ordered and perfectly readable.