nazokiyoubinbou ,
@nazokiyoubinbou@mastodon.social avatar

@arstechnica A "decision" implies a choice is even being made. If you walk up to a mirror with warps in it, if you stand just right the image may look normal, but if you stand wrong it may not. The mirror has not decided to show you one or the other. The mirror just sits there, reflecting light.

We have GOT to stop treating things like LLMs as if they actually were "AI." They are not. The key point here is they should not be making "decisions" because they simply can't.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • All magazines