Elder Millennial Internet Rant

Stop pretending that LLMs are fact-finding tools. It doesn’t work.

Hallucinations are a feature, not a bug.

Enhancing creativity (malevolent or not) is a use case.

In machine learning, ground-truth is a term with specific meaning.

If your objective was to democratize spam to flood the zone with “truthiness”, there is no better purpose-built tool to achieve that meh result than LLMs on the internet.

As someone who was paid to fix people’s internet connections by un-checking and re-checking the TCP/IP box, and when that didn’t work, uninstalling and reinstalling Dial-Up Networking (only if you had the Windows install CD!), I cannot understand how this is not more obvious to people.

By:

Posted in: