AI getting it wrong ...
Over the weekend, I had the occasion to search Google for so much information, trying to troubleshoot and bring back online my main server. Not only were all of the AI answers just wrong (often subtly, but still, wrong), Safari on the iPad does not play well with the AI responses (the Google page would refresh a few times before the browser gave up). I had to append -AI to all of my queries, to suppress the AI output. Sigh.
I did a plain language search on Google again today and the AI response was ... Humorous, if the implications weren’t so dire.
The MH-100 QMNT has a quilted maple top (hence the “QM”) (nothing “quantum” about it), and the NT stands for “no tremolo” – it’s a hard tail with a Tune-o-Matic bridge. But that’s not the biggest issue. The MH-100 definitely does not have a 3+3 tuner configuration (the Les Paul-style EC series does); it has a pretty standard inline 6 angled headstock:
![]() |
| MH-100QMNT Headstock |
I have more to write on this (don’t I always), but not enough time right now (also: always). But a few links that echo my experiences:
- Groundbreaking BBC research shows issues with over half the answers from Artificial Intelligence (AI) assistants
- Largest study of its kind shows AI assistants misrepresent news content 45% of the time – regardless of language or territory
- Google AI Overviews put people at risk of harm with misleading health advice
- The Onion’s Tips for using AI (“[i]f the AI’s response seems incorrect, try changing your perception of reality so it is”) 🤣
It’s not there (yet?)...


Comments
Post a Comment