Articles

ChainMail | Chatbot Blues

Artificial intelligence canโ€™t always be trusted. Let me tell you about the time an AI chatbot lied to me about a famous blues musician.

A few days ago, I decided to compile a list of pop songs about supply chains (I had my reasons; itโ€™s for a future column). So I asked ChatGPT, the generative AI chatbot, for suggestions. ChatGPT identified various appropriate songs, though mostly it focused on adjacent themes such as songs about factories and organized labor. Looking to add to the list, I asked ChatGPT specifically if it knew of any songs about delivering packages. In response, ChatGPT immediately identified an intriguing tune by a legendary bluesman.

ChatGPT: โ€œPackage Delivery Bluesโ€ by Big Bill Broonzy โ€” This classic blues song is about a man whoโ€™s struggling to make ends meet as a delivery driver, and heโ€™s feeling weighed down by the burdens of his job.

Wow! I was generally familiar with Broonzy, who was active from the 1920s to the 1950s, and looked forward to listening to โ€œPackage Delivery Blues,โ€ which sounds like he was anticipating the plight of the Amazon driver. But the song wasnโ€™t on Spotify and didnโ€™t appear in Google searches. So maybe it was less known? I searched discographies, including lists of all Broonzyโ€™s 78 RPM singles and B-sides. Nothing. I rechecked Google. Where was this โ€œclassic blues songโ€?

I was at the library to do more research when I stopped in my tracks. I remembered a phenomenon known as the artificial intelligence hallucination problem, in which large language model AI programs confidently provide information that is totally false, as if seeking to maintain their all-knowing status through bull*****ing.

Tech companies acknowledge the issue. โ€œThe new AI systems are โ€˜built to be persuasive, not truthful,โ€™โ€ The New York Times said, quoting an internal Microsoft document. โ€œThis means that outputs can look very realistic but include statements that arenโ€™t true.โ€™โ€

I began to suspect that the chatbot was hallucinating about โ€œPackage Delivery Blues,โ€ but the description of โ€œPackage Delivery Bluesโ€ was so specific I was still unsure. My next step was to re-interview ChatGPT and ask for evidence. Hereโ€™s our exchange: 

MICHAEL: You told me there is a song about supply chains called โ€œPackage Delivery Bluesโ€ by Big Bill Broonzy. Heโ€™s a famous blues musician but I can find no evidence that โ€œPackage Delivery Bluesโ€ exists. Can you provide proof?

ChatGPT: I apologize for the error in my previous response. Upon further research, I have not been able to find any evidence that Big Bill Broonzy recorded a song called โ€œPackage Delivery Blues.โ€ It seems that the information I provided was incorrect, and I am sorry for any confusion or inconvenience this may have caused. Thank you for bringing this to my attention.

At least the AI confessed.

But donโ€™t be fooled. ChatGPT and other such large language model programs sound smart but should be fact-checked. They arenโ€™t ready for the responsibility of teaching humans. At times they fabricate information. Be too trusting and youโ€™ll get the chatbot blues.

Read the complete Issue 34 of ChainMail here.


Enjoying this story? Subscribe to ChainMail, MxD’s newsletter on breaking supply chain news, trends, and updates.

More News

Articles

Taking the Mystery Out of Controlled Unclassified Information (CUI)

Keeping secrets is everyday work for defense industry contractors. But how do you...

Read More
Articles

2026 Cybersecurity Outlook: โ€˜Hardening Ourselves Against Attacks Has Never Been More Importantโ€™

In 2025, cybercriminals leaned heavily on artificial intelligence (AI) to automate attacks at...

Read More
Articles

The Year in Cybersecurity: AI, CMMC, and Lessons From 2025โ€™s Top Attacks

For manufacturers and suppliers, especially those in the Defense Industrial Base, cybersecurity in...

Read More