26 December 2023

AIs are fraudsters, but with no apparent motive (?)

I've been playing around with Google's version of GPT (called "Bard"). At first I thought it was a bit more accurate than GPT, which I tried using quite a bit earlier this year. But then I asked it to recommend some recent books on the Origin of Life. Ho boy. Not only did it fail to mention Transformer, by Nick Lane, which is probably the most important book on this subject for popular audiences in the last decade, but it recommended a very interesting sounding title by a well respected biochemist, supposedly published in 2021. Only problem? The biochemist is a real person, author of important text books and many scientific articles. He might well have written just the book described. But he didn't. There is no such book, as Bard readily admitted when challenged (with its fakey apologies). Apparently this is a big problem, which I asked Bard about. It said that just "cite checking" by algorithm isn't as simple as it sounds, and the problem with "large language models" making shit up (my words) is persistent and very problematic. Sean Carroll said he used the latest version of GPT to create a course syllabus on a subject he hadn't taught before. He said it looked great; covered all the topics, gave the right emphasis, and included a terrific sounding set of readings. Problem? Again, about half the works recommended didn't actually exist. Something weird about this, but I'm not sure exactly what it is. Maybe AI is accessing some of the other universes in the multiverse, where these books do exist? Borgesian weirdness abounds. 

AI has no consciousness, of that everyone seems pretty sure. But it definitely does unexpected things, so I hope as a species we tread a bit lightly, and don't give it the nuclear codes. 

Cheers and Happy New Year, o denizens of the Brave New World. 



No comments:

Post a Comment

Gyromantic Informicon. Comments are not moderated. If you encounter a problem, please go to home page and follow directions to send me an e-mail.