If there’s one thing I love, it’s jumping on bandwagons. From COVID to cryptocurrency to COVID again to crypto again (can’t wait to see this year’s FTX patches!), I will absolutely take a current trend and apply a baseball veneer to it in order to get 700 words of content.
So, AI. I’m all in on AI. The only problem is I don’t know what I would use AI for. The good news is, there’s a solution! I can just ask ChatGPT what I should use it for:
Baseball writers can use AI to enhance their research, analysis, and storytelling capabilities.
AI-powered tools can help writers collect and analyze vast amounts of data, enabling them to uncover hidden patterns, insights, and trends that might not be apparent through traditional research methods. For example, AI can help writers identify statistical trends, player tendencies, and game strategies that can inform their analysis and reporting.
Additionally, AI can assist writers in generating data-driven insights and predictions, improving the accuracy of their reporting and enhancing the value of their content. AI-powered writing tools can also help writers identify opportunities for personalization, tailoring their content to the preferences and interests of specific audiences.
Overall, AI can enable baseball writers to produce more informed, engaging, and personalized content that resonates with readers and enhances their understanding and appreciation of the sport.
Oh, hell yeah! I love producing more informed, engaging, and personalized content that resonates with readers and enhances their understanding and appreciation of the spot. Great idea!
Seems like a fun tool, right? Maybe it is. So why not go all-in? I used Midjourney, an image generator instead of a text generator, to give some images of “AI running SF Giants:”
Should I have used a word other than “running” because that just got a computer generated image of a player running? Who’s to say?
But after I got the AI to make the above images, I thought I should take a look at the possible downsides of AI technology, so I got it to simulate “SF Giants after AI caused apocalypse.”
Oh damn, that got dark! There’s, like, giant skeletons in baseball stadiums and looming clouds and apocalypse lighting. What a unique vision of baseball in a post-society hellscape. I’m interested to learn more!
Ha ha, just kidding! None of it means a thing. Nothing.
ChatGPT did not receive my prompt of “100 words on how baseball writers can use ai” and think about it, consider some ways that it would be useful for a baseball writer, and then deliver them. Instead, it received my prompt, parsed it, and then wrote words that in its vast library of writing samples typically followed the words that came immediately before. There was no thinking behind it. There was no meaning.
Imagine you’re playing a chess game. Maybe you’re not very good at chess; that’s fine, because chess is difficult and you’ve probably spent your whole life doing something else, which has been a fine decision. But hypothetically, you sit down to play chess, get a few moves in, and start thinking. If I move my bishop here, then he’ll move his knight there, and my queen will be in trouble. So you don’t do that. You move your pawn instead, and feel smart until the other guy’s bishop swoops in and takes your rook. Whoops! Forgot about that.
ChatGPT, on the other hand, wouldn’t do any of that. And why should it? You lost your rook. Instead, it consults its mammoth library of chess games that have been played since they’ve been recording chess games. It finds positions that are similar to yours, looks at what eventually winning players played, and then comes up with a move that resembles those past moves. There is no intentionality there because there is no thing that can be intentional.
So when I say that nothing ChatGPT does means a thing, what I mean is simple: It is incapable of meaning. When I write something, I do it hoping it’ll lead the reader to an interesting thought or idea. When ChatGPT writes something, it does it because those are the words that generally come after some other words. It is very good at parsing questions, and very good at staying on-topic, but the core of it is empty. Those Midjourney pictures up above? Same kind of deal: that’s what pictures like that usually look like, so here’s a new one.
You could say this is just the modern world. Netflix releases mediocre show after mediocre show after mediocre show because people have been trained to just watch a new thing, no matter what. Marvel and Star Wars are \releasing new TV shows or movies year-round at this point, because people will watch them because they’re content. ChatGPT is the logical endpoint of our search for endless content, because we have eliminated the time and resources involved in creating it.
But this kind of AI-driven content is hollow and useless, pure emptiness that serves no purpose and enriches no lives. It cannot speak to any kind of human experience because not only is it not trying to, but it is incapable of doing any part of that. It is a shortcut to a place not worth going.
People have been having it write essays and short stories and books for them, and all of these things have been rejected easily because computers cannot write anything worth reading, and we apparently have an awful lot of people who don’t intuitively know that.
You should understand that the purpose of anything even remotely related about art involves being intentional and having a purpose. And if you don’t, well, that’s the Netflix problem. Here’s a new show, you watched it, that show is gone, here’s another new show. They’re basically the same, right? You didn’t actually care about that first show, right? Great, here’s a third show. Keep watching forever.
And yet, this is the thing that is being pushed down our throats near-constantly. This is the future, people will constantly say, but the future of what? And why? In this future, will nothing be worth our time? Will we have an infinite pool of utterly meaningless, repleaceable content ready to go at a moment’s notice? Will anything anyone says or does actually matter?
I think at some point soonish this trend will stop because people will get tired of it. Sure, I said that about Marvel six or seven years ago, but since I got that one wrong that just means I’m due. But this one touches and hurts everything, because it makes you question whether any of it is worthwhile. It may be a mild poison, but it’s still poison, and if I have one life rule, it’s got to be: Don’t Drink Poison.
Also, if I were a real smartass, I’d have gotten ChatGPT to write this whole damn thing, but I didn’t. Because I have fucking standards.
The world of AI, the Giants, and you!
Thank you Doug. I look to you and others like you to help stave off our robot overlords attempts at world domination🙂👍⚾️
“A shortcut to a place not worth going to.” I think you summed it up brilliantly, Doug. Well done sir!