As we know, generative AI sources its training data from a diverse array of sources, often without the knowledge of the sources themselves. But where do humans source their data? Sometimes from intentional sources, and sometimes from unintentional sources.
I just published a post entitled “Revolutionary Changes in the Generative AI Pricing Model Coming Tomorrow: How Will They Affect the Output?” While portions of the post were written by (today’s) free generative AI tools, the majority of the words came from my brain.
My “bleary-eyed” brain.
I was in Mexico City all week for a marketing sprint. Definitely a tech company sprint, with t-shirts and everything.

But what generative process selected the word “bleary” to use in that post?
I’m not sure about the specifics about why (geddit?) I chose that word, because I hadn’t heard the word “bleary” during the sprint. I was listening to a lot of music this past week, but none of the songs included the word “bleary.”
- Many of the lyrics I heard were in Spanish.
- One lyric repeated the phrase “take me to church”; it was not a religious song.
- Another had some lyrics I choose not a print in this blog or anything else I write. (The song had an aquatic theme.)
It turns out the word “bleary” is one that Mick Jagger sang in a Rolling Stones song from the 1970s.
The lyrics in question:
I had an arrangement to meet a girl, and I was kind of late
From https://genius.com/The-rolling-stones-far-away-eyes-lyrics
And I thought by the time I got there she’d be be off
She’d be off with the nearest truck driver she could find
Much to my surprise, there she was sitting in the corner
A little bleary, worse for wear and tear
Was a girl with far away eyes
The takeaway from this post: how can we ask generative AI tools the specifics of their content generation techniques if we don’t even know how humans generate content?