The mood at the time was that the world was changing and generative AI bots and non-person entities could replace people.
Yes, I am familiar with the party line that AI wouldn’t replace anyone, but would empower everyone to do their jobs more effectively.
The layoff trackers told a different story.
As did the AI gurus who proclaimed that many jobs would soon be obsolete.
Strangely enough, “AI guru” was not one of the jobs that was going away. Which is odd. It seems to me that giving inspirational talks would be the perfect job for a non-person entity.
But many people agreed that entry-level jobs were ripe for rightsizing, meaning that those at the beginnings of their careers would have a much harder time finding work.
“Hardware giant IBM plans to triple entry-level hiring in the U.S. in 2026, according to reporting from Bloomberg. Nickle LaMoreaux, IBM’s chief human resource officer, announced the initiative….’And yes, it’s for all these jobs that we’re being told AI can do,’ LaMoreaux said.”
Because IBM has separated what AI can do from what it can’t do. IBM’s new positions are “less focused on areas AI can actually automate — like coding — and more focused on people-forward areas like engaging with customers.”
Guess what? Bots are not engaging. Well, maybe they’re more engaging than AI gurus…
Can you use people?
But I will go one step further and claim that human product marketers and content writers are more engaging than bot product marketers and content writers.
Believe me, I’ve tested this. Bredebot can fake 30 years of experience, but it’s not genuine.
If you want to engage with your prospects, don’t assign the job to a bot. That’s human work.
Sometimes I write pieces that cover multiple topics, in this case both a technical analysis of digital asset taxonomies and classifications in a multi-faceted sense, and a musical analysis of the multi-faceted genres present in a single song. Whoops, two songs. (One track.)
Because the source picture used to generate the song is not exclusively biometric in nature.
Identity and non-identity technologies. Gemini.
If you look in the lower right corner of the picture you can see a reference to digital asset taxonomy, a reference to a Bredemarket client that specializes in Adobe Experience Manager implementations.
Which brings us to Spotify.
Building the perfect Spotify playlists
Every month without fail I build at least one Spotify playlist for my listening pleasure. Normally these are a mixture of different decades and genres, all thrown together.
Saturday (February 21) I thought I’d be more thematic and create multiple playlists sorted by genre. So far I’ve created four:
Dance, including the Andrea True Connection and Britney Spears.
Electronic, including Kraftwerk and Röyksopp.
Folk, including the Brothers Four and R.E.M.
Punk, including Public Image Ltd. and Hole.
I tried to stay away from traditional categories such as country and disco, and didn’t try to distinguish between punk and hardcore, or genre and NEO-genre.
And I recognize that artists can span multiple genres. Some Devo songs are in my electronic playlist, others are in my punk playlist, and when I get to “Disco Dancer” it won’t go in either.
Which brings us to Elton John.
Building the imperfect digital asset taxonomy
Elton John’s career has spanned multiple genres. The bespectacled piano player has covered simple love songs, energetic power trios (17-11-70), bombastic orchestral episodes, crocodile rock, an island girl, and everything else. And that’s just in his first decade, before he became Disney soundtrack guy.
However, in most cases Elton, Bernie Taupin, and his other collaborators would stick with a particular genre for an entire song. Because that’s what good product marketers do: stick to a single message. Bad product marketers like me tend toward multiple message overload…I seem to have strayed from my point. *** EDIT THIS LATER
But perhaps you noticed the music I incorporated here.
How do you taxonomize THIS digital asset?
1973 in music
Before I describe the problem, let me set the scene.
“Goodbye Yellow Brick Road” was the second of two albums that Elton John released in 1973. It was a sprawling double album.
Elton’s predecessor album, “Don’t Shoot Me I’m The Only Piano Player,” was a number one album with a number one single, the aforementioned “Crocodile Rock.” The album also included the popular song “Daniel” and a character piece (in the Randy Newman tradition) “Texan Love Song.” (Lyricist Bernie Taupin often courted controversy.)
But much was going on outside the pop star world that Elton John seemingly occupied. Progressive music was reaching its peak. While Elton’s first 1973 album rhapsodized on elderberry wine and backed away from the Paul Buckmaster arrangements, 1973 saw releases from Emerson Lake and Palmer, Genesis, Jethro Tull, King Crimson, and Yes. Oh, and an album by Pink Floyd entitled “Dark Side Of The Moon.” When I asked Google Gemini about 1973 progressive albums, it replied in part, “If you enjoy odd time signatures and 20-minute compositions, 1973 is your playground.”
However, music is governed by Newton’s Third Law of Motion, and some definitely anti-progressive works were just starting to appear. The New York Dolls released an album, and underground recordings were circulating of a band called The Modern Lovers.
A significant portion of American teenagers didn’t care about any of this. For them, the ONLY album of importance was Led Zeppelin’s “Houses of the Holy.”
Side one of four, track one, songs one and two
Which brings us to “Funeral For A Friend”/“Love Lies Bleeding.” Technically a two-song medley, but distributed (both physically and electronically) as a single asset.
Elton had tons of fans who were all too happy to rush out, slap $9.98 on the counter to buy “Goodbye Yellow Brick Road” upon its release, and plunk side one of the first record on their turntables.
Time to listen
I suspect that the “WTF” acronym was invented in November 1973.
WTF?
Because the listeners weren’t hearing “the only piano player.”
And they weren’t hearing a Paul Buckmaster-conducted orchestra.
You couldn’t hear a recognizable piano until the 1:40 mark of the song. Slowly you hear Dee, Nigel, and Davey, and the song slowly (but not completely) transitions away from the Hall of the Progressive Masterpiece in the Court of the Multi-Coloured Bespectacled Lunatic on the Top of the Charts. (And no, “lunatic” is not too strong here, since this song falls between Elton John’s known suicide attempts in 1968 and 1975.)
Then, after the band (augmented by Hentschel) brings “Funeral for a Friend” to an energetic conclusion, the piano player transitions to the second song at the 5:22 mark. And Elton, who has been silent all this time, finally sings.
And time to reflect
Let’s review, shall we?
Although the tone is dark with themes of breakup and demise, portions of this sound like a typical Elton John pop song.
But before that it begins with sounds that made American teenagers wondered if they had picked up a Yes album by mistake.
And while few portions of the songs are minimal like Mr. Richman, or include towering solos like Mr. Page, parts would have fit well into a New York studio performance three years earlier. And Elton at his best could outdress the Dolls.
That was fun. Now comes the challenge.
How do you classify THIS?
I’ve already implicitly noted that music classification is a tricky affair.
Take “MacArthur Park,” a song recorded by everyone from Richard Harris to Waylon Jennings to Donna Summer. There are over 200 versions of the song spanning multiple genres. And composer Jimmy Webb is challenging to classify.
Now look at Elton’s song and my four playlists.
The track isn’t folk, dance, or punk.
But is it electronic? Portions are decidedly NOT.
Multi-faceted
You could cheat and place it in two (or more) classifications. Heather Hedden addresses faceted classification:
“The idea of faceted classification as a superior alternative to traditional hierarchical classification, whereby an item (such as book or article) can be classified in multiple different ways instead of in just a single classification class/category, is not new. The first such faceted classification was developed and published by mathematician/librarian S.R. Ranganathan in 1933, as an alternative to the Dewey Decimal System for classifying books, called Colon Classification (since the colon punctuation was originally used to separate the multiple facets).”
A taxonomy, however, is different—ideally:
“[F]aceted taxonomies should…ideally be mutually exclusive, in contrast to the principle of faceted classification…”
My solution
Returning to my Spotify playlist problem:
I could simply place the song in multiple playlists: for example, an electronic playlist and some type of guitar/rock/whatever playlist.
Or I could create a single hyphenated playlist, such as an electronic-guitar playlist. (Many Depeche Mode songs, beginning with “Route 66,” would be ideal here.)
For now I followed neither option, but added “Funeral for a Friend”/“Love Lies Bleeding” into my existing electronic playlist (because it starts electronically) and nowhere else.
There are many controversial uses of land, one of which is data centers. And most of us use them.
When I use SaaS resources or generative AI tools, I’m making use of a data center…somewhere. For example, when I created the image at the top of this post with Google Gemini…and when I uploaded this post to WordPress so you could read it.
But what if the data center was next door to ME? Would I feel differently about data center use?
Warren County, Virginia (Front Royal) is more rural than other counties in the state, such as Fairfax County. And someone is proposing a data center in Warren County.
This prompted a letter to the editor from Cara Aldridge Young, a former high school classmate of mine. (And a talented editor herself, if your company needs one.) Young examined the negatives surrounding data centers:
“Data centers are not quiet, invisible neighbors. They are warehouse-scale buildings surrounded by substations, transmission lines, backup generators, cooling systems, security fencing, and 24-hour lighting. They require enormous amounts of electricity and millions of gallons of water for cooling. In a county that has already experienced drought restrictions and ongoing grid concerns, it’s fair to question whether we are equipped to support that scale of development without long-term environmental and infrastructure consequences.”
I don’t have a feel how Warren County will respond to this request; I haven’t visited Front Royal in decades. But Young presumably isn’t the only resident with concerns about power, water, and the environment.
But I’m sure there are counties that would welcome the economic development, the tax revenue, and the jobs. (Well, not that many jobs.)
On my side of the country, the big infrastructure concern is warehouses, such as the Amazon distribution center in Eastvale, California.
One of Amazon’s buildings in Eastvale, California.
“Since launching the Gemini app, we’ve built tools to encourage creative expression through images and video. Today, we’re taking the next step: custom music generation. Lyria 3, Google DeepMind’s latest generative music model, is rolling out today in beta in the Gemini app.”
I used this prompt to generate a Lyria tune:
“Create music for a wildebeest using a laptop to compare a latent fingerprint to a tenprint fingerprint”
Google’s response:
“’I’ve created a unique atmospheric track for you that blends electronic data-processing sounds with the deep, resonant textures of a wildebeest’s world. You can play the music directly here in our chat!”
Bredemarket has adopted two tactics to cut through the slop and ensure my clients’ messages reach those who need to hear it.
Tactic 1: Before I write, I ask
To bound the message I am about to create for an identity/biometric client (or any client), I ask a number of questions. These questions ensure that the question addresses the right people, their concerns, and their fears. I’ve shared seven of my questions elsewhere.
Seven Questions Your Content Creator Should Ask You.
When all the questions are answered, I have a clear roadmap to start writing.
I don’t feed the answers to Bredebot and have it churn out something. I pick the words myself.
Rewrite this. Don’t write it.
Now perhaps I might use generative AI to tweak a phrase or two, but I remain in complete control of the entire creative process.
The result?
I believe, and my clients also believe, that this careful approach to content results in pieces that are differentiated from the mass-churned content of others.
So my clients stand out and aren’t confused with their competitors.
After all, even though Bredebot fakes thirty years of experience in identity and biometrics, it doesn’t really have such experience. I do. That’s why I’m the biometric product marketing expert.
So if you want me, not a bot, to polish your biometric product marketing sentences “until they shine,” let’s talk about how we can move forward.
“Abstract 3D render of a human silhouette made of shimmering frosted glass, iridescent light refracting through, symbolizing secure data encryption and zero-knowledge proofs, elegant and high-end.”
Personally I think it’s TOO abstract, but perhaps that’s just me.
Sadly the question “why would a robot fish?” was shared in a private Facebook group, so I cannot share the entire question with you. But I can share my response.
“Some humans don’t fish for food, but for relaxation. But if robots need downtime, it doesn’t have to be at a stream with a pole.”
After thinking, I composed the prompt for the Google Gemini picture that illustrates this post.
“Create a realistic picture of a robot by a stream in the woods, fishing. The eyes and other parts of the robot’s head indicate that its internal controls are in maintenance mode, or that the robot is ‘relaxing.’”
My own content creation process with Bredemarket includes a “sleep on it” step which lets my brain reset before taking a fresh look at the content.
The generative AI equivalent is to take the output from the initial prompt, start a new independent chat, and write a second prompt to re-evaluate the output of the first prompt.
Another topic raised by Nadaa Taiyab during today’s SoCal Tech Forum meeting was ambient clinical intelligence. See her comments on how AI benefits diametrically opposing healthcare entities here.
There are three ways that a health professional can create records during, and/or after, a patient visit.
Typing. The professional has their hands on the keyboard during the meeting, which doesn’t make a good impression on the patient.
Structured dictation. The professional can actually look at the patient, but the dictation is unnatural. As Bredebot characterizes it: “where you have to speak specific commands like ‘Period’ or ‘New Paragraph.’”
“Ambient clinical intelligence, or ACI, is advanced, AI-powered voice recognizing technology that quietly listens in on clinical encounters and aids the medical documentation process by automating medical transcription and note taking. This all-encompassing technology has the ability to totally transform the lives of clinicians, and thus healthcare on every level.”
Like any generative AI model, ambient clinical intelligence has to provide my four standard benefits: accuracy, ease of use, security, and speed.
Accuracy is critically important in any health application, since inaccurate coding could literally affect life or death.
Ease of use is of course the whole point of ambient clinical intelligence, since it replaces harder-to-use methods.
Security and privacy are necessary when dealing with personal health information (PHI).
Speed is essential also. As Taiyab noted elsewhere in her talk, the work is increasing and the workforce not increasing as rapidly.
But if the medical professional and patient benefit from the accuracy, ease of use, security, and speed of ambient clinical intelligence, we all win.