AI Automation…and Disclosure

A client recently asked me to perform some research. After initially performing one aspect of the research manually, I performed the second part of the research automatically using Google Gemini. I informed the client of my use of AI for the second part of the research.

This particular use case is separate from using AI for CONTENT, something I’ve been discussing for years. However, since part of Bredemarket’s services include ANALYSIS, I felt it best to disclose when someone other than me performed the analysis.

This post describes the two parts of my research (manual and automated), what I disclosed to my client, and why I disclosed it.

Part One (Manual)

My client required assistance in identifying people with a particular skill set (which I cannot disclose). To fulfill this request, I went into LinkedIn, performed some searches, read some profiles, and selected people who may possess the skills my client required.

After spending some time collecting the research, I forwarded it to the client.

Google Gemini.

Part Two (Automated)

Several hours after sending the initial research to my client, I thought about taking a separate approach to my client’s need. Rather than identifying people with this skill set, I wanted to identify COMPANIES with this skill set.

But this time, I didn’t manually perform the research. I simply created a Google Gemini prompt asking for the companies with this skill set, their website URLs, their email addresses, and their phone numbers.

I, or rather my AI assistant, performed all of this well within my self-imposed 5-minute time frame.

Google Gemini.

The Disclosure

Once this was done, I created an email straight from Google Gemini, and sent this information to my client…

…including the prompt I used, and ALL the language that Google Gemini provided in its response.

Why Disclose?

Now some argue that I’m shooting myself in the foot by disclosing my use of generative AI to answer the second part of my client’s question.

They would claim that I should have just

  • performed the five minutes of research,
  • cleaned it up so it sounded like it came from me,
  • sent it to the client, and
  • charged an outstanding consulting fee.

Don’t do that.

Deloitte did that…and paid for it in the long run.

“Deloitte’s member firm in Australia will pay the government a partial refund for a $290,000 report that contained alleged AI-generated errors, including references to non-existent academic research papers and a fabricated quote from a federal court judgment.”

Now in this case the refund was due to hallucinations in the AI-generated document.

But what of the fact that at least one of Deloitte’s report writers was the Deloitte equivalent of Bredebot?

Personally, I think that disclosure in this instance is required also.

Leave a Comment