top of page

Writing insights with AI could cost you.

  • Writer: Ian Lavis
    Ian Lavis
  • Feb 27
  • 3 min read

Updated: Mar 19

Fast forward a year. Almost everyone is using AI to produce insights. It’s quick, it’s easy – and it’s risky.


We know the power of AI to create content much quicker than a human with ever more impressive results.


But don’t be fooled.


Over-relying on AI could cost you money and damage your reputation.


The only way to avoid this is to treat AI like a new employee who needs careful management and has a tendency to make things up.


Red flags

The damage to an individual or organisation’s reputation by over-relying on AI could be huge. We’re already seeing legal cases where AI has produced inaccurate content and fictitious citations. There is also a risk of breaching confidentiality by feeding sensitive data into a generative AI tool and reproducing it in insights.


The problem is tools such as ChatGPT don’t have to cite sources. They literally extract data from all over the place and cobble it together with no reference to where the information actually came from. Claims can be misleading or false. In some instances, citations are invented.


The only way to avoid potential legal cases for falsifying information or defaming someone is to laboriously check everything against trusted sources, if indeed this is actually possible.


Better still, go direct to experts, interview them, and produce something original that you know is from a trusted source.


Over-reliance on AI is rife on LinkedIn. There’s no end of articles produced by AI that simply rehash stuff that’s already out there. Most is generic rubbish, often falsely attributed to a non-existent human. The end result is often bland, inaccurate and potentially illegal.


What you need to do

AI is an incredibly powerful for organisations, managers and content creators but use it wisely. Like any new employee, you shouldn’t give AI too much control over your content and data without careful supervision.


Here are five ways to avoid an AI disaster when producing insights:


  1. Be selective. AI can be a great tool for idea generation and structuring articles. It can help you get started, giving you some prompts for what to write about and how to present your work. It can also help you edit articles once written. It can’t be relied upon to produce entire articles without heavy editing and careful fact-checking.


  1. Ask the right questions. AI is only as good as what you put in. Don’t be too vague. Zero in on what you really want to know and you’ll get better results.


  2. Be sceptical. Don’t believe everything AI creates. It makes mistakes and can make things up.


  3. If you do use AI to produce insights, check citations and check all content against trusted sources. Check if the information is timely and not out of date. If you don’t know where to start, speak to people who do.


  4. Be original. Go rogue and write insights yourself. Do your own research, interview subject matter experts and include their comments in your insights. Don’t simply rehash what’s been said a million times, which is exactly what AI does. Instead, go directly to the expert and get a fresh take on it. This not only provides original content but it comes from a trusted source. And it sets you apart.


For original, human-written insights, and help with your content, contact me today.

Comments


©2022 by Ian Lavis in THEWRITINGROOM

  • LinkedIn
bottom of page