Human resources at a click

The Era of AI

AI

Felicity Harber v The Commissioners for HMRC (2023) UKFTT 1007 (TC )

In this recent case, the First-Tier Tribunal (Tax Chamber) gave a stark warning to litigants about use of AI in litigation. Ms Harber, a litigant in person, had failed to notify and pay HMRC the relevant Capital Gains Tax (CGT) due following the sale of a property. This subsequently led to a penalty of £3,265.11 being imposed, which the applicant appealed, seeking to rely on having a reasonable excuse for her failure to pay the CGT.

In an attempt to win the appeal, Ms Harber used nine fictitious cases she had found on ChatGPT, an Artificial Intelligence (AI) generated tool. When she put the cases to the Tribunal at the hearing, judges were unable to locate any of them on any available databases. Ultimately, Ms Harber lost her case. The Tribunal concluded that she did not have a reasonable excuse for her failure to notify HMRC of her liability for Capital Gains Tax. Even though the Tribunal accepted that she had used the nine decisions innocently, it nonetheless gave a strong caution against litigants using AI technology, highlighting the danger it poses.

The dangers of AI

As we witness the inevitable expansion and adoption of AI, we are reminded that such a significant tool is not without its risks. Generative AI produces a range of content including images, text, videos and other media sources from the information and data it receives. AI can be a legitimate legal research tool, especially for litigants in person who may not have access to professional legal databases. However, as shown from the above case, one of its problems lies in accuracy. With the wealth of information AI absorbs, it has the ability to distort facts and form illusions, and the data it is fed with may contain bias and false patterns.

Whether a litigant in person or a legal professional, a cautious approach must be taken when utilising AI for research in legal cases. Using false information will not only affect the outcome for the parties who rely on it, but will also impact on credibility and reputation. We anticipate judges will take a tough approach towards the incorrect or inappropriate use of AI technology in the future.

 

Fraud. AI tools will and are being used in fraudulent activities.

Other Areas

Legal research is not the only area in which AI poses a threat going forward. Other areas to watch out for include:

  • Privacy and confidentiality. To generate content, AI relies on data and information. In turn, AI may generate content which includes private data.
  • Fraud. AI tools will and are being used in fraudulent activities by way of, for example, using real and/or fake data to create realistic scams.
  • Intellectual property. With the wealth of data and content that AI tools possess, we are highly likely to see an increase in intellectual property litigation, for example with regard to who owns the rights to AI generated content.

If you have any further questions in relation to legal research and/or litigation more generally, please contact the Dispute Resolution solicitors.

 

 

Disclaimer This information is for guidance purposes only and should not be regarded as a substitute for taking professional and legal advice. Please refer to the full General Notices on our website.
Madeleine Harding|
Madeleine Harding
Trainee Solicitor

Related Articles

On 6 May 2025, the SRA authorised the first law firm providing legal services through artificial intelligence. Garfield.Law will provide...

ICO Consultation and Draft Updated Guidance Where data breaches are easily achieved by human error, encryption not only offers a...

Earlier this year, the European Commission adopted an extension of the two 2021 adequacy decisions with the UK for a...

Related Resources

Internet and email policy

Policy covering the use of the internet and email by employees. Purpose and scope This Policy covers the use of...

Data breaches factsheet

Facts and examples of personal data breaches and information required to report a data breach. Personal Data Breach What is...

Bring your own device policy

This policy covers the use of employees’ own devices (e.g. smartphone, tablet, laptop) for companybusiness. This policy applies to the...

Human resources at a click