Powered by MOMENTUM MEDIA
Broker Daily logo

Unchecked AI use could expose brokers to growing risk

By Julian Barnes
04 February 2026
Share this article
Unchecked AI use could expose brokers to growing risk

The growing use of artificial intelligence across broking is delivering clear efficiency gains, but broker coach Trent Carter has warned that adoption without due process could expose brokers to serious cyber security, compliance, and liability risks.

Speaking on the Finance Specialist podcast, hosts Liam Garman and Trent Carter said AI tools are increasingly being used by brokers and businesses to improve productivity, scale operations, and streamline workflows. However, they cautioned that many users are failing to consider the second- and third-order consequences of deploying AI without appropriate controls.

“I think everybody has the intent to go out there and do the best job for their client,” Carter said.

“In using AI, I think a lot of people have that intent, that this is going to give me a better quality, faster service and help my client achieve what they want to achieve faster, which means I’ll be more successful.

==
==

“I think it’s with the best intent that people use tech, but it doesn’t stop them falling afoul.”

AI and copyright

One issue that businesses face is when employees rely on generative AI that can be inaccurate, misleading, or entirely fabricated, leading to copyright breaches, false advertising, and reputational damage.

Garman noted cases where AI-generated websites and marketing materials had been published without adequate fact-checking, exposing business owners to legal and regulatory action.

As AI models are often trained on massive datasets, often containing copyrighted material, users run the risk of creating and publishing generated outputs that are substantially similar or identical to the copyright-protected data.

AI and copyright laws are an evolving field, but in Australia, there are currently no copyright exceptions for data mining or machine learning.

“I know that it seems that every farmer and his mum’s now using agentic AI to support them, and help them grow their efficiency in their business,” Garman said.

“But I’ve read so many stories of businesses that have been done because their employees have accidentally plagiarised something. They’ve used ChatGPT or Gemini or Claude to come up with content and they’ve copied and pasted it and they’ve put that online. They’ve been done with copyright infringements or with fake advertising.”

Protecting yourself

The pair also pointed to increased regulatory scrutiny around data handling, including spot audits targeting how businesses store, manage, and dispose of customer information.

Garman said: “We’ve seen a number of spot audits in the real estate industry from the office of the Australian Information Commission, who are going out to real estate agencies and making sure that they’re looking after their customer data correctly, that they’re storing it correctly, that they’re disposing of it in a timely manner.

“We know with data laws now that if you are breached and you have data that is released online that you didn’t need to have it anymore, you could be in some pretty big trouble.

“And there are examples of some overseas jurisdictions where governments have actually requested chat logs of businesses and how they’re interacting with ChatGPT. It got me thinking, because it opens a risk for a lot of businesses out there and especially brokers. If you have employees who are looking to cut corners and you’re putting up sensitive client information on these outward-looking AI models, you could be breaking some serious laws here.”

“People need to be pretty careful on how they’re engaging with the tool,” Carter added.

“I don’t think there’s a perfect rule out there yet and it’s a little bit of trial and error that’s going to come through these court cases that are being prosecuted, that policy will fall out the back of.”

Carter pointed to two precautions that brokers can take to protect themselves in the space.

“If you’re a business owner or a broker, you need to have some level of cyber security insurance; it’s almost a must-have,” Carter said.

“Number two is that you probably need to develop, and probably don’t ask AI to develop it, but you need to develop a policy around AI and the responsible use of it in your business.”

Both Carter and Garman noted that if brokers take reasonable steps to protect themselves and their clients’ data, legislation will protect rather than punish them should something happen.

Garman said: “With the Cyber Security Act, I, I’m not a lawyer and this isn’t legal advice, but the act basically says that you have to be doing your best where practically reasonable. What does that mean? It means that if you get hacked but you’ve done the best job you can, you won’t necessarily be held liable for it, because sometimes unfortunate things can happen.

“However, if you’ve been hacked and a hacker has been able to come into the communication between you and a client. Using your email or CRM and you didn’t have basic things such as two-factor authentication or multiple levels of zero trust there, I would say you’d be in a case where you could get in a lot of trouble.

​“So just a good note for our business leaders out there is that yes, things like AI, CRMs and cyber tools can offer some great unique advantages, but it doesn’t necessarily mean that if you don’t use them well, that they don’t come with a pretty hefty price tag.”

Find out more about how brokers can accelerate their processes and take advantage of technology, including artificial intelligence (AI), at the Better Business Summit 2026, run in partnership with National Australia Bank (NAB).

Taking place in every state across March and April 2026, the Better Business Summit will unpack how brokers can take their business to the next level by harnessing technology, formalising processes, and leading with a growth mindset.

Tickets for the Better Business Summit 2026 are available now, but hurry! The event will sell out!

[Related: Cynario and Salestrekker partner on AI innovation]

Tags: