Digital advice is a hot topic at the moment, bringing with it the debate around human financial advisers potentially being replaced by ‘robo-advisers’.
A client James, called into my office last week, explaining that his daughter was using an Artificial Intelligence (AI) platform, and he was considering doing the same.
Over my twenty-year career, I have cultivated my client relationships through a personal philosophy of integrity and trust, so while I was concerned, I appreciated James coming to me.
AI financial platforms extract key components from hundreds of data sources and repackage them in market insights and reports, enabling mum and dad investors to compare and rate investment opportunities.
Additionally, real-time portfolio monitoring identifies patterns, analyses investments and flags market movements outside a set tolerance.
My ability to reduce information to its core factors is limited – I’m only human – but that human element makes the difference, which was what I explained to James.
Responding honestly to James’ questions, I then asked how he felt about loading all his sensitive data into a website.
He replied that my database stored that same information.
“True,” I said, “but James, you know me and my team. You don’t know who manages that AI database. You can’t know where your data is stored or even who has access to it. Are you comfortable with that?”
It was a critical question; cyber-security is a huge issue and remains one of AI’s greatest challenges.
The Australian Securities & Investments Commission (ASIC) has developed and continues to evolve, guidelines for digital advice providers. Building on regulations governing human advisers, they include additional, specific licensing compliance.
Nevertheless, James conceded he was uncomfortable, adding that AI’s lack of human contact was also disconcerting.
James explained that his life was unpredictable, yet he’d always had faith that I’d respond to his needs appropriately. His concerns were addressed, and his goals and personal challenges were incorporated into his individual planning strategy. It was what he valued most about our client-adviser relationship.
James’ also wondered if AI platforms could be programmed to favour particular products or be limited in the products offered.
In fairness, AI can compare more products than any human can, and by its nature, removes potential unconscious bias for or against a product.
I’m not trying to do myself out of a job, but AI’s ability to collect, understand, and disseminate information is unrivalled.
Consequently, AI is probably ideal for investors wanting to simply enter their details and receive a product report.
But to me, client interactions are two-way conversations. AI can’t replace that. Nor has it the soft skills like empathy, imagination or creativity. It understands only its programming and what its algorithms process.
“It’s not flawed,” I said. “It’s simply not human!”
James laughed.
I then explained that many financial advisers utilise information provided by AI while nurturing that irreplaceable client-adviser relationship.
Additionally, regulations require advisers to upskill continually. Which is why I personally believe that financial planners who don’t embrace technology, may end up being replaced by those who utilise AI, rather than by AI itself.
Happily, James remained a client of mine, even saying that it was my willingness to openly discuss AI that convinced him!
Digital advice is here to stay, but it has its place.
Do you have financial goals but don’t know where to start? Perhaps you don’t know what you don’t know and just need guidance.
We will work with you to help you achieve your goals and build financial security. It starts with a two-way conversation that humans do so well!