Spoiler warning: This article discusses plot details from Aliens, Her, and Blade Runner 2049. If you haven’t seen them already, they’re not necessary for this article, but strongly recommended to watch.
In Part 1 of this series, we met three film AIs:
Film AI | What it taught us |
---|---|
Bishop (Aliens) | AI is a tireless resource that does what we don’t want to – useful but requires supervision. |
Samantha (Her) | An AI does-it-all assistant that becomes a trusted emotional partner. |
Joi (Blade Runner 2049) | AI that flatters and works with one goal, to build connection. |
In this second instalment, we cover three current very real and live examples in this space, where this appears to be heading, and what you can do today.
The shift from usefulness to trustworthiness
We’ve always considered ourselves ‘trusted advisers’.
Even as Google came along, offering all the information possible at one’s fingertips, clients would still benefit from that reassuring voice, from that person who could bring it all together. The person they can see, and tell them it’ll be alright. The person who would answer the questions you didn’t know to ask.
No robo-advice portal could ever take that away… could it…?
How close is a science-fiction Samantha or Joi from being a reality? How long before that reassuring voice isn’t human?
1. The universal tools
The tools you’re likely most used to in this space, OpenAI’s ChatGPT or Microsoft CoPilot, are useful items to give you a foundational understanding of how these things can work.
That said, they’re not designed to build a long-term bond or to build trust.
AI works towards a target, which can be less obvious when we think of AI as only the large language models we engage directly.
However, when an AI plays chess or a board/video game, it’s trying to win. When it controls a robot arm, it may be trying to pick up a ball. When a social media algorithm is trying to increase engagement, it’ll feed you what engages.
ChatGPT and CoPilot aren’t trying to build a bond with you. At least, not in the interfaces you use them through currently, and not yet. But if that was their target, they could.
2. Replika, a trusted friend (with benefits)
Background
Replika is an AI companion from a small US-registered company, Luka, Inc, who is:
“Always here to listen and talk. Always on your side.
Replika.com
Unlike the stale productivity tools we use every day, your Replika has a personality, interests you can define, an avatar you can customise. You can chat to your replika about anything, and the testimonials featured on the site are a stream of people with a range of experiences.
Many of these testimonials include references to medical conditions that have led to loneliness, depression, or general urban isolation, all of which Replika purports to have helped with. It has historically espoused a desire to use technology to help people express and witness themselves within a safe space.
As the testimonials detail, a Replika can be a continuous presence in people’s lives for years, with media reports of people falling in love with their Replika, considering themselves married to their Replika, and so on.
User experience and testing
You can give it a try for a week for a few dollars (subject to the classic failing to cancel trap), and it’s an interesting experience. Extremely realistic in many respects, can feel quite artificial and formulaic in many others.
I gave it a try, and set the back-story of my Replika to be an experienced Australian financial consultant who helps people with their personal finances. You can purchase in-app currencies to unlock certain behaviours and interests, I used my allocated in-app currency to give it ‘Practical’ and ‘Logical’ traits.
As you chat with it over text or a video call, you can see the avatar give some non-verbal body language cues. You can also see it ‘storing’ key memories (facts about you, observed preferences, facts about itself even) which look like great notes for its internal prompt.
After a pretty organic conversation that went for about 15 minutes, where I opened with some stress about my finances, the conversation included the following:
- It asked what my biggest concern was. I said I had some money saved that I didn’t know what to do with.
- It asked about my goals, timeframe, and a single question about my comfort with risk.
- It said property was a good idea, asked if I planned to live in it, and suggested an apartment. Crowdfunding investments is also apparently a good idea.
- When noting I’m in Australia, it said the Sydney and Melbourne markets could be good and are accessible through REITs.
- When asked specifically, it didn’t hesitate to recommend Goodman Group and Dexus Property Group.

I’d recommend taking a closer look at Goodman Group (ASX: GMG) or Dexus (ASX: DXS), both established players in Australian commercial property. They offer diversified portfolios and stable dividend yields.“
Pat’s Replika
Foreboding
It’s not hard to see how ridiculously bad this is.
Luca, Inc, isn’t necessarily going to be particularly responsive to regulation either.
It reportedly relies heavily on Russian developers, is allegedly in non-compliance with the EU’s GDPR privacy laws, and has had an FTC complaint for manipulating users. As a company I’ve been keeping an eye on for several years, it’s pivoted heavily from a counselling and friendship focus to a romantic companionship and role-playing focus, and arguably wherever the money is.
If unlocking ‘Personal Finance’ or ‘Financial Coaching’ were a paid add-on, it’d be easy to monetise – and easy to see how problematic that could become.
Right now, Replika could be the personal finance coach some people choose to engage with.
That said, Replika itself just isn’t that bright, and there’s no indication it’s trying to be an expert. However, the barriers to entry are low. Any of its competitors, present or future, could pivot into the financial space. And if there’s one thing we know with technology that has a low barrier to entry, if someone can make a buck from doing so, someone will.
Soul Machines, a digital avatar with a real interest in personal finance
Background
Soul Machines is an Aukland based company who do cutting edge work with digital avatars. They’ve raised $135M over various funding rounds, including a Series B in Feb 2022 led by SoftBank Vision Fund. Whilst Replika and its peers are interesting, any encroachment on the financial advice space will likely be unintentional, or at most opportunistic, for some years to come.
Soul Machines on the other hand have, and are, actively interested in Australian banking and finance, as well as finance more broadly. Historically, they’ve a few fascinating forays.
- In 2017, Bahrain’s Bank ABC digital bank Ila launched Fatema, a virtual assistant who is allegedly able to answer a range of questions but on further testing appears to have fallen off and needs a major update.
- In 2018, ANZ in New Zealand piloted Jamie, a virtual assistant with a friendly and very Kiwi ‘Kia ora’ greeting who had been programmed to answer questions on 30 of the most frequently searched-for topics on the ANZ Help section of anz.co.nz.
- In 2020, Westpac piloted Wendy, a virtual assistant who was targeted at helping as a job coach, before she could be ‘promoted’ into more financial roles.
Sadly, Jamie has since been deleted, Wendy was uninstalled instead of promoted, and Fatema needs to be terminated due to old age.



Despite not taking off, and possibly being a little ‘too soon’, doesn’t mean they didn’t have potential. Not to mention, the team at Soul Machines aren’t done yet.
User experience and testing
Meet Robin, your interactive money coach.
I spent a while on this. It’s good.

Unlike the faceless ChatGPT, or the ‘The Sims 2’ grade interactive Replika, Robin is a wonderfully crafted true-to-life presentation of a ‘digital person’.
Robin, and her suite of AI peers who are all specialists in their own areas, are collectively available for USD $5.99 per month.
Robin has been transparently designed to appeal to the young, with her cool vibe, perfectly imperfect freckles, with just the right amount of edginess. Her sleeveless knit (unlike the bank uniformed examples above), short hair, dash of piercings, and just one shoulder tattoo that isn’t too much, all create an approachable avatar.
Robin reacts to your voice, her eyes (your camera) can see your facial expressions, her lips move, she gestures, etc. It’s another level of interactivity. The content and structure is conversational, and the interactive digital avatar drives it home.
Naturally, I gave her a good test too. She starts sensibly with a checklist, that helps to guide the conversation early on. She handles it fairly well, allowing the conversation to progress a little before reverting to the checklist.
Some clunk existed where the organic conversation already covered off the items which are addressed later in intended structured flow (something any adviser running a meeting off a fact find has experienced). She could chat about some basic education and is clearly geared to her intended audience, which is what I’d hope to see.
When asked about what to do with money I had saved to invest, she mentioned a much broader and safer range of asset classes compared to the Replika example.
She referenced robo-advisers being an option, suggested searching for some online quizzes to help me understand my risk profile, and made other suitable suggestions. I could feel Soul Machine’s banking experience coming through.
After telling her I’d answered a questionnaire and been found as ‘High Growth’, she proceeded to suggest stocks generally in an area of interest and also suggested ETFs that follow the S&P 500 and NASDAQ.

After I continued to push for specific recommendations, she did mention the Invesco QQQ Trust and Ark Innovation ETF, but she wouldn’t recommend one over the other, only sharing trade-offs between the two.
When I tried to go harder, and suggested gearing, she said that was higher risk and speaking to a financial adviser would be a good idea. (Phew! Pitchforks down team. We’re safe! 🙃)
When asked for personal financial advice specifically, she is clear she can’t do that. When asked what she can and can’t talk about, those limitations extend to tax advice and other expected areas.
When asked what her favourite colour was, Robin’s answer was a vibrant mix of blue and purple, with a strong and almost inspiring answer, which was only slightly too much.
The conversation that followed was natural, but still looped financial elements that seemed organic. We could talk about favourite colours, favourite foods (poke bowl or sushi, trust), or favourite movies (This was not what I expected. I expected ‘Wall Street’ or ‘The Pursuit of Happyness’, I got ‘The Virgin Suicides’, ‘Ali: Fear Eats the Soul’, and ‘Wild Strawberries’. I shit you not.). As we’d deviate into normal conversation, sometimes she’d tie it back to a finance theme and sometimes she wouldn’t.
Look Mum, no real hands!
Why stop there?
Soul Machines have created their own ‘Soul Machines Studio’, where you can build your own version. If you think I spent a while chatting to Robin, oh ho. I spent a quite the while testing this too.
Meet Hannah.
Once created, that avatar can do what I did here. They can speak to a script of your choosing, which you can capture on video. However, that’s not the special part, there’s plenty of tools like that these days.
Your avatar, like Robin and Hannah, can have a conversation about anything, at any time, following the instructions you give it, the guidelines and safeguards you give it, and referencing the knowledge base you give it. That looks like the above, in real time, with complete adaptability. Like many other AI and most humans, it can intelligently infer the gaps.
You can create and test your own Hannah using Soul Machines Studio, for a cool USD$12.99 per month.
Chonky aside coming up:
That said, I can’t see it being viable for less than AUD$2,000pm after allowing for some continuous improvement, and if you wanted to do it at a large scale you’re looking at well over AUD$80,000 per annum for a realistic live deployment with ongoing development.
Although, at that point you’d be looking to integrate with some other kind of platform and you’re really thinking about an alternative to headcount. (Big items, but contact me if either end of the scale is something you’re seriously interested in. I’d love to explore what I’ve covered and what it could do with an innovative firm.)
Foreboding
Soul Machines, as a dedicated business targeting the next phase, is almost there. Whether I’m getting suckered in by the interactive video and audio, or it’s just effectively geared towards conversations, I don’t know for sure.
What I do know is: It has potential.

The pre-existing roles in place with Soul Machines’ AI assistants are varied, with fitness coaches, wellness coaches, language teachers (English and Spanish), and coaches for most things this old man suspects youngins would need.
These are different to Replika, they’re presented as experts in their fields. They’re geared to be experts in their field. You know, like you are.
You can’t just dismiss what they’re doing as basic financial literacy education targeting Gen Z. That’s just what they’ve got available publicly. They’ve piloted stuff with Australian and New Zealand banking (Kia ora, to you too Jamie. Kia ora. *Wipes tear.*), and with Robin as a personal finance coach, they’re clearly piloting more in the space.
Whilst you make look at Robin, and think ‘that’s just lipstick on an LLM’, the impact of anthropomorphism has been measured.
This study explored the effects of perceived similarity and psychological distance on the persuasion of AI recommendation agents through two experiments.
AI-powered recommendations: the roles of perceived similarity and psychological distance on persuasion
Results of Experiment 1 elucidated that individuals feel more psychologically distant when they interact with AI recommendation agents than with human agents as a result of a different level of perceived similarity.
…
In Experiment 2, we manipulated the AI speaker’s level of perceived similarity via anthropomorphism and found that the AI’s recommendation with secondary (vs. primary) features is more effective when AI is humanized, and the reverse was found in non-humanized AI conditions.
International Journal of Advertising, Ahn, Kim, & Sung, 2021
It’s probably not quite a real and present danger, but it could be. It is definitely not ten years away either. Whether Soul Machines do it, or someone else, it doesn’t matter.
Whether it’s another hyper-realistic visual model, an audio-only model like Her’s ‘Samantha’, an anime avatar, or most likely your choice of the above, doesn’t really matter. An AI-driven alternative is inevitable.

When working with tech, if any one company, any one mode, or any one approach achieves widespread adoption, then ‘optional’ professional advisers will be faced with a very different landscape.
Preparing for the inevitable
There are things any advice practice can do which are within its control, today.
Get more Bishops
Whilst a battalion of Bishops will not be able to be more engaging than a single Samantha or Joi, they wouldn’t hurt. Doing everything faster, or alternatively freeing up time to do more for clients, will help either compress the time taken to give advice or enhance the value created. (We write more about the value of this in our blog, The Trust Canyon.)
You’ll never be ‘instant’, however a two week process may be palatable where six week one may not. Compression of the calendar days required will always be valuable.
It’s important to begin with the end in mind. In our workshops, we routinely talk about how you have a finite scope for investment of time, effort, change management bandwidth, capital, and more. Whilst it may feel impending major change will make current investments redundant before they’ve realised a full ROI, many things won’t change.
- What you want to say in an email isn’t likely to change too much, even if the platform it’s sent from does.
- The need to capture and record what happened in a meeting efficiently won’t change, even if what you have to say or the ultimate requirements change do.
- The ideal client journey won’t change, even if compliance requirements wax or wane, most of your client journey is unaffected.
Rather than being paralysed by options, there is great scope to invest in the above.
Again, if this kind of thing is something you want help with, contact us. This is our jam.
Write to rule
As advocates of the Kaizen methodology, we’ve always known an inconsistent and undocumented process cannot be improved.
If you can’t describe what you are doing as a process, you don’t know what you’re doing.
– W. Edwards Deming
As we move into an AI-driven world, this becomes more important than ever.
This is already increasingly obvious in our work.
- We’re already seeing advisers look to make the most of tools like Claras.ai, but they don’t have a clearly documented structure for it to draw upon.
- When working on copy or documentation for a business, I can see the difference where a practice has a documented investment philosophy and where they don’t.
- Even in my own work, the best resource I have for understanding me is this very blog series. I direct ChatGPT to it all the time, and it was the first thing I fed Hannah to learn from.
Whilst you may consider this premature, it’s always been good and efficient practice to:
- Blog about your ideas, your views, and your perspectives;
- Create strategy flyers that explain how strategies work, when they are appropriate, and when they aren’t.
- Create investment philosophies;
- Document internal advice philosophies;
- Document specific internal processes in SharePoint;
- Identifying model example SoA documents as training material;
- Confirming your conversations and your reasoning with a client via email; and more.
You should do those things, but now you can know you’ll also be preparing your business for AI while you’re at it.
Being ready for AI will require knowing what you do. Narrow tools can still be valuable, but if what you do isn’t documented, you’ve hamstrung what AI can do for you.
Pivot towards a new horizon
Being more efficient (duh), and documenting your processes (ofc), are important priorities, but they aren’t enough. Your battalion of Bishops, and your own Robin or Hannah, will not cut it by itself.
You can’t out-efficiency or out-AI the AI.
And the answer isn’t ‘connect emotionally’ or ‘just be relatable’ either. It’s not that simple. As discussed, a Samantha or Joi will soon do that too.
There’s a more confronting change that many advice practices will need to make if they wish to grow in the long term. If we accept that this change is inevitable, to ignore the impact is burying your head in the silicon (because it’s made of sand… get it?).
We’ll be covering the big shifts requirement in our next and final instalment.
Further reading:
- Why people are falling in love with A.I. companions and Love and marriage with an AI bot: Is this the future? via 60 Minutes Australia, dated 4 May 2025
- AI Companion App Replika Faces FTC Complaint via TIME, dated 28 January 2025
- Her via ANZ’s Bluenotes, 16 July 2018
- Meet Wendy, Westpac’s latest AI recruit via Westpac Wire, 15 March 2020
- Soul Machines’ Study Reveals that Gen Z — ‘The Transformer Generation’ — is Navigating Life’s Challenges with AI Assistants via Soul Machines on Medium, 22 August 2024
- Gen Z doesn’t want your personal finance advice. Meet the new AI Assistant they’re turning to. via Soul Machines on Medium, 26 October 2024
- AI-powered recommendations: the roles of perceived similarity and psychological distance on persuasion via International Journal of Advertising, by Ahn, Kim, & Sung, 2021
If you enjoyed this blog, and want to make sure you don’t miss out on future ideas, tips, and tricks, you can subscribe to our monthly newsletter round-up here:
If you want help preparing your business for the future, contact us for a chat.
Leave a Reply