Why Aren’t Businesses Seeing the Benefits of AI? The Data Privacy Challenge in Australia
Dr Jamie Sherrah
Speaking at a business event last year, an audience member asked me: “I see all these amazing things AI can do, but why isn’t that translating over into our business? Why aren’t we seeing the benefits at work?”
This question troubled me for some time. There are no lack of apps and integrations to apply AI to your data. I came to the conclusion that lack of data privacy is slowing AI adoption in Australian businesses.
Most AI apps are hosted overseas, particularly the USA. Some, like DeepSeek, are in China. Australians just don’t feel comfortable putting their sensitive data into AI tools or sending confidential client information or company secrets to foreign lands. Due to data security concerns, some businesses have restricted or banned the use of AI, often out of necessity due to regulatory requirements.
The DeepSeek Controversy: A Warning for AI Users
DeepSeek’s recent release has reignited the data privacy debate. Shortly after its launch, the Australian federal government banned its use by employees, and the University of Adelaide, where I serve as an Adjunct member, followed suit. While concerns about political bias exist, the real issue is data security; anything entered into the DeepSeek app is sent to China. The data is stored, used for training, and potentially accessible to unknown parties.
If DeepSeek raises red flags, what about other AI tools like ChatGPT? Most AI platforms use your data for model training, unless you adjust your privacy settings carefully. Additionally, most are hosted in the U.S., meaning your data is stored offshore. For industries like healthcare, where data sensitivity is paramount, this is a significant issue.
Most of the AI tools are hosted in the USA, which means your data is being stored outside of Australia. For some organisations, particularly those in the health sector, this is of concern.
In fact, many Australian organisations are obliged to conform to the Australian Privacy Act of 1988, which requires that data disclosed to a third party (e.g. an AI tool) overseas be guaranteed the same legal privacy protections as in Australia. This is the responsibility of Australian businesses. Clearly, we can’t have any say about how the laws operate in the USA, let alone China!
In fact, regarding the use of foreign AI tools, the Office of the Australian Information Commissioner (OAIC) has declared:
Given the significant and complex privacy risks involved, as a matter of best practice, it is recommended that organisations do not enter personal information, and particularly sensitive information, into AI chatbots.
The Reality of Using Overseas AI Tools
When you enter sensitive data into ChatGPT or other overseas-hosted AI platforms, there are three key risks:
- Data storage & retention — Your data is stored offshore, potentially outside Australian legal protections.
- Model training & exposure — Unless you adjust privacy settings, your data may be used to train AI models, risking future disclosure.
- Third-party access — In some jurisdictions, data laws may allow foreign governments or entities to access stored information.
A Secure Path Forward: Keeping AI Data in Australia
How can we use AI safely in Australia? As far as data security goes, a great start is to ensure your data stays in Australia, where it can be governed by Australian privacy laws.
To achieve this, the AI models (LLMs) and data storage need to be hosted on cloud systems in the Australian region, or for stricter security requirements, on premises. There are cloud application programming interface (APIs) and tools that developers can use to build AI applications to help your business and ensure data privacy in Australia.
If your organisation uses Microsoft suite, CoPilot might be available to you in your secure cloud environment within the Australia region. But are there tools you can use off-the-shelf that are hosted in Australia?
As AI is adopted into various areas of business, integration with platforms and workflows will be essential. And it will be imperative that these platforms are based in Australia and utilise Australian data.
At Inject AI we recognised the need for a sovereign AI platform, hosted in Australia, where Aussie businesses can safely use AI on their data. We have recently launched a tool called “Hippo”, which is akin to Perplexity or ChatGPT, but hosted in Australia. Tools in Australia and comprised of Australian data are critical for Australian consumers and businesses as it keeps their data safe and is governed by Australian data privacy laws.
Take Control of Your AI Strategy
It’s time for businesses to get serious about AI — not just about adopting it, but about governing it responsibly. Here are three steps you can take today:
- Review Your Privacy Policy — Ensure your policies clearly outline how AI tools will be used and the safeguards in place to protect customer data.
- Implement an AI Usage Policy — Define acceptable AI use within your organisation and educate employees about risks.
- Assess Data Risks in AI Tools — Consider where your AI tools store data and whether an Australia-based solution is a better fit for sensitive information.
AI is a powerful force shaping the future of business. But data privacy must not be an afterthought. Making the shift to Australia-based AI solutions isn’t just about security, it’s about ensuring your business is future-proofed, compliant, and ready to thrive.
Dr Jamie Sherrah is an Adjunct AIML member. He is also the Managing Director of Inject AI, creator of Hippo.