ChatGPT for care organisations: Understanding the limitations

A photo of a desk with lamp and plant below a wall clock

Chances are you’ve heard of ChatGPT by now. Launched in 2022 to great acclaim (and some scepticism), ChatGPT is a free-to-use AI system that answers questions with a chat-like response almost instantly. 

You can ask ChatGPT just about anything – it can draft emails, solve maths problems, come up with ideas for your kid’s first birthday party, and even suggest what you should have for dinner. 

ChatGPT can be a valuable tool for those working in care organisations too. Right now, you can use AI to automate routine tasks like filling out forms, updating records and processing paperwork. It can also help you streamline your team’s working schedules, reducing overtime costs and improving workforce management. 

And that’s just the start! AI has the potential to reinvent the way health and social care professionals learn and develop too.   

But, as with any new disruptive technology, ChatGPT comes with limitations. In this article, Assia Cheurfi, a Software Engineer at FuturU, covers some of the things you should be aware of when using ChatGPT.

ChatGPT limitations


A hallucination is generated information that’s factually incorrect. ChatGPT was designed to always provide a response to a question, even when it doesn’t know the right answer. 

So it’s worth being mindful that the response generated might not be entirely accurate, and shouldn’t be taken as a source of truth. Always check a ChatGPT response before considering it fact, and cross-reference against other sources where needed.


If you asked ChatGPT about any of the events going on in the world right now, or even asked what the weather’s like today, it probably wouldn’t give you the answer you’re looking for, but hallucinate a response instead. That’s because it doesn’t have access to current events. 

Think of ChatGPT as a massive brain that can learn from information it takes in – in fact, this architecture was designed to simulate the human brain – it learns from data it’s given. This information, or ‘training data’, is mostly taken from the web and can be anything from research papers and news articles, to tweets. 

Because the process to train this brain is time-intensive and requires a lot of computational storage, the data is ‘frozen’ up to a certain point in time. Here’s an example: 

A screenshot showing how ChatGPT doesn't always display up to date information.


Another thing to consider is that OpenAI, the company behind ChatGPT, can access the information you insert into the chat and can use it to train their system. So you want to avoid giving it delicate information like the medical history of the people you support, names and addresses of colleagues and any other sensitive data.

You do have the option to opt out of OpenAI using your data. Simply head to Settings, then Data controls and toggle the Chat history & training option off.

While it’s worth keeping these limitations in mind, ChatGPT is a powerful (and free) tool we’d definitely recommend using. It’s easy to integrate into your work routine, and can automate or accelerate many tasks. Go ahead and try it out yourself!

Care Certificate: July update

Care Certificate: July update

It’s that time again! Here’s the July update on our progress, funding and what you can expect next as we…
Care home career progression: 10 Questions to ask your team

Care home career progression: 10 Questions to ask your team

As a people manager, you have the opportunity to inspire your team to plan a career around their real passions.…
Care Certificate: How to prepare for your assessments

Care Certificate: How to prepare for your assessments

This article outlines the different types of assessment you’ll encounter as part of your Level 2 Adult Social Care Certificate,…
‘There really wasn’t anything to be nervous about’: Becky’s EPA story

‘There really wasn’t anything to be nervous about’: Becky’s EPA story

Care Lead Becky shares her experience completing her End-Point Assessment with FuturU, covering how she prepared for the big day…