ChatGPT for care organisations: Understanding the limitations

A photo of a desk with lamp and plant below a wall clock

Chances are you’ve heard of ChatGPT by now. Launched in 2022 to great acclaim (and some scepticism), ChatGPT is a free-to-use AI system that answers questions with a chat-like response almost instantly. 

You can ask ChatGPT just about anything – it can draft emails, solve maths problems, come up with ideas for your kid’s first birthday party, and even suggest what you should have for dinner. 

ChatGPT can be a valuable tool for those working in care organisations too. Right now, you can use AI to automate routine tasks like filling out forms, updating records and processing paperwork. It can also help you streamline your team’s working schedules, reducing overtime costs and improving workforce management. 

And that’s just the start! AI has the potential to reinvent the way health and social care professionals learn and develop too.   

But, as with any new disruptive technology, ChatGPT comes with limitations. In this article, Assia Cheurfi, a Software Engineer at FuturU, covers some of the things you should be aware of when using ChatGPT.

ChatGPT limitations


A hallucination is generated information that’s factually incorrect. ChatGPT was designed to always provide a response to a question, even when it doesn’t know the right answer. 

So it’s worth being mindful that the response generated might not be entirely accurate, and shouldn’t be taken as a source of truth. Always check a ChatGPT response before considering it fact, and cross-reference against other sources where needed.


If you asked ChatGPT about any of the events going on in the world right now, or even asked what the weather’s like today, it probably wouldn’t give you the answer you’re looking for, but hallucinate a response instead. That’s because it doesn’t have access to current events. 

Think of ChatGPT as a massive brain that can learn from information it takes in – in fact, this architecture was designed to simulate the human brain – it learns from data it’s given. This information, or ‘training data’, is mostly taken from the web and can be anything from research papers and news articles, to tweets. 

Because the process to train this brain is time-intensive and requires a lot of computational storage, the data is ‘frozen’ up to a certain point in time. Here’s an example: 

A screenshot showing how ChatGPT doesn't always display up to date information.


Another thing to consider is that OpenAI, the company behind ChatGPT, can access the information you insert into the chat and can use it to train their system. So you want to avoid giving it delicate information like the medical history of the people you support, names and addresses of colleagues and any other sensitive data.

You do have the option to opt out of OpenAI using your data. Simply head to Settings, then Data controls and toggle the Chat history & training option off.

While it’s worth keeping these limitations in mind, ChatGPT is a powerful (and free) tool we’d definitely recommend using. It’s easy to integrate into your work routine, and can automate or accelerate many tasks. Go ahead and try it out yourself!

Exploring the AI frontier in healthcare

Exploring the AI frontier in healthcare

FuturU’s CTO Steve Lowe reflects on the trajectory of AI, its practical applications in healthcare, and some of FuturU’s developments…
6 Skills your care home needs and how to get them

6 Skills your care home needs and how to get them

From mental health awareness to mealtime assistance and fire safety, this article outlines six skills your care home needs to…
‘I’ve completed every course on FuturU!’: Jacqueline’s story

‘I’ve completed every course on FuturU!’: Jacqueline’s story

Meet Jacqueline, a healthcare assistant specialising in complex care and one of FuturU’s most prolific learners! Jacqueline shares why she…
How to create a learning habit

How to create a learning habit

Carving out time to learn is easier said than done, but if you want to make real progress at work,…