The Cockpit GPT experience — how we made it amazing, safe and secure

Diabetes Cockpit
4 min readDec 19, 2023

Diabetes therapy is changing at warp speed. So let’s face it — if we were stuck in 1921, we’d all be toast within a few months.

When tech is zipping along at light-speed, especially outside of healthcare, it can be frustrating for us tech-nerds living with diabetes. That’s what stirred Lukas into creating Cockpit, with the rest of us pitching in.

Then 2022 brought us this ultra-cool AI concept called Large Language Models. These models are like sponges, soaking up the internet and anything we lob at them. Their main party trick is predicting the next word, but ask them nicely, and they’ll do your math homework too! This tech was rolled out by OpenAI and ChatGPT and has grown faster than a rocket. So obviously, we had to get our hands dirty and explore using it in Diabetes Cockpit!

Now, playing with ChatGPT without understanding its limitations and security issues is like juggling dynamite. We’ve seen how awesome it can be, so we decided to make it safe for our app. Buckle up, and we’ll walk you through how we made it work!

Making it work

To get a GPT humming along with your data, you need to feed it said data. We’ve chosen to summarize the data — making it anonymous — and use it as a conversation starter.

Next, we gave it a personality. We wanted our GPT to be a kind, funny, and knowledgeable buddy who’s well-versed in diabetes therapy and human behaviors. So, we named it Sam, a nod to Sam Altman, the CEO of OpenAI, who’s whip-smart, driven, and usually takes a beat before answering.

Sounds easy, right? Well — yes, until Sam tells you to inject 50 units insulin more than you should, because it turns out, it’s not a math whiz and can start seeing pink elephants! Crazy. We know!

You might be thinking that a diabetes focused app using ChatGPT and handling sensitive data could be a disaster waiting to happen. And you’d be right. But don’t worry, we’ve got it figured out.

Making GPT safe to use in Cockpit

We made Sam safe to use by employing Explicit Prompting and an Informed Second Opinion.

Cockpit isn’t a medical device; it’s designed to work with diabetes and behavioral data to offer useful insights. To keep it within these guidelines, we fine-tuned the prompt to ensure it doesn’t dish out direct therapy or medical advice.

Even then, Sam could occasionally get a bit loopy, especially after long chats. So, we built another GPT, infused it with over 50 rules about what’s kosher and what’s not when it comes to Sam’s advice. This assistant, lets call him Bob, reviews Sam’s responses and asks him to dial it down if he steps out of line.

Making GPT secure to use

Sure, we’re not a medical device, but we still handle sensitive data. And sending that data stateside is a GDPR minefield, unless we tread carefully. OpenAI is vowing to delete all personal user data but we felt it would be better if personal data never got there in the first place.

Since we’re doing this as a hobby, we can’t go all corporate with ISO27001, but we can go the extra mile and build things the way we want.

We ensure data security through Summarisation, De-Identification, and never Sharing Personal Data in the first place. We send over only statistical data, scrubbed clean of any personal identifiers, and encrypted for good measure. This would already be a pretty safe approach for our integration but if you are a privacy stickler — like we are — you have to admit that this is still not a 100% anonymous. Every request to OpenAI leaves a trace on their cloud and systems — your IP address, which is basically like your fingerprint. In order to eliminate even this last trace of your identity we are routing all traffic through a private gateway, like a proxy, we host ourselves, meaning OpenAI truly receives nothing that could in any way lead back to you — making it actually anonymous this time.

Cockpit has always been putting privacy and data protection first, by default leaving all your data on your own device, unless you want to share it with anyone — and you can change your mind any time. It’s the reason why you do not need to sign up, its your data and we want you to be in control.

So that’s how we’ve used AI to take a giant leap forward in diabetes therapy with Cockpit, all while making it safe, secure, and GDPR compliant.

If you still have questions or feel unsure but would want to catch a chat with Sam, shoot us a line and let’s have a chat — we are happy to take the time and answer all your questions.

Keep on rocking!

--

--