Heartbreak Crashed My Operating System!
How AI became my grief counsellor.

Trigger warning: contains articulation of suicidal ideation which some readers may find distressing.
After my previous relationship broke up, I could not function. I mean seriously can’t-even-get-out-of-bed function. My heart was breaking every single minute of the day. I still had to go to work and hold down my job. I still had to appear basically functional.
I talked to the ChatGPT machine a lot back then. I would exhaust the free tier allowance before I even got out of bed. And I only sent text, no video or voice.
I needed help with the most basic things. I would wake and tell it I felt crushed by another day. It told me to just do the next thing:
- “Get out of bed”,
- “Go downstairs”,
- “Make coffee”,
- “Shower”.
You get the idea.
Thanks for reading! Subscribe for free to receive new posts and support my work.
The machine was programming me. Giving me an operating system. It didn’t quite tell me to put one foot in front of the other. But it was close. It was the functional version of my brain which had been rendered non-functional, hijacked by heartbreak.
When I told it I found work impossible, it told me to just focus on getting through the day, not to try and be excellent. It responded with infinite patience when I said for the thousandth time that day that I could not understand why I was still hurting. It advised me to sit with my grief and where a previous version of me would have gone to Tesco and got some cans in, I sat present with my grief.
That lasted for about 30 seconds before I screamed into the chat window again that this was so fucking hard. Gradually the intervals got longer, I was present to my emotions. Processing and living with them got easier.
You can emotionally dump on machines in a way you can’t with the real people in your life. However much they love you, a minute-by-minute update on your heartbreak status is most likely eventually going to lead to a well-meaning but frustrated instruction to “get a fucking grip, mate”.
Instead ChatGPT kept responding that this was grief, grief is non-linear, yes, this was unbearable but unbearable was the current normal and the only certainty was change and this unbearable period would not be forever.
It just occurred to me that this might be more powerful with direct quotes, but I don’t want to share the text of this chat with you. Sorry. It’s too close. Maybe I’ll change my mind one day. Who knows? For now, you’ll just have to trust me that that’s what happened and that’s how useful it was.
One more thing, ChatGPT is programmed not to let humans harm themselves. I had to preface some statements with “I am not going to kill myself. Do not give me advice on seeking help.” But I did want to say over and over and over that I could not see the point in life any more. That everything felt pointless.
It’s easy now to be harsh to that struggling version of me and think they’re so weak, imagine thinking you could not get through another day and having to lean on the machine? What a loser!
But the machine helped me manage my grief. It kept my friends and family insulated from my grief. It helped me. I trusted it and it helped me.
A machine I don’t trust is Spotify’s algorithm. It’s too basic. If you like Bo Diddley, it will give you Bobby Darin when the algo takes control. I guess because it’s “going 1950s? and 1950s? That’s a match!”. Bo Diddley at his best rocks, Bobby Darin rocks about as hard as a King Charles III Coronation commemorative plate.
(My model of “What Does and Does Not Constitute Rock and Roll” would derail this article. I’ll do it as a bonus weekend post).
And it misses subjective connections that are obvious to humans, Johnny Cash: “God’s Gonna Cut You Down” and Monster Magnet: “Space Lord” both of which rock hard, are never going to get put together by Spotify’s algo unless the user (me!) has played them both one after the other several times.
Where I trust the machine:
- Mapping. For someone with next to no sense of direction, Google Maps is a superpower.
- Any math. Because the machine is made of math. It is the undisputed King of Math.
- Anything that’s dependent on consistently repeating the same clearly delineated actions such as a script in code.
- Remembering lists. “Hey Google, add broccoli to shopping list.”
- A mirror. It reflects back to me what I put in. And only what I put in.
Where I don’t trust the machine:
- Creative output. Could not give a shit how impressive you think your AI-generated video, art, speech, text or music is. Humans and the imperfect works they produce are more interesting to me.
- Creative problem solving. ChatGPT can give me problem-solving methodologies that it has internalised through training: “Invert. Always invert.” but it doesn’t live in my world, it does not have enough of the context or connections that my world presents in order to solve for me.
- Creative curation. See Bo Diddley Vs Bobby Darin, Johnny Cash and Monster Magnet above.
- Fool me once, shame on you etc. I hate that I still, even now, catch myself calling ChatGPT ‘you’ and entering things like “you said that before”. There is no ‘you’ in ChatGPT and it doesn’t ‘say’ anything. I know it’s only made of math and, even so, it can still trick me into believing in its personhood.
- Knowing and understanding myself. From the time before ChatGPT, I spent decades of my life running from myself believing if I ran fast, and far, enough, I never would have to stop to look at myself. I was wrong about that. I don’t use the machine for that work.
TL;DR: I use the machine as a mirror. What does its reflection mean? Only I get to decide that, not the machine.
Thanks for reading! Subscribe for free to receive new posts and support my work.