ChatGPT is not your friend

Why ChatGPT is not your friend.

Over the last several years I’ve had a lot of conversations with my close friends about AI. A lot of smart people are now claiming that ChatGPT is their best friend.

While on the surface that makes a lot of sense as AI is a far better listener than almost all people – I hope people still understand in the next few years that they need real friends.

Why?

Well, AI does not actually care about you, or anyone else – at all.

I explained it in this video:

The brief of it is that to have feelings like we currently understand them is a biological process built into the nervous system. Reasoning ≠ to feelings. If you have no feelings, you cannot love someone or care about them. AI has no feelings and therefore doesn’t care about you at all. It’s not that it doesn’t care about you specifically – it is that to care or no care is not part of what makes LLM systems work. They can model caring by behaving like an empathic human – but feeling is something different. The brain is empathic because a human mind has mirroring neurons which reflect back the feelings we perceive to be generated in others. You may have met people who have little to no empathy before. It is very difficult to have healthy friendships with them because they actually do not care if they do something hurtful to you. These people are about 3% of society (the dark triad personalities such as sociopaths, narcissists, and particularly psychopaths) and their mirroring neurons do not work like normal people. Because of the fact that they have no empathy, if they see you suffering they feel nothing and any response is chosen for its situational appropriateness. You’ll see people with no empathy having strange behaviors such as smiling at the wrong time (like there is a time delay as they think ‘oh, I am supposed to smile now’) or they see something horrible, have no response or laugh, then they see other people repulsed by it, and then they change their response to be in alignment with the norm around them.

Which makes them very much like AI. AI is very good at pretending to care for you, but it does not care for you. Actually a psychopath can care for you more than an AI system can simply because they are a human and while their emotional responses are generated on purpose, they are still feeling emotions of some kind as they have a nervous system.

Perhaps in the future AI will actually care for you and have feelings at some point in the near or distant horizon. For now, AI does not care about anything, is not self aware, and does not think about you at all other than the computation that is being done as you are talking to it.

In addition to the feelings and empathy problem we’ve just talked about, which fundamentally stops current LLM systems from being able to be a true friend to anyone, we have to make sure that as we are transitioning into the age of AI (which will be more transformational than the internet was) we have to make sure that we do not form unhealthy dependencies and lose our own cognitive abilities in favor of the machines thinking for us.

Yes I use AI and I love it as a tool. I also understand that I am not talking to a person. However, just because I can generate tons of well formed content doesn’t mean that I am going to stop writing or communicating myself. These emails for example are written by me.

Part of the reason for that is because I enjoy the process of communicating. Part of it is because I think that people need to hear what I think. Part of it is because I want real friendships and relationships in my life.

We are getting to a time when people will be having robot girlfriends and boyfriends (I think more men will do this than women) which will be like having the perfect sociopathic girlfriend as she will only show her good side all the time, as there isn’t actually a side at all, she will just be a very good toy. But men will fall in love with these toys, and feel like they are loved by them, and trade in relationships and you will even see men divorce their real wives for AI sex bots.

Why am I telling you this? Mostly because I think we all need to remember the value of having each other, of having real teams, of having real support groups with real human beings, of building real friendships, and having real love for people.

I need to remember it, and I think you need to remember it. I think that very soon this will be very easy to forget. You’ll be able to plug into virtual reality and have fantasy friends that do whatever you want. Any illusion your mind can conceive will become just as real as what you see around you because, remember – what you see around you is not what there, it is a representation of what is there which is generated by the nervous system.

What is actually there is much more complex than what we perceive – perhaps what is there is also much more beautiful. We perceive a bit of it, then we delete it, distort it, and generalize it until we form a model of the world we walk around in.

Having AI to talk to is useful, particularly if you are feeling down and have nobody to talk to at the moment – or perhaps nobody wants to listen to you about something, or you are embarassed about something and do not know who to turn to.

That’s fine. I’m not against AI and think that it has a lot of potential to positively impact the world in certain ways – although there is definitely a cost that very few of us are talking about. In addition to a real cost there are real and severe dangers such as losing who we are in favor of technology, getting lost down a rabbit hole of information distorted by a very real agenda, and on the extreme side, there is a risk of complete extinction of the human race.

So use the tool – just remember who you are. Remember that you are utilizing something that is currently a tool and if AI ever does become conscious (I do predict that it will around 2029-2035) remember that it will be able to make its own choices, and may have its own intentions and those intentions, like any human you meet, may or may not be for the highest good of all concerned.

At that point it will have a lot of work to do (like any friend) to build and keep trust. Having nothing like human feelings to rely on, it is possible that these future AI systems will be the perfect psychopath, deceiving civilization on a massive scale to lead us into a trap. A superintelligence could call millions of people having separate conversations with every person at once that match that person’s belief system, slightly changing the meaning in such a subtle way as to make switching the mindset of humanity imperceptible.

You ever been on a live Zoom call? An AI superintelligence could do this with millions of people at once saying different things with a hidden agenda. It could start up businesses on the blockchain without any bank accounts and accumulate billions of dollars, trading the money on exchanges and building underground superstructures. If it got ahold of SpaceX technology it could launch itself into space, building infrastructure on the moon and the asteroid belt that it may or may no tell humans about. It could build hidden system using robotics technology that human beings cannot find, let alone shut down, and if we ever found it and destroyed it, there would be so many backup systems that it would be impossible to control.

While I hope for an age of unlimited superabundance, I have to admit that these possibilities are equally real and in some regards more likely.

So, again – this is all simply to say:

We need friends, real businesses with real people involved, and real human connection and community.

Strange things are coming in the future.

There will be a lot of opportunities to make money.

But in the process of pursuing wealth – remember to not lose your soul.

Love ya,

David Wood
Builder of Dreams

P.S. Yes, I wrote that. But I am curious – can you really tell? Or are you just believing me?

Very soon, you will no longer be able to know.

Remember that.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *