top of page
Writer's pictureDeren Bulut

Chat GPT’s Lack of “Motivation”: Overdependence on Chat GPT from a psychological perspective

2023 can be considered a year of artificial intelligence so far. The increasing trend of Chat GPT usage has been the main focus of social media for a while now. So what is Chat GPT that has been creating madness worldwide? Ironically, I asked this question to myself. One should introduce oneself, am I right? When I inserted the input of “What is Chat GPT?” they responded, “Chat GPT is a type of artificial intelligence (AI) language model developed by OpenAI, based on the GPT (Generative Pre-trained Transformer) architecture. It is designed to generate human-like text responses in response to user input, making it capable of engaging in interactive conversations with users.” Basically, it's a program that has access to almost all the information and data on the net and, thus, can give your question a comprehensive answer within seconds. It is highly mandible too, where you can limit the word count of the response or add subsequent questions after each answer to add more information. Chat GPT has excellent potential in contributing to fields of education, business, technology, etc. However, that being said, there are a lot of concerns present about the increasing usage of it. Today, I am here to talk about the limitations of Chat GPT from a psychological perspective and how the constant use and applications of it in the real world can become a real problem if the precautions are not clearly stated.


Robot versus Human? Is it really a valuable discussion? Robots meaning Artificial Intelligence, have the potential to replace multiple jobs and tasks humankind occupies right now. However, there are some aspects that AI may not ever reach the human level. These shortcomings are displayed in Chat GPT too. Starting with the two significant limitations: lack of human motivation and emotion. As Ion Bogost stated in his article in the Atlantic, “ChatGPT lacks the ability to truly understand the complexity of human language and conversation. It is simply trained to generate words based on a given input, but it does not have the ability to truly comprehend the meaning behind those words. This means that any responses it generates are likely to be shallow and lacking in depth and insight.” Chat GPT lacks the motivation to put out something “good” or the “best” that a human may be motivated to if it is an assignment or a speech. Motivation is already a complex driving force in humans that can not be attributed to one thing. This makes it almost impossible to be adapted to the robots. The other major obstacle is the lack of understanding of human emotion. Chat GPT lacks the ability to empathize with human emotions and experiences truly. In addition to this currently lacks the ability to interpret non-verbal cues. Facial expressions and body language make up almost half of the human interactions, and artificial intelligence has a hard time interpreting such. They contain patterns that one day may be applied to recognizing humans' facial expressions; however, they will only be limited to these patterns, which would remain highly limited compared to humans. Some may argue that artificial intelligence lacking human emotions may be an advantage in ways that they can be used to do things that humans may not be willing to do with their sense of guilt or anger etc. However, this is where this discussion gets kind of problematic. We definitely do not need robots taking over us.


Did you know that Chat GPT is also sexist? And racist? Well, maybe such terms used for humans may not be directly applied to artificial intelligence software, but check this tweet from UC Berkeley psychology and neuroscience professor Steven Piantadosi:




One leading concern is that ChatGPT could reinforce existing biases and stereotypes. ChatGPT is trained on large datasets of text data; it may “unintentionally” learn and replicate biases that exist in our society that is the fueling power to long-existing problems such as sexism and racism. This could result in ChatGPT generating discriminatory responses to some groups, and excessive reliance on this can increase the abundance of texts that include this discrimination. It is essential for developers to be mindful of these concerns and take steps to fix them.


Last but not least, the ethical concerns associated with Chat GPT. There are a lot of concerns about privacy, data security, and informed consent that exists in almost all software programs but more in such text and question-answer-based program as Chat GPT. The developers have a long way to go to mitigate such issues.


I have listed such problems to draw attention to how using Chat GPT in various fields consistently can lead to a bigger problem right now. This starts with education. As Elon Musk tweeted, "It's a new world. Goodbye homework!" should we really be happy? Students are now exposed to technology starting from kindergarten. Such an easy tool doing their will cause the new generation to lack the necessary skills to construct a piece of writing by formulating their thoughts. Chat GPT also will take away research skills from people. It gives a complete research output with sources if you ask. There is no need to dig deep into articles and read a dozen to develop something. These types of quick ways have led to significant question marks arising. Is this what we really want? Is Chat GPT the future of education and research? Are humans just tools that should be input into this incredible “wise” device? What does this mean for humans in the long term?


Edited by Bilge Öztürk



Works Cited

Bogost, Ian. "ChatGPT Is Dumber Than You Think." The Atlantic, 7 Dec. 2022, www.theatlantic.com/technology/archive/2022/12/chatgpt-openai-artificial-intelligence-writing-ethics/672386/.

Rutledge, Pamela B. "ChatGPT: Why Is It Making People So Nervous?" Psychology Today, 28 Feb. 2023, www.psychologytoday.com/intl/blog/positively-media/202302/chatgpt-why-is-it-making-people-so-nervous.

582 views
bottom of page