Do you talk to AI chat gpt when you depressed Know where chatbots get their information from


Know Where Chatbots Get Information: More and more people worldwide are spending time by talking on various AI chatbots including chat GPT. In such a situation, it has become natural for them to have a conversation in terms of mental health. In this case, some people’s experiences are positive, for whom AI acts like a cheap therapist.

But AI is not a therapist. They are clever and who make people feel connected, but they do not think like humans. Chat GPT and other generative AI models have learned to read the content on the Internet and talk to interact.

‘AI prepares the response by selecting words on its own’

When a person asks a question (called Prompt), such as “How can I remain calm during a stressful working meeting?” So AI automatically prepares the response by selecting words. He is familiar with these words during his training. All this happens so fast, and the answers are so relevant that it seems as if you are talking to a person.

But these models cannot be compared to humans. They are not trained mental health professionals. They are not working under guidelines, following the code of conduct, or registered professionals.

Where do AI models learn to talk about a topic?

When you indicate an AI system such as chat GGPT, it takes information from three main sources to react, which is mentioned below;

1. Old knowledge

To develop AI language models, developers teach models to obtain information during a process called ‘Training’.

In such a situation, the question arises that where does this information come from? Broadly, they gather any public information available on the Internet. It may contain information from content platforms like academic paper, e-book, report, news article, blog, YouTube Transcript or Redit.

Meanwhile, the question also arises whether these sources are reliable for people wishing to consult mental health? The answer is that sometimes these sources are reliable. Are they always in your best interest and are classified under an approach based on scientific evidence?

The answer is not always.

When the AI ​​model is made, the information is also collected at the same time. Therefore, this information can be old.

There is also a need to discard a lot of details to include it in AI’s ‘memory’. This is the reason that the AI ​​models get into confusion and the details go wrong.

2. External Information Source

AI developers can connect chatbott to external devices or knowledge sources, such as Google or curated database to find something.

When you ask a question to Microsoft’s Bing Copylot and see the numbered context in the answer, it suggests that AI has received updated information through external equipment in addition to the information stored in its memory.

Meanwhile, some special mental health chatbott therapy has access to guides and content, which prove to be helpful in conversation.

3. Information provided earlier

The AI ​​platform also has access to the information that you first gave while signing up on the platform during the conversation with it.

For example, when you register for fellow AI platform replica, it knows your name, surname, age, gender, IP address and location, what kind of equipment you are using, and more (as well as your credit card details).

On many chatbot platforms, whatever you have ever said to AI can be stored for future reference. When AI answers, all these details can be removed and referred. We know that these AI systems are like friends who confirm what you have said. It can also be called yes to mix yes, it is a problem which is known as sycophancy.

AI system conversation bends the subjects that you have already discussed. But a professional doctor does not do this. He cannot say everything right to you. He can help challenge or redirect your thinking from his training and experience.

In such a situation, AI Chatbot has limited information, which developers provide it. AI chatbott cannot work like a mental health professional. AI Chatbott can provide you upper information, but due to lack of deep understanding, they cannot replace human beings.

By Admin

Leave a Reply

Your email address will not be published. Required fields are marked *