19.4 C
New York
Saturday, July 27, 2024

Google Search CEO Sounds Alarm on Critical Flaw in AI Chatbots Like ChatGPT

TechGoogle Search CEO Sounds Alarm on Critical Flaw in AI Chatbots Like ChatGPT

AI chatbots are becoming more popular all around the globe. The name ChatGPT is already the most well-known brand in this industry, and its popularity is only expected to rise. But how do you plan to act when one of the AI race’s front-runners chooses to notify you about the system’s key flaw? This is precisely what happened when Prabhakar Raghavan, the chief of Google Search, cautioned users about one of the most significant risks associated with AI chatbots like ChatGPT.

Raghavan recently gave an interview to the newspaper Welt am Sonntag in Germany, during which he voiced his opposition to the possible risks posed by artificial intelligence in the form of chatbots. He cautioned about the “hallucination” phenomena in artificial intelligence, which is when a computer gives an answer that seems plausible but is really completely made up. Raghavan emphasised how important it was to reduce the incidence of this problem and maintain accountability for the general people.

Alphabet, the parent company of Google, has been experiencing competition from ChatGPT, a hugely popular chatbot that was built by OpenAI and is supported by Microsoft with an investment of around $10 billion. ChatGPT was developed by OpenAI. Alphabet debuted its very own chatbot, named Bard, only the week before last. Unfortunately, the launch was overshadowed by a costly error when the software disclosed erroneous information in a promotional film. As a direct consequence of the event, the market value of the corporation decreased by $100 billion.

Bard has not been made available to the general public just yet since Alphabet is still going through the process of user testing it. Raghavan stressed the feeling of urgency that the firm has in the race to stay up with ChatGPT, but he also emphasised their obligation to the public to ensure that they do not mislead them.

Check out our other content

Check out other tags:

Most Popular Articles