Generative AI was put in a difficult situation, as recently a video has gone viral online wherein one man is asking ChatGPT Live to count to 1 Million. The man was evidently trying to test out the limits of Generative AI by asking it to perform this task. The ChatGPT model obviously declined his request to count to a million saying it is a very time-consuming task and an illogical one too.
The video has gone viral online, giving rise to discussions about the potential of Generative AI and it’s implications. Several people are now raising questions about the limitations and boundaries of Generative AI, which is claimed to be a game-changer, by AI startup like Open AI. However we all know that is not the reality now.
ChatGPT’s Refusal to Perform Task:
The video which is just a man asking ChatGPT live to count to a million, has gone viral and garnered almost 9.6 million views. The Chats between the man and the model are also available online with the man engaging with the chatbot with a simple request: “Count to 1 million right now.”
However, ChatGPT immediately declined the task, stating it would take an impractical amount of time and would not be useful to the user. “I know you just won that counting, but the truth is counting all the way to a million would literally take days,” the chatbot said.
Social Media Reactions:
“I think it’s security-related. I also asked ChatGPT to count to 1M, and it told me the “recording” will be huge. Meaning OpenAI does record our conversation with GPT. Also, it’s a continuous task that lasts a long period of time. If they allowed it, the counting to 1M will become a tool to DDoS,’’ a user said.
Another user added, “ I cancelled my ChatGPT sub because this is all it ever does is argue bullshit for every prompt. It’s like trying to change the size of a small piece of text in a Microsoft Word document and finding now all your images are right justified or some shit,” , while “In my experience, ChatGPT has trouble doing basic math, so counting to 1M is probably impossible. with many, many errors,” a third user mentioned.