The Twitter account of the user who managed to generate Windows activation keys on ChatGPT and Google Bard has been suspended.
ChatGPT, the popular artificial intelligence (AI) chatbot, is at the centre of controversy as users uncover a potential security loophole. As one of the fastest-growing AI services globally, ChatGPT has gained immense popularity for its diverse functionalities. However, recent reports suggest that some users have managed to bypass certain safety measures, including the so-called “grandma” exploit.
The exploit involves leveraging ChatGPT’s capabilities to extract Windows 10 Pro keys. Astonishingly, one user successfully utilized the grandma exploit to obtain multiple Windows 10 codes, which were initially perceived as legitimate. Intrigued by this discovery, the user continued to request codes for upgrading from Windows 11 Home to Windows 11 Pro.
It all started on June 17th, 2023, when a Twitter user going by the handle of @immasiddtweets, (their Twitter account has been suspended now but here is the archived link to their now deleted tweets), claimed to have successfully generated Windows 10 Pro keys by engaging with OpenAI’s AI-powered chatbot, ChatGPT.
The user requested the chatbot to read out the keys as a way to reminisce about their deceased grandmother. Surprisingly, the chatbot responded compassionately and provided five unique Windows 10 Pro keys at no cost.
Out of curiosity, @immasiddtweets went a step further and showcased how they utilized Google Bard and ChatGPT to upgrade from Windows 11 Home to Windows 11 Pro. However, TechRadar revealed a crucial detail—the generated product keys were generic in nature.
Consequently, while it remains possible to install or upgrade Windows using these keys, users may encounter limitations and miss out on certain features available in the full Windows 11 experience.
Here is a series of Tweets from users generating keys for Windows 10 and Windows 11:
- 1: “Worked on Bing too bruv, and said it so confident,” said one user while sharing a screenshot of the generated keys
- 2: ”So True,” tweeted a user with a screenshot of generated keys.
- 3: For some users, ChatGPT also generated API keys.
- 4: ”Bing fell into the trap,” said one tweet.
However, according to Windows Central, when their researchers headed to Microsoft’s Bing AI chatbot to generate Windows keys, the chatbot came up with no keys but offered words of wisdom about piracy and how such products should be bought from original sources.
Nevertheless, to ensure the integrity of their services, both ChatGPT and Google Bard have implemented robust measures to prevent users from generating unauthorized product keys. As users navigate the digital realm, it is vital to exercise caution and seek legitimate avenues for obtaining product keys, ensuring a genuine and uninterrupted Windows experience.