In the same month that Grok opted for a second Holocaust over vaporizing Elon Musk’s brain, the AI chatbot is on the fritz again. Following the Bondi Beach shooting in Australia during a festival to mark the start of Hanukkah, Grok is responding to user requests with inaccurate or completely unrelated info, as first spotted by Gizmodo.
Grok’s confusion seems to be most apparent with a viral video that shows a 43-year-old bystander, identified as Ahmed al Ahmed, wrestling a gun away from an attacker during the incident, which has left at least 16 dead, according to the latest news reports. Grok’s responses show it repeatedly misidentifying the individual who stopped one of the gunmen. In other cases, Grok responds to the same image about the Bondi Beach shooting with irrelevant details about allegations of targeted civilian shootings in Palestine.
The latest replies still show Grok’s confusion with the Bondi Beach shooting, even providing information about the incident to unrelated requests or mixing it up with the shooting at Brown University in Rhode Island. xAI, Grok’s developer, hasn’t officially commented on what’s happening with its AI chatbot yet. However, it’s not the first time that Grok has gone off the rails, considering it dubbed itself MechaHitler earlier this year.