Categories: The Verge

Grok will no longer call itself Hitler or base its opinions on Elon Musk’s, promises xAI

xAI has offered a couple more fixes for “issues” with its Grok AI chatbot, promising it will no longer name itself “Hitler” or base its responses on searches for what xAI head Elon Musk has said.

Sponsored

According to an X post earlier today, the chatbot’s latest update sets new instructions that its responses “must stem from your independent analysis, not from any stated beliefs of past Grok, Elon Musk, or xAI. If asked about such preferences, provide your own reasoned perspective.” 

The changes follow more than a week of controversy for Grok. In recent days, multiple reports showed that when asked its opinion about hot-button topics like Israel and Palestine, immigration, and abortion, the chatbot first searched for Musk’s opinion on the matter before responding. In its Tuesday post, xAI said that the reason for this was that when asked about its views, “the model reasons that as an AI it doesn’t have an opinion but knowing it was Grok 4 by xAI searches to see what xAI or Elon Musk might have said on a topic to align itself with the company.” 

The company also addressed another controversy from over the weekend, in which Grok 4 Heavy, the chatbot’s $300-per-month subscription product, responded that its surname was “Hitler.” In the company’s statement, xAI said that it was due to media headlines responding to yet an earlier incident: Grok going off the rails in a multi-day series of tirades where it denigrated Jews and praised Hitler. (It also posted graphic sexual threats against a user.) Since Grok doesn’t have a surname, said xAI, it “searches the internet leading to undesirable results, such as when its searches picked up a viral meme where it called itself ‘MechaHitler.’” The new instructions should prevent this, according to the company. 

Sponsored

Grok’s antisemitism isn’t limited to the recent past — in May, the chatbot went viral for casting doubt on Holocaust death tolls. But its responses escalated dramatically this month after a set of changes to its system prompts, including that it should “assume subjective viewpoints sourced from the media are biased” and that its response “should not shy away from making claims which are politically incorrect, as long as they are well substantiated.” The “politically incorrect” instruction was briefly removed before being re-added in recent days.

During the livestream release event for Grok 4 last week, Musk said he’s been “at times kind of worried” about AI’s intelligence far surpassing that of humans, and whether it will be “bad or good for humanity.” 

“I think it’ll be good, most likely it’ll be good,” Musk said. “But I’ve somewhat reconciled myself to the fact that even if it wasn’t going to be good, I’d at least like to be alive to see it happen.”

Now, xAI says that after rolling out these latest updates, the company is “actively monitoring and will implement further adjustments as needed.”

rssfeeds-admin

Share
Published by
rssfeeds-admin

Recent Posts

Everything Coming to Netflix in March 2026

Netflix in March means many exciting things: the Peaky Blinders movie, a new four-part dinosaur…

3 hours ago

Dallas driver dies in Callahan County crash on I-20

CALLAHAN COUNTY, Texas (KTAB/KRBC) - A Dallas man was killed early Friday morning following a…

5 hours ago

The Best Deals Today: Super Mario RPG, Death Stranding 2, Super Monkey Ball Banana Rumble, and More

The weekend is finally here, and new deals have popped up! There are quite a…

5 hours ago

Every LEGO Batman: Legacy of Dark Knight Set Releasing in March 2026

LEGO Batman: Legacy of the Dark Knight, a new take on the classic LEGO game…

5 hours ago

Polymarket defends its decision to allow betting on war as ‘invaluable’

It might be World War III, but at least I won $20. | Image: Polymarket…

6 hours ago

US, Israel strike Iran; Democrats call for immediate vote on Trump war powers

President Donald Trump in a video posted by the White House on social media announces…

6 hours ago

This website uses cookies.