Analytics Economy USA

Reframing Our Fears: What Elon Musk’s AI Concerns Really Reveal

Reframing Our Fears: What Elon Musk’s AI Concerns Really Reveal
AFP via Getty Images
  • PublishedFebruary 19, 2025

The question “Can AI take over the world?” is one of the most frequently posed when discussing the potential future of artificial intelligence. But is this question truly about AI, or does it touch on deeper, more primal fears we rarely address?

Elon Musk recently sparked a conversation about the role of AI in shaping society at the World Government Summit in Dubai, where he discussed a hypothetical AI designed with a focus on Diversity, Equity, and Inclusion (DEI). Musk warned that an AI programmed with such values could, theoretically, decide to eliminate powerful men in favor of achieving gender or diversity balance. He said:

“If hypothetically, AI is designed for DEI, you know, diversity at all costs, it could decide that there’s too many men in power and execute them.”

This statement may seem like the musings of an influential figure, but it raises a more profound question: What if Musk’s fears about DEI in AI reflect a broader anxiety we all share? Could it be that our concern about AI taking control is less about the technology itself and more about our deeper fear of chaos and a lack of control?

In discussing the anxieties surrounding technology, media theorist Douglas Rushkoff sheds light on a complex historical context. Rushkoff, in his book Program or Be Programmed, suggests that technological advancement stems from a fear of the uncontrollable—particularly “the fear of nature, the fear of women, the fear of emotions and darkness.” These are primal fears rooted in the desire to dominate and control forces that seem chaotic or unpredictable.

According to Rushkoff, many tech leaders like Musk, driven by the desire to control this chaos, are crafting technology not just as a tool, but as a means to assert order over natural, emotional, and cyclical patterns. The focus on “controlling nature” and “dominating it” reveals an underlying fear of uncertainty.

When Musk warns that an AI might target those in power based on its programmed DEI values, we might focus on the gender or power dynamics he’s referencing. However, we should also consider that this type of question may stem from a broader unease. Musk’s question—“Can AI take over the world?”—might not only reflect fear of AI itself, but also a fear of what would happen if no one were in control.

The way we frame our questions about AI can offer insight into more than just technological trends—they can reveal something about human nature. Neurologist Erwin W. Straus wrote in his 1955 article, “Man, A Questioning Being,” that questions can be as revealing as dreams, showing more about the one asking than about the subject of the question itself.

In the case of AI, the frequent query, “Can AI take over the world?” may reflect a deeper existential worry about control. We ask this question because we are conditioned to assume that someone or something must govern our world. Whether it is a powerful individual like Musk or an AI created with DEI principles, the underlying belief is that control is essential. The notion that the world could be left to unfold without someone in charge is often dismissed.

Rather than continuing to ask, “Can AI take over the world?” perhaps it’s time to ask why we assume the world needs to be controlled in the first place. Why do we fear chaos, and who stands to benefit from our constant need to assert dominance—whether through human power or artificial intelligence?

AI is not inherently a tool for global domination. It is a tool that reflects the values and concerns of those who create it. So instead of focusing solely on the question of AI taking over, we could shift our focus toward asking:

“Why do we feel the need to control? What would happen if we embraced the unknown?”

The straightforward answer to the question of whether AI can take over the world is no. There will always be forces—be it nature, emotions, or unpredictable events—that technology cannot control. But the more pressing issue is how AI, and technology in general, can shape our thinking and discourse. If we allow ourselves to be driven by fear of a loss of control, we risk letting technology dictate not only our lives but our very thought processes.

In the words of Rushkoff:

“Only the person who is aware of the programming is capable of questioning why it’s programmed that way.”

Forbes contributed to this report.