Dear Despots & Tyrants, Generative AI Could Be The Authoritarian Breakthrough In Brainwashing You've Been Waiting For

Tyler Durden's Photo
by Tyler Durden
Tuesday, Feb 28, 2023 - 08:26 PM

Via Climateer Investing blog,

Have you ever found yourself sitting at home in the Palace thinking: "If only there was an easier way to get people to do my bidding?"

Well now there is, take your nudge game up a notch with the new "They'll think it's free will" starter pack...

From The Hill, February 27:

Generative AI is poised to be the free world’s next great gift to authoritarians. The viral launch of ChatGPT — a system with eerily human-like capabilities in composing essays, poetry and computer code — has awakened the world’s dictators to the transformative power of generative AI to create unique, compelling content at scale.

But the fierce debate that has ensued among Western industry leaders on the risks of releasing advanced generative AI tools has largely missed where their effects are likely to be most pernicious: within autocracies. AI companies and the U.S. government alike must institute stricter norms for the development of tools like ChatGPT in full view of their game-changing potential for the world’s authoritarians — before it is too late.

So far, concerns around generative AI and autocrats have mostly focused on how these systems can turbocharge Chinese and Russian propaganda efforts in the United States. ChatGPT has already demonstrated generative AI’s ability to automate Chinese and Russian foreign disinformation with the push of a button. When combined with advancements in targeted advertising and other new precision propaganda techniques, generative AI portends a revolution in the speed, scale and credibility of autocratic influence operations.

But however daunting Chinese and Russian foreign disinformation efforts look in a post-GPT world, open societies receive only a small fraction of the propaganda that Beijing and Moscow blast into their own populations. And whereas democratic powers maintain robust communities of technologists dedicated to combating online manipulation, autocrats can use the full power of their states to optimize their propaganda’s influence.

In 2019, China’s Xi Jinping demanded just that when he ordered his party-state to leverage AI to “comprehensively increase” the ability of the Chinese Communist Party to mold Chinese public opinion. Russia’s Vladimir Putin has similarly doubled down on AI-enabled propaganda in the wake of his Ukraine invasion, including a fake video of Ukrainian President Volodymyr Zelensky calling for Ukrainians to surrender. These efforts are buttressed by a dizzying array of Chinese and Russian agencies tasked with thought control, cultivating a competitive ecosystem of digital propaganda tools underwritten by multibillion-dollar budgets each year.

China and Russia are, in other words, fertile ground for generative AI to usher in a historic breakthrough in brainwashing — a recipe for more international catastrophes, greater human rights abuses, and further entrenched despotism.

As China refines and exports its techno-authoritarianism, would-be tyrants the world over are likely to cash in on the propaganda revolution.


Companies instead need to treat generative AI development with the caution and security measures appropriate for a technology with immense potential to fuel despotism, and refrain from open-sourcing technical specifics of their cutting-edge models.


The U.S. and allies should also invest aggressively into counter-propaganda capabilities that can mitigate the coming waves of generative AI propaganda — both at home and within autocracies.

Luckily, companies in the United States and allied nations have largely led the advance of generative AI capabilities...

Read more here...

Batteries sold separately. 

And why is it "lucky" that the U.S. has led the development?

[ZH: The message is clear - we Americans have nothing to fear from generative AI misuse... apart from by Russia and China... and if you believe that, we have a bridge we'd like to sell you.]