In September 2024, Hurricane Helene ripped through the mountains of North Carolina, triggering floods that toppled trees, swept away houses and left remote towns isolated after bridges and roads were damaged. The physical destruction was only part of the problem. In the days that followed, outside actors—including white supremacists, armed militias and conspiracy-driven groups—arrived in affected areas, complicating relief and stretching local law enforcement.
Many of these groups operate independently of officials. Chapters of Patriot Front and similar networks showed up to distribute food, clear debris and record themselves doing aid work. Those images and videos were then shared widely on social platforms. Reporters on 60 Minutes and others noted that a primary motive for some visitors was recruitment: the footage lets them present a clean-cut, helpful image to attract followers, while masking their extremist agendas and using the crisis as a recruiting ground.
Researchers who study online networks say this is deliberate. John Kelly of Graphika pointed to a post from a self-described white nationalist Active Club showing masked men clearing branches after the storm with captions invoking racially coded slogans like “white unity.” Extremist organizers often downplay the more off-putting symbols and rhetoric during disasters, preferring content that appears mainstream and service-oriented so they can win sympathy and followers without immediate backlash.
Those humanitarian-style posts are frequently paired with misinformation. Conspiracy narratives that blame government agencies for failed rescues or paint official responses as corrupt spread quickly in the wake of disasters. Kelly and others emphasize that conspiracy theories are powerful recruitment tools because they resonate emotionally and can be adapted to a wide audience.
Foreign influence operations also exploit natural disasters. State-linked accounts and networks associated with China, Russia and Iran have been observed amplifying divisive American posts during crises to push narratives that align with their strategic goals. Analysts describe a two-tiered approach in some campaigns: an initial set of accounts seeds a message and a second wave of assets amplifies it, sometimes followed by coverage from state media or sympathetic officials.
A clear example involved an X post that contrasted damage in Asheville from Hurricane Helene with a peaceful street scene in Kyiv, arguing that domestic relief was meager while U.S. taxes supposedly funded foreign pensions. That post fit an existing domestic grievance and was later reshared by an account linked to Chinese influence activity—illustrating how foreign networks don’t always invent stories but often boost and spread ones that already resonate at home.
Generative AI has made the problem worse by making convincing fabrications easy to produce and distribute. Analysts pointed to a viral AI-generated image from the Helene aftermath of a crying girl in a boat holding a puppy that was entirely fabricated. Other AI-manufactured images have exaggerated or invented scenes of destruction—like doctored photos of the Hollywood sign ablaze during separate wildfires—designed to grab attention and provoke strong emotional reactions.
Whether images and videos are authentic or fabricated, their effect can be similar: they attract eyeballs, inflame emotions and accelerate polarizing narratives. That attention is what extremists and foreign actors are often after—using the human response to disaster as a vector for recruitment, radicalization and influence. As disasters concentrate public focus, they create openings for actors to reframe events, spread falsehoods and recruit followers under the cover of humanitarianism.
Combating this dynamic requires vigilance from platforms, clearer coordination between relief agencies and local authorities, and media literacy among the public so people can better distinguish genuine aid efforts from performance-driven operations and identify manipulated content. Without those defenses, crises will continue to be a fertile ground for actors seeking to manipulate attention, build audiences and push divisive agendas.