Updated

YouTube announced a new measure to combat conspiracy theories by linking directly to third-party sources like Wikipedia a month after its top trending video suggested a Stoneman Douglas High School shooting survivor was an actor.

The Google-owned video platform, which has been under fire to clamp down on the burgeoning growth of conspiracy-focused videos that flourish after major news events, announced new “information cues” at South by Southwest (SXSW) on Tuesday.

“We’re always exploring new ways to battle misinformation on YouTube. We announced plans to show additional information cues, including a text box linking to third-party sources around widely accepted events, like the moon landing,” a YouTube spokesperson told Fox News.

The new features, which will be rolled out in the coming months, are part of a range of new initiatives that YouTube is considering to eradicate misinformation from the platform. The company did eventually remove the video suggesting Parkland survivor David Hogg was an actor and issued an apology.

Once the features are implemented, a video calling into question events around Sept. 11 or the Apollo Moon landing, for example, could be accompanied by a link to the official Wikipedia page on these events.

SADIQ KHAN SHARES DEATH THREATS HE'S RECEIVED, ASKS TECH COMPANIES TO BATTLE HATE 

“Our goal is to start with a list of internet conspiracies listed on the internet where there is a lot of active discussion on YouTube,” YouTube CEO Susan Wojcicki said at SXSW, reports Wired.

However, simply placing links to verified information alongside conspiracy clips doesn’t mean that users will actually click on them. YouTube’s recommendation algorithm itself, additionally, has been shown to push viewers down rabbit holes of the most vile, radical content regardless of political affiliation. And Wikipedia itself is battling against sock puppets (the use of multiple accounts to skirt regulations), bad sourcing and fringe editors, according to the Southern Poverty Law Center.

“People can still watch the videos, but then they have access to additional information,” added Wojcicki.

Still, nothing in YouTube’s community guidelines specifically prohibits the uploading of conspiracy videos.

In breaking news situations, the web’s speed becomes a double-edged sword for tech companies trying to combat fake news and disinformation: Will YouTube be able to keep up with the spread of conspiracy theory videos during the next major news event?

That’s an open question.