Technology

#Meta Wants to Fix’s Wikipedia Biggest Problem Using AI – Review Geek

“Meta Wants to Fix’s Wikipedia Biggest Problem Using AI – Review Geek”

The Meta logo over the Wikipedia logo.
Meta, Wikipedia

Despite the efforts of over 30 million editors, Wikipedia sure ain’t perfect. Some information on Wikipedia lacks a genuine source or citation—as we learned with the Pringle Man hoax, this can have a wide-ranging impact on culture or “facts.” But Meta, formerly Facebook, hopes to solve Wikipedia’s big problem with AI.

Note: To be clear, this is an independent project by researchers at Meta AI, a division of the Meta corporation. The Wikimedia group is not involved, and it isn’t using SIDE to automatically update articles.

As detailed in a blog post and research paper, the Meta AI team created a dataset of over 134 million web pages to build a citation-checker AI called SIDE. Using natural language technology, SIDE can analyze a Wikipedia citation and determine whether it’s appropriate. It can also find new sources for information already published on Wikipedia.

An example of how SIDE can fact-check and suggest new citations on Wikipedia.
Meta AI

Meta AI highlights the Blackfoot Confederacy Wikipedia article as an example of how SIDE can improve citations. If you scroll to the bottom of this article, you’ll learn that Joe Hipp was the first Native American to compete for the WBA World Heavyweight Title—a cool fact that is 100% true. But here’s the problem; whoever wrote this factoid cited a source that has nothing to do with Joe Hipp or the Blackfeet Tribe.

In this case, Wikipedia editors failed to check the veracity of a citation (the problem has since been fixed). But if the editors had SIDE, they could have caught the bad citation early. And they wouldn’t need to look for a new citation, as SIDE would automatically suggest one.

At least, this is the hypothesis put forth by Meta AI researchers. While SIDE is certainly an interesting tool, we still can’t trust AI to understand language, context, or the veracity of anything published online. (To be fair, Meta AI’s research paper describes SIDE as more of a “demonstration” than a working tool.)

Wikipedia editors can now test SIDE and assess its usefulness. The project is also available on Github. For what it’s worth, SIDE looks like a super-powered version of the tools that Wikipedia editors already use to improve their workflow. It’s easy to see how such a tool could flag citations for humans to review, at the very least.

Source: Meta AI

If you liked the article, do not forget to share it with your friends. Follow us on Google News too, click on the star and choose us from your favorites.

For forums sites go to Forum.BuradaBiliyorum.Com

If you want to read more like this article, you can visit our Technology category.

Source

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Close

Please allow ads on our site

Please consider supporting us by disabling your ad blocker!