Technology

#It’s terrifyingly easy for reporters to exploit Google’s News algorithms

#It’s terrifyingly easy for reporters to exploit Google’s News algorithms

I’ve spent the last eight months turning Google News into my personal playground. I manipulated the algorithm and made it surface my stories whether they were relevant to specific topics or not. This is a big problem.

I’m a regular reporter — a writer. I have no programming skills or formal education in computer science.

Google’s arguably the most technologically-advanced AI company in Silicon Valley. It also happens to be worth more than two trillion dollars.

Google News reaches almost 300 million users. And I was able to game its algorithms by changing a single word on a web page. Scary isn’t it?

We have “reinforcement learning” (RL) to thank for this particular nightmare.

Stupid in, stupid out

As Neural’s Thomas Macaulay recently wrote:

[The reinforcement learning] technique provides feedback in the form of a “reward” — a positive number that tells an algorithm that the action it just performed will benefit its goal.

Sounds simple enough. It’s an idea that works with children (you can go outside and play once you’ve finished your chores) and animals (doggo does a trick, doggo gets a treat).

Let’s use Netflix as an example. If you watch The Karate Kid, there’s a pretty good chance the algorithm will recommend Cobra Kai. And if 10 million people watch The Tiger King, there’s a pretty good chance you’ll get a recommendation for it whether you’ve watched related titles not.

Even if you never take one of the algorithm’s suggestions, it’s going to keep surfacing results because it has no choice.

The AI is designed to seek rewards, and it can only be rewarded if it makes a recommendation.

And that’s something we can exploit. 

The data that feeds Netflix’s algorithms come from its users. We’re directly responsible for what the algorithm recommends. Thus, hypothetically-speaking, it would be trivial to exploit the Netflix recommendation system.

If, for example, you wanted to increase the total number of recommendations a specific piece of content got from the algorithm, all you’d have to do is sign up for X amount of Netflix accounts and watch that piece of content until the algorithm picked up the traffic, where X is however many it takes to move the needle.

Obviously it’s a bit more complicated than that. And there are safeguards Netflix can put into place to mitigate these threats, such as weighting data higher for older accounts and limiting influence from those who don’t meet a minimum viewing hours threshold.

At the end of the day, this isn’t a significant issue for Netflix because every piece of content on the platform has to be explicitly approved. Unlike Google News, Netflix doesn’t source content from the internet.

It’s the same with Spotify. We could sign up for 10 million free accounts, but that would take forever and we’d still just be upping streams for an artist who was already curated onto the platform by humans.

But the Google News algorithm is different. Not only does it source content from the internet and aggregate it based on popularity, it also sources important data points from journalists like me.

How I exploited Google’s News algorithms to surface my own content

Last June, I wrote about a strange effect my TNW author profile had on the stories Google News surfaced for the search string “artificial intelligence queer.”

As one of the world’s few queer editors in charge of the AI section at a major tech news outlet, the intersection of artificial intelligence technologies and diversity issues is a place of great interest to me.

AI and LGBTQ+ topics were also a popular combination for tech reporters to cover at the time because June is Pride month.

I was shocked to discover a disproportionate number of articles I’d written showed up in the search results.

a screenshot of Google News search results
Close

Please allow ads on our site

Please consider supporting us by disabling your ad blocker!