The Algorithms Will Not Save Us


#1

Open Thread is where Waypoint staff talk about games and other things we find interesting. This is where you'll see us chat about games, music, movies, TV, and even sports, and welcome you to participate in the discussion.


This is a companion discussion topic for the original entry at https://waypoint.vice.com/en_us/article/bj55y3/the-algorithms-will-not-save-us

#2

Oh man, Youtube is such a cesspool. Just look at the classic example of watching one Feminist Frequency video and getting inundated with alt-right and grumbleglop videos in your recs. The biggest shame is that there are creators I love on the service, like Mark Brown and nerdwriter. But logging onto the front page is garbage and I wish that Youtube would take more responsibility about what goes up there.


#3

It does feel a bit like there should be some (internal? co-operative between outlets?) database about what needs to be covered while covering games. Before the Storm is constantly highlighted here (after some brief initial coverage that didn’t focus on the labour situation with the Japanese publisher and a US union bashing heads) but we failed to see the Mario coverage talking about “It’s Mario time” labour exploitation. The focus on BotW didn’t circle around the queerphobic elements of the game from a company who have, for a long time, pushed that regressive notion that all-ages means straight-only. When Nintendo US acted to fire staff based on GG petitioning it was a story but it’s not a continued point of commentary when their games come out. This feels like some companies are not just getting a pass but can, via how they craft their branding of regressive politics, pretend to be apolitical and as long as people are enjoying their games (which usually don’t even try to engage with marginalised communities) then everyone just nods along.

The algorithms will not save us but neither will manual curation be a magical step forward if it cannot contend with how major players leverage their position (and even childhood affection for these eternal brands and IPs) as negative coverage only points elsewhere (which, even when covering a incident, can read as putting blame on workers rather than executives). I thought it was a positive step when the Naughty Dog stuff came up that the link was talked about: this is a Sony subsidiary studio, this is ultimately a Sony HR failure (they are not just publishing a game from an independent team, this is their studio and their org chart covers HR there); Sony should not be able to just pretend there is nothing they need to be doing to resolve the situation.


#4

There is a solution to this, but it’s not one that any company is moving towards in any meaningful way. The answer is twofold:
1) hire humans to moderate content.
2) PAY & SUPPORT THEM WELL.

I used to do forum moderation professionally (for an actual paycheck from a corporation) and was severely under-paid. One of the companies I worked for even explicitly stated over and over again how important their online community is to them, and…still paid me half of what their lowest-paid salesperson was making.

Patrick is right, the algorithms will never ever be perfect (or even come close) and there is a massive need for oversight by trained human moderators.

But why would anyone ever want to be one?

Doing moderation is a very tough gig that can take a huge toll on your mental health, especially if you’re struggling in any other areas in your life. Combine that with extremely low pay, and leadership that doesn’t understand what you do, why it’s important, or (crucially) how you contribute to the bottom line and you have a situation where even if your company does have some sort of moderation position you’re given no support of any kind, have no room for advancement or pay increases, and no say in what needs to be changed within the product to support better moderation.

Companies think so little of moderation that they frequently outsource their moderation, allowing them to offer even LESS support and pay to moderators that are exposed to constant horrible abuse.

We need a huge overhaul and revamp of the way we think about moderating internet content. We need to treat it as a viable career path, with standards and guidelines and mentors. We need a moderator union that offers support and resources (considering that state of unions in general in the US, this is an even long-er shot). We need huge internet companies to understand that their platforms will no longer exist if they don’t invest in better moderation and act accordingly.

Until those things happen, anyone who has any desire to be a professional moderator will burn out. There’s a reason there are no “expert moderators” that companies consult for training, they don’t exist.


#5

The (actual) problem is not that such a database doesn’t exist, but that the popular concensus about what needs to be covered doesn’t include any of the things you mention. Waypoint is the exception, not the rule.

I mean, I don’t think anyone is pretending that human beings are magic. Curation is necessary, but not sufficient. The problem is that tech companies essentially have their fingers in their ears when told how bad their algorithms are, until ad companies and bad PR force them to move an inch.

EDIT: I do think it’s worth talking about how our feelings toward a game or franchise can create predictable blind spots in our criticism, which can then be implicitly or explicitly exploited.


#6

I would frame this more as… big tech companies are often primary ad companies and they prioritise that (with that model of taking massive VC investment, ie debt, which requires recouping via the very worst of exploitative capitalist models) and so the specifications for the algorithms are always going to tilt to that side (and algorithms are the only things they consider cost-effective to scale, which is required to repay that debt/expectation of growth). Yes, an engineer probably has in the specs document an idea of user engagement (possibly mislabelled as “satisfaction” but if you only measure engagement then you’re only optimising for engagement) but really YouTube is an ad platform. Google search is an ad platform. They also happen to either share user-generated videos or website links but that’s not the core business model. The entire digital ad business is about using algorithms to label users and serve up custom ads while maximising the invasion of privacy that the system enables - something only possible because algorithms are cheap and can scale so you can do something this intrusive and claim that because no humans are involved that you’re not a creepy stalker company who are working to brainwash the public.

When it comes to bad PR: wasn’t it Rightist outrage that got Facebook to drop their human moderation of the trending topics feed? So that’s a good example of how a company was using people as a final check on an algorithm but the bad PR forced them into removing the curation (as facts skew Centre-Left and Rightists got mad their fake news was being removed from the results).

I absolutely think most engineers working on these sorts of algorithms are quite aware of where they fail but also that finding a solution (to a problem which often starts out as edge cases but as everyone builds content for the [partially obscured] algorithm then all edge cases become a focus of future content) which doesn’t impair the business requirements (increasing use of the platform even if it’s less than optimal use: eg hate-sharing still can boost a platform) is extremely hard. Yes, there’s some false view of a utopian apolitical neutral but I’d say that’s not the only factor.


#7

Youtube, Twitter, Facebook, etc have clearly spent their time concentrating on the accumulation of power without any consideration of what would happen at the end of this ride.
At best it’s a naive worldview along the lines of “open communication will bring us together, we just have to provide the medium” , at worst it’s “I got my billions, you all can fuck off”

edit : I still feel like Waypoint could have handled the Kingdom Come coverage better, but after reading how many other people used the exact same “having cake and eating it too” metaphor in the resetera thread dedicated to the podcast, feel like a real twit.


#8

yup.

not a big fan of any of these companies / “services”. avoid them whenever possible.


#9

That was Facebook falling face first into its own trap. They claimed that Trending Topics was always algorithm driven, and when sites figured out it wasn’t, their only viable move was to actually make that the case, with predictably disastrous results.


#10

There’s a hard battle coming.

At least in the states, a lot of our self-identity is wrapped up in the idea that consistent rules consistently enforced are the best way to ensure equality, and the more human judgement enters into the mix, the more biased the results become.

Rules aren’t impartial, and neither are systems. Any system has to choose which variables to monitor and which to ignore. When you’re judging anything with internal relationships and complexity, initially microscopic differences can have vastly different outcomes. Every algorithm is biased by what the framers decide is and is not important to measure.


#11

Maybe I’m just incredibly cynical but when it comes to companies like Nintendo and Valve no one is going to take a serious stand against them in mid to large gaming journalism because to do so is company suicide.

Using Waypoint as an example if they decided that going forward they would no longer cover Nintendo or it’s games or any games that were solely published on Steam or having to do with Valve they would lose a ton of readers.

To put it simply there are some companies that are too big to directly fight. The only thing that is going to actually hurt them is if they do something so disastrous that it gets dragged over into normal media.


#12

When machine learning, AI, and algorithms are only deployed for the sake of making more money, this sort of thing is the result.

I have no doubts all of these techniques will be used extremely effectively to try to sell people more lootboxes.


#13

I don’t think it’s necessary to abstain from Nintendo or Valve coverage in this case, just shed light on the stuff going on behind the scenes. The worst that happens is that Waypoint no longer gets pre-release support from them, but frankly I’m here for the post-release takes anyway. The preview cycle is practically useless at this point.


#14

This basically speaks for itself.


#15

I wish I was more surprised by this, but that’s pretty aw ful. I think what game makers don’t realize is just the existence of this stuff taints every other game with something similar because we can never know how much they’re using techniques like this to increase revenue.


#16

This is all true, but it’s also missing one piece of the puzzle: very few of the users want to pay anything at all, and even fewer want to pay for curation.

It’s one thing to scream till blue in the face that these companies are neglecting this, but we need to recognize that people will still demand free access to the platform, free posting, and free usage, while also simultaneously complaining about ads and paywalls, and also that media isn’t well curated enough.


#17

I’ve been thinking a lot about Austin’s talk earlier this month, specifically about the example he gives of a factory where management and labor were effectively gaming each other: one for profit, the other for survival.

In a sense, every system can be boiled down to a game, or exploited like one. It’s just that we never think of them in those ways. In a world governed by systems you cannot control, the game is always going to be how to exploit and manipulate them for your own gain. There’s no real element of choice here: either you do so, or you don’t survive.

There’s an argument to be made that everything we see regarding algorithms now is not a bug, but a feature; the cost of bad PR pales compared to the upside of growth for these companies. There is no real financial incentive to do otherwise, and no regulatory incentives in sight.

I know that Austin said that success doesn’t look like success, but besides the most obvious, darkest possibilities, I’m not convinced that failure looks like failure either. The danger of AI and algorithms is not a future where robots nuke the planet – it’s one where every interaction in a game and reality is a loot box psychologically tuned to your specific preferences, to extract the maximum amount of revenue.

In a world governed by algorithms and AI, you will have to constantly fight against a system that thinks it knows you better than you know yourself. Welcome to the dumbest, laziest dystopia.


#18

I mean it’s probably not some actual database, just journalists talking to each other over DMs, Discord or something. It’s not like it’s a super big industry.

Honestly, I feel like selective curation is just incompatible with the type of mass-market/democratization of tools/services that companies like Valve or Google provide. People are going to abuse the tools they’re given, whether its ill intended or not. I do think that Google can/should start monitoring videos produced by the channels they’ve partnered with, though.


#19

Maybe YouTube shouldn’t suggest content or promote anything. We have a subscribe button; we have watch later; we have friends who post videos to our Facebooks. Maybe we don’t need an algorithm to try to predict what it thinks we’ll like.


#20

I threw up in my mouth a little. Actually a lot.