“Oh, your algorithm update is reducing sales and usage? Send it!’

Photo: Solen Feyissa / Unsplash

“Measure what matters and what you measure.” There are a number of similar quotes out there that talk about how tracking a KPI in a company makes employees focus more on it, let alone having an explicit incentive structure associated with goals. For example, if boards are interested in ideals like diversity and culture, they should work with CEOs to make sure those stats are top notch citizens on company dashboards alongside sales and profits.

It’s even more difficult when you can’t agree on what the right metric should be. As I wrote earlier, one of the problems we face as an industry is measuring the current Web 3.0 with Web 2.0 dashboards. Misinformation, trolling, harassment, polarization and the resulting negative effects – none of them are as easy to define as CTR or CPM. I myself fell victim to this victim during my time as the head of the consumer goods team at YouTube. When the leadership of Google asked us to stop focusing only on user growth and increase monetization as well, the team we had offloaded to fund the new effort worked on the comment system. Yes, YouTube comments that mostly resulted in lots of attributions, profanity, and worse. We all wanted it to get better, but why did I sacrifice this project on short notice? Because it wasn’t tied to a first-level KPI, such as B. Earnings, Uploads or Playbacks. So it had to wait.

But what if you have the right metric, for example to measure the negative externalities of a product, but it turns out that the number loosely correlates negatively with your business KPIs? For example, if polarizing content leads to more short-term engagement, which leads to more active users, which leads to more ad revenue? It’s not crazy to wonder about this, and while I don’t think it’s a real correlation or that our social platforms are deliberately running on the efficient line of anger and profit, I always wonder what the margin pressures, say, are for reasonable investment means in trust and security.

Casey Newtons Platformer Article about Facebook’s Responsible AI team provoked a combination of eagerness and skepticism. I fear that even if these teams are actually able to study and challenge internally held beliefs about their products, they cannot make changes that negatively impact business metrics. That means we want responsibility, but only if this does not endanger the share price. I’m not just saying this (or specifically) about Facebook, but more generally about the complexity of incentives within a company. Yes, it’s also true that companies are already making decisions to balance user experience with monetization. During my time at Google and YouTube, there have been numerous experiments with ad loading, placement, etc., and the company has never maximized immediate costs when doing a disproportionately negative impact on user engagement or advertisers’ ROI, for example. Greedy in the long term, I guess, not in the short term.

But back to that question, how do you give a team like Responsible AI the ability to cut dollars, engagement, or growth if it believes doing so has a positive impact on fairness, accountability, or any other metric it is responsible for managing. I have an idea: a budget.

Yes, a budget! Teams like this should be tasked with spending up to an amount set annually. That doesn’t mean they have to spend it. Indeed, many profits in this example can be sales neutral or even sales positive. But let them make decisions that are consistent with their mandate without having to implicitly (or explicitly) defend why they are causing the company to leave dollars on the table.

Look, I know this is a strange concept and has all sorts of secondary effects: other changes are made elsewhere in the company to recapture the “lost” revenue that turns out to be other negative externalities. It reaffirms the idea that fairness comes at the cost of revenue and may allow other teams to give up their “responsibility” and let that separate AI team just “fix” everything. Maybe we’re getting to a point where it’s more about carbon offsetting, where each product team has to manage their own responsibility budget and there is a single market where responsibility points can be traded. New challenges require new solutions, and in those cases I think you need to control business anthropology, not just business algorithms.

LEAVE A REPLY

Please enter your comment!
Please enter your name here