The "Workslop" Discourse is Missing the Point
On the crisis of mattering at work
When an employee submits poor quality AI-generated work and you have to fix it, or else send it back in some kind of awkward conversation, who’s to blame? Is it the coworker for being lazy? Is it AI for making it easy to submit lazy work? Your manager for not training people on the tools properly, or otherwise not modeling how to be deliberate with AI? The Harvard Business Review’s “workslop” article sparked this debate. But the data they cite points to something few people want to confront: this problem has origins before AI, and leadership can’t fix it.
The HBR article defined workslop as material that looks right but lacks substance, and this costs companies an invisible tax of $186 per month per person in correction time and ultimately damages how employees think of their coworkers. People who do submit workslop as work are seen as less creative, capable, reliable, trustworthy, and intelligent by their colleagues. The article makes a distinction between “pilots” (high agency and high optimism) and “passengers” (low agency and low optimism, and the perpetrators of workslop) as a way of classifying from whom quality AI-content can come.
The HBR article was done in collaboration with BetterUp, who created a separate report titled “Winning in the Age of AI” (sorry, this is the best link that does get you to use your email to sign up for something), which isn’t about AI really. It’s about human performance. According to the report, human performance across sectors and across the three different kinds of performance (basic performance, collaborative performance, and adaptive performance), has been decreasing since 2019. Basic performance relies on focus as it’s about task execution, collaborative performance relies on teamwork as its about alignment and championing, and adaptive performance comprises creativity, connectivity, and cognitive agility. Each of these kinds of performance have been decreasing since 2019, before AI and before COVID. This has coincided with a decrease in motivation, optimism, and agency. A quote near the end of the report helps quite a bit “We can’t expect adaptive performance if people don’t believe their efforts matter”.
Why do people submit “workslop” as work? Is it because they’re lazy? That might be part of it, but the more likely answer is that workers are alienated increasingly from work that matters. While the BetterUp data doesn’t explicitly answer this, there is a telling pattern.
The issue is the application of industrial measurements of productivity to knowledge work. Metrics designed for factories, for assembly lines, focusing on the number of widgets churned, have successfully been grafted to knowledge work. The difference between industrial work and knowledge work is that the latter work is inherently about insight, connection, and adaptation. With the grafted industrial model, the metric for knowledge work production becomes busyness (PowerPoint slides, reports that no one actually needs) over results. When success, especially in knowledge work, is measured by visible busyness and output volume rather than meaningful outcomes, workers will optimize for appearance. They become disconnected from work that matters because mattering isn’t being measured.
Workslop is the logical endpoint of this shift. Why invest cognitive effort in work that’s evaluated by whether it looks complete rather than whether it accomplishes anything? What better way to be done with the task than to offload it to AI? The same dynamic produces burnout, where sustained effort directed at meaningless performance metrics rather than meaningful outcomes depletes workers while generating nothing of value. This systemic issue then leads to others having to correct or otherwise manage poorly generated work, however it is not the person at fault, but the workplace that makes meaningful effort difficult to access.
To the credit of HBR and BetterUp, they try to get to the core of this. Both of them use leadership as a means to overcome this issue. With HBR, it’s the leader’s responsibility to model deliberate AI adoption. In the BetterUp report, it’s up to leaders to “…express confidence in their employee’s abilities to manage AI” so that they “cultivate a stronger sense of agency”. Both of these miss the mark. They treat employees as if they’re a simple mechanism that will run properly if only a leader will do things the right way, in the right sequence.
I’m left with unsettled questions: What would it look like for mattering to be defined by workers, rather than defined from above? What if employees chose what to work on instead of executing work defined from above? What if employees saw actual outcomes, like user-impact, or problem resolution, rather than optimizing for KPIs? These are the questions we avoid by arguing about AI tools, or leadership instead of work structures.


The problem with talking about an unsettled topic is that it will generally leave other unsettled as well haha... There are a lot of key points here which stem from the question of, "why isn't everyone overachieving at their work?" The answers can be very varied and it's kind of interesting, getting people to work on things they genuinely care about does increase "productivity"; however, I think there's more of a correlation than causation. There's a very fine line of "motivation" and it's really easy to overfit the two extremes (i.e. super high paying jobs where work is marketed to "matter more" and super philanthropic jobs where people really care about making a difference). Both extremes have people closer to "things that matter", but it's only the second group that actually motivated by meaningful work. It's debatable whether those in high paying positions would continue to do their jobs if money was not involved.
My point for going down this rabbit hole is just that "value" is not a well-defined word in present-day (I'm sure it wasn't well defined in the past too, but I can't talk much about the before time haha..). People are definitely motivated by where they believe value lies and I think until we answer that question as a society, this question about getting workers to "give it their all" will be equally elusive.
(Of course, if I completely missed the point in your post, that is fair as well haha..)