6 Comments
User's avatar
Rainbow Roxy's avatar

This article comes at the perfect time. You've really nailed this. It makes me think of Pilates – it's so easy to just look like you're doing a pose perfectly, rather than truly engaging the core. Your point about signalling transformation over actual capability resonates. Frustraiting, but your incentive analysis makes perfect sense.

Expand full comment
Hodman Murad's avatar

Love the Pilates analogy. The form is easy to copy, but the actual core engagement is invisible. And just like in fitness, faking it now means you'll fail when the real test comes. The incentive structure just makes theater feel safer in the short term.

Expand full comment
Neural Foundry's avatar

Nailed it on the performance review problem. The insight that invisible reliability work cant compete with launch metrics in quarterly reviews explains why so many teams end up in this trap. I've seen entire eng orgs optimize for demo-ready features while the monitoring infra quietly rots, and everyones just playing the game rationally.

Expand full comment
Hodman Murad's avatar

That's the hardest part to accept. Everyone's making the optimal move within a broken game. The engineer who spends a quarter on monitoring infrastructure has nothing to show at review time, while their peer who shipped three features gets promoted. You can't blame individuals for that calculus.

Expand full comment
Sam Illingworth's avatar

Thanks Hodman, this is such a great reminder to those of us working in institutions and companies that are obsessed with virtue signalling rather than actually developing effective strategies and governance for how we should be using machine learning.

Expand full comment
Hodman Murad's avatar

The virtue signaling problem is that it's often cheaper and faster than the real work, so it becomes the rational choice.

Expand full comment