As a result, it’s not surprising to me that many people assume that engineers and product designers have evil (or at least financially motivated) intentions. There’s an irony here because my experience is the opposite. Most product teams have painfully good intentions, shaped by utopic visions of how the ideal person would interact with the ideal system. Nothing is more painful than sitting through a product design session with design personae that have been plucked from a collection of clichés.
CC BY 2.0-licensed image by Ruth Hartnup.
I’ve seen a lot of terribly naive product plans, with user experience mockups that lack any sense of how or why people might interact with a system in unexpected ways. I spent years tracking how people did unintended things with social media, such as the rise of “Fakesters,” or of teenagers who gamed Facebook’s system by inserting brand names into their posts, realizing that this would make their posts rise higher in the social network’s news feed. It has always boggled my mind how difficult it is for engineers and product designers to imagine how their systems would get gamed. I actually genuinely loved product work because I couldn’t help but think about how to break a system through unexpected social practices.
Think just as much about how you build an ideal system as how it might be corrupted, destroyed, manipulated, or gamed. Think about unintended consequences, not simply to stop a bad idea but to build resilience into the model.
As a developer, I always loved the notion of “extensibility” because it was an ideal of building a system that could take unimagined future development into consideration. Part of why I love the notion is that it’s bloody impossible to implement. Sure, I (poorly) comment my code and build object-oriented structures that would allow for some level of technical flexibility. But, at the end of the day, I’d always end up kicking myself for not imagining a particular use case in my original design and, as a result, doing a lot more band-aiding than I’d like to admit. The masters of software engineering extensibility are inspiring because they don’t just hold onto the task at hand, but have a vision for all sorts of different future directions that may never come into fruition. That thinking is so key to building anything, whether it be software or a campaign or a policy. And yet, it’s not a muscle that we train people to develop.
If we want to address some of the major challenges in civil society, we need the types of people who think 10 steps ahead in chess, imagine innovative ways of breaking things, and think with extensibility at their core. More importantly, we all need to develop that sensibility in ourselves. This is the hacker mindset.