Software is an instrument
On materials, philomathy, and the scarcity of caring
I loved this conversation between Max Schoening and Geoffrey Litt, two of Notion’s design leaders. Throughout the calm and composed exchange, Max & Geoffrey discuss at length the merits of “malleable software”, a phrase they seem to be contractually obligated to say in every public appearance. The chat is very interesting and feels like a rare peek behind the curtain of two very smart people talking about very smart things. It’s a joy to watch. What became clear after the hour and a half conversation is that they both seem really passionate about two things: (1) learning and (2) software.
Learning
Now that AI is taking over knowledge work, I’m noticing AI companies (and their spokespeople) taking a stance in favor of the process by which we acquire knowledge, aka learning. Anthropic released a “thinking” cap that feels like a perverse tongue and cheek wink at the world, Cursor’s Head of Design urged us to “stay glass”, as an alternative to a black box that obscures away our understanding of the code, and now Notion’s Max describes his joy for learning while Geoffrey talked about “Diff explainer,” an internal tool he made to better understand the codebase.
These companies seem to be drawing a line in the sand: it’s okay to lean on AI to do the busywork but not the “thinking” work, because thinking is worth it. To me, this feels like pacifists selling firearms. Yes, you can use AI tools in that way, but simply put: AI’s defining feature is thinking. Increasingly, it thinks real good. In some (most) topics, it thinks better than most. Beyond that, it does stuff. It does stuff when you’re tired of doing stuff and it does more today than it did last month. So, if you’re working on a tool that thinks and does more than most people can, it seems reasonable that most people will lean on it to think and do for them. Maybe not entirely…yet—but everyone’s watched this movie before and we can guess how it ends.
As I was listening to them nod aggressively to each other’s love for learning, I wondered if I loved learning too. Honestly, I don’t think I do? I mean, I love the feeling of knowing something. I also love discovery and exploration. I love finding the edges of things. But the key difference is that, for me, it’s only interesting if the knowledge on the other side pertains to people. Anything that gets me closer to some universal truth about how we live or who we are — that’s interesting!
I have very little interest to learn about subjects like physics, mathematics, or programming. Of course, you could probably make a very convincing argument that those are integral fields that relate to how we live and who we are. But I promise I’d zone out as soon as you started explaining why.
The thing I find so threatening about AI is that it seems to get us closer to the answers of any subject, while skipping the learning part. I’m sure there will always be people who love learning and bend it to teach them something, but I can’t imagine that number of people isn’t shrinking over time.
Maybe this is signaling of what could become the next status symbol: philomathy. In the conversation, Geoffrey made a passing reference to people working out and not bringing a forklift to the gym. I think he meant that we work out even though there are existing tools that are stronger than humans. So, why do we work out? I imagine he thinks that working out our minds will have a similar fate: there will be smarter options out there, but being smart will continue to have value. Just like being strong has value today. A new kind of peacocking emerges.
Software
Man, these guys love computers! They read papers about computers, quote other computer people, and tinker away on (what I can only assume are) the loudest mechanical keyboards.
At one point Max scoffed at the idea that someone wouldn’t know who Brett Victor is with such confidence that he would surely place him in Mount Rushmore along with Jesus and Michael Jordan. Thankfully, I watched a Brett Victor video about 15 years ago, so I smirked when I heard the name and learned I was part of the in-group.
Once again, this made me realize that perhaps I don’t like software. I mean, I love bending software to do things. But I have no interest in the software itself.
This may sound heretic since I’m a Software Designer, which of course makes software the material of my craft. But when it comes to the intersection of humanities and technology, I lean more towards the former.
Software has an irresistible quality: it’s infinitely scalable. Because it’s so easy to mold to our will and reproduce, compared to other materials, it has the ability to reach massive numbers of people. That’s what’s so compelling about it for me. It can serve as a bridge between me and the rest of the world.
I think of software less like a material and more like an instrument. Like Jimmy Hendrix’s guitar, I have no qualms with lighting my software on fire, if it means I can express an idea and connect with others in some way. In that sense, I’m closer to a musician than a luthier. Of course, a musician must be aware of their instrument’s capabilities, but that awareness is always in favor of the music, not the instrument.
—
After watching this video, I’m left with a deep sense of guilt and ambivalence. I feel like I should appreciate learning more. It seems like the right thing to do. It also feels irresponsible to design digital products dispassionately about its material.
I’m happy Max and Geoffrey are thinking about it and expressing their thoughts so beautifully. This paradigm shift seems to be helping us understand what we care about and where we choose to spend our time.
For some reason, I still find myself resisting the idea of delving deeper into software, even though I’m now regularly opening the terminal, submitting PRs and open-sourcing projects. That feels like a means to an end: bringing us closer to our understanding of ourselves and each other.
So maybe that’s where I land: I don’t want to worship learning or software, I want to use them as props in a bigger, ongoing argument about what it means to be a person right now. If AI can think better and software can scale wider, then the scarce thing is the particular, messy way a human mind cares about something.
I don’t need to fall in love with code or “the craft” of learning to keep going; I just need to stay close to the questions that actually move me, and keep bending these increasingly alien tools toward that. If there’s any dignity left in this whole arrangement, it’s in insisting that the feathers stay dialectic—that even as the machines flap harder and higher, we’re still the ones deciding what’s worth flying toward.


Lol i relate to this post so much it's not even funny. I think software as a stage for culture rather than software as a stage for itself is something that's very hard to remember.
good read, really nice way to unpack things, also thanks for the video nice find, watching it rn