The Knox Papers
Foundational Essays on AI, Identity, and Human Systems
My flagship body of work: the essays where I build new models, reframe old assumptions, and introduce the architectures shaping how I see intelligence, identity, and the future.
AI doesn’t fail queer and BIPOC people because of “bias.” It fails because it cannot interpret our identities with accuracy, context, or cultural truth. This essay reveals the mechanisms behind that harm and introduces a new architecture to prevent identity collapse in the age of autonomous AI.
Most people use AI to write faster. I used it to learn who I was. Over the past year, writing with machines became a mirror that stripped away the noise, surfaced my patterns, and revealed a voice I didn’t know I had.
Amazon doesn’t have a convenience problem.
it doesn’t have a speed problem.
It doesn’t have an assortment problem.
Amazon has a discernment problem.
People don’t remember flawless. They remember feeling seen.
They remember the human who made the stress smaller, the journey lighter, the moment calmer. They remember the person who didn’t just process them, but acknowledged them.
Travel, at its core, is a human ritual. It is migration, reunion, escape, reinvention, obligation, hope. And any airline that forgets that truth — even unintentionally — begins to lose the one thing technology can’t replicate: emotional resonance.
This one was hard to write. Not because I’m mad. Because I still want the blanket.
For years, Target was the rare brand that made belonging feel casual. You’d walk in for toothpaste and walk out with a little affirmation tucked into your bag. Pride wasn’t seasonal — it was spatial. It lived in the aisles, woven into the colors, the tone, the quiet promise that this place sees you.
And then, 2023 happened. The backlash. The retreat. The silence.
When Target pulled its Pride collection, it didn’t just move product — it moved meaning. It told queer people what we’ve always half-suspected: our safety here was situational.
This isn’t about outrage. It’s about empathy debt — the emotional residue that builds when remorse never matures into reflection. You can’t pay it off with apologies. You pay it off with change.
For years, Nintendo has been the moral compass of interactive entertainment.
A company that taught the world that joy could be engineered, and that simplicity could feel transcendent.
They’ve spent decades proving that technology doesn’t have to exploit emotion to move people. It can honor it.
That’s why their recent stance on AI — a public commitment not to use it in game design — is so fascinating. Because in one sense, they’re right.
And in another, they’re missing their greatest opportunity yet.
I wrote AI, Craft & the Human Future of Retail because I’ve come to believe the real disruption of AI isn’t technical — it’s human. After years leading transformation work with Tredence and SymphonyAI, I watched automation accelerate faster than reflection, and intelligence scale faster than intention.
This paper is my attempt to restore that balance. It introduces The Knox AI Empathy System — a model and a mindset for designing organizations that think with care. It’s not about faster machines; it’s about wiser leaders.
Because in an age where intelligence is abundant, discernment has become the rarest skill of all.