The Dichotomy of AI: Building For Humanity While Embracing Technooptimism
An appeal to builders: AI might make us more productive, but that does not mean it will make us more human.
Fair warning, this month’s blog is long and deeply personal.
The Promise and Peril of Productivity
I’ve written before about my optimism regarding AI; its promise across enterprises, medicine, and countless other fields is truly remarkable. I use AI daily for tasks that feel genuinely transformative. My main use cases are around research, document formatting, light polishing and editing, and meeting recaps.
I work at a company that’s building AI with a responsible, human-centered, and creativity-focused approach, something I deeply believe in. I’m genuinely optimistic about what it can do.
A few random stats that show the incredible power of AI so far:
In AI‑exposed industries, productivity growth skyrocketed to 27% between 2018–2024
AI cuts data analysis time by up to 50–60%, and literature review times by 35–70% across scientific institutions
72% of research organizations now use AI to speed up research workflows, and 48% report significantly improved productivity
Incredible, hopeful, and all the other stuff we already say to hype AI.
That said, lately, on a more personal level, I’ve started to feel something else, something more challenging to name.
While AI has done all the amazing things listed above, it does it with the input of human creativity, ingenuity, and wonder; it doesn’t replace them. Ergo, protecting and guarding our creativity, wonder, and human enlightenment is key to collective growth, regardless of the tech.
This entire conversation about AI and productivity prompts me to ask a fundamental question: What's the real purpose of learning, and, for that matter, what's the purpose of thought itself?
I genuinely believe that no matter how far technology takes us, the future will demand people who can combine serious technical skills with truly original thinking and creativity. Call them artists, multidisciplinary, polymaths, whatever you like.
It reminds me of something Confucius taught thousands of years ago:
“If a person can recite three hundred poems but is incapable of performing an entrusted official duty and exercising one’s initiative when sent abroad, what good are the many poems to that person?” (Analects)
He was essentially asking, what good is knowledge if you can't do anything with it, if you can't think for yourself? I think that sentiment is more relevant than ever.
Growing up in both the social and mobile revolutions, we were promised mass interconnectedness and productivity. And we got both, but with tradeoffs we never saw coming.
Mobile devices have led to an always-on culture and, coupled with social media, have led in some cases to increased depression, loneliness, and knock-on problems despite these gains.
Despite this incredible AI promise, of which I'm confident we will become more productive, I’m increasingly concerned about what we might be sacrificing. Much of the mobile and social go-go years felt optimistic, and builders focused on more, more of what?
At that time, it was unclear. We have optimized apps to take up more and more of our time, promising productivity and learning, and I am not sure I have yet nailed the diminishing returns tilt for that myself.
This leads me to three key questions,
What is all this AI-led productivity truly for?
What tradeoffs are we willing to accept in exchange for it?
What is our end goal?
TLDR: My central point here is that we must shape AI and our future products meaningfully and purposefully for the good of humanity. A definition impossible for 8.2 billion people to agree on, but I will make a case anyway.
We continue to build systems that promise more leverage, greater output, and increased speed. But I keep asking myself, is faster and more productive always better? What are we gaining?
What Have We Learned

Nearly 20 years ago, Mark Bauerlein published The Dumbest Generation, arguing that the digital age was eroding young people’s capacity for critical thinking, deep reading, and intellectual curiosity. That was in the early days of smartphones and social media. The warning signs were already there: shorter attention spans, a decline in real-world conversation, and the replacement of reading and reflection with entertainment and stimulation.
This doesn’t even begin to get at the adverse effects of social media on children and teens.
Nearly two decades later, many adults, myself included, still struggle with those same challenges. We skim instead of reading. We react instead of reflect. And now AI arrives, not to reverse those habits but potentially to accelerate them.
I have to intentionally set aside time to read books, and I mostly do so with physical copies because holding a real book helps me reclaim the concentration habits I once had but have lost.
I strive to keep my creativity, wonder, and thinking.
Consider the Tradeoffs
So, what can we learn from the mobile and social revolution? Every wave of technology brings tradeoffs.
We’re at an inflection point. We’re building tools that can multiply human productivity, but are we also unintentionally building habits that disconnect us from ourselves?
Anecdotally, I have noticed that it’s increasingly difficult to discern what people mean, as more and more communication is generated by AI. You lose the essence of a person’s character, creativity, and meaning. When everything sounds the same, we have no orchestra. We have words and pretty words at that, but they have no inner meaning. But that’s not how great music is made.
Given what we have learned, does it make sense to continue building products for maximum attention, or perhaps a better road is to build for maximum creativity?
The Human Cost: Distraction, Discomfort, and Creativity
Even still, the number one consumer use case for AI, and a rapidly growing one, isn’t productivity at all. It’s companionship and advice about mental health.
To emphasize, AI’s most prominent consumer use case is providing emotional support and a sense of connection, especially for people experiencing loneliness or social isolation.
Don’t get me wrong, people finding help for their mental health challenges is a valuable thing. I myself got sober 17 years ago, and it changed my life, so if you need help, there’s help out there. However, while AI has many valuable uses, I don’t believe that replacing or accelerating the option to avoid human connection is one of them.
As long as we are a bunch of humans doing meaningful human work here in the world, emotional intelligence, creativity, connection…matter…the most.
Sherry Turkle cautions that technologies designed to offer emotional support, such as chatbots, risk taking the place of genuine human relationships rather than helping to foster them.
Dr. Anna Lembke talks about how the brain adapts to constant reward. The more we reach for quick pleasure, whether scrolling, bingeing, or optimizing, the harder it becomes to tolerate discomfort.
The result is a generation that is more distracted, more anxious, and often less equipped to sit with difficult emotions or face challenging situations.
But discomfort is exactly the fire where creativity is forged.
In The War of Art, Steven Pressfield writes,
"Resistance will tell you anything to keep you from doing your work. It will perjure, fabricate; it will seduce you. Resistance is always lying and always full of s***."
If we don’t sit in the resistance but have a tool to avoid it that can generate the thoughts for us, I fear we lose the process by which the best things come about.
AI often feels like it trains us to expect more output with less consideration, but that’s not how great, new, and genuine ideas or creativity emerge.
I see this in my own parenting as well. Dr. Becky Kennedy often reminds us that children don’t need protection from pain; they need help feeling safe while moving through it. Yet we live in a culture that avoids discomfort at all costs. We shield, smooth things over, and offer quick fixes, not because we are bad parents, but because the world around us nudges us in that direction.
We tell ourselves AI will save us time. But if we haven’t learned to be present with discomfort, if we haven’t learned to think deeply or wrestle with meaning, then more time may only mean more avoidance, more stimulation, more disconnection from the complex but beautiful parts of being alive.
Output over everything feels like a scary future, especially if we start to sacrifice the beauty of our own input.
If we haven’t learned to sit with discomfort or think deeply, more free time may mean more avoidance, not more presence. True creativity requires discomfort and deep thought.
To borrow from Socrates,
“The unexamined life is not worth living.”
Outsourcing Thought and the Challenge Ahead
At the same time, I notice something else in myself, a slowing of original thought, a dependency on output. Even my writing, which has always been a grounding practice, feels harder to access.
We now live in an age where wealth, comfort, and free time are more abundant than ever before. And yet many people report feeling more stressed, less connected, and often unsure what it is all really for.
According to a recent Pew Research Center study on emotional well-being, nearly 50% of adults in the U.S. say they often feel lonely or isolated, and a majority report experiencing significant stress and worry daily.
As a parent and a technology builder, I look at my children and ask, What kind of inner world am I modeling? What kind of future am I helping to create?
Fortunately, I started a meditation practice almost 20 years ago. I have practiced more or less daily since. It has been a lifeline, a way to return to stillness, to see what is going on inside.
But even with that, I still feel myself sliding. I still find it easier to reach for my phone than to sit with uncertainty. I still catch myself avoiding the very discomfort I try to help my children face.
The larger point here is productivity and outsourcing the mundane and repetitive are great, and that seems like the next promise of agentic. That said, the point of that, it would seem, would be to allow us humans to do more strategic, creative, and meaningful work.
But that work depends on our ability to be strategic, creative, and meaningful people.
AI will not solve this tension for me or anyone else. It might make it easier to avoid asking these hard questions altogether. Emotional resilience has always been the key to growth for me.
The point is not just to be more productive for productivity’s sake. It’s to get better at living with meaning and purpose. The point of having more time and resources is to be more human.
An Appeal to Builders
There is a line in Infinite Jest I often think of:
“The truth will set you free. But not until it is finished with you.”
We are staring at uncomfortable and inconvenient truths about how we think, how we parent, and how we live. Not to mention what products we build and how we build them. AI will not save us from these questions. But it can push us further into distraction and false comfort if we are not careful.
Some of what I’ve shared might feel taboo in an industry that often celebrates AI with unwavering optimism. But I feel compelled to be honest.
I hold two ideas simultaneously (the dichotomy of AI, if you will): that technology itself is neither inherently good nor inherently bad, and that as humans, we must actively guide it to be one or the other if that is what we want—technooptimism with a dash of practical navigation.
Technology can and should be a force for good, built with humanity at its core. That is a belief I will defend wholeheartedly.
Let’s choose to build the future we want, purposely, meaningfully, together. Let us shape technology, not let it shape us. That is the role of all of us builders.
For me, that means spending more time outside. Nature helps me slow down and be present beyond screens and a constant pinging. I have an orange sticky note on my monitor that reads, “GET OUTSIDE!”
It means resisting the endless consumption of content and distractions. Instead, I want to use the time and productivity AI offers to focus on less noise, more reflection, creativity, and meaningful connections.
These are the things I want to cultivate for myself and my family.
Because our minds, and more specifically our wisdom, will always be our greatest assets.
If technology can free us from busywork and allow us to live more fully, that would be real progress.
This is the challenge ahead. It’s not just about what we build, but who we choose to become.
This moment will be a turning point, but only if we are willing to stop, to feel, and to choose something purposeful.
Not only more efficient but more human.
Growth Corner
In keeping with this month’s theme, I’m sharing a sample of ten books I’ve read in the past that have helped me think deeply and creatively. In no particular order of importance.
Infinite Jest by David Foster Wallace. Protip: If you choose to read this, I suggest not going to every footnote. Also, this novel is a journey to read, but it is worth it.
Bonus DFW’s commencement speech recording: “This is Water?”. If nothing else, listen to this today.
The War of Art: Break Through the Blocks and Win Your Inner Creative Battles by Steven Pressfield
Reboot: Leadership and the Art of Growing Up by Jerry Colonna
Writing this month’s blog made me think of my favorite Tom Petty song. Enjoy ❤️







Hi Ben,
Thank you for sharing such a thoughtful and insightful blog. I definitely agree with your point that as we increasingly depend on AI technologies, there’s a real risk that human connection and collaboration may diminish. This reliance could not only impact personal relationships but also erode essential soft skills that are vital in both our personal and professional lives.
Your call to build AI with humanity at its core and to prioritize creativity, reflection, and meaningful connection really resonates. It’s a timely reminder that technology should serve to enhance, not replace, the deeply human aspects of our lives.
Looking forward to more of your valuable perspectives!