No one specifically asked for an iPhone. Yet, here we are. We all have one (or some form of it) – likely within arm’s reach of us at this very moment.
This is the result of, frankly, our technological innovation. And we as people have always just gone along with whatever is created throughout our lifetimes, automobiles, the CD player, HD TVs, gas stoves, microwave ovens, you name it.
But is it really progress? In some cases, you could argue, most certainly. In others, it’s more of a convenience. Do we really need that something or, deep down, is it good for us?
Don’t get me wrong.
I love that I can pull out my phone, open a map app, and instantly find my way in a new city, no matter how winding the roads or confusing the neighborhoods. And you know, being able to text my friends my exact location so we can meet up—even in a place none of us have been before—still feels a bit like magic to someone who grew up in a time when only paper maps and hand-written directions were options. Admittedly, I also love the idea that I didn’t grow up in the Stone Age and had access to medicines, nutrition, and 21st-century knowledge that allow us to live much longer, healthier lives.
But something is off with the technology creation of today. More and more it seems, we’re being pushed into a world where tech isn’t really helping us all that much – at least not broadly. And it may be extracting a steeper cost than we have traditionally acknowledged.
We Never Question Our Unbending Faith in Technology
For years, I’ve heard all the usual business figures, tech entrepreneurs, and government leaders talk endlessly about the need to be on the cutting edge of technology – you know to “stay competitive.” That’s been the American mindset for decades, right? It’s not just AI companies like OpenAI or Google. Even the thoughtful Reid Hoffman, founder of LinkedIn, and countless other leaders in business and government repeat it like a mantra, with this unquestioning faith and belief.
But what does staying competitive really mean today? Is it a race to create the next algorithm, the next shortcut, the next disruption? It’s couched in this idea of societal wealth creation, but more and more it’s a very narrow group that’s benefiting. Sure, the technology founders will make billions. But what do the rest of us collectively gain from this? Today, we rarely pause to question the wisdom of endless innovation. Stay competitive… so we can do what exactly? If we never stop to account for the cost, at what point are we accelerating toward our own undoing?
And look, I’m certainly not the first to propose this, but we either have our true reckoning right now or it fast becomes too late.
Why do I suggest this? How about the following:
Point No. 1: Our Sedentarism and Declining Health
It’s hard to argue that the rise of technology hasn’t dramatically increased sedentary lifestyles. We walk less and move less. After all, we don’t take a quick stroll to the store. We don’t even bike. We drive. Or, now we get it delivered to us! For years, none of us has had to get up to change the TV channel like we used to!
And we are paying the price. Our obesity rates? 40.3 percent of Americans are now considered obese. Not just overweight. Those are obesity rates. Throw in our rates cardiovascular disease (coronary heart disease was the leading cause of deaths (39.5%) in the most recent study in 2022) and the impact of technology alone on our physical health isn’t so pretty.
And what about mental health? Kids and adults alike spend hours in front of screens, often exceeding recommended limits. Even our interactions—once rooted in shared physical experiences—are usually virtual now, reducing real community engagement. Today, 57 percent of Americans recently polled considered themselves to be lonely. Let’s say that again: 57 percent!
Is technology to blame for all of this? Perhaps not. Could we attribute a large chunk of it to technology? There’s probably little doubt about that.
Point No. 2: Social Media (Division by Design)
I remember when Facebook first came out. It was a great way to connect with people with whom you may have lost touch. You know, those people from high school or college or even former work colleagues. In some cases, it even spurred in-person get-togethers. And yeah, it still does this to some extent. But the reality is that social media has simply become one thing: an outrage and anxiety machine.
All the major platforms’ algorithms are engineered to maximize engagement, often by stoking outrage, fear, and divisiveness. Wanna know why we’re so close to the brink of a Civil War in this country? It’s a machine that was programmed to make you hate someone else.
We’re all now funneled into echo chambers, fed a diet of rage and mistrust specifically tailored to our profiles. Fantastic, right?
We don’t discuss. We argue. We don’t share. We insult. Instead of fostering real connection, we’re now retreating into our own online worlds, where genuine relationships with flesh-and-blood human beings are becoming more and more rare. Thanks, Mark Zuckerberg.
Point No. 3: Thanks to Technology, ‘The Haves’ Have Prospered
If you want to look at the ultimate arena for our survival-of-the-fittest Darwinism, you need only look at the technology industry. While in the past, you could certainly argue that advances in tech have been the rising tide lifting all boats, a lot of the recent innovation has been purely about extracting or saving the maximum amount of money at the cost of employees, partners, and customers. It’s not about democratizing prosperity; it’s been about magnifying the divide between the haves and have-nots.
Then there’s the “digital divide” itself, which places those without access to modern tech at growing economic, educational, and social disadvantage—whether by geography, income, or circumstance. And the more the tech giants consolidate power, the more they control culture, politics, and opportunity itself.
Point No. 4: Technology Means You Don’t ‘Kick Back and Relax’
l remember what many tech entrepreneurs predicted at the start of today’s modern Internet in 1994: People would only have to work 4-day work weeks! It seems almost a joke 30+ years later.
Tech was supposed to make life easier. Instead, always-on communication, remote work, and digital collaboration have removed boundaries from the workday for millions. Burnout is soaring; vacations and downtime shrink as expectations grow to keep working, responding, producing, no matter the hour.
Have you seen the recent 9-9-6 routine that’s been adopted by workers in Silicon Valley? Started in China, it’s the idea that people who work for startups or tech companies now must put in the hours of 9 a.m. to 9 p.m., six days a week. Again, wonderful.
In essence, we traded our 9-to-5 workday for relentless connectivity – to the job. Thanks, Marc Andreessen.
Point No. 5: Learning Without Learning
Ah, just “Google it.” Maybe now, it’s “just ChatGPT it.” Why bother learning anything today if you have the world’s smartest librarian to look it up for you right at your fingertips? And, because we don’t actually have to learn or struggle for knowledge ourselves, many of us don’t actually learn the essential skills: critical thinking, perseverance, or memory recall. In essence, you don’t ever get to learn how to learn.
OK, you ask: What’s the downside of this?
I think the biggest loss is personal. For example, ever been in a conversation with someone where you had to think on your feet, consider the options, or even have a witty comeback? Maybe you were on a date. What happens in the case when you don’t learn anything, or you haven’t had to think about anything? You don’t come up with anything. You have a chance in that one fleeting moment. And because you never had to put your mind to something, you come empty. The cost of that moment alone could be life-changing.
Point No. 6: The Coming Revolution Will Be a Huge Drain on Energy and Water Resources
Let’s set aside the rationale for AI until the next point. Artificial intelligence relies heavily on data centers—vast, energy-hungry complexes filled with thousands of servers performing immense computations 24/7.
These data centers are the backbone of AI’s rapid growth, but come with ridiculous energy and water demands that are already straining resources and sparking local outrage.
Experts estimate that data centers currently account for just about 5% to 15% of global electricity use, but with AI’s expansion, this could rise to 35-50% of data center power consumption by 2030 alone. In the United States, Deloitte projects that power demand from AI data centers could more than triple by 2035, soaring to 123 gigawatts—roughly equivalent to the output of 123 large power plants.
Of course, it’s not just energy usage. That energy also spews more carbon into the atmosphere. According to the same 2025 report by CarbonBrief (linked above), data centers worldwide were responsible for about 2% of global carbon emissions, a figure expected to rise as demand for cloud computing and AI processing grows. In the U.S. alone, data centers consumed roughly 70 billion kilowatt-hours of electricity in 2023, producing an estimated 50 million metric tons of CO2—comparable to the emissions of approximately 10 million cars on the road.
10 million cars.
The water consumption side is equally grim. Data centers use millions of gallons of water daily for cooling servers to prevent overheating. Total U.S. data center water use tripled from 2014 to 2023, reaching about 17.4 billion gallons annually—roughly equivalent to the yearly water use of millions of homes. For instance, in our own backyard, Loudoun County, VA is home to roughly 200 data centers alone, and water usage by these facilities is about 900 million gallons per year! Indeed, large data centers can consume up to 5 million gallons per day, equivalent to the water use of a town populated by 10,000 to 50,000 people.
And the water itself in some of those areas? Undrinkable, say the residents.
This growing demand has sparked fierce community opposition. Voters and local officials have repeatedly blocked or delayed data center projects over concerns about excessive power use and water depletion. Over $60 billion worth of data center projects have been halted or postponed due to this backlash, cutting across partisan lines. For example, the Tucson City Council unanimously rejected a massive new Amazon data center in August citing environmental and resource concerns. Nearby communities in Virginia and Maryland have also rallied against proposed developments, fearing permanent damage to local infrastructure, water supplies, and quality of life.
Not convinced yet? How about the next point:
Point No. 7: AI Is Job Loss and Potentially Worse
You’ve seen all the recent layoffs at the big companies – Microsoft, Google, Amazon, Salesforce, ServiceNow. They’re happening across the board. There’s one reason: AI. Those companies are using AI for what it’s really meant for – saving money. Yes, companies simply don’t need what ServiceNow CEO Bill McDermott called those: “soul-crushing jobs.” (But aren’t jobs still paying jobs, Bill?)
How about recent college grads who haven’t been able to find work? The New York Times recently declared this group “the long-term unemployed.”
I’ve made the argument before. AI isn’t equivalent to the Industrial Revolution. It’s transforming work with even greater speed and impact, shedding countless jobs in fields ranging from manufacturing to truck or Uber driving, journalism, finance, and customer service. And unlike previous technological advances, the rapid displacement wrought by algorithms leaves little time for anyone to adapt.
What’s worse is that, as AI creeps further into decision-making, not only are jobs lost, but vital human judgment, empathy, and creativity will be replaced by automated logic that no one fully understands. It’s this specter of uncontrolled AI, super intelligent, unregulated, poorly understood, and potentially existential, that experts suggest has the possibility of wiping out humanity.
Let’s say that again – wiping out humanity. Even if that’s only a small risk, it seems like that’s a risk worth trying to mitigate, no? Yet, here we are, continuing like it won’t happen.
Conclusion
Like all advances in technology, no one asked whether anyone wanted it. Is it nice to have an iPhone? Of course? But did I live without it before? Yes. And arguably better than I do today. Has the economy expanded because of it? Sure. Is it a slam dunk that the iPhone has been an overall net positive for humans? Hardly.
The real question is, when does technology stop being a balancing act – when the scale starts tilting more and more away from being good for humanity? I mean, does humanity really need this AI revolution?
You know, it’s funny, you can already hear our elites and our current oligarchy of billionaires pushing back on any slowing of technological innovation. “Technology is the backbone of progress!” they’ll say, citing history.
But they’re not really thinking about you, the average person, are they? They, after all, stand to make billions in the new world. Moreover, I’d argue the past isn’t prologue in this case. And if we’re really looking at the writing on the wall, history hasn’t prepared anyone for what’s coming, especially the human consequences of this revolution.
For that reason, isn’t it time we started asking: When is enough enough?






