en
Skip to main contentMarch 30, 2026
Daniel Goleman is author of the international best-seller Emotional Intelligence and Optimal: How to Sustain Personal and Organizational Excellence Every Day. He is a regular contributor to Korn Ferry.
Compared to humans—who have limited energy alongside basic needs for sleep, water, and connection—artificial intelligence represents something categorically different: a tireless capacity for constant activity. A recent Fortune article spotlighted the rise of “always-on” AI agents: systems designed to execute tasks autonomously, monitor workflows, and continue working long after their human counterparts have logged off. The pitch is compelling: What if your organization could move while you sleep? What if productivity was no longer limited by human need?
One firm projects that by 2027, 50% of companies currently using generative AI will have deployed autonomous agentic AI. These are systems that don't just assist with tasks but complete them end-to-end, without human prompting.
But here's what we know about people: the pressure to perform and produce is already prone to overtaking their work lives. As AI systems grow more autonomous, that pressure will only intensify.
This is where emotional self-regulation, one of the core competencies of emotional intelligence, becomes essential. Emotional self-regulation is often misunderstood as a form of suppression: a dampening of emotion, urgency, or even voice. But in practice, it is really about balance: the ability to feel urgency without becoming governed by it; the ability to keep disruptive emotions and impulses in check in order to maintain effectiveness under stressful or even hostile conditions; and the capacity to stay internally steady even when external pressure accelerates.
In the age of always-on AI, this ability may determine organizational survival.
The greatest risk isn't that AI “gets it wrong,” but that leaders are too overwhelmed or moving too fast to notice. Under pressure to produce and compete, leaders are vulnerable to scaling prematurely, automating before guardrails are in place, or deploying systems before the human infrastructure — governance, accountability, escalation pathways — has been built to support them. One study found that while 92% of companies plan to increase AI investment over the next three years, only 1% of leaders describe their companies as mature in deployment, meaning that AI is integrated and can drive meaningful outcomes. The gap between investment and readiness is already enormous. What fills that gap, too often, is an overwhelming sense of urgency that masquerades as a sound way forward.
Because emotions are contagious — particularly from leaders outward – a leader’s internal state becomes an organizational force. It’s been observed again and again: The most powerful person in any room has an outsized emotional impact on everyone else in it. When that person is operating from fear or reactive urgency, those states radiate outward, clouding judgement and productive dialogue. In measured doses, urgency is productive. But when it becomes chronic or outsized, it narrows thinking, erodes judgment, and leads to poor decision-making.
By contrast, the more measured and deliberate a leader, the more likely an organization stabilizes. That stability is essential – not just for everyone’s overall wellbeing, but if companies are going to think critically, ethically, and collaboratively about AI adoption.
Unlike what it promises, as of now, always-on AI does not eliminate human responsibility. Instead, it amplifies it. These systems still require oversight—ethical guardrails, accountability structures, and clear escalation pathways. They require grounded judgment about where automation creates value and where human discernment remains essential.
Leaders who lack emotional self-regulation tend to interpret slower adoption as failure. They push harder, accelerate rollouts, and mandate use before people are ready. Leaders with emotional balance read the same signals differently: they recognize hesitation as information, seeing it as a cue to examine risk, close gaps, and strengthen the system before overwhelming it with new technologies.
The scarcest resource in this next phase of AI adoption may not be innovation or infrastructure. It may be the capacity—especially among leaders—to slow down, examine motivations, and ask the questions no one else is asking.
When systems accelerate, leaders must decelerate. That is not a weakness. In the current moment, it may be the most sophisticated strategic move available.
Co-written by Elizabeth Solomon
Click here to learn more about Daniel Goleman's Building Blocks of Emotional Intelligence.
Stay on top of the latest leadership news with This Week in Leadership—delivered weekly and straight into your inbox.