The Automation Experiment That Almost Broke Us
May 10, 2024 · 841 words
Published by Steven Delaney
I've always been fascinated by automation. The idea of having systems that could handle routine tasks without human intervention seemed like the perfect solution to many of the challenges we faced as an MSP. So when I heard about a new automation platform that promised to revolutionize our operations, I was immediately interested.
The platform was impressive. It could automate user provisioning, password resets, software installations, patch management, and dozens of other routine tasks. The sales rep painted a picture of a fully automated MSP where technicians could focus on strategic work instead of routine maintenance.
I was sold. We implemented the platform across all our clients, and I was excited to see the results.
The Initial Success
The first few weeks were amazing. Tasks that used to take hours were completed in minutes. Our technicians had more time for strategic projects. Clients were impressed by how quickly we could respond to requests. The automation was working exactly as promised.
I was so excited that I started looking for more things to automate. User onboarding, software updates, security scans, backup verification, performance monitoring. If it could be automated, we automated it.
The Cracks Begin to Show
But then things started to go wrong. Not dramatically, but in small ways that began to add up:
Lost Context: Automated processes completed tasks without understanding the broader context. A password reset that should have been flagged for security review went through automatically.
Reduced Oversight: We were catching fewer issues because automated processes didn't have the human judgment to recognize when something wasn't quite right.
Client Confusion: Clients started receiving automated responses that didn't match their actual needs or situations.
Team Disconnection: Our technicians were losing touch with client systems because they weren't interacting with them as much.
The Breaking Point
The real wake-up call came when we had a major security incident that our automated systems completely missed. A client's network was compromised, but because the attack didn't trigger any of our automated alerts, we didn't discover it until days later.
The client was understandably upset. "I thought you were monitoring our systems," they said. "How could this happen without you knowing?"
That's when I realized the problem: we had automated the monitoring, but we had also automated the thinking.
The Reckoning
The security incident forced me to take a hard look at our automation strategy. I realized that we had made a fundamental mistake: we had focused on automating tasks without considering whether those tasks should be automated.
Some tasks are perfect for automation. User provisioning, software updates, routine maintenance. These are repetitive, well-defined processes that don't require human judgment.
But other tasks need human oversight. Security monitoring, client communication, problem diagnosis, strategic planning. These require context, judgment, and relationship understanding that automation can't provide.
The New Approach
We didn't abandon automation entirely, but we became much more selective about what we automated:
Automate the Routine: We kept automation for repetitive, well-defined tasks that don't require human judgment.
Humanize the Critical: We brought back human oversight for tasks that require context, judgment, or relationship management.
Hybrid Approach: We developed hybrid processes where automation handles the routine parts, but humans handle the exceptions and edge cases.
Regular Review: We implemented regular reviews of our automated processes to ensure they were still appropriate and effective.
The Lessons Learned
The automation experiment taught me several important lessons:
Automation is a Tool, Not a Solution: Automation can make processes more efficient, but it can't replace human judgment and relationship management.
Context Matters: Tasks that seem routine might actually require context and judgment that automation can't provide.
Balance is Key: The best approach combines automation for efficiency with human oversight for quality and relationship management.
Regular Evaluation: Automated processes need regular evaluation to ensure they're still appropriate and effective.
The Results
The new approach worked much better. We kept the efficiency benefits of automation while maintaining the quality and relationship benefits of human oversight. Our clients were happier because they were getting both fast service and personal attention. Our team was more engaged because they were doing meaningful work instead of just managing automated processes.
The Broader Lesson
The automation experiment taught me something fundamental about technology and business: just because you can automate something doesn't mean you should. The goal isn't to eliminate human involvement; it's to optimize the balance between efficiency and effectiveness.
The Bottom Line
Looking back, I'm glad we went through the automation experiment, even though it was painful at times. It taught us valuable lessons about the role of technology in service delivery and the importance of maintaining human relationships in a technology-driven business.
Today, we use automation strategically, not comprehensively. We automate what makes sense to automate, and we keep humans involved where human judgment and relationships matter most.
The result? We're more efficient than we were before, but we're also more effective. And that's the balance that really matters.
What's your experience with automation? Have you found the right balance between efficiency and human oversight? I'd love to hear your stories.

Steven Delaney
MSP Industry Expert • Houston, TX
Strategic insights and practical guidance for the modern Managed Service Provider. Based in Houston, TX.