THE BLOG

Upskilling vs Terminating: The 60-Day Assessment Framework

ai skills leadership marketing department marketing skills marketing team Oct 27, 2025
Your team's skills are obsolete. You have 60 days to decide: invest in upskilling or cut your losses. Here's the assessment framework that actually works.

Three months ago you gave someone a project. They're still "assessing the current state" and haven't delivered anything.

Meanwhile, someone else knocked out the same work in a week using AI, iterated based on feedback, and moved on to the next thing.

That gap tells you everything you need to know about who survives and who doesn't.

When AI changes the fundamental nature of marketing work, most of your team's skills become obsolete overnight. You face an uncomfortable choice: Invest time and money in upskilling people who might never get there, or cut your losses and find people who already operate at the speed you need.

You have about 60 days to figure this out. Here's how to make the call.

The Core Question

Can they get there fast enough?

Not "can they eventually learn this." Everyone can eventually learn anything given unlimited time and patience. You don't have unlimited time. Your competitors are already moving.

The question is whether someone can upskill fast enough to justify keeping them versus replacing them with people who already have these capabilities or building AI systems that make the role unnecessary.

Fast enough means weeks, not months. If someone needs six months of training to do what you need today, they're not getting there fast enough. The market is moving too quickly. By the time they're competent, the requirements will have changed again.

Look for self-directed learning velocity. People who already started teaching themselves AI tools without being told. People who experiment on their own time. People who bring you solutions instead of waiting for instructions.

If you have to force someone to learn what their job now requires, they're probably not going to make it.

Warning Signs They Won't Make It

Inactivity is the biggest red flag. When someone has been in a role for three months and hasn't shipped anything meaningful, they're telling you they can't operate at the pace required.

Excuses vary. "I'm still learning the systems." "I need more context before I can execute." "I want to make sure I get it right the first time." All translate to the same thing: They're not moving fast enough.

Compare this to people who produce results immediately. They ship imperfect work, get feedback, iterate, and improve. They're not waiting for perfect understanding. They're learning by doing.

The person who spends weeks building elaborate plans rarely executes them well. The person who tests quickly and adjusts based on results wins.

Watch what people do with AI tools. Some people integrate them immediately and start producing at higher volume. Others ignore them and keep working the old way. The second group is telling you they won't adapt.

Resistance to automation is fatal. When someone says "I don't have time to set up automations," they're really saying "I don't understand that spending an hour now saves ten hours later." That's a basic failure of strategic thinking.

Meeting addiction is another signal. People who need to schedule meetings for everything, who won't make decisions without group consensus, who spend 30 hours per week in conversations instead of executing. They're filling time because they don't know how to fill it with actual work.

AI doesn't need meetings. It needs clear instructions and it executes. If someone can't work that way, they're not compatible with AI-augmented workflows.

The Speed Problem

Someone takes 50 hours a week to do work that someone else completes in 20. What's happening?

They're not working smarter. They're obsessing over details that don't matter. Perfectionism about things that don't affect outcomes. Checking and rechecking work that was fine the first time. Meetings that could have been messages. Documentation nobody will read.

This is what happens when people don't understand the difference between important work and busy work. AI forces clarity on this because it doesn't do busy work. It only does what you explicitly tell it to do.

People who can't distinguish important from urgent, strategic from tactical, high-impact from low-impact don't survive in AI-powered environments. Because those environments reward ruthless prioritization.

If someone consistently underestimates how long work takes, or can't break large projects into small shipping increments, or needs hand-holding through every step, they're operating below the threshold for AI-native work.

The brutal reality: When someone needs three months to accomplish what AI helps others do in three days, keeping them is charity, not business strategy.

What "Getting There" Actually Means

Getting there means operating independently with AI tools to produce work that previously required supervision or collaboration.

Can they prompt effectively to get useful output? Can they evaluate that output critically? Can they iterate to improve it? Can they integrate it into existing workflows without constant guidance?

Can they identify what should be automated and actually build the automation? Can they train AI agents on your specific context? Can they troubleshoot when things don't work?

Can they think strategically about where AI creates leverage versus where humans add unique value? Can they design systems, not just execute tasks?

Most importantly: Can they do all this without you telling them exactly what to do at every step?

Self-directed execution is the core capability. If someone needs detailed instructions for everything, AI replaces them. Because AI follows detailed instructions perfectly and doesn't require management overhead.

The people who survive are the ones who take vague direction—"we need to improve our content output"—and come back with implemented solutions. Built the agents. Created the workflows. Started producing results. Then asked for feedback to refine.

The 60-Day Timeline

You don't have six months to figure out if someone can adapt. Markets move too fast. Competitors move too fast. The window for making this transition is narrow.

Give people 60 days to demonstrate capability. Not to learn everything. To show they can learn fast enough to keep up.

Set clear expectations. Here's what good looks like. Here's the pace we need. Here's what success means. Here are the tools available. Now go.

Then watch what happens. Some people will struggle initially but show rapid improvement. They're asking good questions. Shipping work. Incorporating feedback. Moving faster each week.

Others will spin their wheels. Lots of activity. Little output. Same problems every week. No visible improvement in pace or quality.

The first group is worth developing. The second group is telling you they won't make it.

Sixty days is enough time to see trajectory. Not enough time to achieve mastery. But enough to know if someone is on the path or stalled out.

The Team Morale Trap

The most common reason for keeping underperformers: They're great team players. Everyone likes them. Terminating them would hurt morale.

This is backwards thinking. You know what hurts morale? Watching high performers carry underperformers. Seeing someone produce nothing while getting paid the same as people producing everything.

High performers don't want dead weight on their teams. They want capable colleagues who pull their own weight. When you keep people who can't keep up, you're telling your best people their output doesn't matter.

The "team player" who can't deliver results isn't helping the team. They're harming it. Morale improves when you remove people who slow everyone else down.

Yes, termination is uncomfortable. So is keeping someone in a role they're failing at while everyone around them works harder to compensate.

When Termination Makes More Sense

If someone can only do what they're told, AI does it better. That's the test.

Traditional executors who need detailed instructions for every task are competing with software that follows instructions perfectly, works 24/7, never takes vacation, and costs essentially nothing.

The economics don't work. Even if they're doing good work by 2019 standards, they're not delivering value that justifies human salary in 2025.

If someone shows no interest in learning AI tools, they're telling you they won't adapt. Believe them. Don't waste months trying to force development that isn't happening.

If someone has been in role for 90 days and you're still wondering if they can do the job, you have your answer. People who can operate at the required level prove it quickly.

Three strikes and you're out. Miss deadlines once, maybe it's circumstance. Twice, it's a pattern. Three times, it's who they are. Stop expecting different results.

The Replacement Options

Terminating someone only makes sense if you have a plan for replacing their output. Three options exist.

Hire someone who already has AI-native capabilities. Good luck. As discussed, these people barely exist in the hiring pool. If you can find them and afford them, great. Most companies can't.

Outsource to agencies or consultants who specialize in AI-powered marketing. They've already climbed the learning curve. They can deploy proven systems quickly. More expensive per hour, cheaper in total cost than hiring wrong people repeatedly.

Build AI systems that eliminate the need for the role entirely. If someone's job is primarily execution, automate it. Project management. Content creation. Workflow coordination. All handled by properly configured AI agents.

Most realistic approach: Combination of all three. One or two strategic internal people. Agency partners for specialized execution. AI automation for routine work. Total team of three humans and six AI agents replacing ten traditional employees.

The Six-Month Learning Curve Reality

Even if you find or develop someone with the right capabilities, they need time to understand your business well enough to make good decisions.

AI execution is fast. Strategic alignment takes time. Anyone can prompt AI to create content. Creating content your customers actually care about requires deep understanding of those customers.

This is why the 60-day assessment matters. You're not evaluating mastery. You're evaluating learning velocity. Do they pick up context quickly? Do they ask good questions? Do they connect dots between what they're learning and what actions to take?

Someone who takes six months to understand basics won't work. Someone who grasps fundamentals in two weeks and deepens understanding continuously? Worth developing.

The assessment isn't "can they do everything perfectly right now." It's "are they improving fast enough to be valuable before the market moves past us."


Develop Career-Proof Marketing Skills at ACE

Upskilling isn't optional anymore. It's adapt or exit. The Academy of Continuing Education teaches ambitious marketers how to develop AI-native capabilities that make them indispensable instead of replaceable. Stop hoping your old skills stay relevant. Start building the capabilities that actually matter. Join ACE today.

GET ON OUR NEWSLETTER LIST

Sign up for new content drops and fresh ideas.