Skip to content

After a Disappointing Digital Rollout: Four Things Steel Mills Do Differently Next Time

By: Fero Labs Logo light
Adobe Stock 388955077

Studies have shown that around 70% of digital transformation initiatives failed to meet their objectives - which means many steel mills evaluating new technology today have at least one previous deployment that didn’t deliver as expected - although it doesn’t have to be this way.

Sometimes the system never integrated cleanly with plant data.
Sometimes the model couldn’t adapt to production variability.
Sometimes operators simply didn’t use it.
And in a number of cases, the software itself couldn’t meet the technical claims made during procurement.

These experiences leave a mark. They shape expectations, influence decision-making, and understandably make teams cautious about the next initiative. But steel mills that have moved past earlier disappointment tend to follow a different path the next time - one built on tighter scope, realistic evaluation, and clearer operational fit.

From our experience, here’s what that looks like in practice.

Start With an Accurate Diagnosis of What Actually Failed

When a deployment falls short, the easy conclusion is that the software “didn’t work.”
But in operations reviews, the root cause usually lands in one of a few specific areas: unclear objectives, underestimated integration, misaligned workflow, low end-user adoption, or overly optimistic vendor claims.

Procurement and deployment teams who recover quickly do so by making a point of determining which of those factors applied - not just as a post-mortem exercise, but to avoid repeating the same issue. A precise diagnosis is the first step toward better procurement requirements and deployment.

Define a Smaller, More Measurable Objective the Next Time

Broad digital initiatives tend to falter under their own weight. The plants that succeed after an earlier setback narrow the scope dramatically.

Instead of trying to “transform” a process area, they select a single use case tied to a measurable operational metric and then build on it over time. These often include:
Reduction of alloy use on one product family, to stabilize a known bottleneck, or confirm whether an internal model can be replaced with something more adaptive.

This approach has two advantages:

1) It keeps the project manageable.
2) Early results arrive faster, restoring confidence across teams.

At one mill, the objective for their new deployment was whether Fero could replace periodic retraining of their in-house model. That narrow objective alone freed hundreds of engineering hours once validated so they could focus on more exploratory initiatives.

Fit the Technology Into Existing Workflows

One of the dominant causes of poor adoption in previous deployments was workflow disruption.
If the technology required operators to change screens, reorganize routines, or rely on new inputs, adoption fell off quickly.

Successful re-deployments reverse the logic:
Where possible, technology fits into the workflow — not the other way around.

For example, we recently worked with a European steel mill with no available screen space in the operator pulpit, so the solution was integrated into their existing interface rather than redesigning the station. It was a less elegant solution but it meant that operators began using it immediately because nothing about their routine changed. This principle alone has been decisive in preventing repeat failures.

Create Meaningful Early Usage Instead of Long Onboarding

Lengthy training cycles and delayed go-lives tend to drain momentum, especially at sites with previous failed experiences.

Effective teams introduce tools in a way that produces real operational interaction within days or weeks — not months. Engineers validate early recommendations; operators engage only after the signals look credible; additional process areas come later. This staged introduction is pragmatic and avoids the “everything changes at once” pattern that contributed to earlier issues.

What This Means Moving Forward

Steel mills that have rebounded from disappointing digital initiatives share a clear pattern:
They avoid broad transformation ambitions, focus on tightly scoped objectives, fit tools into existing workflows as much as possible, and screen the new software with an engineering mindset.

None of these changes are dramatic. But together, they create the conditions for a deployment to succeed - even at sites where previous attempts fell short.

A failed deployment is not a forecast of future outcomes. It’s an opportunity to set different conditions the next time.