Skip to content

Vertical Visionaries: Berk Birand - Fero Labs

By: Bowery Capital Bowery capital logos id MK Nzzu QZ
• February 2024
Berk Birand 1800x1200

Berk Birand, Co-Founder and CEO of  Fero Labs, answers some of our questions.


Can you describe the mission of Fero Labs and what inspired you to start the company?

Our mission is to use AI to make the global industrial sector more efficient and sustainable. One of the realities we saw when we started Fero Labs was that factories in the course of their operations tend to consume more resources than they really need to; we estimate they consume around 10% more in raw materials, energy, and processing time than is inherently required when producing at industrial scale.

At Fero Labs, we believe many of these overconsumption issues can be mitigated by making better decisions within the four walls of the factory and we believe that AI/ML solutions are ideally suited to provide that efficiency increase.

What we do is build software that sits on top of a factory’s equipment and collects operational data, then our software enables engineers and data scientists at these plants to provide recommendations to reduce waste and better manage their processes.


How did manufacturers optimize these processes prior to Fero Labs? What are the typical practices Fero Labs is coming in and looking to replace?

Typically there are two families of process optimization within the industrial sector. One of them would be process control systems; engineers in a steel mill might set a temperature to a certain level and then the system adjusts the furnace levels and the fuel inputs and so forth to keep the temperature at that target.

This works very much like cruise control where you pick a speed and the engine, gas, and brakes all optimize to achieve that outcome.

The challenge here is to figure out what you should set that furnace temperature to. The way engineers typically decide on what temperature to set it at is based on first principles - knowing the chemistry and the physics of the plant - or by Six Sigma or lean manufacturing methods which are essentially methods that have been devised in the 1980s and 1990s and rely on classical statistics.

The Six Sigma or lean manufacturing approach leads to lots of scatter plots, lots of simple visualizations, and analysis in MS Excel. I am generalizing a bit and there are some specific tools for this kind of process improvement, but most of the work really boils down to visualization and basic two-dimensional linear models.

Given this, the main substitute Fero Labs is trying to replace would be Excel and these Six Sigma type of tools. The challenge with these tools is that they do not work at a certain scale. When there are a large number of data points being generated by tens of thousands of different sensor tags, these tools can struggle if they do not know which two dimensions to visualize. 

These tools also do not work when the relationships are not very straightforward and the relationship between a few different factors may be non-linear. We are looking to explain those more complex relationships, not just simple linear ones.


In terms of the Fero platform, what are some features or tools that got users excited and have driven industry leaders to want to roll it out within their own organizations?

Our biggest innovation in this space has been our core machine learning models. We develop all of our machine learning models in-house and we have an entire R&D team doing research in this area.

We realized early on - maybe five or six years ago now - that one of the biggest obstacles in the adoption of machine learning in industrial sectors, and in the enterprise more broadly, was ultimately an issue of trust.

If I am a factory, or a bank, or an insurance carrier, I do not want to rely on a model where I do not really understand the model enough to have conviction as to why certain recommendations are being made. In response to this, we have built all of our core technology for machine learning models to be explainable by nature.

This means instead of just providing a recommendation for an industrial steel formulation, we can say this is what you should do, and here is why you should do it, and it is because two weeks ago, this happened, and then this happened, so now we believe this is the right thing to do.

That explainability of our models is one of the core reasons why customers adopt Fero vs. using other tools in the sector or building custom solutions in-house.

All of our models also provide a confidence score with each prediction, this has been hugely important and solves a problem which has become more prominent in light of the popularity of ChatGPT; many ML systems tend to be black boxes - this means sometimes they might give a great answer, and other times the answer is complete BS, but the model gives it with such confidence that you never know whether to trust it. 

This is why our models provide a confidence level for each prediction. Sometimes, we can say this is what you should do, and we are very confident it is the right decision, while other times we can warn the user that the model is not very confident in this prediction and then the engineers know not to rely too heavily on those outputs.


What does Fero Labs aim for to be a “good” confidence level for these industrial process predictions?

Our customers definitely want 100% - we try to explain to them that this is impossible as no model is correct 100% of the time, but we typically aim for 90-95% confidence bands across all of our models. The accuracy of our predictions is a benefit we hear back from customers all the time, so our confidence levels are sound.


What was a go-to-market lever that Fero Labs was able to pull that really helped drive adoption in the early days of the business?

One of the interesting things we did was to adopt both a bottom-up and top-down go-to-market strategy. Many companies in our space focus on the executive sale and really try to win over the highest levels of the company. 

That works for a year or two because they are executives and of course they have some influence, but ultimately if the engineers at the plants do not like the software they will not use it and the company will end up churning as a customer.

Instead, we decided to start by selling at the plant-level, specifically targeting mid-level managers at these plants. We convince them to buy Fero for a single plant, and then another, and another, and ultimately this allows us to scale faster at the outset, and then after that point we would go top-down and try to sell an enterprise contract at the executive level. It is very important to have that buy in both on the ground, as well as at the highest levels of the company.


Since Fero Labs launched its product, how has it improved and what is on the roadmap in terms of product expansion?

When we first started out, our core focus was on reducing waste and ultimately the reason companies in the industrial sector want to reduce waste is to reduce their costs; it can be extremely costly to waste raw materials. And beyond just wasting raw materials, there can also be a lot of waste in terms of processing time; spending 10% more time to make the same batch of products can negatively impact the finances of an operation by leading a plant to incur significant excess energy costs.

This avoidance of waste has been a big part of Fero since day one. But since we launched, one thing that has changed in the industrial sector more broadly is that sustainability has become a bigger factor in many companies’ decision making, especially at the board level.

As a result, we have added a number of features to track not just cost, but also progress against certain sustainability goals. All of our customers have very ambitious goals to achieve by 2030 in terms of emissions reduction and in order to get there in six or seven years, they need to start making changes today. 

The modules we have built out since launch allow our customers to reduce both costs and emissions simultaneously. And this is a pretty novel idea, we call this Profitable Sustainability and this allows engineers to make optimization decisions at the plant that can drive profitability, while also helping the company make progress towards their 2030 sustainability goals.


What were some of the biggest challenges Fero Labs faced in the course of growing the business to where it is today, and how did you overcome them?

One of the big challenges that almost all data companies face is the garbage-in, garbage-out problem.

We do not make any hardware, so all the data being fed into our software comes from systems and sensors that are already in place and databases that our customers were already using to store historical data.

The data being fed into Fero’s software is coming from third-party systems which were not built for machine learning, so sometimes we find ourselves dealing with data quality issues, often these are pre-existing data quality issues that had never been addressed. 

The data sets in question are often massive and no human is going through them line-by-line and reviewing every single data point. But with machine learning, all of the data needs to be at a reasonably good quality level for it to really work.

To address this, we have built entirely new products within our core offering which allow engineers to do data QA and data cleaning. Many of our customers have seen a ton of value from this without even considering the lift in accuracy this leads to for their ML outputs. 

There are a lot of different approaches to data cleaning, but the way that we have done it is to go back to our roots as a vertical SaaS company. We built a data cleaning module for the industrial sector that is designed for the Six Sigma engineers and the steel metallurgists to be able to do this individually without needing to write any code.

Launching these self-service, no-code methods for data cleaning has been a huge boost to the accuracy of our models.


Were there any partnerships or collaborations that helped Fero Labs supercharge its reach as you were scaling the business? How do you think about partnerships in general in this sector?

The industrial sector is interesting because there are a few different categories of companies; in our world there are the suppliers and then there are the system integrators. The service the integrators provide is to pull together different solutions and then package them up as a single offering to their customers.

As a result, the industrial space is very accustomed to companies relying on a partnership network approach. We have a few partnerships, but I would not say that the system integrator partnerships have been the ones that have been necessarily the most successful.

We always want to maintain a close relationship with our customers both for GTM learnings, but also for product development and that can be complicated with system integrator partnerships but it is something we are still exploring.

What has been more successful would be go-to-market partnerships with companies at the system integrator type of level where the partnership is more demand-gen or lead-gen focused and less of a reseller arrangement. This means we are still talking directly to the customers and owning the relationship, but the SI makes the introduction and based on that we have a revenue share.

Another type of non-traditional partnership channel has been through our investors. They have been great about leveraging their existing relationships, and of course Bowery has been extremely helpful in making introductions through their broader network.

Even though this is not a typical commercial partnership, we have still gotten a lot of value out of these investment partnerships.


What sets Fero Labs apart from its competitors? And how do you think you can sustain that competitive advantage?

In our space, and this is also applicable to the broader enterprise ML space, there was a bit of a gold rush a few years back where you saw a lot of startups that had the impression that they could take free, open source ML model libraries, wrap them in a nice visual package, and then sell them to industry.

This approach to company building did not work, as many of our customers are perfectly capable of taking those same open source models and then building their own systems internally. There is not much value to be offered if all you are doing is repackaging open source software.

Instead, what we have always tried to do - and what is still one of our core competitive advantages - is to invest heavily in our own proprietary research around data science and machine learning. Beyond just leveraging open source models, we do internal research that is the type of research that could be published in an academic journal.

By doubling down on research, we have been able to make explainable white-box ML models which have set us apart from the competition.

The other way we have been able to set ourselves apart is the fact there are many ML companies out there that pretend to be software companies, but the product they are selling is really the labor of data scientists who are behind the scenes building and deploying custom models. 

These companies brand themselves as software businesses, but they are more like a shared services company behind the scenes. That can work for a pilot but it does not scale. When you get to the point that you have customers with hundreds of plants, there is no way that our internal team as a startup can service these factories manually. 

This is why we chose to build a self-service piece of software which their subject matter experts can drive where we can provide support as needed, but can avoid getting stuck building custom models for each user.


Based on your experience founding Fero Labs, what advice would you give to entrepreneurs looking to build their own vertical SaaS business?

The thing we always have on our minds is how can we be as horizontal as possible within our given vertical. What I mean by that is we really try to build software which can take raw data from our customers' databases and then turn that data into dollars. 

As a company, we initially tried to focus on individual components of the value chain and then build on those until we can support the companies we sell into as broadly as possible. In the industrial space, there are companies that just focus on doing data cleaning, or just focus on data harmonization, or just offer self-service analytics. 

What we always wanted to do was to be the one-stop shop in terms of owning the entire value chain from data cleanliness to modeling to measuring value to reporting. Companies don’t want to learn how to use multiple products if they can just learn one.

Ultimately, the more pieces of the value chain you can assist a company with, the easier it becomes for executives to rationalize spending money on your software, because they can link it to outcomes (e.g, “we paid this much for the software and in return we have gotten X dollars in improvements”). 

We have had many customers tell us that Fero Labs is the only digital investment they have made which had a quantifiable dollar return in terms of how we helped improve their efficiency. If you build software just for cleaning data, it is a lot harder for the buyer to measure the inherent value in a cleaner data set. It becomes much easier to measure ROI if you own more of the value chain and are closer to outcomes.

My advice would be to try to understand the value chain and see how much of that you can capture.

[Originally published Bowery Capital]

If you liked “Vertical Visionaries: Berk Birand (Fero Labs)” and want to read more content from the Bowery Capital Team, check out other relevant posts from the Bowery Capital Blog.