Food for thought

The keys to connecting the shop, replicating success and scaling up

A bill from
Pierrick Boissel
The keys to connecting the shop, replicating success and scaling up

This article is a summary of a forthcoming white paper. Feel free to send us an email at to receive it when it is available.

Over the last two years, we have met and exchanged with more than 100 users, prospects, customers and industrial decision-makers from all industries.

The topic? Connecting the shop floor and being able to play with production data in real time.

Bring up process and production data in time to make "data-driven" decisions. Indeed, a connected workshop means real time for monitoring, alerting and machine recalibration, production orchestration, predictive maintenance, anomaly and scrap detection, etc.

In 90% of the cases, the questions, the attempts to solve them or the solutions chosen are often the same. We have therefore decided to give you some keys to move forward in the fog.

What you've brought up:

  • Real time is essential
  • We don't know what is in the machines because the automaton is closed. We bought special machines, everything is blocked by the supplier.
  • My machines are not connected, the gain is uncertain for the return on investment.
  • The production networks are outside the scope of the IS and are therefore not included in the transformation plans (this tends to change).
  • We don't have any automation specialists here. It's maintenance, methods or production that manages it. They don't know the controllers / automatons. We don't dare touch them.
  • We have an MES, we spent a year implementing it, the environment is completely closed, we can't get anything out, we spend our time doing custom development and the users are not satisfied with the result.
  • We have done with open-source bricks (node-red for example), but it is not reproducible at scale. The robustness of the system is not guaranteed.

Achievable gains:

According to McKinsey, connecting the shop floor and getting the right data to the right person in real time is a more than significant source of ROI:

  • up to +90% productivity.
  • up to 50% less unscheduled machine downtime.
  • up to - 40% on maintenance costs.

And that's without mentioning the financial and ecological gains represented by the reduction of consumption: waste, material and energy losses, whose costs are currently exploding in Europe.

We can also point out the gains made by scaling up projects in record time, better inter-site communication and hybridization of cloud/edge services, which is less demanding than full cloud.

It is therefore urgent to roll up our sleeves.

But then what to do, and especially how to do it?

Digitizing the workshop means first of all understanding that we can go much faster by iterating and above all; that today it is possible. The problem becomes a question of software and no longer of hardware, and good news: solutions exist.

Here is a short summary of what we recommend at Niagara, to help you in your reflection:

Think accessibility to machine data, sometimes closed: your choice could be for a platform that can be deployed anywhere, close to the machines (servers / gateway / HMI / production computer). Take the time to check the native and multi protocol connectors of the chosen solution.

Think about software accessibility: set up a no-code platform that allows any engineer or automation specialist to read the data from his machine and work on it / set up interfaces that allow ML parameters to be set without having to code, like Automi for example.

Think flexible: New technologies mean the end of MES. Take an open platform, whose infrastructure is in micro-services. The advent of Edge & pub-sub & stream technologies (kafka, MQTT, etc..) will allow you to avoid overloading the production network.

Think scaling and industrialization: Like everyone else, you don't want to fall into "pilot hell." Make sure you can deploy digital twins of your plants, and deploy instances easily. Then you can start standardizing your cross-plant IoT data models. You can create your own on a platform like Niagara or use platforms like azure digital twin or AWS digital twin for example. OPC UA is a robust standard, but make sure that the solution you choose does not force you to use this protocol, so that you have the choice of having to maintain and manage it or not.

Think use cases: Don't forget that the platform must be able to allow you to unpack different use cases, check that your choice is for a platform with no-code IT connectors to send the data to different analysis software or your databases for bigdata analysis for example.

Think about IT teams, as well as security: a hybrid platform that will allow group IT to monitor the edges deployed on the different sites as well as the access policy/ incoming/outgoing flows, etc...

If we summarize, for the dissipated:

#1 - Technologies today allow for the deployment of hybrid tools (Edge & Cloud) at scale that will tackle the collection and processing of industrial data at the machine and unlock many new uses.

#2 - To tackle the subject: think use case, forget about the technology, define clear objectives and then a limited scope and iterate. Enlarge the scope and start again.

#3 - No & low-code tools are perfectly suited to iterate quickly without having to train code in the factory.

#4 - microservices, flexibility and scalability will help you avoid the hell of the driver and the infamous V-cycle of legacy tools.

What about you? What have you implemented in your factories?

Ready to take back control
of your industrial data?

Talk to an expert