Stop Guessing Which Hazard Study You Need

Take the Free Assessment Now →
Editorial ArchiveSmart ManufacturingSmart Manufacturing

Prevent Data Loss from Remote Process Sites

By Mark Halder, MatrikonOPC, Product Manager

Listen to this article

Losing critical process data is a very real threat in the process control industry today, especially with sites that are remotely distributed. There can be any range of interruptions to the communications between these remote facilities.

From satellite or microwave transmissions being interrupted, to Ethernet communications from an ISP going down, to power outages and regular maintenance, all these can cause this data to be lost and become very difficult to recover.

Mark Halder,  MatrikonOPC,  Product Manager
By Mark Halder, MatrikonOPC, Product Manager

Process Data Control

The first thing that must be considered when evaluating a solution is how it will be used. Who is going to be looking at and accessing this data? Will it be used by the staff at the remote stations, or will it be used by people accessing a central repository?

Most of the time it will not be necessary to collect and transmit all of the production data, as only data that is critical to needs and metrics should be transmitted. Once the data that is to be transferred is selected, the speed at which this data is to be collected needs to be considered.

Is it absolutely necessary that the data be collected every second, or should it be considered at 15-second, minute or even higher update rates?

This depends on several factors:

– How fast does the process data change?

– What process data resolution is needed?

– How much storage space is available for the data?

Another thing that must be carefully considered is how critical this data is. What will happen if the data connectivity is lost?

There are two different ways to examine data criticality:

1. If the data needs to be accessible 100% of the time and cannot handle any interruption,

2. If the data needs to arrive at the destination with 100% of the data intact, however then having some type of redundancy solution in place to guarantee that this can be achieved needs to be considered there can be a delay on the arrival, then a store and forward type of data transfer method might be preferable.

Data Architecture

There are many different ways to design this type of system, and each has its advantages and disadvantages. Above all else, it is essential that the architecture that best fits the requirements is selected. The two main types of data architectures are a ‘push architecture’ and a ‘pull architecture’.

The push architecture has control of data movement located at the point where the data originates, which would be at the remote facility or some branch where remote data is localised. From here the data is sent across the network into the central data warehouse.

Pull architecture has control of the data movement located at the central data warehouse, usually in a corporate headquarters where many different people or systems can access the data.

The central differences between the two are:

Push Architecture
More Secure: Since remote facilities are usually located within their own isolated networks, they have much tighter security restrictions for remote access than systems on an enterprise network. This makes it much harder for an external attacker to get access to the control systems and cause damage.

As a general security policy, data flow should be from a more secure location in the network to a less secure location. This is because if a less secure location was ever to be compromised then the data would still be safe and not accessible by the attacker.

Data Control: Having the data control maintained at the remote facility allows the staff at this remote location to have full control over what is sent to the data warehouse. This can be very critical if the data warehouse systems are maintained by a different team within the company or even maintained by a different company entirely. This prevents unauthorised people from seeing data that have not been explicitly configured.

Pull Architecture
Central Configuration: Having everything controlled in the data warehouse means that there is a central configuration for all of the data. Maintaining this data and configuration from a single location means that fewer people would need to be trained on the solution.

Less Cost: When only maintaining a single point of control there will generally be less cost for the overall architecture. However, these cost savings may not arise if communications to remote sites need to be more secure.

Higher levels of security can be implemented, but this normally requires an additional security layer to restrict the tags that the central location can view and access. Another main architectural decision that needs to be made is whether a store and forward solution and/or a redundant solution is needed.

Both of these solutions may be used simultaneously; however this will impact the cost of the implementation.

Process Data Visibility

It can be very beneficial to have local operators using the data that has been collected to pull up KPIs and dashboards. If all of this data is being collected into a central historian and that connection is lost, these KPIs and local dashboards will fail as a live connection is needed to the main historian for this function, which may be located hundreds of miles away.

In this type of situation, a store and forward solution that has a local buffer of data that can be accessed from the remote facility could be used. This allows all of the reports and dashboards to run locally, preventing outages if the main connection to the historian is lost, along with reducing the bandwidth being used because data is not being retrieved from the main data warehouse, it is being accessed locally.

Eliminating Process Data Loss with an Industrial Data Logger

A popular option to overcome such connectivity issues is implementing an industrial data logger. Such devices can connect to the SCADA system and transmit data directly to the historian automatically.

A popular option to overcome such connectivity issues is implementing an industrial data logger. Such devices can connect to the SCADA system and transmit data directly to the historian automatically.

Ideally, a data logger will be compatible with a wide variety of common historians or databases used to capture the data (including Oracle, PHD, IP21, PI, or any other data warehouse).

If deployed in an industrial environment, it should also be a rugged piece of hardware to deal with wide ambient temperature swings and constrained spaces. As well as a challenging physical environment, a data logger must be secure and reliable, safe from malicious attacks yet enabling the transmission of the data across firewalls and over WANs to the centralised data warehouse.

In the modern industrial environment, losing critical process data can occur due to a variety of factors, human and technical, intentional and accidental. Fortunately, data logger technology is available to safeguard against this and, when utilised in conjunction with a well-considered strategy and architecture, can ensure that any loss of data need not mean disaster.

Show More

    Would you like further information about this article?

    Add your details below and we'll be in touch ASAP!


    Input this code: captcha

    Phil Black - PII Editor

    I'm the Editor here at Process Industry Informer, where I have worked for the past 17 years. Please feel free to join in with the conversation, or register for our weekly E-newsletter and bi-monthly magazine here: https://www.processindustryinformer.com/magazine-registration. I look forward to hearing from you!

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    Back to top button

    Join 25,000 process industry specialists and subscribe to:

    PII has a global network of suppliers ready to help...