Concept and History of Historian
Historian is a term used to refer to a database that manages sensor data (or tag data) generated from manufacturing and production sites.
Perhaps, considering the meaning of the term “History”, it can be understood as a software with underlying functions that allow viewing historical data over time.
This historian software stores time series data with time information in factory production process, and is used for main production information, production status, performance monitoring and quality assurance. In the early days of the industry, it was developed for storage of sensor data and simple real-time monitoring, mainly for industrial equipment developers or related software developers. At that time, the time series data stored had relatively wide time gaps (seconds) and the amount of data was stored was relatively small (MBs).
However, as time went on, various functions were added and evolved into an optimized form for each industry group, and now dozens of large companies or professional industries are releasing related historian software; however, the amount of data or the speed at which it was captured was still below wishful limits.
According to this historical background, the functions among the respective historian products are similar to one another, but they have also evolved into products that are customized for use and performance depending on the business characteristics applied.
Below is a list of companies and products that can be found in the category of current Historian products. (Source: https://en.wikipedia.org/wiki/Operational_historian)
- ABBDecathlon History
- Aspen Technology21
- Canary Labs Enterprise Historian
- Enea’s Polyhedra Historian, a module of Polyhedra DBMS
- GE Intelligent PlatformsProficy Historian
- GP StrategiesEtaPRO System
- HoneywellUniformance PHD
- IconicsHyper Historian
- Inductive AutomationSQL Bridge module of Ignition SCADA
- National InstrumentsCitadel, used in LabVIEW DSC and other products
- OSIsoft- PI System
- Schneider ElectricInStep Software eDNA Real-Time Historian
- Schneider ElectricWonderware Historian
- Yokogawa Exaquantum Historian
IoT data explosion and changes in IIoT market
With the explosion of the Internet, the development of sensor technology and the technological advances in big data processing, the concept of Internet of Things (IoT) has started to spread rapidly since the 2000s.
In addition, the same trend has occurred in industry and the term and Industrial Internet of Things (IIoT) is once again changing the world.
This IIoT’s primary agenda is to ensure that all production components in the industrial field generate information and communicate quickly with each other, ultimately aiming at innovative, autonomous production processes. Sometimes this is referred to M2M (machine to machine) capabilities.
The evolutionary change in this industry begins with the simple fact that the performances of the sensor, the data producer and the number of sensors installed are increasing rapidly, as shown in the picture below.
As a result, the amount of sensor data generated from one factory will increase enormously, which is the core of the IIoT change.
Ironically, this explosion of underlying data brought the OT (Operational Technology) market, which refers to the industrial production process where Historian products were extremely active, and convergence of Information Technology (IT), which refers to the traditional information and data management industry. Today’s IIoT needs require better and more real-time Business Intelligence (BI) in analyzing the OT data, and thus providing greater insights in optimizing the operational performance of the industrial operation. This use case falls into the “big data” domain.
The background of this convergence is that Historian products have limited performance and functionality in the IIoT big data market where there are enormous data to handle. Inevitably, customers are looking for a solution to manage this big data from the IT domain.
Until now, it seems that OT and IT are divided into two parts and perform their respective roles. But it is expected that the products which can perform both roles simultaneously will be highly demanded from customers.
What should Historian Products have in the future? The Next Generation Historian.
None of the current solutions provide all the functionalities and performance that are appropriate for the evolving IIoT era. Current Historian solutions lack most of the functionality and performance needed for IIoT: High-speed data entry, Real-time data retrieval & compression, and data standardization, just to name a few. Click To Tweet
The following is a summary of what the next generation Historian should provide and as time goes by, which aspects of functionalities will be required on the market.
High-speed data entry
The fact that the scan rate and the number of sensor data are increased means that the number of sensor data to be input per second is huge. For some customers, more than 10 million data events are generated per second, and a high-speed storage is required to store them.
Real-time data retrieval
In past, to store simple data at high speed, the plain text files would be suitable. However, now users want to retrieve data based on time, sensor name and special conditions.
To meet next generation historian performance and storage needs, additional user and storage server or nodes are additionally needed. The need for flexible and simple scale-out solutions are critical for these big data needs.
Flexible scalability means that a large number of servers operate physically, which inevitably means server failures will occur. In a critical environment, server failures are not allowed to affect the business operation.
Standardized data extraction and transformation
Many businesses are having to deal with vendor lock-in issues, which limits flexibility for building dynamic solutions as business needs evolve. Ideally standard interfaces and functions should be adapted throughout the industry.
Real-time data compression
Storage has a cost. As data volumes explodes this cost becomes significant. With sensor data reaching millions of time-stamps per second finding vehicles to store the data at the at the lowest possible cost is critical. High speed, real-time “data compression” solves this need.
Interworking with various analysis solutions
There are many analytical solutions in the market. Interworking with these solutions will help businesses recycle their existing assets, and thus lowering their overall costs.
Machbase – A Next Generation Historian
Machbase is an excellent solution for the massive data processing requirements of today’s IIoT industry.
Comparing Machbase to the ideal next generation historian.
- High speed data entry
- A major Machbase strength.
- Based on patented high-speed data transfer protocol, memory and disk-optimized data conversion technology, Machbase shows remarkable input/ingestion performance.
- For a typical x86 Linux server, sensor data of 20 bytes or less Machbase achieved more than 2 million record ingestions per second, and with an 8-mode environment, data input capacity of more than 10 million records per second is achieved.
- Real-time data retrieval
- Machbase uniquely provides a technique for creating user indexes in parallel with real-time data entry/ingestion
- Machbase supports real-time and high-speed “search” of input data and returns results with industry leading performance.
- Machbase’s log structure merge (LSM) index structure is optimized for big data retrieval, guaranteeing retrieval results in a few seconds even in an environment where more than a billion records are stored in one node.
- Flexible Scalability and High Availability
- Machbase Enterprise Edition provides a clustered environment and an easy scale-out environment
- Machbase scalability enables users to increase the performance and data storage space linearly!
- Multiple replication structures can be selected according to the user’s environment and high availability is provided so that even if an arbitrary node is abnormally terminated, the service can be immediately sustained.
- Machbase has been developed in ‘C’ for all core engines and ensures ultra-efficient operation to maximize CPU and memory utilization
- Compared to other engines developed with Java or similar interpreter languages, users can save up to 70% or greater CPU consumption.
- Machbase is also designed to selectively adjust performance and resource utilization to suit user environments by providing various tuning parameters.
- Real-time data compression
- Machbase compresses all stored data and related indexes in real time.
- This saves up to 90% of the data storage space and reduces I/O costs required for data loading, enabling high-speed data retrieval.
- Standardized Data Extraction and Transformation.
- Machbase directly supports the ANSI SQL language!
- No need for users to learn another data access language and reuses all the existing database knowledge.
- Machbase provides a standard SDK resulting in maximizing reusability of existing applications.
- Interworking with various analysis solutions
- In addition to standard development tools, Machbase provides various interfacing environments including Restful API, Python, and R.
- Machbase allows users to easily interact with their own tools and maintain an infrastructure that allows for convenient analysis of large amounts of data.