Unify 5.0 Release Notes - 15 Mar 2023
Capabilities added in Unify 5.0 include:
- Make industrial data actionable from the cloud with the new Time Series Data Mover capability, which comprises several features to retrieve, process, and share streaming time series data
- Respond quickly to issues by viewing pipeline status with the new pipeline status log
- Other updates and enhancements
Time Series Data Mover
The new TSDM capability includes:
- A connector for PI Data Archive to import PI tags (points)
- Updates to Unify to store and process datastreams, which are new artifacts, similar to datasets, which contain time series data
- Use stream pipeline to contextualize stream data by combining it with model data
- Use Unify transform blocks to accomplish no-code mashup of time series data with model data
PI Data Archive Connector
The connector supports two methods for importing PI archive data, both configurable. You can include filters for the PI tag PointSource and Name:
- Real-time: append new tag data based on the configured time frame
- These are pulled into Unify as events - in this mode all tags that have a new value since the last run are included. Conversely, if a tag value hasn’t changed since the last event was pulled, it will not be included in the latest event.
- You can choose how often the real-time process runs, for example, 30 or 120 seconds
- The countdown to the next real-time process starts after the last one completes
- Backfill: import historical data
- Used to “seed” the system with the recent history of tag values
- Backfill supports short time periods, to a maximum of 31 days
For full installation instructions, see the On Premise PI Time Series Connector Installation Guide.
Updates to Unify
Unify includes several new features to process and share time series data.
- Pipelines to process streams - when creating a new pipeline, you can specify if it is a stream pipeline
- Two new transforms:
- “Datastream”, which represents the incoming stream data from PI Data Archive. This artifact is created by the connector configured as described above. Datastreams and all the transforms connected to them are easily identified by the dashed flow line on the canvas.
- Note: you can configure only one datastream per stream pipeline
- Note: in a stream pipeline, preview includes only the latest event data, which may not include a value for every single PI tag as the latest event only includes tags which have new values since the last run.
- “DerivedStream”, which represents the outgoing, contextualized stream data from Unity
- DerivedStream can only be used for one source datastream. It shows a validation error if you reuse the derived stream for a new source. This helps you to avoid appending processed events from a different stream source to the same derived datastream, which will cause problems for the downstream data processing.
Figure 1: Stream Pipeline with a Datastream and Derived Stream in Unify 5.0.
- The following transforms cannot receive data from a datastream (i.e., cannot be placed to the right of a datastream)
- MapAttribute
- MapTemplate
- Graph Builder
- Derived Dataset
- Assign Value
- Flow Audit
- SQL queries that insert or modify data
- Once you’ve defined your stream pipeline, AutoSync will continuously process datastream events and output new processed data to the derived datastream. Note: stream pipelines do not need to be published so you will not see that option on a stream pipeline.
AWS Redshift & Generic API Connectors
To share your contextualized time series data with other systems, you can use either the AWS Redshift connector or work with Element to build a custom egress connector.
- Connector for AWS RedShift to export time series data streams
- See the AWS Redshift Publish Connector Deployment Guide on the Element Community for more information
- Generic connector API for exporting time series data streams to other systems
- If you have other destination systems for time series data, please contact Element for more details
Pipeline Status
You can easily view logs for a specific pipeline by selecting the logs button on the pipeline page. If there’s an error with a pipeline, the circle indicator on the button will turn red.
Figure 2: Pipeline Logs in Unify 5.0.
Other Updates
- Continued performance and reliability enhancements for pipelines, particularly those triggered by evergreening events
Frequently Asked Questions
Contact Us if you have any questions or feedback on TDSM or Unify 5.0: [email protected]
Q: Is Unify shifting to focus just on time series data?
A: We’re not shifting focus, we’re deepening our focus on all types of operations data. One of the most common challenges we hear is the cost, complexity, and potential incompleteness of transferring data to the cloud. Element Unify is an open and flexible Industrial DataOps solution for data-driven decisions and data-powered operations, so adding time series data fits right in.
Q: Does Unify support other ingress/source systems for TSDM?
A: As noted above, our initial release of TSDM supports the PI Data Archive. We plan on adding support for Aspen IP.21 shortly. Please reach out if you are interested in IP.21 or support for other sources of time series data.
Q: Does Unify support other egress/destination systems for TSDM?
A: Our initial release of TSDM includes an API for creating a custom connector for other destination systems. Our next two targets for destination systems are AWS S3 and Azure ADX. Please contact us for more information.
Q: How can we seed the system with all our historical data (e.g. years of PI tags.)
A: Importing your full history of PI tags typically happens after your initial set-up is completed, using the existing backfill capability for configuration and testing. We are currently planning a method to import larger batches of historical data. For importing all your historical data, please reach out to discuss your needs and objectives.
Q: What does the timestamp column I see in the datastream preview signify?
A: It is the timestamp obtained from PI. It is not the timestamp of the time we pulled the event. For example:
A sensor on pump records a value of 5.5mps at 10:58:03 AM on 15 Mar 2023. That value and full timestamp is sent to PI. Assuming the connector is set to read from PI every minute, the next event is pulled at 10:59:00 AM on 15 Mar 2023 and the 5.5mps is included in the event. The timestamp we will see in the Unify preview will be 2023-03-15 10:58:03 AM.
Q: Does TSDM support PI Event Frames & calculations?
A: PI Event Frames are not currently supported by TSDM. If including Event Frames is a priority for you, please contact us. If a PI tag value is calculated and stored as a tag, TSDM will pull it in as a PI tag value. Calculation definitions are not pulled in by TSDM but continue to be supported via the Unify PI AF Connector and can be combined with time series data in a stream pipeline.
Q: What logs are generated by this set of capabilities?
A: The pipeline generates logs for “run pipeline” events, including when the “run pipeline” job is enqueued, has failed, and has succeeded. Log status includes: Info, Warning, Error, Success. You can see the pipeline log in the new pipeline status log:
Figure 3: Pipeline Logs in Unify 5.0.
Known Issues
The following are known issues in this version of Unify. Corrections to these issues will be included in a future version.
- Preview refresh: during a real-time data update, the preview pane may show a partial list of PI tag data. This is because the process is still in progress. Refreshing the page after a few moments should resolve the issue.
- Catalog pane: after opening the right-side pane on a catalog view, then returning to a pipeline without closing the catalog pane may cause the catalog pane to reappear. Closing the pane before navigating away from the page will resolve the issue.
- Logging for stream pipelines does not log a “run event”.
We look forward to hearing your feedback on this release!