Ingesting Data from OPC

Sean Ely
Sean Ely

Last month we announced the new Data Sources — a robust and easy-to-use data transfer interface within TDengine, and we gave a demo on how to Ingest Data from a PI System, and earlier this week we gave a demo on how to Ingest Data from MQTT. In this blog post we show a demo of how to use TDengine Cloud to ingest data directly from an OPC UA Server. This capability to configure data interfaces to ingest data is available in both TDengine Cloud and TDengine Enterprise.

Setup is as easy as downloading the data collection agent, installing it on the network where the OPC UA Server is, and providing the agent with the endpoint where you would like to send the data. From there, everything is configured from the TDengine Data Sources page — just configure the Server Endpoint and upload a CSV configuration file and watch the data flow into TDengine.

Setting up OPC-UA Demo with TDengine Cloud

Overall, the demo took the following steps:

  1. Create a Linux VM (Ubuntu 20.04 was used)
  2. Install Docker
  3. Install and connect to OPC UA Server (docker)
  4. Install and configure TaosX-Agent
  5. Create the csv configuration for mapping the OPC UA data
  6. Setup OPC UA Data Source in TDengine Cloud
  7. View the results in TDengine

Create the Linux VM and SSH into it

Created a Ubuntu 20.04 Linux VM on Microsoft Azure, downloaded the pem and connected via SSH (example connection below)

ssh -i ~/.ssh/linux-vm_key.pem azureuser@

Install Docker

Once logged into the VM, setup docker community edition

sudo apt update
sudo apt install software-properties-common curl apt-transport-https ca-certificates -y
curl -fsSL | sudo gpg --dearmor -o /etc/apt/trusted.gpg.d/docker-archive-keyring.gpg
sudo add-apt-repository "deb [arch=amd64] $(lsb_release -cs) stable"
sudo apt install docker-ce docker-ce-cli uidmap -y
sudo usermod -aG docker $USER
newgrp docker

Setup an OPC UA Server with some random data for testing

The IoT Edge OPC UA PLCby Microsoft was used for this demo. This OPC UA server creates some sample nodes which generate random data with some anomalies. Find the image and documentation on Docker Hub.

From the default configuration provided by Microsoft, one addition was made for this demo. “–ut” was added to enable anonymous login. The docker run command is listed below.

docker run --rm -it -p 50000:50000 -p 8080:8080 \
   -v $PWD/certs:/certs/ \
  --name opcplc \
  --pn=50000 --autoaccept --sph --sn=5 --sr=10 --st=uint \
  --fn=5 --fr=1 --ft=uint --gn=5 --ut

Connect to the OPC UA Server using an OPC UA Client

Before trying to connect to the OPC UA server, make sure the port 50000 is opened up for inbound traffic in the network configuration for the Linux VM.

Next download an OPC UA client, for this demo the Prosys OPC UA Browser was used. Launch the OPC UA Browser and input the connection string with the format “opc.tcp://<ip>:<port>” for example, this demo has the connection string of opc.tcp://

Next browse through the OPC UA tree to view telemetry data to be transferred to TDengine

Objects > OpcPlc > Telemetry

For this demo we will be using 4 nodes:

  1. FastUInt1 – Type Int
  2. DipData – Type Double
  3. SpikeData – Type Double
  4. StepUp – Type Int

Create the Dataset Configuration CSV file

With the information including the NodeID from the OPC UA Client, create a CSV config file for the OPC UA Dataset.

  1. The first row is for human-readability to describe what the records are for and is not used in the configuration.
  2. The second row is the meta header, its content will be strictly parsed into the fully described table schema.
Point IDpoint_idYesThe OPC-UA node id
Data TypetypeYesThe data type of current node, it should of types int, double or other TDengine data types.
STable NamestableNoBy default “opc_{type}” is used as a template, for better context, users are encouraged to define their own STable naming.
Table NametbnameYesThe table name (child table name under STable)
EnabledenabledNoTo update the node or not. `true` by default.
Timestamp Column Namets_colNoUse OPC message time as primary timestamp value.
Received Timestamp Columnreceived_time_colNoUse taosx collecting time as primary timestamp value.
Value Column Namevalue_colNoThe value column name for each node(under a STable).
Quality Column Namequality_colNoThe quality column name for each node like value.
Additional Tagtag::type::nameNoThe tag field definition, “tag” is the identity prefix, “type” is TDengine data type, usually use string with VARCHAR, “name” is the tag name in the STable. A full example string is: tag::varchar(64)::note . You can add as many tags as you need.

Recommendations for configuring

  1. It is recommended to use different STable names for different data types
  2. It is recommended to use individual table names for each node.
  3. It is recommended to use the received timestamp as primary timestamp value.

The CSV configuration used in this demo is shared here.

Install TaosX-Agent

Created a new connection agent in TDengine cloud and copy the configuration into the agent.toml file

scp -i ~/.ssh/linux-vm_key.pem ~/code/taosx-mqtt/taosx-agent-1.2.1-linux-x64.tar.gz azureuser@
tar -xzf taosx-agent-1.2.3-linux-x64.tar.gz
cd taosx-agent-1.2.3-linux-x64
cd /etc/taos/
sudo vi agent.toml
sudo systemctl start taosx-agent

Configure the OPC UA Data Source in TDengine Cloud

Now that the CSV configuration file has been created and the data collection agent has been installed configure the data source in TDengine Cloud.

  1. Type: OPC-UA
  2. Select an existing database, or create a new database for the demo
  3. Provide the server endpoint in the format “<IP>:<port>”
  4. Authentication can be left blank for anonymous authentication for the demo
  5. Enable CSV config and upload the CSV file previously created
  6. Adjust any of the other configuration values as desired, or leave them as default and create the data source

View the Results in TDengine Data Explorer

  • Sean Ely
    Sean Ely

    Sean Ely previously worked as Head of Product at TDengine.