TDengine Reader
TDengine Reader plugin is used to read data from TDengine by TaosData TDengine.
Prerequisites
Considering performance issues, this plugin uses TDengine's JDBC-JNI driver, which directly calls the client API (libtaos.so or taos.dll) to send write and query requests to taosd instances. Therefore, dynamic library link files need to be configured before use.
First copy plugin/reader/tdenginereader/libs/libtaos.so.2.0.16.0 to /usr/lib64 directory, then execute the following commands to create soft links:
ln -sf /usr/lib64/libtaos.so.2.0.16.0 /usr/lib64/libtaos.so.1
ln -sf /usr/lib64/libtaos.so.1 /usr/lib64/libtaos.soExample
TDengine comes with a demo database taosdemo. We read some data from the demo database and print to terminal.
The following is the configuration file:
{
"job": {
"setting": {
"speed": {
"channel": 3
},
"errorLimit": {
"record": 0,
"percentage": 0.02
}
},
"content": {
"reader": {
"name": "tdenginereader",
"parameter": {
"username": "root",
"password": "taosdata",
"beginDateTime": "2017-07-14 10:40:00",
"endDateTime": "2017-08-14 10:40:00",
"splitInterval": "1d",
"connection": {
"jdbcUrl": "jdbc:TAOS://127.0.0.1:6030/test",
"querySql": [
"select * from test.meters where ts <'2017-07-14 10:40:02' and loc='beijing' limit 10"
]
}
}
},
"writer": {
"name": "streamwriter",
"parameter": {
"print": true
}
}
}
}
}Save the above configuration file as job/tdengine2stream.json
Execute Collection Command
Execute the following command for data collection
bin/addax.sh job/tdengine2stream.jsonCommand output is similar to the following:
Details
2021-02-20 15:32:23.161 [main] INFO VMInfo - VMInfo# operatingSystem class => sun.management.OperatingSystemImpl
2021-02-20 15:32:23.229 [main] INFO Engine -
{
"content":
{
"reader":{
"parameter":{
"password":"*****",
"connection":[
{
"querySql":[
"select * from test.meters where ts <'2017-07-14 10:40:02' and loc='beijing' limit 100"
],
"jdbcUrl":[
"jdbc:TAOS://127.0.0.1:6030/test"
]
}
],
"username":"root"
},
"name":"tdenginereader"
},
"writer":{
"parameter":{
"print":true
},
"name":"streamwriter"
}
},
"setting":{
"errorLimit":{
"record":0,
"percentage":0.02
},
"speed":{
"channel":3
}
}
}
2021-02-20 15:32:23.277 [main] INFO PerfTrace - PerfTrace traceId=job_-1, isEnable=false, priority=0
2021-02-20 15:32:23.278 [main] INFO JobContainer - Addax jobContainer starts job.
2021-02-20 15:32:23.281 [main] INFO JobContainer - Set jobId = 0
java.library.path:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
....
2021-02-20 15:32:23.687 [0-0-0-reader] INFO CommonRdbmsReader$Task - Begin to read record by Sql: [select * from test.meters where ts <'2017-07-14 10:40:02' and loc='beijing' limit 100
] jdbcUrl:[jdbc:TAOS://127.0.0.1:6030/test].
2021-02-20 15:32:23.692 [0-0-0-reader] WARN DBUtil - current database does not supoort TYPE_FORWARD_ONLY/CONCUR_READ_ONLY
2021-02-20 15:32:23.740 [0-0-0-reader] INFO CommonRdbmsReader$Task - Finished read record by Sql: [select * from test.meters where ts <'2017-07-14 10:40:02' and loc='beijing' limit 100
] jdbcUrl:[jdbc:TAOS://127.0.0.1:6030/test].
1500000001000 5 5 0 1 beijing
1500000001000 0 6 2 1 beijing
1500000001000 7 0 0 1 beijing
1500000001000 8 9 6 1 beijing
1500000001000 9 9 1 1 beijing
1500000001000 8 2 0 1 beijing
1500000001000 4 5 5 3 beijing
1500000001000 3 3 3 3 beijing
1500000001000 5 4 8 3 beijing
1500000001000 9 4 6 3 beijing
2021-02-20 15:32:26.689 [job-0] INFO JobContainer -
任务启动时刻 : 2021-02-20 15:32:23
任务结束时刻 : 2021-02-20 15:32:26
任务总计耗时 : 3s
任务平均流量 : 800B/s
记录写入速度 : 33rec/s
读出记录总数 : 100
读写失败总数 : 0Parameters
| Configuration | Required | Type | Default Value | Description |
|---|---|---|---|---|
| jdbcUrl | Yes | list | None | JDBC connection information of target database, note that TAOS here must be uppercase |
| username | Yes | string | None | Username of data source |
| password | No | string | None | Password for specified username of data source |
| table | Yes | list | None | Selected table names to be synchronized, using JSON data format. When configured for multiple tables, users need to ensure multiple tables have the same structure |
| column | Yes | list | None | Collection of column names to be synchronized in configured table, detailed description rdbmreader |
| where | No | string | None | Filtering conditions for the table |
| querySql | No | list | None | Use custom SQL instead of specified table to get data. When this item is configured, Addax system will ignore table, column configuration items |
| beginDateTime | Yes | string | None | Data start time, Job migrates data from beginDateTime to endDateTime, format is yyyy-MM-dd HH:mm:ss |
| endDateTime | Yes | string | None | Data end time, Job migrates data from beginDateTime to endDateTime, format is yyyy-MM-dd HH:mm:ss |
| splitInterval | Yes | string | None | Divide task according to splitInterval, create one task per splitInterval |
splitInterval
Used to divide task. For example, 20d represents dividing data into 1 task every 20 days. Configurable time units:
d(day)h(hour)m(minute)s(second)
Using JDBC-RESTful Interface
If you don't want to depend on local libraries or don't have permissions, you can use the JDBC-RESTful interface to write to tables. Compared to JDBC-JNI, the configuration differences are:
- driverClass specified as
com.taosdata.jdbc.rs.RestfulDriver - jdbcUrl starts with
jdbc:TAOS-RS:// - Use
6041as connection port
So the connection in the above configuration should be modified as follows:
{
"connection": [
{
"querySql": [
"select * from test.meters where ts <'2017-07-14 10:40:02' and loc='beijing' limit 100"
],
"jdbcUrl": ["jdbc:TAOS-RS://127.0.0.1:6041/test"],
"driver": "com.taosdata.jdbc.rs.RestfulDriver"
}
]
}Type Conversion
| Addax Internal Type | TDengine Data Type |
|---|---|
| Long | SMALLINT, TINYINT, INT, BIGINT, TIMESTAMP |
| Double | FLOAT, DOUBLE |
| String | BINARY, NCHAR |
| Boolean | BOOL |
Currently Supported Versions
TDengine 2.0.16
Notes
- TDengine JDBC-JNI driver and dynamic library versions must match one-to-one. Therefore, if your data version is not
2.0.16, you need to replace both the dynamic library and JDBC driver in the plugin directory.