IBMStreams/streamsx.eventstore

Name: streamsx.eventstore

Owner: IBM Streams

Description: null

Created: 2017-07-20 13:52:38.0

Updated: 2017-12-19 17:04:01.0

Pushed: 2017-12-20 00:25:53.0

Homepage: null

Size: 212

Language: Scala

GitHub Committers

UserMost Recent Commit# Commits

Other Committers

UserEmailMost Recent Commit# Commits

README

streamsx.eventstore

Table of Contents

Description

This toolkit contains operators that enable you to connect IBM Streams to IBM Db2 Event Store.

Currently, this toolkit contains one operator: a file sink operator called EventStoreSink for inserting IBM Streams tuples in to an IBM Db2 Event Store table.

To use this operator, you must have an existing IBM Db2 Event Store database, and the IBM Db2 Event Store cluster or server must be running.

You can find precompiled EventStoreSink toolkits for various IBM Db2 Event Store RELEASES here: https://github.com/IBMStreams/streamsx.eventstore/releases

EventStoreSink

Important: The tuple field types and positions in the IBM Streams schema must match the field names in your IBM Db2 Event Store table schema exactly.

The EventStoreSink operator has three required parameters:

This sink operator can execute within a Consistent Region.

Changes

This is the first version of the toolkit, so there are no changes yet!

Supported versions
Data types

SPL type | Support status | Event Store type | ——– | ————– | ———– | boolean | Supported | Boolean | enum | Not supported | N/A | int8 | Supported | Byte | int16 | Supported | Short | int32 | Supported | Int | int64 | Supported | Long | uint8 | Supported | Byte | uint16 | Supported | Short | uint32 | Supported | Int | uint64 | Supported | Long | float32 | Supported | Float | float64 | Supported | Double | decimal32 | Not supported | N/A | decimal64 | Not supported | N/A | decimal128 | Not supported | N/A | complex32 | Not supported | N/A | complex64 | Not supported | N/A | timestamp | Supported | java.sql.Timestamp | rstring | Supported | String | ustring | Supported | String| blob | Not supported | N/A | xml | Not supported | N/A | list | Only one level supported | Array | bounded list type | Not supported | N/A | set | Only one level supported | Array | bounded set type | Not supported | N/A | map | Only one level supported | Map| bounded map type | Not supported | N/A | tuple | Not supported | N/A |

In the preceding table “only one level supported” means the array or map has elements and keys that are primitive data types.

Additional documentation

Java equivalents for SPL types

Installation

Using the distribution
  1. Clone the git repository to your VM where IBM Streams is installed.

  2. In the home directory of the cloned directory, create a directory called impl/lib where you will store the generated JAR file for the sink operator.

  3. In the home directory, compile the sink operator to make the JAR files, the project, and the target directories, and the toolkit.xml file by running one of the following commands:

  4. sbt toolkit

  5. ./recompile.sh

  6. Use the following steps to use the directory as a toolkit location in Streams Studio:

  7. Open IBM Streams Studio.

  8. Open the Streams Explorer view.

  9. Expand the IBM Streams Installation section, and expand the IBM Streams subsection.

  10. Right click on Toolkit locations.

  11. Select Add Toolkit Location.

  12. Select the directory where the cloned directory is stored and select OK.

Tip: If you don't want to use a clone, replace steps 1-3 by using a toolkit release for streamsx.eventstore that corresponds to the IBM Db2 Event Store release in a local directory. In step 4, use the directory where the toolkit is saved as the toolkit location.

Building from source

The build instructions assume the following setup:

Updating to a new version

If you already installed the toolkit following the instructions in Installing the toolkit from scratch and need the new version, complete the following steps:

  1. Change to your toolkit folder on your virtual machine and run the following commands:

    pull
    ctk toolkit
    

    Alternatively, change to the streamsx.eventstore directory and run:

    ./recompile.sh

  2. Refresh the toolkit location in Streams Studio (as specified in Using the distribution).

Installing the toolkit from scratch

In these instructions, your virtual machine is the Streams QSE VM. These instructions were written for Streams 4.2.0

  1. Install SBT on your virtual machine. See instructions for RedHat here: http://www.scala-sbt.org/0.13/docs/Installing-sbt-on-Linux.html.

  2. Clone the https://github.com/IBMStreams/streamsx.eventstore repository on the virtual machine file system. (It doesn't need to be in your Eclipse workspace.)

  3. Setup IBM Db2 Event Store on your remote machine and create a local configuration file to reference the remote IBM Db2 Event Store daemon.

    For more information, see Setting up the reference to the IBM Db2 Event Store daemon on your virtual machine

    For more information on installing IBM Db2 Event Store, see https://www.ibm.com/support/knowledgecenter/SSGNPV/eventstore/welcome.html.

  4. In the top level of the repository, run sbt toolkit or ./recompile.sh.

    Tip: You might need to create impl/lib/ in the repository for it to run properly.

  5. Create a new IBM Streams project. Add the location of the repository as a toolkit location.

  6. Start IBM Db2 Event Store on the remote machine (Depending on the version of IBM Db2 Event Store that you have installed, this could be your local host or your cluster.)

  7. Write an IBM Streams test project with EventStoreSink as the sink operator.

Configuration and setup for sample project

Setting up IBM Db2 Event Store

For more information on installing IBM Db2 Event Store, see: https://www.ibm.com/support/knowledgecenter/SSGNPV/eventstore/welcome.html

On your IBM Streams system, either download the precomipled toolkit that corresponds to your IBM Db2 Event Store edition or you clone the https://github.com/IBMStreams/streamsx.eventstore repository.

Note: If you clone the repository, you might need to edit the build.sbt file that is used to compile so that the IBM Db2 Event Store client JAR file corresponds to the IBM Db2 Event Store release where you want to insert data. For example to get the client JAR for IBM Db2 Event Store Enterprise Edition from Maven, the build.sbt file has the line:

"com.ibm.event? % ?ibm-db2-event-store-client? % ?1.1.0"

To use IBM Db2 Event Store Developer Edition version 1.1.2, comment out "com.ibm.event? % ?ibm-db2-event-store-client? % ?1.1.0" and uncomment the following line:

?com.ibm.event? % ?ibm-db2-event-store-desktop-client? % ?1.1.2?

Setting up the reference to the IBM Db2 Event Store daemon on your virtual machine

When you start IBM Db2 Event Store on a remote machine, the daemon should also start automatically.

To enable the EventStoreSink operator to connect to your remote IBM Db2 Event Store installation, you must determine the connection endpoint string. The connection endpoint string is the set of IP addresses and port numbers that has the form <hostname>:<portnumber>. Each entry is separated with a comma.

Enter this value for the connectionString parameter. For example: 9.26.150.75:1101,9.26.150.76:1101

Tip: To connect to IBM Db2 Event Store Developer Edition, use the external IP address for the work station where IBM Db2 Event Store is running. Use the same port number that is specified in the sample notebooks that are available in the Community section of the IBM Db2 Event Store end user client.

If you are running IBM Db2 Event Store Developer Edition on a Mac, you can find the external IP address in System Preferences > Network.

Event Store sink operator parameters

You can define the following parameters for the Event Store:

Optional output port

An optional output port exists in the EventStoreSink operator. This output port is intended to output the information on whether a tuple was successful or not when it was inserted into the database. EventStoreSink looks for a Boolean field called “Inserted” in the output stream. EventStoreSink sets the field to true if the data was successfully inserted and false if the insert failed.

Besides the “Inserted” column, the output will include the original tuple that was processed by the EventStoreSink operator.

The output stream can be used to tie in with an SQS operator to enable it to use the “Inserted” flag to remove successfully inserted tuples from the SQS query. If the tuple is not inserted, the tuple must remain in the SQS query so that it can be resubmitted for insertion again.

To add the output port for the sink operator, add the output port explicitly in the operator properties and define the schema for the output stream.


This work is supported by the National Institutes of Health's National Center for Advancing Translational Sciences, Grant Number U24TR002306. This work is solely the responsibility of the creators and does not necessarily represent the official views of the National Institutes of Health.