Name: ibm-cloud-functions-refarch-data-processing-message-hub
Owner: International Business Machines
Description: An IBM Cloud Functions serverless reference architecture for responding to messages or handling streams of data records.
Created: 2018-03-05 15:53:01.0
Updated: 2018-04-21 02:17:48.0
Pushed: 2018-03-26 13:42:54.0
Size: 227
Language: Shell
GitHub Committers
User | Most Recent Commit | # Commits |
---|
Other Committers
User | Most Recent Commit | # Commits |
---|
This project deploys a reference architecture with IBM Cloud Functions to execute code in response to messages or to handle streams of data records. No code runs until messages arrive via IBM Message Hub (powered by Apache Kafka). When that happens, function instances are started and automatically scale to match the load needed to handle the stream of messages.
You can learn more about the benefits of building a serverless architecture for this use case in the accompanying IBM Code Pattern.
Deploy this reference architecture:
If you haven't already, sign up for an IBM Cloud account then go to the Cloud Functions dashboard to explore other reference architecture templates and download command line tools, if needed.
The application deploys two IBM Cloud Functions (based on Apache OpenWhisk) that read from and write messages to IBM Message Hub (based on Apache Kafka). This demonstrates how to work with data services and execute logic in response to message events.
One function, or action, is triggered by message streams of one or more data records. These records are piped to another action in a sequence (a way to link actions declaratively in a chain). The second action aggregates the message and posts a transformed summary message to another topic.
Choose “Start Creating” and select “Deploy template” then “Message Hub Events” from the list. A wizard will then take you through configuration and connection to event sources step-by-step.
Behind the scenes, the UI uses the wskdeploy
tool, which you can also use directly from the CLI by following the steps in the next section.
wskdeploy
command line toolThis approach will deploy the Cloud Functions actions, triggers, and rules using the runtime-specific manifest file available in this repository.
bx
CLI and Cloud Functions plugin.wskdeploy
CLI.kafka-broker
. On the “Manage” tab of your Message Hub console create two topics: in-topic and out-topic. On the “Service credentials” tab make sure to add a new credential named Credentials-1.template.local.env
to a new file named local.env
and update the KAFKA_INSTANCE
, SRC_TOPIC
, and DEST_TOPIC
values for your instance if they differ.wskdeploy
one a local copy of this repository
clone https://github.com/IBM/ibm-cloud-functions-refarch-data-processing-message-hub.git
bm-cloud-functions-refarch-data-processing-message-hub
ke service credentials available to your environment
ce local.env
sk package refresh
ploy the packages, actions, triggers, and rules using your preferred language
untimes/nodejs # Or runtimes/[php|python|swift]
eploy
wskdeploy
ploy the packages, actions, triggers, and rules
eploy undeploy
bx wsk
command line toolThis approach shows you how to deploy individual the packages, actions, triggers, and rules with CLI commands. It helps you understand and control the underlying deployment artifacts.
This approach sets up a continuous delivery pipeline that redeploys on changes to a personal clone of this repository. It may be of interest to setting up an overall software delivery lifecycle around Cloud Functions that redeploys automatically when changes are pushed to a Git repository.