aws-samples/aws-go-wordfreq-sample

Name: aws-go-wordfreq-sample

Owner: AWS Samples

Description: A sample micro service built with AWS SDK for Go using concurrency, and AWS services.

Created: 2015-10-02 22:25:21.0

Updated: 2017-12-07 19:27:27.0

Pushed: 2016-06-29 05:09:18.0

Homepage: null

Size: 33

Language: Go

GitHub Committers

UserMost Recent Commit# Commits

Other Committers

UserEmailMost Recent Commit# Commits

README

aws-go-wordfreq-sample

Word Frequency is a sample service built with AWS SDK for Go. The service highlights how can be used within a concurrent application. The service takes advantage of Amazon Simple Storage service, Amazon Simple Queue Service, Amazon DynamoDB, and AWS Elastic Beanstalk to collect and report the top 10 most common words of a text file all in the AWS Cloud.

This sample highlights how the SDK can be used to build an application, to read job messages from SQS queue when an existing or new file was uploaded to S3. A S3 bucket is configured to notify an SQS queue with information of the file uploaded. This job message will be read by one of potentially many instances of the Word Frequency service application. The service will then JSON decode the message extracting the object's bucket and key of the file uploaded. Once parsed and added to a job channel, a worker goroutine within a pool of workers will read the job from the channel, stream the object's content from S3, and count the words. When complete the worker will send the results to the results channel so that they can be recorded to DynamoDB, and also sent to an SQS result queue for further processing.

This package is made up of a set of executable commands.

uploads3

CLI application to upload a file from your local system to S3. Taking advantage of the S3 Upload Manager's concurrent multipart uploads.

Command line usage:

loads3 my-bucket my-filename

An additional environment variable can be set instructing the uploads3 command to wait for the file to be processed, and print out the results to the console when they are available.

worker

Service application which will read job messages from a SQS, count the top 10 words, record the results to DynamoDB, and send the results also to an additional SQS queue for further processing.

Requires the following environment variables to be set.

Optionally the follow environment variables can be provided.

createTable

CLI application to show how the SDK can be used to create a DynamoDB table, which the worker will use to record job results to.

Command line usage:

eateTable my-tablename

This work is supported by the National Institutes of Health's National Center for Advancing Translational Sciences, Grant Number U24TR002306. This work is solely the responsibility of the creators and does not necessarily represent the official views of the National Institutes of Health.