kikinteractive/docker-airflow

Name: docker-airflow

Owner: Kik Interactive

Description: Docker Apache Airflow

Forked from: puckel/docker-airflow

Created: 2017-04-13 14:45:37.0

Updated: 2017-04-13 14:45:39.0

Pushed: 2017-04-10 18:51:18.0

Homepage:

Size: 83

Language: Shell

GitHub Committers

UserMost Recent Commit# Commits

Other Committers

UserEmailMost Recent Commit# Commits

README

docker-airflow

CircleCI branch Docker Hub Docker Pulls Docker Stars

This repository contains Dockerfile of airflow for Docker's automated build published to the public Docker Hub Registry.

Informations
Installation

Pull the image from the Docker repository.

    docker pull puckel/docker-airflow
Build

For example, if you need to install Extra Packages, edit the Dockerfile and then build it.

    docker build --rm -t puckel/docker-airflow .
Usage

By default, docker-airflow runs Airflow with SequentialExecutor :

    docker run -d -p 8080:8080 puckel/docker-airflow

If you want to run another executor, use the other docker-compose.yml files provided in this repository.

For LocalExecutor :

    docker-compose -f docker-compose-LocalExecutor.yml up -d

For CeleryExecutor :

    docker-compose -f docker-compose-CeleryExecutor.yml up -d

NB : If you don't want to have DAGs example loaded (default=True), you've to set the following environment variable :

LOAD_EX=n

    docker run -d -p 8080:8080 -e LOAD_EX=n puckel/docker-airflow

If you want to use Ad hoc query, make sure you've configured connections: Go to Admin -> Connections and Edit “postgres_default” set this values (equivalent to values in airflow.cfg/docker-compose*.yml) :

For encrypted connection passwords (in Local or Celery Executor), you must have the same fernet_key. By default docker-airflow generates the fernet_key at startup, you have to set an environment variable in the docker-compose (ie: docker-compose-LocalExecutor.yml) file to set the same key accross containers. To generate a fernet_key :

    python -c "from cryptography.fernet import Fernet; FERNET_KEY = Fernet.generate_key().decode(); print FERNET_KEY"

Check Airflow Documentation

Install custom python package
UI Links

When using OSX with boot2docker, use: open http://$(boot2docker ip):8080

Scale the number of workers

Easy scaling using docker-compose:

    docker-compose scale worker=5

This can be used to scale to a multi node setup using docker swarm.

Wanna help?

Fork, improve and PR. ;-)


This work is supported by the National Institutes of Health's National Center for Advancing Translational Sciences, Grant Number U24TR002306. This work is solely the responsibility of the creators and does not necessarily represent the official views of the National Institutes of Health.