IBM/MAX

Name: MAX

Owner: International Business Machines

Description: null

Created: 2018-03-01 23:27:23.0

Updated: 2018-05-12 06:16:02.0

Pushed: 2018-03-12 06:37:21.0

Homepage: null

Size: 55

Language: null

GitHub Committers

UserMost Recent Commit# Commits

Other Committers

UserEmailMost Recent Commit# Commits

README

Template for Model Asset eXchange (MAX) Models

MAX-Inception V3 Model
Description

Keras is a deep learning library that you can use in conjunction with Tensorflow and several other deep learning libraries. This is Keras model Inception V3, with weights pre-trained on ImageNet. Since its first introduction, Inception has been one of the best performing family of models on the ImageNet dataset

Model Metadata

| Domain | Industry | Framework | Datasets | Data Format |
| ————- | ——– | ——– | ——— | ——— | | Vision | General | Tensorflow and Theano| ImageNet | channels_first, channels_last|

References
Licenses

| Component | License | Link | | ————- | ——– | ——– | | This repository | Apache 2.0 | LICENSE | | Model Weights | CC BY License | ABC| | Model Code (3rd party) | MIT | ABC| | Test assets | Various | Asset README |## Quick Start

1. Quick Start

Download and run the image from Dockerhub

erhub link and running instruction
Pre-requisite
Steps
Part A: Build, deploy and use the Model using Docker
  1. Build the Model
  2. Deploy the Model
  3. Use the Model
  4. Development
  5. Clean Up
Part B: Train, deploy and score the Model using Watson Studio Deep Learning
  1. Train and Deploy using Watson Studio
  2. Troubleshooting
Part A: Build, deploy and use the Model using Docker
1. Build the Model

Clone the MAX repository locally. In a terminal, run the following command:

t clone https://https://github.com/IBM/MAX.git

Change directory into the repository base folder: $ cd MAX.

To build the docker image locally, run:

cker build -t mae-keras -f Dockerfile .

The docker image uses the nvidia/cuda base image so will need to pull that from Dockerhub, which may take some time. Note that currently this docker image is CPU only (we will support GPU later).

2. Deploy the Model

To run the docker image, which automatically starts the model serving API, run:

cker run -it -v $PWD:$PWD -w $PWD -v $HOME/.keras/models:/root/.keras/models -p 5000:5000 mae-keras

When run for the first time, the model files will automatically be downloaded.

3. Use the Model

The API server automatically generates an interactive Swagger documentation page. Go to http://localhost:5000 to load it. From there you can explore the API and also create test requests.

Use the model/predict endpoint to load a test image (you can use one of the test images from the assets folder) and get predicted labels for the image from the API.

You can also test it on the command line, for example:

rl -F "image=@assets/dog.jpg" -XPOST http://127.0.0.1:5000/model/predict

You should see a JSON response like that below:

atus": "ok", "predictions": [{"label": "beagle", "probability": 0.9201778173446655, "label_id": "n02088364"}, {"label": "Walker_hound", "probability": 0.010086667723953724, "label_id": "n02089867"}, {"label": "English_foxhound", "probability": 0.009787781164050102, "label_id": "n02089973"}, {"label": "bluetick", "probability": 0.006095303222537041, "label_id": "n02088632"}, {"label": "Eskimo_dog", "probability": 0.0025898281019181013, "label_id": "n02109961"}]}
4. Development

To run the Flask API app in debug mode, edit app.py to set the last line to:

run(debug=True, host='0.0.0.0')

You will then need to rebuild the docker image (following step 2 and step 3).

5. Clean up
Part B: Train, deploy and score the Model using Watson Studio Deep Learning
6. Train and Deploy using Watson Studio

Watson Studio Deep Learning instructions go here

7. Troubleshooting

This work is supported by the National Institutes of Health's National Center for Advancing Translational Sciences, Grant Number U24TR002306. This work is solely the responsibility of the creators and does not necessarily represent the official views of the National Institutes of Health.