ctsit/imagemap

Name: imagemap

Owner: CTS-IT

Owner: CTS-IT

Description: Painmap REDCap external module

Forked from: marlycormar/painmap

Created: 2017-12-21 21:44:49.0

Updated: 2018-02-02 20:07:32.0

Pushed: 2018-02-03 17:59:03.0

Homepage:

Size: 505

Language: HTML

GitHub Committers

UserMost Recent Commit# Commits

Other Committers

UserEmailMost Recent Commit# Commits

README

REDCap module: Pain Map

This REDCap module improves the survey participants experience by providing an easy way to indicate painful body parts as well as levels of pain. Such improvement is achieved by allowing the user to select the image representing the user's current pain level or by clicking predetermined regions in a diagram of the human body.

Prerequisites
Installation
Features included

This module defines a new action tag: @IMAGEMAP. The possible values for this tag are PAINMAP_MALE (representation of a generic male body), PAINMAP_FEMALE (representation of a female body), and SMILE_SCALE (six faces diagram); they correspond to the following images:

PAINMAP_MALE

PAINMAP_MALE

PAINMAP_FEMALE

PAINMAP_FEMALE

SMILE_SCALE

SMILE_SCALE

Usage

To display one of the images above in a survey or data entry form, add a new field of type Text Box and include one of the following options in the Action Tags / Field Annotation (optional) field:

@IMAGEMAP = PAINMAP_MALE 
@IMAGEMAP = PAINMAP_FEMALE
@IMAGEMAP = SMILE_SCALE

Each body part selected is associated with a key, for example the “Ankle (front-left)” of the female body diagram is linked to the key “f34”. To find a particular key for a body part, please refer to the html files (map files) located in the folder maps. After selecting multiple body parts, the field containing the action tag @IMAGEMAP will have as value a string of comma-separated keys, e.g. “f36,f17,f18,f21”. Similarly, if using the faces diagram, the field containing the action tag (i.e. @IMAGEMAP=SMILE_SCALE) will have the value corresponding to the face clicked, which ranges from 1 to 7.


This work is supported by the National Institutes of Health's National Center for Advancing Translational Sciences, Grant Number U24TR002306. This work is solely the responsibility of the creators and does not necessarily represent the official views of the National Institutes of Health.