Movember TrueNTH USA Shared Services
Pick any path for installation
$ export PROJECT_HOME=~/truenth_ss
$ sudo apt-get install postgresql python-virtualenv python-dev
$ sudo apt-get install libffi-dev libpq-dev build-essential redis-server
$ git clone https://github.com/uwcirg/true_nth_usa_portal.git $PROJECT_HOME
This critical step enables isolation of the project from system python,
making dependency maintenance easier and more stable. It does require
that you activate
the virtual environment before you interact with
python or the installer scripts. The virtual environment can be
installed anywhere, using the nested 'env' pattern here.
$ virtualenv $PROJECT_HOME/env
Required to interact with the python installed in this virtual environment. Forgetting this step will result in obvious warnings about missing dependencies. This needs to be done in every shell session that you work from.
$ cd $PROJECT_HOME
$ source env/bin/activate
To create the postgresql database that backs your Shared Services issue the following commands:
$ sudo -u postgres createuser truenth-dev --pwprompt # enter password at prompt
$ sudo -u postgres createdb truenth-dev --owner truenth-dev
The OS default version of pip is often out of date and may need to be updated before it is able to install other project dependencies:
$ pip install --upgrade pip setuptools
Copy the default to the named configuration file
$ cp $PROJECT_HOME/instance/application.cfg{.default,}
Obtain consumer_key
and consumer_secret
values from
Facebook App page and write the values to
application.cfg
:
# application.cfg
[...]
FB_CONSUMER_KEY = '<App ID From FB>'
FB_CONSUMER_SECRET = '<App Secret From FB>'
To enable Google OAuth, obtain similar values from the Google API page.
- Under APIs Credentials, select
OAuth 2.0 client ID
- Set the
Authorized redirect URIs
to exactly match the location of<scheme>://<hostname>/login/google/
- Enable the
Google+ API
Write to the respective GOOGLE_CONSUMER_KEY and
GOOGLE_CONSUMER_SECRET variables in the same application.cfg
configuration file.
Instruct pip
to install the correct version of all dependencies into the
virtual environment. This idempotent step can be run anytime to confirm the
correct libraries are installed:
pip install -r requirements.txt
The idempotent sync
function takes necessary steps to build tables,
upgrade the database schema and run seed
to populate with static data.
Safe to run on existing or brand new databases.
FLASK_APP=manage.py flask sync
To update your Shared Services installation run the deploy.sh
script
(this step replaces the two previous hand run steps).
This script will:
- Update the project with the latest code
- Install any dependencies, if necessary
- Perform any database migrations, if necessary
- Seed any new data to the database, if necessary
$ cd $PROJECT_HOME
$ ./bin/deploy.sh
To see all available options run:
$ ./bin/deploy.sh -h
$ FLASK_APP=manage.py flask runserver
$ celery worker --app portal.celery_worker.celery --loglevel=info
Alternatively, install an init script and configure. See Daemonizing Celery
Should the need ever arise to purge the queue of jobs, run the following destructive command
$ celery -A portal.celery_worker.celery purge
The value of SQLALCHEMY_DATABASE_URI
defines which database engine
and database to use. At this time, only PostgreSQL is supported.
Thanks to Alembic and Flask-Migrate, database migrations are easily managed and run.
Note
Alembic tracks the current version of the database to determine which migration scripts to apply. After the initial install, stamp the current version for subsequent upgrades to succeed:
FLASK_APP=manage.py flask db stamp head
Anytime a database (might) need an upgrade, run the manage script with
the db upgrade
arguments (or run the deployment
script)
This is idempotent process, meaning it's safe to run again on a database that already received the upgrade.
FLASK_APP=manage.py flask db upgrade
Update the python source files containing table definitions (typically classes derrived from db.Model) and run the manage script to sniff out the code changes and generate the necessary migration steps:
FLASK_APP=manage.py flask db migrate
Then execute the upgrade as previously mentioned:
FLASK_APP=manage.py flask db upgrade
To run the tests, repeat the
postgres createuser && postgres createdb
commands as above with the
values for the {user, password, database} as defined in the
TestConfig
class within portal.config.py
All test modules under the tests
directory can be executed via
nosetests
(again from project root with the virtual environment
activated)
$ nosetests
Alternatively, run a single modules worth of tests, telling nose to not supress standard out (vital for debugging) and to stop on first error:
$ nosetests -sx tests.test_intervention
The test runner Tox is configured to run the portal test suite and test other parts of the build process, each configured as a separate Tox "environment". To run all available environments, execute the following command:
$ tox
To run a specific tox environment, "docs" or the docgen environment in this case, invoke tox with the -e
option eg:
$ tox -e docs
Tox will also run the environment specified by the TOXENV
environment variable, as configured in the TravisCI integration.
Tox will pass any options after -- to the test runner, nose. To run tests only from a certain module (analgous the above nosetests invocation):
$ tox -- -sx tests.test_intervention
This project includes integration with the TravisCI continuous integration platform. The full test suite (every Tox virtual environment) is automatically run for the last commit pushed to any branch, and for all pull requests. Results are reported as passing with a ✔ and failing with a ✖.
UI integration/acceptance testing is performed by Selenium and is included in the test suite and continuous intergration setup. Specifically, Sauce Labs integration with TravisCI allows Selenium tests to be run with any number of browser/OS combinations and captures video from running tests.
UI tests can also be run locally (after installing xvfb
) by passing
Tox the virtual environment that corresponds to the UI tests (ui
):
$ tox -e ui
Project dependencies are hardcoded to specific versions (see
requirements.txt
) known to be compatible with Shared Services to
prevent dependency updates from breaking existing code.
If pyup.io integration is enabled the service will create pull requests when individual dependencies are updated, allowing the project to track the latest dependencies. These pull requests should be merged without need for review, assuming they pass continuous integration.
Docs are built seperately via sphinx. Change to the docs directory and
use the contained Makefile to build - then view in browser starting with
the docs/build/html/index.html
file
$ cd docs
$ make html