Skip to content

Commit

Permalink
Merge branch 'release/0.2.0'
Browse files Browse the repository at this point in the history
  • Loading branch information
abought committed Nov 5, 2020
2 parents df11dcd + c153d3b commit eea490a
Show file tree
Hide file tree
Showing 46 changed files with 2,504 additions and 3,041 deletions.
3 changes: 2 additions & 1 deletion .envs/.local/.django
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,11 @@ LZ_OFFICIAL_DOMAIN=localhost:8000
# ------------------------------------------------------------------------------
GOOGLE_ANALYTICS_ID=

# Sentry uses two buckets: one for python errors (backend) and one for JS (frontend)
# Our Sentry configuration uses two buckets: one for python errors (backend) and one for JS (frontend)
# ------------------------------------------------------------------------------
SENTRY_DSN=
SENTRY_DSN_FRONTEND=

# Set the location of download large lookup files (which are downloaded separately, after the build step)
# This path is relative to the docker container
ZORP_ASSETS_DIR=/app/.lookups
60 changes: 60 additions & 0 deletions .envs/.production/.django-sample
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
##### Example file. Rename to .django to use in production

# General
# ------------------------------------------------------------------------------
# DJANGO_READ_DOT_ENV_FILE=True
DJANGO_SETTINGS_MODULE=config.settings.production
DJANGO_SECRET_KEY=
# The admin site has restricted functionality, but make it hard for bots to find. In the future this could be served separately behind a VPN
DJANGO_ADMIN_URL=admin-changeme-something-very-hard-to-guess/
# The localhost entries work if we are using a reverse proxy
DJANGO_ALLOWED_HOSTS=.my.locuszoom.org,localhost,localhost:5000
LZ_OFFICIAL_DOMAIN=my.locuszoom.org

# Security
# ------------------------------------------------------------------------------
# TIP: It is better to handle the redirect via the server. However, django can handle the redirect if needed.
DJANGO_SECURE_SSL_REDIRECT=False

# Email
# ------------------------------------------------------------------------------
[email protected]

DJANGO_EMAIL_HOST=smtp.host.example
DJANGO_EMAIL_HOST_USER=locuszoom-noreply
DJANGO_EMAIL_HOST_PASSWORD=

# django-allauth
# ------------------------------------------------------------------------------
# This can be disabling if, eg, you are running a private instance and don't want random people to upload things
DJANGO_ACCOUNT_ALLOW_REGISTRATION=True

# Gunicorn
# ------------------------------------------------------------------------------
# Recommend 2-4x # cores
WEB_CONCURRENCY=8

# Our Sentry configuration uses two buckets: one for python errors (backend) and one for JS (frontend)
# ------------------------------------------------------------------------------
SENTRY_DSN=
SENTRY_DSN_FRONTEND=

# Google Analytics
# ------------------------------------------------------------------------------
GOOGLE_ANALYTICS_ID=

# Redis
# ------------------------------------------------------------------------------
REDIS_URL=redis://redis:6379/0

# Celery
# ------------------------------------------------------------------------------

# Flower
## Set these to very hard to guess values
CELERY_FLOWER_USER=
CELERY_FLOWER_PASSWORD=
CELERY_WORKERS=12

# For now, re-use the local "uploads" mount point as a place to store the large lookup files required by annotations
ZORP_ASSETS_DIR=/lz-uploads/.lookups
8 changes: 8 additions & 0 deletions .envs/.production/.postgres-sample
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# PostgreSQL
# ------------------------------------------------------------------------------
POSTGRES_HOST=postgres
POSTGRES_PORT=5432
POSTGRES_DB=locuszoom_plotting_service
# Please make sure to set these.
POSTGRES_USER=
POSTGRES_PASSWORD=
7 changes: 5 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,7 @@ celerybeat.pid
staticfiles/

# Webpack built assets (and files that reference them)
assets/webpack_bundles/
webpack-stats.json
locuszoom_plotting_service/static/webpack_bundles/

# Sphinx documentation
docs/_build/
Expand Down Expand Up @@ -43,3 +42,7 @@ scripts/data_loaders/sources/*
# Configuration files that should not be checked into Git
.envs/*
!.envs/.local/
# Exclude production envs, EXCEPT samples
!.envs/.production/
.envs/.production/*
!.envs/.production/*sample
2 changes: 1 addition & 1 deletion .nvmrc
Original file line number Diff line number Diff line change
@@ -1 +1 @@
lts/erbium
lts/fermium
68 changes: 17 additions & 51 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# LocusZoom: Hosted Upload Service

Upload and share GWAS results with LocusZoom.js
Upload, analyze, and share GWAS results with LocusZoom.js. Try it at [my.locuszoom.org](https://my.locuszoom.org).


## Settings
Expand All @@ -11,10 +11,11 @@ For a basic guide to most settings, see the [cookiecutter-django docs](https://c


### Quickstart
The following commands will start a development environment. Some IDEs (such as Pycharm) are able to run the app via
run configurations, which may be more convenient than starting things through the terminal.
The following commands will start a development environment. Some IDEs (such as Pycharm) are able to run the app via run configurations, which may be more convenient than starting things through the terminal.

For production deployment instructions, see [docs/deploy/index.md](docs/deploy/index.md).

- In one tab, build assets:
- In one tab, build assets (in local development, this is currently done on the host system, outside of Docker):

`$ yarn run prod`

Expand All @@ -25,12 +26,15 @@ or with live rebuilding, if you intend to be changing JS code as you work:
- In a second open terminal::

```
$ docker-compose -f local.yml build
$ docker system prune && docker-compose -f local.yml build --pull
$ docker-compose -f local.yml up
```

On the first installation, you will also need to download some large asset files required for annotations. (see
deployment docs for the correct command to use with production assets)
(`docker system prune` is optional, but it can save your hard drive from filling up as you experiment with different build options)

On the first installation, you will also need to download some large asset files required for annotations. For local development, a "test" version is available that will only annotate a limited subset of biologically interesting genes; this subset is much smaller than the full database, and easier to use on a laptop.

(see deployment docs for the correct command to use with production assets)

```bash
$ docker-compose -f local.yml run --rm django zorp-assets download --type snp_to_rsid_test --tag genome_build GRCh37 --no-update
Expand Down Expand Up @@ -63,8 +67,7 @@ into your browser. Now the user's email should be verified and ready to go.

`$ docker-compose -f local.yml run --rm django python manage.py makemigrations`

Then verify the migration file is correct, and restart docker to apply the migrations automatically.
(in production, you must apply the migrations manually; see deployment guide for details)
Then verify the migration file is correct, and restart Docker to apply the migrations automatically. (in production, you must apply the migrations manually; see deployment guide for details)


## Development and testing helpers
Expand All @@ -78,7 +81,7 @@ This script generates fake studies for search results, but it notably does not r
It may be improved in the future to generate more realistic and complete fake data.

### Opening a terminal for debugging
Because all development happens inside a docker container, it is sometimes useful to open a terminal for debugging
Because all development happens inside a Docker container, it is sometimes useful to open a terminal for debugging
purposes. This can be done as follows.

On a running container::
Expand Down Expand Up @@ -114,55 +117,18 @@ A suite of unit tests is available::

`$ docker-compose -f local.yml run --rm django pytest`

## Celery

This app comes with Celery. The docker configuration will automatically launch celery workers when the app starts, but
the commands below may be useful when running the app in other environments.

To run a celery worker:

```bash
$ cd locuszoom_plotting_service
$ celery -A locuszoom_plotting_service.taskapp worker -l info
```

Please note: For Celery's import magic to work, it is important *where* the celery commands are run. If you are in the
same folder with *manage.py*, you should be ok.


## Sentry

Sentry is an error logging aggregator service. If a key (DSN) is provided in your .env file, errors will be tracked
automatically. You will need one DSN each for your frontend (JS) and backend (python) code.
automatically.

## Deployment

### Docker
This app uses docker to manage dependencies and create a working environment. See
This app uses Docker to manage dependencies and create a working environment. See
[deployment documentation](docs/deploy/index.md) for instructions on how to create a working, production server
environment. A local, debug-friendly docker configuration is also provided in this repo, and many of the instructions
environment. A local, debug-friendly Docker configuration is also provided in this repo, and many of the instructions
in this document assume this is what you will use.

The original docker configuration has been modified from cookiecutter-django; see their docs for more information
The original Docker configuration has been modified from cookiecutter-django; see their docs for more information
about default options and design choices.


### (future) Initializing the app with default data

Certain app features, such as "tagging datasets", will require loading initial data into the database.

This feature is not yet used in production, but the notes below demonstrate loader scripts in progress.

These datasets may be large or restricted by licensing rules; as such, they are not distributed with the code and must
be downloaded/reprocessed separately for loading.

- [SNOMED CT (Core) / May 2019](https://www.nlm.nih.gov/research/umls/Snomed/core_subset.html)

These files must be downloaded separately due to license issues (they cannot be distributed with this repo).
Run the appropriate scripts in `scripts/data_loaders/` to transform them into a format suitable for django usage.

After creating the app, run the following command (once) to load them in (using the appropriate docker-compose file)::

`$ docker-compose -f local.yml run --rm django python3 manage.py loaddata scripts/data_loaders/sources/snomed.json`

[![Built with Cookiecutter Django](https://img.shields.io/badge/built%20with-Cookiecutter%20Django-ff69b4.svg)](https://github.com/pydanny/cookiecutter-django/)
Empty file removed assets/js/library.js
Empty file.
7 changes: 4 additions & 3 deletions assets/js/pages/gwas_summary.js
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ import {create_qq_plot, create_gwas_plot} from '../util/pheweb_plots';

import Tabulator from 'tabulator-tables';
import 'tabulator-tables/dist/css/bootstrap/tabulator_bootstrap4.css';
import { pairs, sortBy } from 'underscore';
import { toPairs, sortBy } from 'lodash';

function createTopHitsTable(selector, data, region_url) {
// Filter the manhattan json to a subset of just peaks, largest -log10p first
Expand Down Expand Up @@ -95,7 +95,7 @@ if (window.template_args.ingest_status === 2) {
return resp.json();
})
.then(data => {
sortBy(pairs(data.overall.gc_lambda), function (d) {
sortBy(toPairs(data.overall.gc_lambda), function (d) {
return -d[0];
}).forEach(function (d, i) {
// FIXME: Manually constructed HTML; change
Expand All @@ -111,7 +111,8 @@ if (window.template_args.ingest_status === 2) {
} else {
create_qq_plot([{ maf_range: [0, 0.5], qq: data.overall.qq, count: data.overall.count }], data.ci);
}
}).catch(() => {
}).catch((e) => {
console.error(e);
document.getElementById('qq_plot_container').textContent = 'Could not fetch QQ plot data.';
});
});
Expand Down
Loading

0 comments on commit eea490a

Please sign in to comment.