Skip to content

Commit

Permalink
Merge pull request #252 from bcgsc/release/v7.16.0
Browse files Browse the repository at this point in the history
Release/v7.16.0
  • Loading branch information
Nithriel authored May 4, 2023
2 parents ab9020a + bccb8f2 commit 48b6f12
Show file tree
Hide file tree
Showing 16 changed files with 219 additions and 32 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/npm-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ jobs:
services:
postgres:
# Docker Hub image
image: postgres:9.6-alpine # Should match Dockerfile.db
image: postgres:11-alpine # Should match Dockerfile.db
env:
POSTGRES_PASSWORD: postgres
ports:
Expand Down
2 changes: 1 addition & 1 deletion Dockerfile.db
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM postgres:9.6-alpine
FROM postgres:11-alpine

RUN mkdir -p /tmp/psql_data/

Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Integrated Pipeline Reports (IPR) API

![centos build](https://www.bcgsc.ca/bamboo/plugins/servlet/wittified/build-status/IPR-API) ![build](https://github.com/bcgsc/pori_ipr_api/workflows/build/badge.svg?branch=master) [![codecov](https://codecov.io/gh/bcgsc/pori_ipr_api/branch/master/graph/badge.svg?token=9043E24BZR)](https://codecov.io/gh/bcgsc/pori_ipr_api) ![node versions](https://img.shields.io/badge/node-14%20%7C%2016-blue) [![postgres versions](https://img.shields.io/badge/postgres-9.6%20-blue)](https://www.orientdb.org/) [![DOI](https://zenodo.org/badge/322391719.svg)](https://zenodo.org/badge/latestdoi/322391719)
![centos build](https://www.bcgsc.ca/bamboo/plugins/servlet/wittified/build-status/IPR-API) ![build](https://github.com/bcgsc/pori_ipr_api/workflows/build/badge.svg?branch=master) [![codecov](https://codecov.io/gh/bcgsc/pori_ipr_api/branch/master/graph/badge.svg?token=9043E24BZR)](https://codecov.io/gh/bcgsc/pori_ipr_api) ![node versions](https://img.shields.io/badge/node-14%20%7C%2016-blue) [![postgres versions](https://img.shields.io/badge/postgres-11%20-blue)](https://www.orientdb.org/) [![DOI](https://zenodo.org/badge/322391719.svg)](https://zenodo.org/badge/latestdoi/322391719)


IPR is part of the [platform for oncogenomic reporting and interpretation](https://github.com/bcgsc/pori).
Expand Down
5 changes: 5 additions & 0 deletions app/models/reports/report.js
Original file line number Diff line number Diff line change
Expand Up @@ -117,6 +117,11 @@ module.exports = (sequelize, Sq) => {
field: 'tumour_content',
type: Sq.FLOAT,
},
m1m2Score: {
name: 'm1m2Score',
field: 'm1m2_score',
type: Sq.FLOAT,
},
ploidy: {
type: Sq.TEXT,
},
Expand Down
8 changes: 8 additions & 0 deletions app/routes/report/variants.js
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,8 @@ const logger = require('../../log');
const {KB_PIVOT_MAPPING} = require('../../constants');

const KBMATCHEXCLUDE = ['id', 'reportId', 'variantId', 'deletedAt', 'updatedBy'];
const MUTATION_REGEX = '^([^\\s]+)(\\s)mutation[s]?$';

const getVariants = async (tableName, variantType, reportId) => {
return db.models[tableName].scope('extended').findAll({
order: [['id', 'ASC']],
Expand Down Expand Up @@ -38,6 +40,9 @@ const therapeuticAssociationFilter = {
{[Op.is]: literal('distinct from \'msi\'')},
{[Op.is]: literal('distinct from \'tmb\'')},
]},
// Regex filter for finding columns with 2 or more spaces that end with
// mutation or mutations
[Op.not]: {kbVariant: {[Op.regexp]: MUTATION_REGEX}},
};

// PSQL natively ignores null on equal checks.
Expand All @@ -56,6 +61,9 @@ const cancerRelevanceFilter = {
{[Op.is]: literal('distinct from \'exp\'')},
{[Op.is]: literal('distinct from \'tmb\'')},
]},
// Regex filter for finding columns with 2 or more spaces that end with
// mutation or mutations
[Op.not]: {kbVariant: {[Op.regexp]: MUTATION_REGEX}},
};

const unknownSignificanceIncludes = ['mut', 'tmb'];
Expand Down
1 change: 1 addition & 0 deletions config/jest/jest.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -46,4 +46,5 @@ module.exports = {
'<rootDir>/test/keycloak.mock.js',
'<rootDir>/test/graphkb.mock.js',
],
testTimeout: 10000,
};
67 changes: 51 additions & 16 deletions demo/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,48 +2,83 @@

> :warning: **BEFORE YOU START!** The demo dump is created from a cleaned/stripped version of the production database. Only reports under the PORI project are kept. If any changes have been made to these reports since the last dump then they must be manually reviewed by the developer creating the dump beforehand to double check nothing has been uploaded, edited or added that should not be included in public data (ex. identifiable or proprietary information).
First create a dump of the production database (see migrationTools create).
FIRST: create a dump of the production database (see migrationTools create).

```bash
pg_dump -Fc -U <USER> -h <HOSTNAME> -d <DATABASE_NAME> > new_demo.dump
```

Then restore this dump as a new database. If you are running a local postgres server for which you have root access, the easiest way to do this is using the restore script. You can ignore the password parameters for now since they will not be kept anyway
THE MIDDLE STEP: restore the dump as a new database and run the node script to clean any non-public data from the dump.
There are two ways to do this. One way requires superuser and one way doesn't. If you are running a local postgres server for which you have root access, you have superuser, and the easiest way to do this is using the restore script.
Instructions for both ways to do this are below.

FINALLY: create a dump of the newly cleaned database. This should be MUCH smaller than the original and is the one that will be included in the git repository.

```bash
POSTGRES_USER=$USER DB_DUMP_LOCATION=new_demo.dump SERVICE_PASSWORD=root READONLY_PASSWORD=root bash demo/restore_iprdb_dump.sh
pg_dump -Fc -U $USER -h localhost -d ipr_demo > demo/ipr_demodb.postgres.dump
```

The next step is the run the node script to clean any non-public data from the dump. To run this clean efficiently, triggers should be disabled before clean. First connect to your newly created db and run the following
Whichever way you decide to do this, note that if the database was dumped from an older version, you may need to migrate the schema to ensure it is up to date first.


```bash
npx sequelize-cli db:migrate --url postgres://${USER}@localhost:5432/ipr_demo
```

THE MIDDLE STEP:

IF YOU HAVE SUPERUSER:
1) run the restore script,
2) alter the tables to disable triggers,
3) run the clean script,
4) re-enable the triggers.

If you don't have superuser you won't be able to disable and re-enable the triggers.

You can ignore the password parameters for now since they will not be kept anyway.

```bash
POSTGRES_USER=$USER DB_DUMP_LOCATION=new_demo.dump SERVICE_PASSWORD=root READONLY_PASSWORD=root bash demo/restore_iprdb_dump.sh
```
```sql
ALTER TABLE reports DISABLE TRIGGER ALL;
ALTER TABLE reports_genes DISABLE TRIGGER ALL;
ALTER TABLE germline_small_mutations DISABLE TRIGGER ALL;
```

Then run this script against the database copy that has been dumped. For example below the database dump was restored in a local version of the postgres server and run as follows

```bash
node demo/clean_db_for_demo.js --database.name ipr_demo --database.hostname localhost --database.password '' --graphkb.password ''
```

The triggers must now be re-enabled, connect to your db and run the following

```sql
ALTER TABLE reports ENABLE TRIGGER ALL;
ALTER TABLE reports_genes ENABLE TRIGGER ALL;
ALTER TABLE germline_small_mutations ENABLE TRIGGER ALL;
```

Note: If this was dumped from an older version, you may need to migrate the schema to ensure it is up to date first

```bash
npx sequelize-cli db:migrate --url postgres://${USER}@localhost:5432/ipr_demo
```
IF YOU DON'T HAVE SUPERUSER:

Finally you can create a dump of the newly cleaned database. This should be MUCH smaller than the original and is the one that will be included in the git repository
export values for:

POSTGRES_USER
POSTGRES_PASSWORD
SERVICE_PASSWORD
IPR_GRAPHKB_PASSWORD
DATABASE_HOSTNAME=eg iprdevdb.bcgsc.ca
DB_DUMP_LOCATION=the filesystem location of the pg_dump output file
DATABASE_NAME=the name you want to use for the demo db

1) run the restore script with the option to avoid loading triggers
2) ssh to the db's host, or set it in the command. Then run the clean script command
3) run the restore script again with the option to load triggers.

```bash
pg_dump -Fc -U $USER -h localhost -d ipr_demo > demo/ipr_demodb.postgres.dump

DB_DUMP_LOCATION=$DB_DUMP_LOCATION READONLY_PASSWORD=root TRIGGERS_OPTION="no_triggers" bash demo/restore_iprdb_dump.sh

node demo/clean_db_for_demo.js --database.name $DATABASE_NAME --database.hostname $DATABASE_HOSTNAME

DB_DUMP_LOCATION=$DB_DUMP_LOCATION READONLY_PASSWORD=root TRIGGERS_OPTION="only_triggers" bash demo/restore_iprdb_dump.sh
```

Expect error messages as the script tries to create a user that already exists.
This should not affect the restore/clean of the db.
26 changes: 20 additions & 6 deletions demo/clean_db_for_demo.js
Original file line number Diff line number Diff line change
Expand Up @@ -156,7 +156,7 @@ const addDemoUserToProject = async (queryInterface, transaction, demoUser, proje
}
};

const cleanUsers = async (queryInterface, transaction) => {
const cleanUsers = async (queryInterface, transaction, reportsToKeep) => {
console.log('DROP all non-admin non-manager groups');
await queryInterface.sequelize.query(
`DELETE FROM user_groups
Expand Down Expand Up @@ -237,7 +237,7 @@ const cleanUsers = async (queryInterface, transaction) => {
);
console.log(demoUser.id);
console.log(userMetadata);
if (userMetadata) {
if (!userMetadata) {
await queryInterface.sequelize.query(
`INSERT INTO user_metadata (
ident, user_id,
Expand Down Expand Up @@ -268,8 +268,22 @@ const cleanUsers = async (queryInterface, transaction) => {
await addDemoUserToProject(queryInterface, transaction, demoUser, 'TEST');
console.log('drop all other users');
await queryInterface.sequelize.query(
'DELETE FROM users WHERE username != :username',
{transaction, replacements: {username: 'iprdemo'}},
`UPDATE reports_therapeutic_targets set updated_by = :demouser WHERE updated_by is not null
and (report_id IN(:reportsToKeep));
UPDATE reports_summary_analyst_comments set updated_by = :demouser WHERE updated_by is not null
and (report_id IN(:reportsToKeep));
UPDATE reports_mutation_signature set updated_by = :demouser WHERE updated_by is not null
and (report_id IN(:reportsToKeep));
DELETE FROM users WHERE username != :username;`,
{transaction, replacements: {username: 'iprdemo', demouser: demoUser.id, reportsToKeep}},
);

console.log('DROP all user_projects where user has been deleted');
await queryInterface.sequelize.query(
`DELETE FROM user_projects WHERE user_id not in (select id from users);
DELETE FROM user_metadata WHERE user_id not in (select id from users);
DELETE FROM user_group_members WHERE user_id not in (select id from users);`,
{transaction},
);
};

Expand Down Expand Up @@ -355,7 +369,7 @@ const cleanDb = async () => {
);
await checkReportsCount(queryInterface, transaction, reportsToKeep.length);

await cleanUsers(queryInterface, transaction);
await cleanUsers(queryInterface, transaction, reportsToKeep);
await checkReportsCount(queryInterface, transaction, reportsToKeep.length);

console.log('anonymize reports_pairwise_expression_correlation patient id data');
Expand Down Expand Up @@ -399,7 +413,7 @@ const cleanDb = async () => {
replacements: {tcgaPattern: 'TCGA-%'},
},
);
// sort by ident to avoid chronological ordering
// sort by ident to avoid chronological ordering
nonTcgaPatients.sort((a, b) => {
return a.ident.localeCompare(b.ident);
});
Expand Down
Binary file modified demo/ipr_demodb.postgres.dump
Binary file not shown.
18 changes: 17 additions & 1 deletion demo/restore_iprdb_dump.sh
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,22 @@ then
READONLY_PASSWORD=$SERVICE_PASSWORD
fi

if [ "$TRIGGERS_OPTION" = "" ];
then
SECTION=""
fi

if [ "$TRIGGERS_OPTION" = "no_triggers" ];
then
SECTION="--section=pre-data --section=data"
fi

if [ "$TRIGGERS_OPTION" = "only_triggers" ];
then
SECTION="--section=post-data"
fi



echo "*** CREATING DATABASE ***"

Expand All @@ -50,7 +66,7 @@ psql -U $POSTGRES_USER -d "$TEMPLATE_NAME" -c "CREATE EXTENSION IF NOT EXISTS \"

# import dump
PGPASSWORD=$SERVICE_PASSWORD createdb -U $SERVICE_USER -T $TEMPLATE_NAME $DATABASE_NAME
PGPASSWORD=$SERVICE_PASSWORD pg_restore -U $SERVICE_USER -n public --no-acl --no-owner -Fc "$DB_DUMP_LOCATION" -d "$DATABASE_NAME";
PGPASSWORD=$SERVICE_PASSWORD pg_restore -U $SERVICE_USER -n public $SECTION --no-acl --no-owner -Fc "$DB_DUMP_LOCATION" -d "$DATABASE_NAME"

# create the RO user for demos
psql -U $POSTGRES_USER -c "GRANT CONNECT ON DATABASE $DATABASE_NAME TO $READONLY_USER;"
Expand Down
20 changes: 20 additions & 0 deletions migrations/20230503180603-DEVSU-1981-add-m1m2score-to-reports.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
const REPORTS = 'reports';

module.exports = {
up: async (queryInterface, Sq) => {
return queryInterface.sequelize.transaction(async (transaction) => {
await queryInterface.addColumn(
REPORTS,
'm1m2_score',
{
type: Sq.FLOAT,
},
{transaction},
);
});
},

down: async () => {
throw new Error('Not Implemented!');
},
};
4 changes: 2 additions & 2 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"private": true,
"name": "ipr-api",
"version": "7.15.0",
"version": "7.16.0",
"description": "Integrated Pipeline Reports API",
"main": "bin/server.js",
"scripts": {
Expand Down
19 changes: 18 additions & 1 deletion test/routes/report/report.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ let request;
const checkReport = (report) => {
[
'tumourContent', 'ploidy', 'subtyping', 'ident', 'patientId',
'sampleInfo', 'seqQC', 'reportVersion',
'sampleInfo', 'seqQC', 'reportVersion', 'm1m2Score',
'state', 'expression_matrix', 'alternateIdentifier', 'ageOfConsent',
'biopsyDate', 'biopsyName', 'presentationDate', 'kbDiseaseMatch',
'kbUrl', 'pediatricIds',
Expand Down Expand Up @@ -93,6 +93,7 @@ describe('/reports/{REPORTID}', () => {
templateId: template.id,
patientId: mockReportData.patientId,
tumourContent: 100,
m1m2Score: 22.5,
});
await db.models.reportProject.create({
reportId: report.id,
Expand Down Expand Up @@ -491,6 +492,22 @@ describe('/reports/{REPORTID}', () => {
expect(res.body).toHaveProperty('tumourContent', 23.2);
});

describe('PUT', () => {
test('M1M2 Score update OK', async () => {
const res = await request
.put(`/api/reports/${report.ident}`)
.auth(username, password)
.type('json')
.send({
m1m2Score: 98.5,
})
.expect(HTTP_STATUS.OK);

checkReport(res.body);
expect(res.body).toHaveProperty('m1m2Score', 98.5);
});
});

test('ploidy update OK', async () => {
const res = await request
.put(`/api/reports/${report.ident}`)
Expand Down
Loading

0 comments on commit 48b6f12

Please sign in to comment.