This microservice provides access and interaction with all sorts of Challenge data.
- Production API
- Resources API
- ES Processor - Updates data in ElasticSearch
- Domain Challenge - Domain Challenge
Configuration for the application is at config/default.js
.
The following parameters can be set in config files or in env variables:
- READONLY: sets the API in read-only mode. POST/PUT/PATCH/DELETE operations will return 403 Forbidden
- LOG_LEVEL: the log level, default is 'debug'
- PORT: the server port, default is 3000
- AUTH_SECRET: The authorization secret used during token verification.
- VALID_ISSUERS: The valid issuer of tokens.
- AUTH0_URL: AUTH0 URL, used to get M2M token
- AUTH0_PROXY_SERVER_URL: AUTH0 proxy server URL, used to get M2M token
- AUTH0_AUDIENCE: AUTH0 audience, used to get M2M token
- TOKEN_CACHE_TIME: AUTH0 token cache time, used to get M2M token
- AUTH0_CLIENT_ID: AUTH0 client id, used to get M2M token
- AUTH0_CLIENT_SECRET: AUTH0 client secret, used to get M2M token
- BUSAPI_URL: Bus API URL
- KAFKA_ERROR_TOPIC: Kafka error topic used by bus API wrapper
- AMAZON.AWS_ACCESS_KEY_ID: The Amazon certificate key to use when connecting. Use local dynamodb you can set fake value
- AMAZON.AWS_SECRET_ACCESS_KEY: The Amazon certificate access key to use when connecting. Use local dynamodb you can set fake value
- AMAZON.AWS_REGION: The Amazon certificate region to use when connecting. Use local dynamodb you can set fake value
- AMAZON.IS_LOCAL_DB: Use Amazon DynamoDB Local or server.
- AMAZON.DYNAMODB_URL: The local url if using Amazon DynamoDB Local
- AMAZON.ATTACHMENT_S3_BUCKET: the AWS S3 bucket to store attachments
- ES: config object for Elasticsearch
- ES.HOST: Elasticsearch host
- ES.API_VERSION: Elasticsearch API version
- ES.ES_INDEX: Elasticsearch index name
- ES.ES_REFRESH: Elasticsearch refresh method. Default to string
true
(i.e. refresh immediately) - FILE_UPLOAD_SIZE_LIMIT: the file upload size limit in bytes
- OPENSEARCH: Flag to use Opensearch NPM instead of Elasticsearch
- RESOURCES_API_URL: TC resources API base URL
- GROUPS_API_URL: TC groups API base URL
- PROJECTS_API_URL: TC projects API base URL
- CHALLENGE_MIGRATION_APP_URL: migration app URL
- TERMS_API_URL: TC Terms API Base URL
- COPILOT_RESOURCE_ROLE_IDS: copilot resource role ids allowed to upload attachment
- HEALTH_CHECK_TIMEOUT: health check timeout in milliseconds
- SCOPES: the configurable M2M token scopes, refer
config/default.js
for more details - M2M_AUDIT_HANDLE: the audit name used when perform create/update operation using M2M token
- FORUM_TITLE_LENGTH_LIMIT: the forum title length limit
You can find sample .env
files inside the /docs
directory.
- Drop/delete tables:
npm run drop-tables
- Creating tables:
npm run create-tables
- Seed/Insert data to tables:
npm run seed-tables
- Initialize/Clear database in default environment:
npm run init-db
- View table data in default environment:
npm run view-data <ModelName>
, ModelName can beChallenge
,ChallengeType
,AuditLog
,Phase
,TimelineTemplate
orAttachment
- Create Elasticsearch index:
npm run init-es
, or to re-create index:npm run init-es force
- Synchronize ES data and DynamoDB data:
npm run sync-es
- Start all the depending services for local deployment:
npm run services:up
- Stop all the depending services for local deployment:
npm run services:down
- Check the logs of all the depending services for local deployment:
npm run services:logs
- Initialize the local environments:
npm run local:init
- Reset the local environments:
npm run local:reset
- The seed data are located in
src/scripts/seed
-
Make sure to use Node v10+ by command
node -v
. We recommend using NVM to quickly switch to the right version:nvm use
-
📦 Install npm dependencies
# export the production AWS credentials to access the topcoder-framework private repos in AWS codeartifact aws codeartifact login --tool npm --repository topcoder-framework --domain topcoder --domain-owner 409275337247 --region us-east-1 --namespace @topcoder-framework # install dependencies yarn install
-
⚙ Local config
In thechallenge-api
root directory create.env
file with the next environment variables. Values for Auth0 config should be shared with you on the forum.# Auth0 config AUTH0_URL= AUTH0_PROXY_SERVER_URL= AUTH0_AUDIENCE= AUTH0_CLIENT_ID= AUTH0_CLIENT_SECRET= # Locally deployed services (via docker-compose) IS_LOCAL_DB=true DYNAMODB_URL=http://localhost:8000
- Values from this file would be automatically used by many
npm
commands. ⚠️ Never commit this file or its copy to the repository!
- Values from this file would be automatically used by many
-
🚢 Start docker-compose with services which are required to start Topcoder Challenges API locally
npm run services:up
-
♻ Update following two parts:
- https://github.com/topcoder-platform/challenge-api/blob/develop/src/models/Challenge.js#L116
throughput: 'ON_DEMAND',
should be updated tothroughput:{ read: 4, write: 2 },
- https://github.com/topcoder-platform/challenge-api/blob/develop/config/default.js#L27-L28
-
♻ Create tables.
npm run create-tables # Use `npm run drop-tables` to drop tables.
-
♻ Init DB, ES
npm run local:init
This command will do 3 things:
- create Elasticsearch indexes (drop if exists)
- Initialize the database by cleaning all the records.
- Import the data to the local database and index it to ElasticSearch
-
🚀 Start Topcoder Challenge API
npm start
The Topcoder Challenge API will be served on
http://localhost:3000
- TBD
Test configuration is at config/test.js
. You don't need to change them.
The following test parameters can be set in config file or in env variables:
- ADMIN_TOKEN: admin token
- COPILOT_TOKEN: copilot token
- USER_TOKEN: user token
- EXPIRED_TOKEN: expired token
- INVALID_TOKEN: invalid token
- M2M_FULL_ACCESS_TOKEN: M2M full access token
- M2M_READ_ACCESS_TOKEN: M2M read access token
- M2M_UPDATE_ACCESS_TOKEN: M2M update (including 'delete') access token
- S3_ENDPOINT: endpoint of AWS S3 API, for unit and e2e test only; default to
localhost:9000
- Start Local services in docker.
- Create DynamoDB tables.
- Initialize ES index.
- Various config parameters should be properly set.
Seeding db data is not needed.
To run unit tests alone
npm run test
To run unit tests with coverage report
npm run test:cov
To run integration tests alone
npm run e2e
To run integration tests with coverage report
npm run e2e:cov
Refer to the verification document Verification.md
-
after uploading attachments, the returned attachment ids should be used to update challenge; finally, attachments have challengeId field linking to their challenge, challenge also have attachments field linking to its attachments, this will speed up challenge CRUDS operations.
-
In the app-constants.js Topics field, the used topics are using a test topic, the suggested ones are commented out, because these topics are not created in TC dev Kafka yet.