Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

project API elastic search data loader

NotificationsYou must be signed in to change notification settings

topcoder-platform/project-processor-es

Repository files navigation

Dependencies

Local setup

  1. Install node dependencies:

    npm install
  2. Run docker compose with dependant services:

    cd local/docker-compose up
    Click to see details

    This docker-compose run all the dependencies which are necessary forproject-processor-es to work.

    ServiceNamePort
    Elasticsearchesearch9200
    Zookeeperzookeeper2181
    Kafkakafka9092

    docker-compose automatically creates Kafka topics which are used byproject-processor-es listed inlocal/kafka-client/topics.txt.

  3. Set environment variables for M2M authentication:AUTH0_CLIENT_ID,AUTH0_CLIENT_SECRET,AUTH0_URL,AUTH0_AUDIENCE,AUTH0_PROXY_SERVER_URL:

    export AUTH0_CLIENT_ID=<insert required value here>export AUTH0_CLIENT_SECRET=<insert required value here>export AUTH0_URL=<insert required value here>export AUTH0_AUDIENCE=<insert required value here>export AUTH0_PROXY_SERVER_URL=<insert required value here>
  4. Initialize Elasticsearch indexes:

    npm run sync:es
  5. Start processor app:

    npm start

Commands

Lint & Tests commands

CommandDescription
npm run lintRun lint check.
npm run lin:fixRun lint check with automatic fixing of errors and warnings where possible.
npm run testRun integration tests.
npm run test:covRun integration tests with coverage report.

View data in Elasticsearch indexes

You may run the next command to output documents in the Elasticsearch indexes for debugging purposes.

npm run view-data<INDEX_NAME><DOCUMENT_ID>
Examples
  • npm run view-data projects 1 view document with id1 inprojects index
  • npm run view-data timelines 1 view document with id1 intimelines index
  • npm run view-data metadata 1 view document with id1 intimelines index(this index has only one document and all the data is stored inside one document which might be very big).

Kafka commands

If you've useddocker-compose with the filelocal/docker-compose.yml during local setup to spawn kafka & zookeeper, you can use the following commands to manipulate kafka topics and messages:(ReplaceTOPIC_NAME with the name of the desired topic)

Create Topic

dockerexec project-processor-es-kafka /opt/kafka/bin/kafka-topics.sh --create --zookeeper zookeeper:2181 --partitions 1 --replication-factor 1 --topic TOPIC_NAME

List Topics

dockerexec project-processor-es-kafka /opt/kafka/bin/kafka-topics.sh --list --zookeeper zookeeper:2181

Watch Topic

dockerexec  project-processor-es-kafka /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic TOPIC_NAME

Post Message to Topic (from stdin)

dockerexec -it project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic TOPIC_NAME
  • Enter or copy/paste the message into the console after starting this command.

Post Message to Topic (from file)

dockerexec -i project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --topic project.action.create --broker-list localhost:9092< test_message.json
  • All example for messages are in:./test/data.

Configuration

Configuration for the processor is atconfig/default.js.The following parameters can be set in config files or in env variables:

  • LOG_LEVEL: the log level; default value: 'debug'
  • KAFKA_URL: comma separated Kafka hosts; default value: 'localhost:9092'
  • KAFKA_GROUP_ID: the Kafka group id; default value: 'project-processor-es'
  • KAFKA_CLIENT_CERT: Kafka connection certificate, optional; default value is undefined;if not provided, then SSL connection is not used, direct insecure connection is used;if provided, it can be either path to certificate file or certificate content
  • KAFKA_CLIENT_CERT_KEY: Kafka connection private key, optional; default value is undefined;if not provided, then SSL connection is not used, direct insecure connection is used;if provided, it can be either path to private key file or private key content
  • CREATE_DATA_TOPIC: create data Kafka topic, default value is 'project.action.create'
  • UPDATE_DATA_TOPIC: update data Kafka topic, default value is 'project.action.update'
  • DELETE_DATA_TOPIC: delete data Kafka topic, default value is 'project.action.delete'
  • KAFKA_MESSAGE_ORIGINATOR: Kafka topic originator, default value is 'project-api'
  • MEMBER_SERVICE_ENDPOINT: used to get member details (defaults tohttps://api.topcoder-dev.com/v6/members)
  • AUTH0_URL: AUTH0 URL, used to get M2M token
  • AUTH0_PROXY_SERVER_URL: AUTH0 proxy server URL, used to get M2M token
  • AUTH0_AUDIENCE: AUTH0 audience, used to get M2M token
  • TOKEN_CACHE_TIME: AUTH0 token cache time, used to get M2M token
  • AUTH0_CLIENT_ID: AUTH0 client id, used to get M2M token
  • AUTH0_CLIENT_SECRET: AUTH0 client secret, used to get M2M token
  • esConfig: config object for Elasticsearch

Refer toesConfig variable inconfig/default.js for ES related configuration.

Also note that there is a/health endpoint that checks for the health of the app. This sets up an expressjs server and listens on the environment variablePORT. It's not part of the configuration file and needs to be passed as an environment variable

Config for tests are atconfig/test.js, it overrides some default config.

Local Deployment with Docker

To run the Challenge ES Processor using docker, follow the below steps

  1. Navigate to the directorydocker

  2. Rename the filesample.api.env toapi.env

  3. Set the required AWS credentials in the fileapi.env

  4. Once that is done, run the following command

docker-compose up
  1. When you are running the application for the first time, It will take some time initially to download the image and install the dependencies

Integration tests

Integration tests use different indexprojects_test,timelines_test,metadata_test which is not same as the usual indexprojects,timelines,metadata.

While running tests, the index names could be overwritten using environment variables or leave it as it is to use the default test indices defined inconfig/test.js

export ES_PROJECT_INDEX=projects_testexport ES_TIMELINE_INDEX=timelines_testexport ES_METADATA_INDEX=metadata_testexport ES_CUSTOMER_PAYMENT_INDEX=customer_payments_test

Running integration tests and coverage

To run test alone

npm run test

To run test with coverage report

npm run test:cov

Verification

  • Start Docker services, initialize Elasticsearch, start processor app
  • Navigate to the repository root directory.
  • Send message:docker exec -i project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --topic project.action.create --broker-list localhost:9092 < ./test/data/project/project.action.create.json
  • run commandnpm run view-data projects 1 to view the created data, you will see the data are properly created:
info: Elasticsearch Project data:info: {"createdAt":"2019-06-20T13:43:23.554Z","updatedAt":"2019-06-20T13:43:23.555Z","terms": [],"id": 1,"name":"test project","description":"Hello I am a test project","type":"app","createdBy": 40051333,"updatedBy": 40051333,"projectEligibility": [],"bookmarks": [],"external": null,"status":"draft","lastActivityAt":"2019-06-20T13:43:23.514Z","lastActivityUserId":"40051333","members": [      {"createdAt":"2019-06-20T13:43:23.555Z","updatedAt":"2019-06-20T13:43:23.625Z","id": 2,"isPrimary": true,"role":"manager","userId": 40051333,"updatedBy": 40051333,"createdBy": 40051333,"projectId": 2,"deletedAt": null,"deletedBy": null      }    ],"version":"v2","directProjectId": null,"billingAccountId": null,"estimatedPrice": null,"actualPrice": null,"details": null,"cancelReason": null,"templateId": null,"deletedBy": null,"attachments": null,"phases": null,"projectUrl":"https://connect.topcoder-dev.com/projects/2"}
  • Run the producer and then write some invalid message into the console to send to theproject.action.create topic:

    docker exec -it project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --topic project.action.create --broker-list localhost:9092in the console, write message, one message per line:{ "topic": "project.action.create", "originator": "project-api", "timestamp": "2019-02-16T00:00:00", "mime-type": "application/json", "payload": { "id": "invalid", "typeId": "8e17090c-465b-4c17-b6d9-dfa16300b0ff", "track": "Code", "name": "test", "description": "desc", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0aa", "phases": [{ "id": "8e17090c-465b-4c17-b6d9-dfa16300b012", "name": "review", "isActive": true, "duration": 10000 }], "prizeSets": [{ "type": "prize", "prizes": [{ "type": "winning prize", "value": 500 }] }], "reviewType": "code review", "tags": ["code"], "projectId": 123, "forumId": 456, "status": "Active", "created": "2019-02-16T00:00:00", "createdBy": "admin" } }

    { "topic": "project.action.create", "originator": "project-api", "timestamp": "2019-02-16T00:00:00", "mime-type": "application/json", "payload": { "id": "173803d3-019e-4033-b1cf-d7205c7f774c", "typeId": "8e17090c-465b-4c17-b6d9-dfa16300b0ff", "track": "Code", "name": "test", "description": "desc", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0aa", "phases": [{ "id": "8e17090c-465b-4c17-b6d9-dfa16300b012", "name": "review", "isActive": true, "duration": 10000 }], "prizeSets": [{ "type": "prize", "prizes": [{ "type": "winning prize", "value": 500 }] }], "reviewType": "code review", "tags": ["code"], "projectId": 123, "forumId": -456, "status": "Active", "created": "2018-01-02T00:00:00", "createdBy": "admin" } }

    { [ { abc

  • Then in the app console, you will see error messages

  • Sent message to update data:

    docker exec -i project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --topic project.action.update --broker-list localhost:9092 < ./test/data/project/project.action.update.json

  • Run commandnpm run view-data projects 1 to view the updated data, you will see the data are properly updated:

info: Elasticsearch Project data:info: {"createdAt":"2019-06-20T13:43:23.554Z","updatedAt":"2019-06-20T13:45:20.091Z","terms": [],"id": 1,"name":"project name updated","description":"Hello I am a test project","type":"app","createdBy": 40051333,"updatedBy": 40051333,"projectEligibility": [],"bookmarks": [],"external": null,"status":"draft","lastActivityAt":"2019-06-20T13:43:23.514Z","lastActivityUserId":"40051333","members": [        {"createdAt":"2019-06-20T13:43:23.555Z","deletedAt": null,"role":"manager","updatedBy": 40051333,"createdBy": 40051333,"isPrimary": true,"id": 2,"userId": 40051333,"projectId": 2,"deletedBy": null,"updatedAt":"2019-06-20T13:43:23.625Z"        }    ],"version":"v2","directProjectId": null,"billingAccountId": null,"estimatedPrice": null,"actualPrice": null,"details": null,"cancelReason": null,"templateId": null,"deletedBy": null,"attachments": [],"phases": [],"projectUrl":"https://connect.topcoder-dev.com/projects/2","invites": [],"utm": null}
  • Run the producer and then write some invalid message into the console to send to theproject.action.create topic:docker exec -it project-processor-es-kafka /opt/kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.createin the console, write message, one message per line:{ "topic": "project.action.update", "originator": "project-api", "timestamp": "2019-02-17T01:00:00", "mime-type": "application/json", "payload": { "id": "173803d3-019e-4033-b1cf-d7205c7f774c", "typeId": "123", "track": "Code", "name": "test3", "description": "desc3", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0dd", "groups": ["group2", "group3"], "updated": "2019-02-17T01:00:00", "updatedBy": "admin" } }

    { "topic": "project.action.update", "originator": "project-api", "timestamp": "2019-02-17T01:00:00", "mime-type": "application/json", "payload": { "id": "173803d3-019e-4033-b1cf-d7205c7f774c", "typeId": "8e17090c-465b-4c17-b6d9-dfa16300b0ff", "track": ["Code"], "name": "test3", "description": "desc3", "timelineTemplateId": "8e17090c-465b-4c17-b6d9-dfa16300b0dd", "groups": ["group2", "group3"], "updated": "2019-02-17T01:00:00", "updatedBy": "admin" } }

    [ [ [ } } }

  • Then in the app console, you will see error messages

  • To test the health check API, runexport PORT=5000, start the processor, then browsehttp://localhost:5000/health in a browser,and you will see result{"checksRun":1}

About

project API elastic search data loader

Resources

Stars

Watchers

Forks

Packages

No packages published

Contributors15

Languages


[8]ページ先頭

©2009-2025 Movatter.jp