This microservice, is based upon clm-core and extends the basic functionalities with additional features
This service is used to allow clients to execute a launch request (LTI, CMI5) and then display the content in the client system. This service can also translate between different launch specifications offered by the different tools. The groups/user assignments determine whether a content is launched for the context of a user.
-
MariaDB, set up locally. This service leverages a database (DB) as the cornerstone for storing documents persistently. To establish a connection with MariaDB, it is essential that the database is secured through username and password authentication. Therefore, in order to run this service it is required to create a database within the MariaDB and configure it with a username and password for access control
- MariaDB Installation: https://mariadb.com/kb/en/getting-installing-and-upgrading-mariadb/
- For setting up the password of a user: https://mariadb.com/kb/en/set-password/
-
Node.js 20.x: https://nodejs.org/en/download
root
├── api-docs # Open API 3.0.0 definition as .yaml file documenting all routes and data models this service offers.
├── docs # Developer documentation of all functions, classes, interfaces, types this service exposes as an npm package.
├── dist # The built TypeScript project transpiled to JavaScript.
└── src # Business-relevant logic for this web server.
The Entity Relationship Model of the Open Core is shown above.
The clm-ext-launch
module does not utilize resources on its own but leverages various resources from other modules to facilitate the launching of objects in compliance with the standard:
User (clm-core)
- Used to enrich user information during the Launch Request.
- Checks for user-specific enrollments.
Enrollment (clm-ext-learning_objects)
- Verifies user-specific assignments in learning objects.
ServiceProvider (clm-ext-service_providers)
- Required to obtain all user-specific service providers and their associated tools.
Tool (clm-ext-tools)
- Necessary to determine the type of tool to be launched. Currently supports the launch standards CMI5, LTI 1.1, and LTI 1.3.
This service functions as a web microservice that can be orchestrated through a gateway and as an npm package to provide functionalities to other CLM extensions. A microservice can build upon the classes/types/interfaces of this service to extend basic functionalities.
- The service's configuration can be customized by referring to the
.env
file. Within this file, theMARIA_CONFIG
variable should be updated with the appropriate values to reflect the user's specific database settings. Refer to theMARIA_CONFIG
variable in the table below to see which comma seperated value refers to which respective database setting. -
- npm install
- Copy .env.default to .env and overwrite needed properties
Following table gives an overview of the settings you can change through the environment variables
Name | Example | Required (Yes/No) | Description |
---|---|---|---|
PORT | 3002 | Yes | The port on which the service should be deployed. |
DEPLOY_URL | HOST_PROTOCOL://HOST_ADDRESS:GATEWAY_PORT/api | Yes | The address where all microservices are to be orchestrated. A /api must be appended. |
MARIA_CONFIG | MARIA_HOST_ADDRESS|MARIA_PORT|MARIA_DATABASE|MARIA_USER|MARIA_USER_PASSWORD | Yes | A comma-separated string that must contain the configured parameters that were previously defined during the installation of MariaDB. |
TOKEN_SECRET | secret |
Yes | to sign and verify JWTs for authentication. Have to be the same across all modules of the Open-Core |
GATEWAY_URL | http://gateway/api | No | The URL of the application's gateway, which might be used for API routing. This is relevant when using docker orchestration. Defaults to the example value |
LOGIN_HINT_ENCRYPT_KEY | secret | No | A secret key used to encrypt login hints. Relevant for LTI 1.3. Defaults to the example value |
KID | 1 | No | The key identifier for the public key to be used. For LTI 1.3 relevant. Defaults to the example value |
DISABLE_ERR_RESPONSE |
true |
No | Flag to control whether error responses should be returned. Defaults to example value if not set. |
3.1 npm run dev
for development with nodemon
3.2 npm start
for deployment
- Subsequently, the JSON representation of the Open-API specification should be accessible at:
http://localhost:${PORT}/launch/swagger
To access the API endpoints detailed in the Open-API specification, an API token is required. This token is generated during the initialization of the clm-core module. For further details, please consult the documentation at clm-core.
-
Documentation about all exposed modules can be found under
/docs
. -
Include the package in your project's
package.json
dependencies:"dependencies": { "clm-ext-launch": "git+https://$token:$token@$url_of_package#$branch_name" }
-
To use database-dependent DAOs/DTOs, inject
MARIA_CONFIG
into the environment before importing the module:a) Manually in the code:
process.env.MARIA_CONFIG = "localhost|3306|clm|root|12345"; import * as core from 'clm-ext-launch';
b) Through
.env
file:MARIA_CONFIG=localhost|3306|clm|root|12345
import * as core from 'clm-ext-launch';
- Accessible routes for this microservice are available at
http://localhost:PORT/launch/swagger
after starting the service. - Ensure to set up a reverse proxy to route traffic to the respective microservices as shown in the table.
The changelog can be found in the CHANGELOG.md file.
Please see the file AUTHORS.md to get in touch with the authors of this project. We will be happy to answer your questions at {[email protected]}
The project is made available under the license in the file LICENSE.txt