Skip to content

Commit

Permalink
Merge branch 'v6.x'
Browse files Browse the repository at this point in the history
  • Loading branch information
danibix95 committed Nov 27, 2023
2 parents 8bdef5c + 4a23db3 commit c4555bf
Show file tree
Hide file tree
Showing 13 changed files with 313 additions and 248 deletions.
1 change: 1 addition & 0 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ on:
pull_request:
branches:
- main
- v6.x
paths:
- ".github/workflows/**"
- "lib/**"
Expand Down
10 changes: 10 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,16 @@ and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.
- updated `@fastify/mongodb` to v8.0.0
- updated `@fastify/multipart` to v8.0.0

## 6.9.4 - 2023-11-22

### Added

- [#225](https://github.com/mia-platform/crud-service/pull/225): `MONGODB_MAX_IDLE_TIME_MS` env to control MongoDB `maxIdleTimeMs` connection option (default set to 0 for backward compatibility, meaning the opened connection remain opened indefinitely)

### Fixed

- [#227](https://github.com/mia-platform/crud-service/pull/227): create indexes limiting promises concurrency to prevent connection creation spikes at boot

## 6.9.3 - 2023-11-21

### Changed
Expand Down
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ LABEL maintainer="Mia Platform Core Team<[email protected]>" \
name="CRUD Service" \
description="HTTP interface to perform CRUD operations on configured MongoDB collections" \
eu.mia-platform.url="https://www.mia-platform.eu" \
eu.mia-platform.version="6.9.3"
eu.mia-platform.version="6.9.4"

ENV NODE_ENV=production
ENV LOG_LEVEL=info
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -226,7 +226,7 @@ The CRUD service offers the functionality to modify a view by editing the underl

To enable this feature, you need to include the `enableLookup: true` property in the view configuration JSON. By default, this setting is set to false.

For more information on correctly configuring and understanding the capabilities of writable views, please refer to the [writable views documentation](./docs/50_Writable_views.md).
For more information on correctly configuring and understanding the capabilities of writable views, please refer to the [writable views documentation](./docs/50_Writable_Views.md).

### Headers

Expand Down
1 change: 1 addition & 0 deletions docs/20_Configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ Below you can find all the environment variables that you can edit.
| ENABLE_TRACING | Boolean | Optional | `false` | Specifies if OpenTelemetry tracing should be enabled. It is possible to find more documentation [here](https://docs.mia-platform.eu/docs/runtime_suite_libraries/lc39/main-entrypoint#opentelemetry-tracing-experimental) |
| ENABLE_STRICT_OUTPUT_VALIDATION | Boolean | Optional | `false` | Specifies whether service responses should be strict compliant with the schema (when enabled the service would fail to return values in case underlying collection contains documents not compliant with the schema) |
| MAX_MULTIPART_FILE_BYTES | Number | Optional | 100 | Sets the max size (Mb) that is possible to process in multipart requests |
| MONGODB_MAX_IDLE_TIME_MS | Number | Optional | 0 | Controls the MongoDB `maxIdleTimeMs` connection option (default set to 0 for backward compatibility, meaning the opened connection remain opened indefinitely) |

:::warning
Using `ALLOW_DISK_USE_IN_QUERIES` (either with `true` or `false` values) with a MongoDB version lower than 4.4 will make all the GET calls unusable since the MongoDB cluster will raise an error for the unrecognized option `allowDiskUse`.
Expand Down
2 changes: 1 addition & 1 deletion docs/50_Writable_Views.md
Original file line number Diff line number Diff line change
Expand Up @@ -385,7 +385,7 @@ When a lookup field is an _array_ of references, the `$push`, `$addToSet` and `$

## One-to-many Relationship

This configuration example behaves similarly to the previous one, where a single rider is returned per order. In this case, however, it is allowed to have multiple riders associated to a single order (let's assume this is logical just within the scope of this example). To achieve this different behaviour it is sufficient to remove the `$unwind` operator from the view aggregation pipeline, so that multiple records would be returned.
This configuration example behaves similarly to the previous one, where a single rider is returned per order. In this case, however, it is allowed to have multiple riders associated to a single order (let's assume this is logical just within the scope of this example). To achieve this different behavior it is sufficient to remove the `$unwind` operator from the view aggregation pipeline, so that multiple records would be returned.

Below are provided the collection and view definition:

Expand Down
Empty file removed docs/50_Writable_views.md
Empty file.
1 change: 1 addition & 0 deletions envSchema.js
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,7 @@ const noCryptSchema = {

const properties = {
MONGODB_URL: { type: 'string', description: 'the mongodb connection string' },
MONGODB_MAX_IDLE_TIME_MS: { type: 'number', description: 'idle time (in ms) to control the MongoDB maxIdleTimeMs connection option (default: 0, meaning there is no max idle time and connection remain open indefinitely)', default: 0 },
COLLECTION_DEFINITION_FOLDER: { type: 'string', description: 'a path where all collections are defined' },
VIEWS_DEFINITION_FOLDER: { type: 'string', description: 'a path where all views are defined' },
USER_ID_HEADER_KEY: { type: 'string', description: 'Header key used to know which user makes the request' },
Expand Down
249 changes: 8 additions & 241 deletions index.js
Original file line number Diff line number Diff line change
Expand Up @@ -14,63 +14,40 @@
* limitations under the License.
*/

/* eslint-disable no-await-in-loop */
'use strict'

const fp = require('fastify-plugin')
const fastifyEnv = require('@fastify/env')
const fastifyMultipart = require('@fastify/multipart')

const Ajv = require('ajv')
const ajvFormats = require('ajv-formats')
const ajvKeywords = require('ajv-keywords')

const lomit = require('lodash.omit')
const lunset = require('lodash.unset')

const { readdirSync } = require('fs')
const { join } = require('path')
const { ObjectId } = require('mongodb')
const { JSONPath } = require('jsonpath-plus')

const myPackage = require('./package')
const QueryParser = require('./lib/QueryParser')
const CrudService = require('./lib/CrudService')
const AdditionalCaster = require('./lib/AdditionalCaster')
const fastifyEnvSchema = require('./envSchema')

const httpInterface = require('./lib/httpInterface')
const JSONSchemaGenerator = require('./lib/JSONSchemaGenerator')
const createIndexes = require('./lib/createIndexes')
const { castCollectionId, getDatabaseNameByType } = require('./lib/pkFactories')
const loadModels = require('./lib/loadModels')
const joinPlugin = require('./lib/joinPlugin')

const { castCollectionId } = require('./lib/pkFactories')
const {
aggregationConversion,
SCHEMA_CUSTOM_KEYWORDS,
OBJECTID,
SETCMD,
PUSHCMD,
PULLCMD,
UNSETCMD,
ADDTOSETCMD,
} = require('./lib/consts')
const joinPlugin = require('./lib/joinPlugin')
const generatePathFieldsForRawSchema = require('./lib/generatePathFieldsForRawSchema')
const { getIdType, registerMongoInstances } = require('./lib/mongo/mongo-plugin')
const mergeViewsInCollections = require('./lib/mergeViewsInCollections')

const { compatibilityModelJsonSchema, modelJsonSchema } = require('./lib/model.jsonschema')
const fastifyEnvSchema = require('./envSchema')
const { registerMongoInstances } = require('./lib/mongo/mongo-plugin')
const { getAjvResponseValidationFunction } = require('./lib/validatorGetters')
const { JSONPath } = require('jsonpath-plus')
const { pointerSeparator } = require('./lib/JSONPath.utils')

const ajv = new Ajv({ useDefaults: true })
ajvFormats(ajv)
ajvKeywords(ajv)

const compatibilityValidate = ajv.compile(compatibilityModelJsonSchema)
const validate = ajv.compile(modelJsonSchema)

const PREFIX_OF_INDEX_NAMES_TO_PRESERVE = 'preserve_'
const VIEW_TYPE = 'view'

async function registerCrud(fastify, { modelName, isView }) {
if (!fastify.mongo) { throw new Error('`fastify.mongo` is undefined!') }
if (!modelName) { throw new Error('`modelName` is undefined!') }
Expand Down Expand Up @@ -228,216 +205,6 @@ async function iterateOverCollectionDefinitionAndRegisterCruds(fastify) {
}
}

function buildModelDependencies(fastify, collectionDefinition, collection) {
const {
defaultState,
} = collectionDefinition

const allFieldNames = collectionDefinition.fields
? collectionDefinition.fields.map(field => field.name)
: Object.keys(collectionDefinition.schema.properties)
const pathsForRawSchema = generatePathFieldsForRawSchema(fastify.log, collectionDefinition)

// TODO: make this configurable
const crudService = new CrudService(
collection,
defaultState,
{ allowDiskUse: fastify.config.ALLOW_DISK_USE_IN_QUERIES },
)
const queryParser = new QueryParser(collectionDefinition, pathsForRawSchema)
const additionalCaster = new AdditionalCaster(collectionDefinition)
const jsonSchemaGenerator = new JSONSchemaGenerator(
collectionDefinition,
{},
fastify.config.CRUD_LIMIT_CONSTRAINT_ENABLED,
fastify.config.CRUD_MAX_LIMIT,
fastify.config.ENABLE_STRICT_OUTPUT_VALIDATION
)
const jsonSchemaGeneratorWithNested = new JSONSchemaGenerator(
collectionDefinition,
pathsForRawSchema,
fastify.config.CRUD_LIMIT_CONSTRAINT_ENABLED,
fastify.config.CRUD_MAX_LIMIT
)

return {
crudService,
queryParser,
castResultsAsStream: () => additionalCaster.castResultsAsStream(),
castItem: (item) => additionalCaster.castItem(item),
allFieldNames,
jsonSchemaGenerator,
jsonSchemaGeneratorWithNested,
}
}

function createLookupModel(fastify, viewDefinition, mergedCollections) {
const lookupModels = []
const viewLookups = viewDefinition.pipeline
.filter(pipeline => '$lookup' in pipeline)
.map(lookup => Object.values(lookup).shift())

for (const lookup of viewLookups) {
const { from, pipeline } = lookup
const lookupCollection = mergedCollections.find(({ name }) => name === from)
const lookupIdType = getIdType(lookupCollection)
const lookupCollectionMongo = fastify.mongo[getDatabaseNameByType(lookupIdType)].db.collection(from)

const lookupProjection = pipeline?.find(({ $project }) => $project)?.$project ?? {}
const parsedLookupProjection = []
const lookupCollectionDefinition = {
...lomit(viewDefinition, ['fields']),
schema: {
type: 'object',
properties: {},
required: [],
},
}

Object.entries(lookupProjection)
.forEach(([fieldName, schema]) => {
parsedLookupProjection.push({ [fieldName]: schema })
const conversion = Object.keys(schema).shift()
if (schema !== 0) {
if (!aggregationConversion[conversion]) {
throw new Error(`Invalid view lookup definition: no explicit type found in ${JSON.stringify({ [fieldName]: schema })}`)
}
lookupCollectionDefinition.schema.properties[fieldName] = {
type: aggregationConversion[conversion],
}
}
})

const lookupModel = {
...buildModelDependencies(fastify, lookupCollectionDefinition, lookupCollectionMongo),
definition: lookupCollectionDefinition,
lookup,
parsedLookupProjection,
}
lookupModels.push(lookupModel)
}
return lookupModels
}

// eslint-disable-next-line max-statements
async function loadModels(fastify) {
const { collections, views = [] } = fastify
const mergedCollections = mergeViewsInCollections(collections, views)

fastify.log.trace({ collectionNames: mergedCollections.map(coll => coll.name) }, 'Registering CRUDs and Views')

const models = {}
const existingStringCollection = []
const existingObjectIdCollection = []

// eslint-disable-next-line max-statements
const promises = mergedCollections.map(async(collectionDefinition) => {
// avoid validating the collection definition twice, since it would only
// match one of the two, depending on the existence of schema property
if (!collectionDefinition.schema) {
if (!compatibilityValidate(collectionDefinition)) {
fastify.log.error({ collection: collectionDefinition.name }, compatibilityValidate.errors)
throw new Error(`invalid collection definition: ${JSON.stringify(compatibilityValidate.errors)}`)
}
} else if (!validate(collectionDefinition)) {
fastify.log.error(validate.errors)
throw new Error(`invalid collection definition: ${JSON.stringify(validate.errors)}`)
}

const {
source: viewSourceCollectionName,
name: collectionName,
endpointBasePath: collectionEndpoint,
type,
indexes = [],
enableLookup,
source,
pipeline,
} = collectionDefinition ?? {}
const isView = type === VIEW_TYPE
const viewLookupsEnabled = isView && enableLookup

fastify.log.trace({ collectionEndpoint, collectionName }, 'Registering CRUD')

const collectionIdType = getIdType(collectionDefinition)
const collection = await fastify
.mongo[getDatabaseNameByType(collectionIdType)]
.db
.collection(collectionName)
const modelDependencies = buildModelDependencies(fastify, collectionDefinition, collection)

let viewDependencies = {}
if (viewLookupsEnabled) {
const sourceCollection = mergedCollections.find(mod => mod.name === viewSourceCollectionName)
const sourceCollectionDependencies = buildModelDependencies(fastify, sourceCollection)

const viewIdType = getIdType(sourceCollection)
const sourceCollectionMongo = await fastify
.mongo[getDatabaseNameByType(viewIdType)]
.db
.collection(viewSourceCollectionName)
viewDependencies = buildModelDependencies(fastify, collectionDefinition, sourceCollectionMongo)
viewDependencies.queryParser = sourceCollectionDependencies.queryParser
viewDependencies.allFieldNames = sourceCollectionDependencies.allFieldNames
viewDependencies.lookupsModels = createLookupModel(fastify, collectionDefinition, mergedCollections)
}

models[getCollectionNameFromEndpoint(collectionEndpoint)] = {
definition: collectionDefinition,
...modelDependencies,
viewDependencies,
isView,
viewLookupsEnabled,
}

if (isView) {
const existingCollections = collectionIdType === OBJECTID ? existingObjectIdCollection : existingStringCollection
if (existingCollections.length === 0) {
existingCollections.push(
...(
await fastify
.mongo[getDatabaseNameByType(collectionIdType)]
.db
.listCollections({}, { nameOnly: true })
.toArray()
)
.filter(({ type: collectionType }) => collectionType === VIEW_TYPE)
.map(({ name }) => name)
)
}

if (existingCollections.includes(collectionName)) {
await fastify
.mongo[getDatabaseNameByType(collectionIdType)]
.db
.collection(collectionName)
.drop()
}

return fastify
.mongo[getDatabaseNameByType(collectionIdType)]
.db
.createCollection(
collectionName,
{
viewOn: source,
pipeline,
}
)
}

return createIndexes(collection, indexes, PREFIX_OF_INDEX_NAMES_TO_PRESERVE)
})
while (promises.length) {
await Promise.all(promises.splice(0, 5))
}
fastify.decorate('models', models)
}

function getCollectionNameFromEndpoint(endpointBasePath) {
return endpointBasePath.replace('/', '').replace(/\//g, '-')
}

const validCrudFolder = path => !['.', '..'].includes(path) && /\.js(on)?$/.test(path)

async function setupCruds(fastify) {
Expand Down
Loading

0 comments on commit c4555bf

Please sign in to comment.