Skip to content

Commit

Permalink
Merge branch 'main' into feat/perf-test
Browse files Browse the repository at this point in the history
  • Loading branch information
Demetrio Marino committed Jan 19, 2024
2 parents b199325 + 29c918d commit 4fb3bf4
Show file tree
Hide file tree
Showing 13 changed files with 1,865 additions and 1,590 deletions.
26 changes: 26 additions & 0 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,32 @@ updates:
schedule:
interval: "weekly"
open-pull-requests-limit: 10
groups:
fastify:
patterns:
- "@fastify*"
update-types:
- "minor"
- "patch"
miaPlatform:
patterns:
- "@mia-platform*"
update-types:
- "minor"
- "patch"
lodash:
patterns:
- "@lodash*"
update-types:
- "minor"
- "patch"
ajv:
patterns:
- "ajv"
- "ajv-*"
update-types:
- "minor"
- "patch"

- package-ecosystem: "docker"
directory: "/"
Expand Down
15 changes: 13 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,17 +18,28 @@ and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.
### Changed

- remove `additionalProperties` constraints from collection definition schema to allow greater flexibility in adding further config entries
- updated NodeJS version in Dockerfile to v20.8.1
- updated NodeJS version in Dockerfile to v20.11.0
- updated `@fastify/mongodb` to v8.0.0
- updated `@fastify/multipart` to v8.0.0

## 6.9.5 - 2024-01-19

### Added

- [#247](https://github.com/mia-platform/crud-service/pull/247): `xls` and `xlsx` export formats

### Changed

- updated service dependencies
- updated NodeJS version in Dockerfile to v18.19.0

## 6.9.4 - 2023-11-22

### Added

- [#225](https://github.com/mia-platform/crud-service/pull/225): `MONGODB_MAX_IDLE_TIME_MS` env to control MongoDB `maxIdleTimeMs` connection option (default set to 0 for backward compatibility, meaning the opened connection remain opened indefinitely)

### Fixed
### Fixed

- [#227](https://github.com/mia-platform/crud-service/pull/227): create indexes limiting promises concurrency to prevent connection creation spikes at boot

Expand Down
8 changes: 4 additions & 4 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM node:20.10.0-bullseye-slim as base-with-encryption
FROM node:20.11.0-bullseye-slim as base-with-encryption

WORKDIR /cryptd

Expand All @@ -10,7 +10,7 @@ RUN apt-get update && \

########################################################################################################################

FROM node:20.10.0-bullseye-slim as build
FROM node:20.11.0-bullseye-slim as build

ARG COMMIT_SHA=<not-specified>
ENV NODE_ENV=production
Expand All @@ -30,7 +30,7 @@ RUN echo "crud-service: $COMMIT_SHA" >> ./commit.sha

# create a CRUD Service image that does not support automatic CSFLE
# and therefore it can be employed by everybody in any MongoDB product
FROM node:20.10.0-bullseye-slim as crud-service-no-encryption
FROM node:20.11.0-bullseye-slim as crud-service-no-encryption

# note: zlib can be removed once node image version is updated
RUN apt-get update \
Expand All @@ -43,7 +43,7 @@ LABEL maintainer="Mia Platform Core Team<[email protected]>" \
name="CRUD Service" \
description="HTTP interface to perform CRUD operations on configured MongoDB collections" \
eu.mia-platform.url="https://www.mia-platform.eu" \
eu.mia-platform.version="6.9.4"
eu.mia-platform.version="6.9.5"

ENV NODE_ENV=production
ENV LOG_LEVEL=info
Expand Down
15 changes: 10 additions & 5 deletions docs/60_Usage_Best_Practices.md
Original file line number Diff line number Diff line change
Expand Up @@ -268,13 +268,18 @@ These data can be dynamic or static, but in the case of dynamic data, it is impo

When the dataset of a collection has a high rate of **UPDATE/DELETE** operations compared to the rate of incoming HTTP requests, we suggest avoiding pagination mechanisms in favor of a data streaming approach.

The `GET /export` method exposed by each endpoint associated with a collection opens a data stream in `nd-json` format in the HTTP response. By using this method, the CRUD Service will open **only one cursor** to the MongoDB cluster, and the `ResultSet` will remain unaffected by concurrent **UPDATE/DELETE** operations.
The `GET /export` method exposed by each endpoint associated with a collection opens a data stream in different formats. By using this method, the CRUD Service will open **only one cursor** to the MongoDB cluster, and the `ResultSet` will remain unaffected by concurrent **UPDATE/DELETE** operations.

:::info
[ndjson](http://ndjson.org/) is a format that ensures the streaming of data structures, where each record is processed individually and separated by a newline (`\n`) delimiter.
The export format can be specified through the `Accept` header, supported formats (and relative headers) are the following ones:

To properly read this format, it is necessary to specify the header `"Accept: application/x-ndjson" `within the HTTP request. This header informs the server that the client expects the response to be in `nd-json` format.
:::

| Format | Accept Header value | Description |
|--------|---------------------|-------------|
| `json` | `application/json` | Data is exported in JSON format. |
| [`ndjson`](http://ndjson.org/) | `application/x-ndjson` | Data is exported in JSON, each record is processed individually and separated by a newline (`\n`) delimiter. |
| `csv` | `text/csv` | Data is exported in CSV format using comma as separator (the CSV includes the header with column names) |
| `xlsx` | `application/vnd.openxmlformats-officedocument.spreadsheetml.sheet` | Data is exported in XLS format (the file includes the header with column names) |
| `xls` | `application/vnd.ms-excel` | Data is exported in XLS format (the file includes the header with column names) |

In the given scenario, we can make a single HTTP request:

Expand Down
14 changes: 7 additions & 7 deletions lib/httpInterface.js
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ const BatchWritableStream = require('./BatchWritableStream')

const { SCHEMAS_ID } = require('./schemaGetters')
const { getAjvResponseValidationFunction, shouldValidateStream, shouldValidateItem } = require('./validatorGetters')
const { getFileMimeParser, getFileMimeStringify } = require('./mimeTypeTransform')
const { getFileMimeParser, getFileMimeStringifiers } = require('./mimeTypeTransform')

const OPTIONS_INCOMPATIBILITY_ERROR_CODE = 2
const UNIQUE_INDEX_MONGO_ERROR_CODE = 11000
Expand Down Expand Up @@ -437,8 +437,8 @@ async function handleGetListLookup(request, reply) {
const stateArr = state?.split(',')
const { replyType, streamValidator } = reply.context.config
const contentType = replyType()
const responseParser = getFileMimeStringify(contentType)
if (!responseParser) {
const responseStringifiers = getFileMimeStringifiers(contentType)
if (!responseStringifiers) {
return reply.getHttpError(UNSUPPORTED_MIME_TYPE_STATUS_CODE, `Unsupported file type ${contentType}`)
}

Expand All @@ -451,7 +451,7 @@ async function handleGetListLookup(request, reply) {
.stream(),
this.castResultsAsStream(),
streamValidator(),
responseParser(),
...responseStringifiers(),
reply.raw
)
} catch (error) {
Expand Down Expand Up @@ -486,8 +486,8 @@ async function handleGetList(request, reply) {
const { replyType, streamValidator } = reply.context.config
const contentType = replyType(accept)

const responseParser = getFileMimeStringify(contentType, {})
if (!responseParser) {
const responseStringifiers = getFileMimeStringifiers(contentType, {})
if (!responseStringifiers) {
return reply.getHttpError(NOT_ACCEPTABLE, `unsupported file type ${contentType}`)
}

Expand Down Expand Up @@ -519,7 +519,7 @@ async function handleGetList(request, reply) {
.stream(),
this.castResultsAsStream(),
streamValidator(),
responseParser(),
...responseStringifiers(),
reply.raw
)
} catch (error) {
Expand Down
60 changes: 16 additions & 44 deletions lib/mimeTypeTransform.js
Original file line number Diff line number Diff line change
Expand Up @@ -15,60 +15,32 @@
*/
'use strict'

const ndjson = require('ndjson')
const csvParse = require('csv-parse')
const csvStringify = require('csv-stringify')
const JSONStream = require('JSONStream')
const excel = require('./transformers/excel')
const ndjson = require('./transformers/ndjson')
const json = require('./transformers/json')
const csv = require('./transformers/csv')

module.exports = {
getFileMimeStringify: (contentType, parsingOptions = {}) => {
getFileMimeStringifiers: (contentType, parsingOptions = {}) => {
const mimeStringify = {
'application/x-ndjson': () => ndjson.stringify(),
'application/json': () => JSONStream.stringify(),
'text/csv': () => csvStringify.stringify({
encoding: 'utf8',
delimiter: ',',
escape: '\\',
header: true,
quote: false,
...parsingOptions,
cast: {
object: (value) => {
try {
return { value: JSON.stringify(value), quote: true }
} catch (errs) {
return value
}
},
},
}),
'application/x-ndjson': ndjson(parsingOptions).stringifier,
'application/json': json(parsingOptions).stringifier,
'text/csv': csv(parsingOptions).stringifier,
'application/vnd.ms-excel': excel(parsingOptions).stringifier,
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet': excel(parsingOptions).stringifier,
}

return mimeStringify[contentType]
},
getFileMimeParser: (contentType, parsingOptions) => {
const mimeParser = {
'application/x-ndjson': () => ndjson.parse(),
'application/json': () => JSONStream.parse('*'),
'text/csv': () => csvParse.parse({
encoding: 'utf8',
delimiter: ',',
columns: true,
skip_empty_lines: true,
relax_quotes: true,
escape: '\\',
...parsingOptions,
cast: (value) => {
try {
return JSON.parse(value)
} catch (errs) {
return value
}
},
}),
'application/x-ndjson': ndjson(parsingOptions).parser,
'application/json': json(parsingOptions).parser,
'text/csv': csv(parsingOptions).parser,
'application/vnd.ms-excel': excel(parsingOptions).parser,
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet': excel(parsingOptions).parser,
}

return mimeParser[contentType]
},

}

55 changes: 55 additions & 0 deletions lib/transformers/csv.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
/*
* Copyright 2024 Mia s.r.l.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
'use strict'

const csvParse = require('csv-parse')
const csvStringify = require('csv-stringify')

module.exports = (parsingOptions) => ({
stringifier: () => [csvStringify.stringify({
encoding: 'utf8',
delimiter: ',',
escape: '\\',
header: true,
quote: false,
...parsingOptions,
cast: {
object: (value) => {
try {
return { value: JSON.stringify(value), quote: true }
} catch (errs) {
return value
}
},
},
})],
parser: () => csvParse.parse({
encoding: 'utf8',
delimiter: ',',
columns: true,
skip_empty_lines: true,
relax_quotes: true,
escape: '\\',
...parsingOptions,
cast: (value) => {
try {
return JSON.parse(value)
} catch (errs) {
return value
}
},
}),
})
48 changes: 48 additions & 0 deletions lib/transformers/excel.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
/*
* Copyright 2024 Mia s.r.l.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
'use strict'

const { Transform } = require('stream')
const XLSXTransformStream = require('xlsx-write-stream')

module.exports = () => ({
stringifier: () => {
let headerProcessed = false
const dataTransformer = new Transform({
transform(chunk, _, callback) {
if (!headerProcessed) {
headerProcessed = true
const columns = Object.keys(chunk)
this.push(columns)
}

this.push(
Object.values(chunk)
.map(documentValue => (
typeof documentValue === 'object'
? JSON.stringify(documentValue)
: documentValue
))
)
return callback()
},
objectMode: true,
})
const xlsxTransformer = new XLSXTransformStream()
return [dataTransformer, xlsxTransformer]
},
parser: () => { throw new Error('not implemented') },
})
23 changes: 23 additions & 0 deletions lib/transformers/json.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
/*
* Copyright 2024 Mia s.r.l.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
'use strict'

const JSONStream = require('JSONStream')

module.exports = () => ({
stringifier: () => [JSONStream.stringify()],
parser: () => JSONStream.parse('*'),
})
23 changes: 23 additions & 0 deletions lib/transformers/ndjson.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
/*
* Copyright 2024 Mia s.r.l.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
'use strict'

const ndjson = require('ndjson')

module.exports = () => ({
stringifier: () => [ndjson.stringify()],
parser: () => ndjson.parse(),
})
Loading

0 comments on commit 4fb3bf4

Please sign in to comment.