Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Add migration support for sqlite, better validation #4

Open
wants to merge 33 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
6c5d6cb
Begin working on migrations
cameronblandford Mar 26, 2019
0f2b1d7
Add support for remaking and altering existing columns
cameronblandford Mar 27, 2019
42a037f
Merge branch 'master' of https://github.com/221B-io/json-to-knex into…
cameronblandford Mar 29, 2019
a37a49e
prettify validation example
cameronblandford Mar 29, 2019
126a67d
Improve validation example
cameronblandford Mar 29, 2019
bb08f16
Add postgres test file
cameronblandford Apr 1, 2019
d359907
Show postgres working
cameronblandford Apr 2, 2019
257a929
Move schemas into own file, add stringify util for readability/space
cameronblandford Apr 3, 2019
b62b40e
Get postgres tests working
cameronblandford Apr 4, 2019
5963782
Remove obsolete comments
cameronblandford Apr 4, 2019
9c35a09
Add basics for private migrations table management
cameronblandford Apr 5, 2019
6ea88ae
Add instructions on how to migrate between schemas
cameronblandford Apr 5, 2019
7835289
Update README.md
cameronblandford Apr 8, 2019
190af29
Add full schema migration function
cameronblandford Apr 8, 2019
a280f84
Merge branch 'feature/migrations' of https://github.com/221B-io/json-…
cameronblandford Apr 8, 2019
ab4c6a3
Add cli stubs
cameronblandford Apr 8, 2019
c52291d
Refactor using utility function
cameronblandford Apr 8, 2019
6354d73
Add non-internal migrating to db manager
cameronblandford Apr 8, 2019
7c2c08f
Fix some bugs in the manager, add a test for the manager
cameronblandford Apr 8, 2019
c08e9de
Add full, passing tests of functionality with postgres
cameronblandford Apr 9, 2019
7fc93df
Fix tests, some bugs, add test stub for validation specifically
cameronblandford Apr 9, 2019
2503f90
Fix async issue
cameronblandford Apr 12, 2019
646ad6d
Add JSDoc stubs for methods
cameronblandford Apr 12, 2019
8dc5439
Update docs
cameronblandford Apr 12, 2019
7363f68
Add comments
cameronblandford Apr 12, 2019
f7e0383
Remove unnecessary knex initialization
cameronblandford Apr 16, 2019
7703aae
Change raw queries to knex chains
cameronblandford Apr 16, 2019
3efc3be
Fix async issues, clean up
cameronblandford Apr 18, 2019
a52bdfd
Fix bug where table and column changes would happen in wrong order
cameronblandford Apr 18, 2019
184b749
Remove obsolete comments
cameronblandford Apr 22, 2019
620e693
Attempt to add CI
cameronblandford Apr 22, 2019
1a53631
Fix copy/paste typo
cameronblandford Apr 22, 2019
f48c7f8
Start to fix tests
cameronblandford Apr 26, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
language: node_js
node_js:
- "stable"
cache:
directories:
- "node_modules"
script:
- npm test
9 changes: 9 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,14 @@
# JSON to Knex


```bash
npm install json-to-knex
j2k init # creates json-migrations folder, .jtkconfig file
# write your schema
j2k migrate
# generates your tables for you!
```

## What it does

JSON to Knex dynamically creates tables by parsing a JSON structure into a series of chained knexjs functions using the input you specify.
Expand Down
Empty file added cli/applyAll.js
Empty file.
8 changes: 8 additions & 0 deletions cli/applyNextMigration.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
const fs = require("fs");
const migrator = require("../lib/migrations");
const dbManager = require("../lib/db-manager");
// check for dbSchema.json

(async () => {
// stub
})();
8 changes: 8 additions & 0 deletions cli/init.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
const fs = require("fs");
const migrator = require("../lib/migrations");
const dbManager = require("../lib/db-manager");
// check for dbSchema.json

(async () => {
await dbManager.makeMigrationsTable();
})();
Empty file added cli/rollbackAll.js
Empty file.
Empty file added cli/rollbackMigration.js
Empty file.
96 changes: 59 additions & 37 deletions examples/validation-from-disk.js
Original file line number Diff line number Diff line change
@@ -1,53 +1,75 @@
const Ajv = require('ajv');
const fse = require('fse');
const path = require('path');
const Ajv = require("ajv");
const fse = require("fse");
const path = require("path");

const getSchema = (uri ) => {
return request.json(uri).then(function (res) {
const getSchema = uri => {
return request.json(uri).then(function(res) {
if (res.statusCode >= 400)
throw new Error('Loading error: ' + res.statusCode);
throw new Error("Loading error: " + res.statusCode);
return res.body;
});
}

};

// const loadSchema = getSchema;

async function go() {
const base = '../sql-json-schema/'
const base = "../sql-json-schema/";

const loadSchema = (uri) => {
const loadSchema = uri => {
return fse.readJson(path.join(base, uri));
}
};

const schemaSchema = await fse.readJson(path.join(base, 'schema.schema.json'))
const ajv = new Ajv({ loadSchema, });
const schemaSchema = await fse.readJson(
path.join(base, "schema.schema.json")
);
const ajv = new Ajv({ loadSchema });
require("ajv-merge-patch")(ajv); // add merge and patch compatibility

ajv.compileAsync(
schemaSchema
).then((validate) => {
const valid = validate(
{
"tables": [
{
"name": "users",
"columns": [
{
"name": "firstName",
"type": "float",
scale: 'something'
}
]
}
]
}
)
if(!valid) {
console.log('Error!');
ajv.compileAsync(schemaSchema).then(validate => {
const valid = validate({
tables: [
{
name: "users",
columns: [
{
name: "age",
type: "integer",
unsigned: false
},
{
name: "firstName",
type: "string"
},
{
name: "bio",
type: "text",
default: "This bio is currently empty."
},
{
name: "age",
type: "integer"
},
{
name: "mightBeValid",
type: "integer"
// , unrecognizedField: 5 // will invalidate
},
{
name: "friendId",
type: "integer",
unsigned: true
}
]
}
]
});
if (!valid) {
console.log("Error!");
console.log(validate.errors);
} else {
console.log('Valid!');
console.log("Valid!");
}
})
});
}

go()
// go();
130 changes: 87 additions & 43 deletions lib/builder.js
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
const _ = require("lodash");
const validateColumn = require("./validation");

// Values that can go in a column's "type" field
const types = {
binary: (table, name, options) => {
return table.binary(name, options.length);
Expand Down Expand Up @@ -55,9 +56,15 @@ const types = {
},
uuid: (table, name) => {
return table.uuid(name);
},
foreign: (table, name, options) => {
debugger;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Still working on this?

console.log(`Creating foreign key ${options.foreign}`);
return table.foreign(options.foreign);
}
};

// Fields that can go in a column
const typeProperties = {
primary: (chain, value) => {
if (value === true) {
Expand All @@ -77,9 +84,6 @@ const typeProperties = {
}
return chain;
},
foreign: (table, name) => {
return table.foreign(name);
},
references: (chain, value) => {
return chain.references(value);
},
Expand All @@ -100,69 +104,109 @@ const typeProperties = {
},
index: (chain, value) => {
return value ? chain.index() : chain;
},
onDelete: (chain, value) => {
console.log(chain);
console.log(value);
return chain.onDelete(value);
}
};

function createTable(chain, name, schema) {
/**
* Creates a table with columns specified by the schema
* @param {*} chain
* @param {*} name
* @param {*} tableSchema
*/
async function createTable(chain, name, tableSchema) {
// Usually start with knex.createTable....
return chain.createTable(name, table => {
const sawPrimary = false;
return await chain.createTable(name, table => {
_.forEach(tableSchema.columns, column => {
createColumn(table, column);
});
});
}

_.forEach(schema.columns, column => {
const columnName = column.name;
// e.g., id: { type: "increments" }
if (_.isPlainObject(column)) {
validateColumn(column);
if (_.has(types, column.type)) {
let columnChain = types[column.type](table, columnName, column);
const order = _.uniq(
_.concat(
["primary", "type", "unsigned", "references", "inTable"],
_.keys(column)
)
/**
* Creates a column in the passed-in table using the specified column config
* @param {*} table
* @param {*} column
*/
async function createColumn(table, column, alter = false) {
const columnName = column.name;
// e.g., id: { type: "increments" }
debugger;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

More....

if (_.isPlainObject(column) && _.has(types, column.type)) {
let columnChain = types[column.type](table, columnName, column);
// correctly orders the properties to be chained:
// the ones where order is important, then all the rest
const order = _.uniq(
_.concat(
["primary", "type", "unsigned", "references", "inTable"],
_.keys(column)
)
);

// for each item in the order
_.forEach(order, columnPropertyName => {
if (_.has(column, columnPropertyName)) {
// if it exists in the column def, apply the related chain method
const columnProperty = column[columnPropertyName];
if (_.has(typeProperties, columnPropertyName)) {
columnChain = typeProperties[columnPropertyName](
columnChain,
columnProperty
);
_.forEach(order, columnPropertyName => {
if (_.has(column, columnPropertyName)) {
const columnProperty = column[columnPropertyName];
if (_.has(typeProperties, columnPropertyName)) {
columnChain = typeProperties[columnPropertyName](
columnChain,
columnProperty
);
}
}
});
}
} else {
throw Error(
`Column '${
column.type
}' of '${name}:${columnName}' is not of recognized type`
);
}
});
});

// this is only for altering columns that already exist
if (alter) {
columChain = columnChain.alter();
}
} else {
throw Error(
`Column '${
column.type
}' of '${name}:${columnName}' is not of recognized type`
);
}
}

function createTables(knex, schema) {
/**
* Creates a series of tables based off a db JSON schema
* @param {*} knex a knex instance
* @param {JSON} schema the JSON schema to be used as the blueprint
* @returns {*} a chainable knex schema object
*/
async function createTables(knex, schema) {
let chain = knex.schema;
_.forEach(schema.tables, tableConfig => {
for (let i in schema.tables) {
const tableConfig = schema.tables[i];
const tableName = tableConfig.name;
chain = createTable(chain, tableName, tableConfig);
});
chain = await createTable(chain, tableName, tableConfig);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the await necessary here?

}
return chain;
}

function dropTablesIfExists(knex, schema) {
/**
* Removes all tables listed in the passed-in schema
* @param {*} knex a knex instance
* @param {Object} schema the JSON schema to be used
* @returns {*}
*/
async function dropTablesIfExists(knex, schema) {
let chain = knex.schema;
_.forEach(schema.tables, tableConfig => {
_.forEach(schema.tables.reverse(), async tableConfig => {
const tableName = tableConfig.name;
chain.dropTableIfExists(tableName);
await chain.dropTableIfExists(tableName);
});
return chain;
}

module.exports = {
createColumn,
createTable,
createTables,
dropTablesIfExists
Expand Down
Loading