Migrating from MongoDB to Postgresql #9711
-
Hello, Now that the Postgresql adapter is out of beta, I was considering if it is feasible for us to migrate from MongoDB to Postgresql, since we already have a Postgresql server, and it would cut down on some costs, and lower the number of technologies in our stack. I couldn't find resources on this topic. The basic question is, how hard would it be, and what options would we have, to migrate the data we are currently storing in MongoDB (we use PayloadCMS 2.x) to Postgresql (with Payload 3) ? Thanks! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
You can create a script to paginate the payload v2 data. loop while nextPage is true. import { getPayload } from "payload";
import { SLUGS } from "../../src/collections/Constants";
import configs from "../../src/payload.config";
type my_func = () => Promise<void>;
async function main() {
const payload = await getPayload({
config: configs,
});
const failedFiles: any[] = []; // Array para almacenar los objetos de archivos y errores
let hasNextPage = true;
let page = 0;
const func_list: my_func[] = [];
while (hasNextPage) {
const bathDocs = await payload.find({
collection: SLUGS.TesauroDerechoInternacional,
pagination: true,
page: page,
limit: 100,
});
bathDocs.docs.forEach(doc => {
func_list.push(
async () => {
try {
const document = await payload.create({
collection: 'COLLECTION NAME ON v3',
data: {
...doc
'owner_type': '@IIRESODH',
}
});
console.log('Document saved:', document);
} catch (error) {
console.error('Error processing file:', error);
failedFiles.push({ doc, error });
}
}
)
})
if (bathDocs.hasNextPage) {
page = bathDocs.nextPage as number;
hasNextPage = true;
} else {
hasNextPage = false;
console.log('out of fnc');
}
}
// Function to run promises in batches
async function processInBatches(funcs: my_func[], batchSize: number) {
for (let i = 0; i < funcs.length; i += batchSize) {
const batch = funcs.slice(i, i + batchSize);
const remainingFiles = funcs.slice(i); // Archivos restantes
console.log('Remaining files:', remainingFiles.length);
await Promise.all(batch.map(func => func()));
}
}
// Process the files in batches of 8
await processInBatches(func_list, 5);
// func_list[0]();
// this its th test fnc;
// Verificar si hay archivos que fallaron
if (failedFiles.length > 0) {
console.log('Files that failed to process:', failedFiles);
}
const r = await func_list[0]();
console.log(r);
}
(async () => {
try {
await main();
console.log('Finish');
process.exit(0);
} catch (error) {
console.error('Error:', error);
process.exit(1);
}
})(); |
Beta Was this translation helpful? Give feedback.
You can create a script to paginate the payload v2 data. loop while nextPage is true.
send the data to the Payload v3 endpoint or using the project import.