Migrate codebase from CommonJS to ES Modules
- Convert all require()/module.exports to import/export across 260+ files
- Add "type": "module" to package.json to enable ESM by default
- Add migrations/package.json with "type": "commonjs" to keep db-migrate compatible
- Convert eslint.config.js to ESM with sourceType: "module"
- Replace __dirname/__filename with import.meta.dirname/import.meta.filename
- Replace require.main === module with process.argv[1] === import.meta.filename
- Remove 'use strict' directives (implicit in ESM)
- Convert dynamic require() in switch statements to static import lookup maps
(dns.js, domains.js, backupformats.js, backupsites.js, network.js)
- Extract self-referencing exports.CONSTANT patterns into standalone const
declarations (apps.js, services.js, locks.js, users.js, mail.js, etc.)
- Lazify SERVICES object in services.js to avoid circular dependency TDZ issues
- Add clearMailQueue() to mailer.js for ESM-safe queue clearing in tests
- Add _setMockApp() to ldapserver.js for ESM-safe test mocking
- Add _setMockResolve() wrapper to dig.js for ESM-safe DNS mocking in tests
- Convert backupupload.js to use dynamic imports so --check exits before
loading the module graph (which requires BOX_ENV)
- Update check-install to use ESM import for infra_version.js
- Convert scripts/ (hotfix, release, remote_hotfix.js, find-unused-translations)
- All 1315 tests passing
Migration stats (AI-assisted using Cursor with Claude):
- Wall clock time: ~3-4 hours
- Assistant completions: ~80-100
- Estimated token usage: ~1-2M tokens
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-02-14 09:53:14 +01:00
|
|
|
import assert from 'node:assert';
|
2026-02-14 15:43:24 +01:00
|
|
|
import backupSites from '../backupsites.js';
|
Migrate codebase from CommonJS to ES Modules
- Convert all require()/module.exports to import/export across 260+ files
- Add "type": "module" to package.json to enable ESM by default
- Add migrations/package.json with "type": "commonjs" to keep db-migrate compatible
- Convert eslint.config.js to ESM with sourceType: "module"
- Replace __dirname/__filename with import.meta.dirname/import.meta.filename
- Replace require.main === module with process.argv[1] === import.meta.filename
- Remove 'use strict' directives (implicit in ESM)
- Convert dynamic require() in switch statements to static import lookup maps
(dns.js, domains.js, backupformats.js, backupsites.js, network.js)
- Extract self-referencing exports.CONSTANT patterns into standalone const
declarations (apps.js, services.js, locks.js, users.js, mail.js, etc.)
- Lazify SERVICES object in services.js to avoid circular dependency TDZ issues
- Add clearMailQueue() to mailer.js for ESM-safe queue clearing in tests
- Add _setMockApp() to ldapserver.js for ESM-safe test mocking
- Add _setMockResolve() wrapper to dig.js for ESM-safe DNS mocking in tests
- Convert backupupload.js to use dynamic imports so --check exits before
loading the module graph (which requires BOX_ENV)
- Update check-install to use ESM import for infra_version.js
- Convert scripts/ (hotfix, release, remote_hotfix.js, find-unused-translations)
- All 1315 tests passing
Migration stats (AI-assisted using Cursor with Claude):
- Wall clock time: ~3-4 hours
- Assistant completions: ~80-100
- Estimated token usage: ~1-2M tokens
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-02-14 09:53:14 +01:00
|
|
|
import BoxError from '../boxerror.js';
|
|
|
|
|
import DataLayout from '../datalayout.js';
|
2026-03-12 22:55:28 +05:30
|
|
|
import logger from '../logger.js';
|
2026-02-14 15:43:24 +01:00
|
|
|
import hush from '../hush.js';
|
|
|
|
|
const { DecryptStream, EncryptStream } = hush;
|
Migrate codebase from CommonJS to ES Modules
- Convert all require()/module.exports to import/export across 260+ files
- Add "type": "module" to package.json to enable ESM by default
- Add migrations/package.json with "type": "commonjs" to keep db-migrate compatible
- Convert eslint.config.js to ESM with sourceType: "module"
- Replace __dirname/__filename with import.meta.dirname/import.meta.filename
- Replace require.main === module with process.argv[1] === import.meta.filename
- Remove 'use strict' directives (implicit in ESM)
- Convert dynamic require() in switch statements to static import lookup maps
(dns.js, domains.js, backupformats.js, backupsites.js, network.js)
- Extract self-referencing exports.CONSTANT patterns into standalone const
declarations (apps.js, services.js, locks.js, users.js, mail.js, etc.)
- Lazify SERVICES object in services.js to avoid circular dependency TDZ issues
- Add clearMailQueue() to mailer.js for ESM-safe queue clearing in tests
- Add _setMockApp() to ldapserver.js for ESM-safe test mocking
- Add _setMockResolve() wrapper to dig.js for ESM-safe DNS mocking in tests
- Convert backupupload.js to use dynamic imports so --check exits before
loading the module graph (which requires BOX_ENV)
- Update check-install to use ESM import for infra_version.js
- Convert scripts/ (hotfix, release, remote_hotfix.js, find-unused-translations)
- All 1315 tests passing
Migration stats (AI-assisted using Cursor with Claude):
- Wall clock time: ~3-4 hours
- Assistant completions: ~80-100
- Estimated token usage: ~1-2M tokens
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-02-14 09:53:14 +01:00
|
|
|
import fs from 'node:fs';
|
|
|
|
|
import HashStream from '../hash-stream.js';
|
|
|
|
|
import path from 'node:path';
|
|
|
|
|
import ProgressStream from '../progress-stream.js';
|
|
|
|
|
import promiseRetry from '../promise-retry.js';
|
|
|
|
|
import safe from 'safetydance';
|
|
|
|
|
import stream from 'stream/promises';
|
|
|
|
|
import { Transform } from 'node:stream';
|
|
|
|
|
import tar from 'tar-stream';
|
|
|
|
|
import util from 'node:util';
|
|
|
|
|
import zlib from 'node:zlib';
|
|
|
|
|
|
2026-03-12 22:55:28 +05:30
|
|
|
const { log, trace } = logger('backupformat/tgz');
|
2022-04-28 18:43:14 -07:00
|
|
|
|
2024-07-18 15:13:38 +02:00
|
|
|
// In tar, the entry header contains the file size. If we don't provide it those many bytes, the tar will become corrupt
|
|
|
|
|
// Linux provides no guarantee of how many bytes can be read from a file. This is the case with sqlite and log files
|
|
|
|
|
// which are accessed by other processes when tar is in action. This class handles overflow and underflow
|
|
|
|
|
class EnsureFileSizeStream extends Transform {
|
|
|
|
|
constructor(options) {
|
|
|
|
|
super(options);
|
|
|
|
|
this._remaining = options.size;
|
|
|
|
|
this._name = options.name;
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
_transform(chunk, encoding, callback) {
|
|
|
|
|
if (this._remaining <= 0) {
|
2026-03-12 22:55:28 +05:30
|
|
|
log(`EnsureFileSizeStream: ${this._name} dropping ${chunk.length} bytes`);
|
2024-07-18 15:13:38 +02:00
|
|
|
return callback(null);
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
if (this._remaining - chunk.length < 0) {
|
2026-03-12 22:55:28 +05:30
|
|
|
log(`EnsureFileSizeStream: ${this._name} dropping extra ${chunk.length - this._remaining} bytes`);
|
2024-07-18 15:13:38 +02:00
|
|
|
chunk = chunk.subarray(0, this._remaining);
|
|
|
|
|
this._remaining = 0;
|
|
|
|
|
} else {
|
|
|
|
|
this._remaining -= chunk.length;
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
callback(null, chunk);
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
_flush(callback) {
|
|
|
|
|
if (this._remaining > 0) {
|
2026-03-12 22:55:28 +05:30
|
|
|
log(`EnsureFileSizeStream: ${this._name} injecting ${this._remaining} bytes`);
|
2024-07-18 15:13:38 +02:00
|
|
|
this.push(Buffer.alloc(this._remaining, 0));
|
|
|
|
|
}
|
|
|
|
|
callback();
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
2024-07-18 13:40:36 +02:00
|
|
|
function addEntryToPack(pack, header, options) {
|
2024-07-18 13:31:29 +02:00
|
|
|
assert.strictEqual(typeof pack, 'object');
|
|
|
|
|
assert.strictEqual(typeof header, 'object');
|
|
|
|
|
assert.strictEqual(typeof options, 'object'); // { input }
|
|
|
|
|
|
2024-07-05 17:53:35 +02:00
|
|
|
return new Promise((resolve, reject) => {
|
|
|
|
|
const packEntry = safe(() => pack.entry(header, function (error) {
|
|
|
|
|
if (error) {
|
2026-03-12 22:55:28 +05:30
|
|
|
log(`addToPack: error adding ${header.name} ${header.type} ${error.message}`);
|
2024-07-10 19:09:02 +02:00
|
|
|
reject(new BoxError(BoxError.FS_ERROR, error.message));
|
2024-07-05 17:53:35 +02:00
|
|
|
} else {
|
|
|
|
|
resolve();
|
|
|
|
|
}
|
|
|
|
|
}));
|
|
|
|
|
|
2024-07-18 13:31:29 +02:00
|
|
|
if (!packEntry) return reject(new BoxError(BoxError.FS_ERROR, `Failed to add ${header.name}: ${safe.error.message}`));
|
2024-07-05 17:53:35 +02:00
|
|
|
|
2024-07-18 15:13:38 +02:00
|
|
|
if (options?.input) {
|
|
|
|
|
const ensureFileSizeStream = new EnsureFileSizeStream({ name: header.name, size: header.size });
|
2026-03-12 22:55:28 +05:30
|
|
|
safe(stream.pipeline(options.input, ensureFileSizeStream, packEntry), { debug: log }); // background. rely on pack.entry callback for promise completion
|
2024-07-18 15:13:38 +02:00
|
|
|
}
|
2024-07-05 17:53:35 +02:00
|
|
|
});
|
|
|
|
|
}
|
|
|
|
|
|
2024-07-18 13:40:36 +02:00
|
|
|
async function addPathToPack(pack, localPath, dataLayout) {
|
|
|
|
|
assert.strictEqual(typeof pack, 'object');
|
|
|
|
|
assert(dataLayout instanceof DataLayout, 'dataLayout must be a DataLayout');
|
|
|
|
|
assert.strictEqual(typeof localPath, 'string');
|
|
|
|
|
|
2025-08-13 19:22:14 +05:30
|
|
|
const stats = { fileCount: 0, linkCount: 0, dirCount: 0 };
|
|
|
|
|
|
2024-07-18 13:40:36 +02:00
|
|
|
const queue = [ localPath ];
|
|
|
|
|
while (queue.length) {
|
|
|
|
|
// if (pack.destroyed || outStream.destroyed) break;
|
|
|
|
|
const dir = queue.shift();
|
|
|
|
|
const [readdirError, entries] = await safe(fs.promises.readdir(dir, { withFileTypes: true }));
|
|
|
|
|
if (!entries) {
|
2026-03-12 22:55:28 +05:30
|
|
|
log(`tarPack: skipping directory ${dir}: ${readdirError.message}`);
|
2024-07-18 13:40:36 +02:00
|
|
|
continue;
|
|
|
|
|
}
|
|
|
|
|
const subdirs = [];
|
|
|
|
|
for (const entry of entries) {
|
|
|
|
|
const abspath = path.join(dir, entry.name);
|
|
|
|
|
const headerName = dataLayout.toRemotePath(abspath);
|
|
|
|
|
if (entry.isFile()) {
|
|
|
|
|
const [openError, handle] = await safe(fs.promises.open(abspath, 'r'));
|
2026-03-12 22:55:28 +05:30
|
|
|
if (!handle) { log(`tarPack: skipping file, could not open ${abspath}: ${openError.message}`); continue; }
|
2024-07-18 13:40:36 +02:00
|
|
|
const [statError, stat] = await safe(handle.stat());
|
2026-03-12 22:55:28 +05:30
|
|
|
if (!stat) { log(`tarPack: skipping file, could not stat ${abspath}: ${statError.message}`); continue; }
|
2024-07-18 13:40:36 +02:00
|
|
|
const header = { name: headerName, type: 'file', mode: stat.mode, size: stat.size, uid: process.getuid(), gid: process.getgid() };
|
|
|
|
|
if (stat.size > 8589934590 || entry.name.length > 99) header.pax = { size: stat.size };
|
|
|
|
|
const input = handle.createReadStream({ autoClose: true });
|
|
|
|
|
await addEntryToPack(pack, header, { input });
|
2025-08-13 19:22:14 +05:30
|
|
|
++stats.fileCount;
|
2024-07-18 13:40:36 +02:00
|
|
|
} else if (entry.isDirectory()) {
|
|
|
|
|
const header = { name: headerName, type: 'directory', uid: process.getuid(), gid: process.getgid() };
|
|
|
|
|
subdirs.push(abspath);
|
|
|
|
|
await addEntryToPack(pack, header, { /* options */ });
|
2025-08-13 19:22:14 +05:30
|
|
|
++stats.dirCount;
|
2024-07-18 13:40:36 +02:00
|
|
|
} else if (entry.isSymbolicLink()) {
|
2025-09-12 09:48:37 +02:00
|
|
|
const [readlinkError, site] = await safe(fs.promises.readlink(abspath));
|
2026-03-12 22:55:28 +05:30
|
|
|
if (!site) { log(`tarPack: skipping link, could not readlink ${abspath}: ${readlinkError.message}`); continue; }
|
2025-09-12 09:48:37 +02:00
|
|
|
const header = { name: headerName, type: 'symlink', linkname: site, uid: process.getuid(), gid: process.getgid() };
|
2024-07-18 13:40:36 +02:00
|
|
|
await addEntryToPack(pack, header, { /* options */ });
|
2025-08-13 19:22:14 +05:30
|
|
|
++stats.linkCount;
|
2024-07-18 13:40:36 +02:00
|
|
|
} else {
|
2026-03-12 22:55:28 +05:30
|
|
|
log(`tarPack: ignoring unknown type ${entry.name} ${entry.type}`);
|
2024-07-18 13:40:36 +02:00
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
queue.unshift(...subdirs); // add to front of queue and in order of readdir listing
|
|
|
|
|
}
|
2025-08-13 19:22:14 +05:30
|
|
|
|
|
|
|
|
return stats;
|
2024-07-18 13:40:36 +02:00
|
|
|
}
|
|
|
|
|
|
2024-07-05 17:53:35 +02:00
|
|
|
async function tarPack(dataLayout, encryption, uploader, progressCallback) {
|
2022-04-28 18:43:14 -07:00
|
|
|
assert(dataLayout instanceof DataLayout, 'dataLayout must be a DataLayout');
|
|
|
|
|
assert.strictEqual(typeof encryption, 'object');
|
2024-07-05 17:53:35 +02:00
|
|
|
assert.strictEqual(typeof uploader, 'object');
|
|
|
|
|
assert.strictEqual(typeof progressCallback, 'function');
|
2022-04-28 18:43:14 -07:00
|
|
|
|
|
|
|
|
const gzip = zlib.createGzip({});
|
2022-11-06 10:17:14 +01:00
|
|
|
const ps = new ProgressStream({ interval: 10000 }); // emit 'progress' every 10 seconds
|
2024-07-05 17:53:35 +02:00
|
|
|
ps.on('progress', function (progress) {
|
|
|
|
|
const transferred = Math.round(progress.transferred/1024/1024), speed = Math.round(progress.speed/1024/1024);
|
|
|
|
|
if (!transferred && !speed) return progressCallback({ message: 'Uploading backup' }); // 0M@0MBps looks wrong
|
|
|
|
|
progressCallback({ message: `Uploading backup ${transferred}M@${speed}MBps` });
|
2022-04-28 18:43:14 -07:00
|
|
|
});
|
2026-03-12 16:52:32 +05:30
|
|
|
ps.on('heartbeat', function ({ elapsed, transferred }) {
|
|
|
|
|
progressCallback({ message: `Still uploading backup (${elapsed}s, ${Math.round(transferred/1024/1024)}M)` });
|
|
|
|
|
});
|
2022-04-28 18:43:14 -07:00
|
|
|
|
2026-01-20 22:22:02 +01:00
|
|
|
// careful not to have async code between here and pipeline() for 'error' handling
|
2024-07-05 17:53:35 +02:00
|
|
|
const pack = tar.pack();
|
2025-08-13 18:38:56 +05:30
|
|
|
const hash = new HashStream();
|
2026-01-20 22:22:02 +01:00
|
|
|
const destStream = uploader.createStream();
|
2025-08-11 19:30:22 +05:30
|
|
|
|
2026-02-15 19:37:30 +01:00
|
|
|
let pipeline;
|
2022-04-28 18:43:14 -07:00
|
|
|
if (encryption) {
|
|
|
|
|
const encryptStream = new EncryptStream(encryption);
|
2026-01-20 22:22:02 +01:00
|
|
|
pipeline = safe(stream.pipeline(pack, gzip, encryptStream, ps, hash, destStream));
|
2022-04-28 18:43:14 -07:00
|
|
|
} else {
|
2026-01-20 22:22:02 +01:00
|
|
|
pipeline = safe(stream.pipeline(pack, gzip, ps, hash, destStream));
|
2024-07-05 17:53:35 +02:00
|
|
|
}
|
|
|
|
|
|
2025-08-13 19:22:14 +05:30
|
|
|
let fileCount = 0;
|
2024-07-05 17:53:35 +02:00
|
|
|
for (const localPath of dataLayout.localPaths()) {
|
2026-03-12 22:55:28 +05:30
|
|
|
const [error, stats] = await safe(addPathToPack(pack, localPath, dataLayout), { debug: log });
|
2024-07-18 13:40:36 +02:00
|
|
|
if (error) break; // the pipeline will error and we will retry the whole packing all over
|
2025-08-13 19:22:14 +05:30
|
|
|
fileCount += stats.fileCount;
|
2022-04-28 18:43:14 -07:00
|
|
|
}
|
2026-03-12 22:55:28 +05:30
|
|
|
log(`tarPack: packed ${fileCount} files`);
|
2022-04-28 18:43:14 -07:00
|
|
|
|
2024-07-18 13:40:36 +02:00
|
|
|
pack.finalize(); // harmless to call if already in error state
|
2024-07-05 17:53:35 +02:00
|
|
|
|
2024-07-10 19:09:02 +02:00
|
|
|
const [error] = await pipeline; // already wrapped in safe()
|
2024-07-05 17:53:35 +02:00
|
|
|
if (error) throw new BoxError(BoxError.EXTERNAL_ERROR, `tarPack pipeline error: ${error.message}`);
|
2025-10-01 17:19:58 +02:00
|
|
|
|
2025-10-20 13:22:51 +02:00
|
|
|
const stats = ps.stats(); // { startTime, totalMsecs, transferred }
|
2026-03-12 22:55:28 +05:30
|
|
|
log(`tarPack: pipeline finished: ${JSON.stringify(stats)}`);
|
2024-07-05 17:53:35 +02:00
|
|
|
|
|
|
|
|
await uploader.finish();
|
2025-08-12 19:41:50 +05:30
|
|
|
return {
|
2025-10-20 13:22:51 +02:00
|
|
|
stats: { fileCount, size: stats.transferred, transferred: stats.transferred },
|
2025-10-01 17:19:58 +02:00
|
|
|
integrity: { size: stats.transferred, fileCount, sha256: hash.digest('hex') }
|
2025-08-12 19:41:50 +05:30
|
|
|
};
|
2022-04-28 18:43:14 -07:00
|
|
|
}
|
|
|
|
|
|
2024-07-05 17:53:35 +02:00
|
|
|
async function tarExtract(inStream, dataLayout, encryption, progressCallback) {
|
2022-04-28 18:43:14 -07:00
|
|
|
assert.strictEqual(typeof inStream, 'object');
|
|
|
|
|
assert(dataLayout instanceof DataLayout, 'dataLayout must be a DataLayout');
|
|
|
|
|
assert.strictEqual(typeof encryption, 'object');
|
2024-07-05 17:53:35 +02:00
|
|
|
assert.strictEqual(typeof progressCallback, 'function');
|
2022-04-28 18:43:14 -07:00
|
|
|
|
2024-07-05 17:53:35 +02:00
|
|
|
const extract = tar.extract();
|
|
|
|
|
const now = new Date();
|
2026-03-12 16:52:32 +05:30
|
|
|
let entryCount = 0;
|
2024-07-05 17:53:35 +02:00
|
|
|
extract.on('entry', async function (header, entryStream, next) {
|
|
|
|
|
if (path.isAbsolute(header.name)) {
|
2026-03-12 22:55:28 +05:30
|
|
|
log(`tarExtract: ignoring absolute path ${header.name}`);
|
2024-07-05 17:53:35 +02:00
|
|
|
return next();
|
|
|
|
|
}
|
2026-03-12 16:52:32 +05:30
|
|
|
++entryCount;
|
2024-07-05 17:53:35 +02:00
|
|
|
const abspath = dataLayout.toLocalPath(header.name);
|
|
|
|
|
let error = null;
|
|
|
|
|
if (header.type === 'directory') {
|
|
|
|
|
[error] = await safe(fs.promises.mkdir(abspath, { recursive: true, mode: 0o755 }));
|
|
|
|
|
} else if (header.type === 'file') {
|
|
|
|
|
const output = fs.createWriteStream(abspath);
|
|
|
|
|
[error] = await safe(stream.pipeline(entryStream, output));
|
2024-07-11 17:22:13 +02:00
|
|
|
if (!error) [error] = await safe(fs.promises.chmod(abspath, header.mode));
|
2024-07-05 17:53:35 +02:00
|
|
|
} else if (header.type === 'symlink') {
|
|
|
|
|
await safe(fs.promises.unlink(abspath)); // remove any link created from previous failed extract
|
|
|
|
|
[error] = await safe(fs.promises.symlink(header.linkname, abspath));
|
|
|
|
|
} else {
|
2026-03-12 22:55:28 +05:30
|
|
|
log(`tarExtract: ignoring unknown entry: ${header.name} ${header.type}`);
|
2024-07-05 17:53:35 +02:00
|
|
|
entryStream.resume(); // drain
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
if (error) return next(error);
|
|
|
|
|
|
|
|
|
|
[error] = await safe(fs.promises.lutimes(abspath, now /* atime */, header.mtime)); // for dirs, mtime will get overwritten
|
|
|
|
|
next(error);
|
2024-07-05 09:26:38 +02:00
|
|
|
});
|
2026-03-12 22:55:28 +05:30
|
|
|
extract.on('finish', () => log(`tarExtract: extracted ${entryCount} entries`));
|
2024-07-05 09:26:38 +02:00
|
|
|
|
2024-07-05 17:53:35 +02:00
|
|
|
const gunzip = zlib.createGunzip({});
|
|
|
|
|
const ps = new ProgressStream({ interval: 10000 });
|
|
|
|
|
ps.on('progress', function (progress) {
|
|
|
|
|
const transferred = Math.round(progress.transferred/1024/1024), speed = Math.round(progress.speed/1024/1024);
|
|
|
|
|
if (!transferred && !speed) return progressCallback({ message: 'Downloading backup' }); // 0M@0MBps looks wrong
|
|
|
|
|
progressCallback({ message: `Downloading ${transferred}M@${speed}MBps` });
|
2022-04-28 18:43:14 -07:00
|
|
|
});
|
2026-03-12 16:52:32 +05:30
|
|
|
ps.on('heartbeat', function ({ elapsed, transferred }) {
|
|
|
|
|
progressCallback({ message: `Still downloading backup (${elapsed}s, ${Math.round(transferred/1024/1024)}M)` });
|
|
|
|
|
});
|
2022-04-28 18:43:14 -07:00
|
|
|
|
|
|
|
|
if (encryption) {
|
2022-04-28 21:58:00 -07:00
|
|
|
const decrypt = new DecryptStream(encryption);
|
2024-07-05 17:53:35 +02:00
|
|
|
const [error] = await safe(stream.pipeline(inStream, ps, decrypt, gunzip, extract));
|
|
|
|
|
if (error) throw new BoxError(BoxError.EXTERNAL_ERROR, `tarExtract pipeline error: ${error.message}`);
|
2022-04-28 18:43:14 -07:00
|
|
|
} else {
|
2024-07-05 17:53:35 +02:00
|
|
|
const [error] = await safe(stream.pipeline(inStream, ps, gunzip, extract));
|
|
|
|
|
if (error) throw new BoxError(BoxError.EXTERNAL_ERROR, `tarExtract pipeline error: ${error.message}`);
|
2022-04-28 18:43:14 -07:00
|
|
|
}
|
|
|
|
|
|
2026-03-12 22:55:28 +05:30
|
|
|
log(`tarExtract: pipeline finished: ${JSON.stringify(ps.stats())}`);
|
2022-04-28 18:43:14 -07:00
|
|
|
}
|
|
|
|
|
|
2025-09-12 09:48:37 +02:00
|
|
|
async function download(backupSite, remotePath, dataLayout, progressCallback) {
|
|
|
|
|
assert.strictEqual(typeof backupSite, 'object');
|
2022-04-28 18:43:14 -07:00
|
|
|
assert.strictEqual(typeof remotePath, 'string');
|
|
|
|
|
assert(dataLayout instanceof DataLayout, 'dataLayout must be a DataLayout');
|
|
|
|
|
assert.strictEqual(typeof progressCallback, 'function');
|
|
|
|
|
|
2026-03-12 22:55:28 +05:30
|
|
|
log(`download: Downloading ${remotePath} to ${dataLayout.toString()}`);
|
2022-04-28 18:43:14 -07:00
|
|
|
|
2026-03-12 22:55:28 +05:30
|
|
|
await promiseRetry({ times: 3, interval: 20000, debug: log }, async () => {
|
2025-08-02 01:46:29 +02:00
|
|
|
progressCallback({ message: `Downloading backup ${remotePath}` });
|
2024-07-05 09:26:38 +02:00
|
|
|
|
2025-09-12 09:48:37 +02:00
|
|
|
const sourceStream = await backupSites.storageApi(backupSite).download(backupSite.config, remotePath);
|
|
|
|
|
await tarExtract(sourceStream, dataLayout, backupSite.encryption, progressCallback);
|
2022-04-30 16:42:14 -07:00
|
|
|
});
|
2022-04-28 18:43:14 -07:00
|
|
|
}
|
|
|
|
|
|
2025-09-12 09:48:37 +02:00
|
|
|
async function upload(backupSite, remotePath, dataLayout, progressCallback) {
|
|
|
|
|
assert.strictEqual(typeof backupSite, 'object');
|
2022-04-28 18:43:14 -07:00
|
|
|
assert.strictEqual(typeof remotePath, 'string');
|
|
|
|
|
assert.strictEqual(typeof dataLayout, 'object');
|
|
|
|
|
assert.strictEqual(typeof progressCallback, 'function');
|
|
|
|
|
|
2026-03-12 22:55:28 +05:30
|
|
|
log(`upload: uploading to site ${backupSite.id} path ${remotePath} (encrypted: ${!!backupSite.encryption}) dataLayout ${dataLayout.toString()}`);
|
2022-11-05 08:43:02 +01:00
|
|
|
|
2026-03-12 22:55:28 +05:30
|
|
|
return await promiseRetry({ times: 5, interval: 20000, debug: log }, async () => {
|
2025-08-02 01:46:29 +02:00
|
|
|
progressCallback({ message: `Uploading backup ${remotePath}` });
|
2024-07-05 17:53:35 +02:00
|
|
|
|
2025-11-14 13:18:21 +01:00
|
|
|
const uploader = await backupSites.storageApi(backupSite).upload(backupSite.config, backupSite.limits, remotePath);
|
2025-09-12 09:48:37 +02:00
|
|
|
const { stats, integrity } = await tarPack(dataLayout, backupSite.encryption, uploader, progressCallback);
|
2025-08-11 19:30:22 +05:30
|
|
|
|
2025-08-15 16:09:58 +05:30
|
|
|
// use '.' instead of remote path since the backup can be moved to another path
|
|
|
|
|
const integrityMap = new Map([ ['.', integrity] ]);
|
2025-08-15 14:33:31 +05:30
|
|
|
return { stats, integrityMap };
|
2022-04-30 16:42:14 -07:00
|
|
|
});
|
2022-04-28 18:43:14 -07:00
|
|
|
}
|
2024-07-18 15:39:45 +02:00
|
|
|
|
2025-09-12 09:48:37 +02:00
|
|
|
async function copy(backupSite, fromPath, toPath, progressCallback) {
|
|
|
|
|
assert.strictEqual(typeof backupSite, 'object');
|
2025-08-25 23:45:14 +02:00
|
|
|
assert.strictEqual(typeof fromPath, 'string');
|
|
|
|
|
assert.strictEqual(typeof toPath, 'string');
|
|
|
|
|
assert.strictEqual(typeof progressCallback, 'function');
|
|
|
|
|
|
2025-09-12 09:48:37 +02:00
|
|
|
await backupSites.storageApi(backupSite).copy(backupSite.config, fromPath, toPath, progressCallback);
|
2025-08-25 23:45:14 +02:00
|
|
|
}
|
|
|
|
|
|
2025-09-12 09:48:37 +02:00
|
|
|
async function verify(backupSite, remotePath, integrityMap, progressCallback) {
|
|
|
|
|
assert.strictEqual(typeof backupSite, 'object');
|
2025-08-15 16:09:58 +05:30
|
|
|
assert.strictEqual(typeof remotePath, 'string');
|
|
|
|
|
assert(util.types.isMap(integrityMap), 'integrityMap should be a Map');
|
|
|
|
|
assert.strictEqual(typeof progressCallback, 'function');
|
|
|
|
|
|
2026-03-12 22:55:28 +05:30
|
|
|
log(`verify: Verifying ${remotePath}`);
|
2025-08-15 16:09:58 +05:30
|
|
|
|
2025-09-12 09:48:37 +02:00
|
|
|
const inStream = await backupSites.storageApi(backupSite).download(backupSite.config, remotePath);
|
2025-08-15 16:09:58 +05:30
|
|
|
|
|
|
|
|
let fileCount = 0;
|
|
|
|
|
|
|
|
|
|
const extract = tar.extract();
|
|
|
|
|
extract.on('entry', async function (header, entryStream, next) {
|
|
|
|
|
if (path.isAbsolute(header.name)) {
|
2026-03-12 22:55:28 +05:30
|
|
|
log(`verify: ignoring absolute path ${header.name}`);
|
2025-08-15 16:09:58 +05:30
|
|
|
return next();
|
|
|
|
|
}
|
2026-03-12 22:55:28 +05:30
|
|
|
log(`verify: ${header.name} ${header.size} ${header.type}`);
|
2025-08-15 16:09:58 +05:30
|
|
|
if (header.type === 'file') {
|
|
|
|
|
++fileCount;
|
|
|
|
|
}
|
|
|
|
|
entryStream.resume(); // drain
|
|
|
|
|
next();
|
|
|
|
|
});
|
2026-03-12 22:55:28 +05:30
|
|
|
extract.on('finish', () => log('verify: extract finished'));
|
2025-08-15 16:09:58 +05:30
|
|
|
|
|
|
|
|
const hash = new HashStream();
|
|
|
|
|
const gunzip = zlib.createGunzip({});
|
|
|
|
|
const ps = new ProgressStream({ interval: 10000 });
|
|
|
|
|
ps.on('progress', function (progress) {
|
|
|
|
|
const transferred = Math.round(progress.transferred/1024/1024), speed = Math.round(progress.speed/1024/1024);
|
|
|
|
|
if (!transferred && !speed) return progressCallback({ message: 'Downloading backup' }); // 0M@0MBps looks wrong
|
|
|
|
|
progressCallback({ message: `Downloading ${transferred}M@${speed}MBps` });
|
|
|
|
|
});
|
|
|
|
|
|
2025-09-12 09:48:37 +02:00
|
|
|
if (backupSite.encryption) {
|
|
|
|
|
const decrypt = new DecryptStream(backupSite.encryption);
|
2025-08-15 16:09:58 +05:30
|
|
|
const [error] = await safe(stream.pipeline(inStream, ps, hash, decrypt, gunzip, extract));
|
|
|
|
|
if (error) throw new BoxError(BoxError.EXTERNAL_ERROR, `tarExtract pipeline error: ${error.message}`);
|
|
|
|
|
} else {
|
|
|
|
|
const [error] = await safe(stream.pipeline(inStream, ps, hash, gunzip, extract));
|
|
|
|
|
if (error) throw new BoxError(BoxError.EXTERNAL_ERROR, `tarExtract pipeline error: ${error.message}`);
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
const integrity = integrityMap.get('.');
|
2026-03-12 22:55:28 +05:30
|
|
|
log(`verify: Expecting: ${JSON.stringify(integrity)} Actual: size:${ps.stats().transferred} filecount:${fileCount} digest:${hash.digest()}`);
|
2025-08-15 16:09:58 +05:30
|
|
|
|
2026-02-08 11:17:27 +01:00
|
|
|
const messages = [];
|
|
|
|
|
if (integrity.size !== ps.stats().transferred) messages.push(`Size mismatch. Expected: ${integrity.size} Actual: ${ps.stats().transferred}`);
|
|
|
|
|
if (integrity.fileCount !== fileCount) messages.push(`File count mismatch. Expected: ${integrity.fileCount} Actual: ${fileCount}`);
|
|
|
|
|
if (integrity.sha256 !== hash.digest()) messages.push(`File count mismatch. Expected: ${integrity.sha256} Actual: ${hash.digest()}`);
|
|
|
|
|
|
|
|
|
|
return messages;
|
2025-08-15 16:09:58 +05:30
|
|
|
}
|
|
|
|
|
|
2025-08-01 22:58:19 +02:00
|
|
|
function getFileExtension(encryption) {
|
|
|
|
|
assert.strictEqual(typeof encryption, 'boolean');
|
|
|
|
|
|
|
|
|
|
return encryption ? '.tar.gz.enc' : '.tar.gz';
|
|
|
|
|
}
|
|
|
|
|
|
Migrate codebase from CommonJS to ES Modules
- Convert all require()/module.exports to import/export across 260+ files
- Add "type": "module" to package.json to enable ESM by default
- Add migrations/package.json with "type": "commonjs" to keep db-migrate compatible
- Convert eslint.config.js to ESM with sourceType: "module"
- Replace __dirname/__filename with import.meta.dirname/import.meta.filename
- Replace require.main === module with process.argv[1] === import.meta.filename
- Remove 'use strict' directives (implicit in ESM)
- Convert dynamic require() in switch statements to static import lookup maps
(dns.js, domains.js, backupformats.js, backupsites.js, network.js)
- Extract self-referencing exports.CONSTANT patterns into standalone const
declarations (apps.js, services.js, locks.js, users.js, mail.js, etc.)
- Lazify SERVICES object in services.js to avoid circular dependency TDZ issues
- Add clearMailQueue() to mailer.js for ESM-safe queue clearing in tests
- Add _setMockApp() to ldapserver.js for ESM-safe test mocking
- Add _setMockResolve() wrapper to dig.js for ESM-safe DNS mocking in tests
- Convert backupupload.js to use dynamic imports so --check exits before
loading the module graph (which requires BOX_ENV)
- Update check-install to use ESM import for infra_version.js
- Convert scripts/ (hotfix, release, remote_hotfix.js, find-unused-translations)
- All 1315 tests passing
Migration stats (AI-assisted using Cursor with Claude):
- Wall clock time: ~3-4 hours
- Assistant completions: ~80-100
- Estimated token usage: ~1-2M tokens
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-02-14 09:53:14 +01:00
|
|
|
const _EnsureFileSizeStream = EnsureFileSizeStream;
|
|
|
|
|
|
2026-02-14 15:43:24 +01:00
|
|
|
export default {
|
2024-07-18 15:39:45 +02:00
|
|
|
download,
|
|
|
|
|
upload,
|
2025-08-15 16:09:58 +05:30
|
|
|
verify,
|
2025-08-01 22:58:19 +02:00
|
|
|
getFileExtension,
|
2025-08-25 23:45:14 +02:00
|
|
|
copy,
|
Migrate codebase from CommonJS to ES Modules
- Convert all require()/module.exports to import/export across 260+ files
- Add "type": "module" to package.json to enable ESM by default
- Add migrations/package.json with "type": "commonjs" to keep db-migrate compatible
- Convert eslint.config.js to ESM with sourceType: "module"
- Replace __dirname/__filename with import.meta.dirname/import.meta.filename
- Replace require.main === module with process.argv[1] === import.meta.filename
- Remove 'use strict' directives (implicit in ESM)
- Convert dynamic require() in switch statements to static import lookup maps
(dns.js, domains.js, backupformats.js, backupsites.js, network.js)
- Extract self-referencing exports.CONSTANT patterns into standalone const
declarations (apps.js, services.js, locks.js, users.js, mail.js, etc.)
- Lazify SERVICES object in services.js to avoid circular dependency TDZ issues
- Add clearMailQueue() to mailer.js for ESM-safe queue clearing in tests
- Add _setMockApp() to ldapserver.js for ESM-safe test mocking
- Add _setMockResolve() wrapper to dig.js for ESM-safe DNS mocking in tests
- Convert backupupload.js to use dynamic imports so --check exits before
loading the module graph (which requires BOX_ENV)
- Update check-install to use ESM import for infra_version.js
- Convert scripts/ (hotfix, release, remote_hotfix.js, find-unused-translations)
- All 1315 tests passing
Migration stats (AI-assisted using Cursor with Claude):
- Wall clock time: ~3-4 hours
- Assistant completions: ~80-100
- Estimated token usage: ~1-2M tokens
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-02-14 09:53:14 +01:00
|
|
|
_EnsureFileSizeStream,
|
2024-07-18 15:39:45 +02:00
|
|
|
};
|