huge uploader nodejs
huge-uploader-nodejs
is a Node.js promise-based module made to receive chunked & resumable file uploads. It's made to work with its frontend counterpart huge-uploader
.
From huge-uploader
:
HTTP and especially HTTP servers have limits and were not designed to transfer large files. In addition, network connexion can be unreliable. No one wants an upload to fail after hours… Sometimes we even need to pause the upload, and HTTP doesn't allow that.
The best way to circumvent these issues is to chunk the file and send it in small pieces. If a chunk fails, no worries, it's small and fast to re-send it. Wanna pause? Ok, just start where you left off when ready.
The frontend module chunks and sends, this backend module receives and assembles all the pieces back together at the end of the process.
Installation & usage
npm install huge-uploader-nodejs --save
As an exemple, I'll give something in pure Node.js, without any framework. But it is obviously compatible with any framework out there.
const http = ;const uploader = ; // you must specify a temp upload dir and a max filesize for the chunksconst tmpDir = './tmp';const maxFileSize = 10; http;
Also, if the uploader is not on the same domain, don't forget to set a CORS policy. Either directly on node or on the reverse proxy. Here's an exemple for Node:
res; if requrl === '/upload' && reqmethod === 'OPTIONS' res; res; res; // 24hrs res; res; return;
Options
They aren't many options (all are required). As shown in the example, you pass the function:
- the request object,
- a directory to write to
{ String }
, - the maximum total file size for the upload
{ Number }
, - the maximum chunk size
{ Number }
.
Be warned that total file size is computed by multiplying maxChunkSize
by uploader-chunks-total
header. So if the client is splitting files in chunks smaller than maxChunkSize
, leading to a situation where uploader-chunks-total
> maxFileSize / maxChunkSize
, the upload will be refused although it might be smaller than maxFileSize
.
Garbage collection
As said in the exemple, the module takes care of cleaning the successful uploads. But if an upload is paused and never resumed, its files are going to stay forever. So you should create a script called via a crontab that will erased directory older than the time you're willing to keep them.
Example bash script:
#!/bin/bash find /var/www/tmp -type d -mtime +24h -delete
How to setup with the frontend
This module is made to work with huge-uploader
frontend module. In case you would like to develop your own frontend, this module needs three specific headers to work:
uploader-file-id
a unique file id that's used to create temp upload directory for this upload,uploader-chunks-total
the total numbers of chunk that will be sent,uploader-chunk-number
the current chunk number (0 based index, so last chunk isuploader-chunks-total - 1
).
Any other header will be ignored. Also, you can send POST
parameters. Parameters send with the last chunk only will be processed.
Contributing
There's sure room for improvement, so feel free to hack around and submit PRs! Please just follow the style of the existing code, which is Airbnb's style with minor modifications.