SPromiseMeSpeed
SPromiseMeSpeed is currently the fastest ES6 Promise javascript library and polyfill, promising a speedy implementation that are ~1-59x faster than Chrome's native promises and ~2-548x faster than Bluebird's promises (five hundread and fourty eight is not a typo). The purpose of SPromiseMeSpeed is to provide a speedy alternative to ES6 promises until browsers implement faster native promises. If all you need is a way to defer the stack level so you don't get the dreaded "Maximum Stack Call Exceeded" error, then consider using my other library, DeferStackJS, for slightly better performance.
Quick Start
To use, simply drop the following snippet of HTML code into your <head>
before all of the scripts that use SPromiseMeSpeed.
<script src="https://dl.dropboxusercontent.com/s/llt6sv7y2yurn2v/SPromiseMeSpeedDEBUG.min.js?dl=0"></script>
Or, alternatively if you want faster page loading, add a defer to every script to let the browser know that you don't call evil document.write
inside your script.
Before:
...
After
...
RequireJS and NodeJS Setup
For dropping into either RequireJS or NodeJS, please use the spromisemespeed
npm repository, this minified file, or the corresponding source code file. To install via npm, use the following code.
npm install spromisemespeed
After installing via npm, one can use require("spromisemespeed")
. Alternatively, one can drop the SPromiseMeSpeed.min.js file into the same directory as their NodeJS script and do require("./SPromiseMeSpeed.min.js")
. Both methods are functionally equivalent. Otherwise, if one do not know how to use the command line, save the script corresponding to one's operating system to the directory where the nodejs script will run and use the file manager to run the script (on Windows, it's a double-click).
- fastestsmallesttextencoderdecoder
- Microsoft Windows batch: install-FastestSmallestTextEncoderDecoder-windows.bat
- Bash for Apple MacOS and Linux (e.x. Ubuntu): install-FastestSmallestTextEncoderDecoder-unix.sh
- fastestsmallesttextencoderdecoder-encodeinto
- Microsoft Windows batch: install-FastestSmallestTextEncoderDecoder-encodeInto.bat
- Bash for Apple MacOS and Linux (e.x. Ubuntu): install-FastestSmallestTextEncoderDecoder-encodeInto.sh
API
SPromiseMeSpeed gives you one and only one new global on the window object: window.SPromise
. window.SPromise
is the exact same as the native window.Promise
as documented by MDN EXCEPT:
- It is called without the
new
operator. - All callbacks are invoked immediately after the promise is resolved without waiting until the next tick.
- It has a
window.SPromise.isPromise
method you can use to determine if something is a promise. - It is a lot faster than native Chromium promises, even with (suprise, suprise!)
await
. - It skims and cuts corners to boost performance by reusing the same reference pased to callbacks as the source.
Example code snippets:
SPromise Test Example
SPromiseMeSpeed.min.js VS SPromiseMeSpeedDEBUG.min.js
The main difference between the two versions is that SPromiseMeSpeedDEBUG is intended for development. It adds many extra type checks and notifies you when the wrong type of object is passed to the SPromise API. For some of the errors, it even gives a suggestion on how to fix them. However, these checks come at a cost: performance. If you have already written your code well enough to not need these checks, then use SPromiseMeSpeed.min.js for even better performance. SPromiseMeSpeed.js will run blind untill it gets done running or it hits a wall and crashes. For example, if you pass something that is not a function to window.SPromise
from SPromiseMeSpeedDEBUG.min.js, then it will print a pretty error message saying you passed a wrong type to the function. But with SPromiseMeSpeed.min.js, the console will say something along the lines of 'cannot call null or undefined' or 'null or undefined is not a function.' To use SPromiseMeSpeed without the DEBUG, insert the following alternative code into your <head>
:
<script src="https://dl.dropboxusercontent.com/s/i8om2fcz5izdeoj/SPromiseMeSpeed.min.js?dl=0"></script>
RequireJS and ES Module API Documentation
For NodeJS, calling require("SPromiseMeSpeed.min.js")
yields the function SPromise
.
module.exports = function(handle) {/*...*/};
As for example snippets, observe the way to require modules below.
const SPromise = require("spromisemespeed");
Or, one can use the new and shiny ES6 module importation statements.
// Variation 1 import { SPromise } from "spromisemespeed";
// Variation 2 import * as SPromiseModule from "spromisemespeed"; const SPromise = SPromiseModule.SPromise;
npm Project
This DEBUG build of this project can be found on npm here at this link. The faster unchecked build can be found on npm here at this other link.
Benchmarks
If you are a sensible guy like me, then you shouldn't take my word for the speed of SPromiseMeSpeed. Rather, take the word of these benchmarks:
Casual Promising
Benchmark Code (executed in the console at https://cdnjs.cloudflare.com/ajax/libs/bluebird/2.11.0/bluebird.min.js)
(async function(requestIdleCallback, console, performance){ "use strict"; var resStr = ""; var nativePromise = window.Promise, idleOptions = {timeout: 11}; function test(str, f){ return new nativePromise(function(acc){ var tests=[], tmp=0, SPromise = window.SPromise; var cycleCount=5, intV=requestIdleCallback(function theeFunc(){ "use strict"; if (--cycleCount < 0) { var res = tests.reduce((a, b) => a + b) / tests.length; resStr += "\n" + str + res + "ms"; acc(); return; } var k = performance.now(), i = 0, Len = 6; (function test(v){ f(function(k){ k(v+(((v<<2)%11)|0)|0); }).then(function(v){ i = i + 1|0; if (i < Len) { test(((v|0)%7)|0); } else { tmp = tmp + (v|0)|0; tests.push((performance.now() - k)/Len); } }); })(Math.random()*40|0); requestIdleCallback(theeFunc, idleOptions); }, idleOptions); }); } await test("NativeR1: ", function(x){return new nativePromise(x)}); await test("NativeR2: ", function(x){return new nativePromise(x)}); await test("NativeR3: ", function(x){return new nativePromise(x)}); await test("NativeR4: ", function(x){return new nativePromise(x)}); await (new nativePromise(_=>requestIdleCallback(_))); // to allow the CPU to take a break (new Function(document.body.innerText))(); var bbPromise = window.Promise; await test("BluebirdR1: ", function(x){return new bbPromise(x)}); await test("BluebirdR2: ", function(x){return new bbPromise(x)}); await test("BluebirdR3: ", function(x){return new bbPromise(x)}); await test("BluebirdR4: ", function(x){return new bbPromise(x)}); await (new nativePromise(_=>requestIdleCallback(_))); // to allow the CPU to take a break window.Promise = nativePromise; var SPromise = window.SPromise; await test("SPromiseR1: ", function(x){return SPromise(x)}); await test("SPromiseR2: ", function(x){return SPromise(x)}); await test("SPromiseR3: ", function(x){return SPromise(x)}); await test("SPromiseR4: ", function(x){return SPromise(x)}); console.log(resStr);})(requestIdleCallback, console, performance);
Console output (lower numbers = lower promise delay = better):
NativeR1: 0.04416666730927924ms
NativeR2: 0.018666667165234685ms
NativeR3: 0.026166668006529413ms
NativeR4: 0.024499996410061918ms
BluebirdR1: 0.27599999836335576ms
BluebirdR2: 0.22883333343391615ms
BluebirdR3: 0.22850000144292912ms
BluebirdR4: 0.2271666657179594ms
SPromiseR1: 0.006999998974303404ms
SPromiseR2: 0.006166667056580384ms
SPromiseR3: 0.005999997180576125ms
SPromiseR4: 0.006833329098299146ms
Synchronous Hellhole of Death Promising
Benchmark code (executed in the console at https://cdnjs.cloudflare.com/ajax/libs/bluebird/2.11.0/bluebird.min.js):
(async function(requestIdleCallback, console, performance){ "use strict"; var resStr = ""; var nativePromise = window.Promise, idleOptions = {timeout: 11}; function test(str, f){ return new nativePromise(function(acc){ var tests=[], tmp=0, SPromise = window.SPromise; var cycleCount=5, intV=requestIdleCallback(function theeFunc(){ "use strict"; if (--cycleCount < 0) { var res = tests.reduce((a, b) => a + b) / tests.length; resStr += "\n" + str + res + "ms"; acc(); return; } var i = 0, k = performance.now(); (function test(v){ f(function(k){ k(v+(((v<<5)%11)|0)|0); }).then(function(v){ i = i + 1|0; if (i < 131072) { test(v%7|0); } else { tests.push((performance.now() - k)/131072); tmp = tmp + v|0; clearInterval(intervalID); requestIdleCallback(theeFunc); } }); })(Math.random()*40|0); var last = 0; var intervalID = setInterval(function(){ if (last !== i) {last = i; return} i = 131072; clearInterval(intervalID); tests.push((performance.now() - k)/i); requestIdleCallback(theeFunc); }, 10); }, idleOptions); }); } await test("NativeR1: ", function(x){return new nativePromise(x)}); await test("NativeR2: ", function(x){return new nativePromise(x)}); await test("NativeR3: ", function(x){return new nativePromise(x)}); await test("NativeR4: ", function(x){return new nativePromise(x)}); await (new nativePromise(_=>requestIdleCallback(_))); // to allow the CPU to take a break (new Function(document.body.innerText))(); var bbPromise = window.Promise; await test("BluebirdR1: ", function(x){return new bbPromise(x)}); await test("BluebirdR2: ", function(x){return new bbPromise(x)}); await test("BluebirdR3: ", function(x){return new bbPromise(x)}); await test("BluebirdR4: ", function(x){return new bbPromise(x)}); await (new nativePromise(_=>requestIdleCallback(_))); // to allow the CPU to take a break window.Promise = nativePromise; var SPromise = window.SPromise; await test("SPromiseR1: ", function(x){return SPromise(x)}); await test("SPromiseR2: ", function(x){return SPromise(x)}); await test("SPromiseR3: ", function(x){return SPromise(x)}); await test("SPromiseR4: ", function(x){return SPromise(x)}); console.log(resStr);})(requestIdleCallback, console, performance);
Console output (lower numbers = lower promise delay = better):
NativeR1: 0.005961311340207942ms
NativeR2: 0.005754425048820622ms
NativeR3: 0.005737731933574963ms
NativeR4: 0.005725685119539747ms
BluebirdR1: 0.0004817428588488326ms
BluebirdR2: 0.0003913345338446561ms
BluebirdR3: 0.0003724441526742339ms
BluebirdR4: 0.00038145446783488524ms
SPromiseR1: 0.0002370834351062001ms
SPromiseR2: 0.00017644500722724388ms
SPromiseR3: 0.00017766571041022416ms
SPromiseR4: 0.00017798614484476616ms
Await Promising
Benchmark Code (executed in the console at https://cdnjs.cloudflare.com/ajax/libs/bluebird/2.11.0/bluebird.min.js):
(async function(performance, console){ "use strict"; var resStr=""; var nativePromise = window.Promise, idleOptions = {timeout: 10}; function acceptor(acc){acc();} function test(str, f){ return new nativePromise(function(acc){ var tests=[], SPromise = window.SPromise; var cycleCount=6, intV=requestIdleCallback(function theeFunc(){ "use strict"; if (--cycleCount < 0) { var res = tests.reduce((a, b) => a + b) / tests.length; resStr += "\n" + str + res + "ms"; acc(); return; } var k = performance.now(), i = 0; (async function test(){ await f(acceptor); if (i < 384) { i = i+1|0; test(); } else { tests.push((performance.now() - k)/384); requestIdleCallback(theeFunc, idleOptions); } })(Math.random()*40|0); }, idleOptions); }); } await test("NativeR1: ", function(x){return new nativePromise(x)}); await test("NativeR2: ", function(x){return new nativePromise(x)}); await test("NativeR3: ", function(x){return new nativePromise(x)}); await (new nativePromise(_=>requestIdleCallback(_))); // to allow the CPU to take a break (new Function(document.body.innerText))(); var bbPromise = window.Promise; await test("BluebirdR1: ", function(x){return new bbPromise(x)}); await test("BluebirdR2: ", function(x){return new bbPromise(x)}); await test("BluebirdR3: ", function(x){return new bbPromise(x)}); await (new nativePromise(_=>requestIdleCallback(_))); // to allow the CPU to take a break window.Promise = nativePromise; var SPromise = window.SPromise; await test("SPromiseR1: ", function(x){return SPromise(x)}); await test("SPromiseR2: ", function(x){return SPromise(x)}); await test("SPromiseR3: ", function(x){return SPromise(x)}); console.log(resStr);})(performance, console);
Console output (lower numbers = lower promise delay = better):
NativeR1: 0.008009982618912342ms
NativeR2: 0.010225694394547543ms
NativeR3: 0.01835503472749325ms
BluebirdR1: 4.231184895818134ms
BluebirdR2: 4.206684027773615ms
BluebirdR3: 4.192039930558167ms
SPromiseR1: 0.016039496535490295ms
SPromiseR2: 0.010410156290971728ms
SPromiseR3: 0.010818142375986403ms
[Caution: please don't read the follow paragraph if you are easily disturbed by vivid images of emesis. Also note that this paragraph has been a bit less applicable since Chrome's await
recieved internal optimizations.] The signifigance of the above tests is that trying to force a native method like await
into using a user-created function like SPromise
is comparable to trying to swallow someone else's barf. If you are going to swallow barf (as in await
), you would likely want to swallow your own native barf instead of trying to swallow the barf of someone else (like Bluebird or SPromise). Yet in spite of this, SPromise makes the barf tastey (fast and performant) enough for Chrome to swallow it with greater efficiency.
Development
On linux, the project can be developed by cloning it with the following command line.
git clone https://github.com/anonyco/SPromiseMeSpeedJS.git; cd SPromiseMeSpeedJS; npm run install-dev
Emphasize the npm run install-dev
which downloads closure-compiler.jar
into the repository for minifying the files.
Now that the repository is cloned, edit the files as one see fit. Now that the files have been edited, run the following in the terminal in the root folder of the repository in order to minify the NodeJS JavaScript files.
npm run build
Continuity
I try my best to be a realist, and what's more realistic than death? I am going to die someday and it may be tomorrow in a car crash. You never know. As I have no coder freinds to look out for my projects, I'm looking for anyone who wants to be a collaborator on this project in the event of the unforseen. Reach out to me at wowzeryest@gmail.com. If issues/pulls start piling up over the course of months, assume the worst. As I am trying my best to do my part to help the community, I encourage every developer to share their projects with other people to ensure continuity.