Travel
Create and manage your Javascript data at hyperspeed:
; dbusers;dbusers; console;// [{ name: 'Francisco' }, { name: 'Sarah' }]
With the same Javascript syntax, it makes CRUD operations anywhere you want:
- Memory (default): start with just variables.
- Rest API: make HTTP calls when each data modification happens.
- IndexedDB: the Browser database so you don't even need your own back-end.
- SQL Database: which can be connected anywhere.
- MongoDB Database: which can be connected anywhere.
- Help us: by adding more data storage places. E.G. filesystem, CSVs, etc
Features:
- Consistent Javascript API. Work with any kind of store using the same API so you don't have to worry about your data sourcing. Start prototyping in memory and then switch to a REST API or IndexedDB with a single line.
- Extends the Javascript API so you get some extra capabilities like filtering with an object
.filter({ id: 2 })
but still has all the native methods available like.filter(({ id }) => id === 2)
. - Universal JS so you can use the same code in the front-end and on the back-end/CLI.
- Tiny footprint with no dependencies. Not only is the whole thing SIZE_HERE, but it also supports treeshaking to make it even smaller.
- Underlying query builder that optimizes the engines for many cases like
.filter()
,.find()
,.slice()
, etc.
Antifeatures:
- No big-data scale. This is to make your prototype, convert it into a MVP and launch it to production to grow your userbase. However, it does not have advanced features like JOINs, GraphQL-queries, etc. so it's difficult to optimize for performance. See the migration guide if you want to get rid of
travel
. I strongly believe you should aim to launch quick and iterate hard, and only when things look like they are growing to start optimizing. - There are some small differences in the drivers that you have to be aware of to optimize them. Not needed in the beginning, but it might as well allow you to grow another 10x if you squeeze it. See each driver docs for the specifics.
For instance, to change the engine of the example in the beginning to a REST API:
;const db = url: 'https://api.example.com/' ; // Make two POST requests to /usersdbusers;dbusers; // Make a GET request to /usersconsole;// [{ name: 'Francisco' }, { name: 'Sarah' }]
Getting started
First install it with npm:
npm install travel
Then you can import it and use it:
; // You need this `async` context to be able to run any `await` callasync { // CREATE dbusers; dbusers; // READ console; // [{ name: 'Francisco' }, { name: 'Sarah' }] // UPDATE; .find() => one, .filter() => many dbusersname = 'Paco'; console; // [{ name: 'Paco' }, { name: 'Sarah' }] // DELETE dbusers; console; // [{ name: 'Sarah' }]};
Documentation
The library behaves like a Javascript object, where each key should be treated as an array:
; async { // The different "tables" console; // []; console; // []; console; // [];};
For the database technology (SQL, MongoDB) the tables should already be created wherever the database is going to be hosted, but no need to initialize them as empty arrays in Javascript.
Create
To add a single new record, you can use .push()
as usual for arrays:
; async { console; // [] // Add the first record dbusers; console; // [{ name: 'Francisco' }] dbusers; console; // [{ name: 'Francisco' }, { name: 'Sarah' }]};
You can also add multiple records at once with push()
:
; async { console; // [] dbusers; console; // [{ name: 'Francisco' }, { name: 'Sarah' }] const newUsers = name: 'John' name: 'Maria' ; dbusers; // DO NOT DO .push(newUsers), you need the "..." console; // [{ name:'Francisco' }, { name:'Sarah' }, { name:'John' }, { name:'Maria' }]};
You can also initialize the data, which will delete and add records by overwriting the whole table as expected from the Javascript syntax:
; // This will remove the previous records and set the new user recordsdbusers = id: 0 name: 'Francisco' time: '1990-01-01' id: 1 name: 'Sarah' time: '1990-01-02' id: 2 name: 'John' time: '1990-01-03' id: 3 name: 'Maria' time: '1990-01-04' ;
Read
These are the only ones that need the keyword await
, but there are many read operations. Let's start with a simple one, getting everything as a plain array of objects:
console;// [{ id: 0, name: 'Francisco', time: '1990-01-01T00:00:00Z' }, ...]
To avoid overly repeating ourselves, this is the init code that we'll use in all the examples:
;dbusers = id: 0 name: 'Francisco' email: 'francisco@gmail.com' time: '1990-01-01' id: 1 name: 'Sarah' email: 'sarah@gmail.com' time: '1990-01-02' id: 2 name: 'John' email: 'john@hotmail.com' time: '1990-01-01' id: 3 name: 'Jane' email: 'jane@hotmail.com' time: '1990-01-02' ; // All *await* operations need to be within an async contextasync { // READ OPERATIONS HERE };
Let's find all of the users with their birthday on January 1st 1990:
const younger = await dbusers;console;// [{ id: 0, email: 'Francisco', ... }, { id: 2, email: 'John', ... }]
We can also match by their email provider with a bit of Regexp:
const gmail = await dbusers;console;// [{ id: 0, email: 'francisco@gmail.com', ... }, { id: 1, email: 'sarah@gmail.com', ... }]
Or we can apply a manual filter and just loop over each of them:
const earlyAdopters = await dbusers;console;// [{ id: 0, name: 'Francisco', ... }, { id: 1, name: 'Sarah', ... }]
You can do all of these operations, but for a single user, if you use .find()
instead of filter()
as usual with Javascript. Let's find the one with the id 1
:
// Note: we are extending the capabilities of `filter` only for this libraryconst me = await dbusers;console;// { id: 0, name: 'Francisco', email: 'francisco@gmail.com', time: '1990-01-01T00:00:00Z' }
That's awesome! But what if you want to limit your query to N users? In Javascript you can use .slice()
, so let's just use it:
// Get the first 10 usersconst firstPage = await dbusers;console;// [{ id: 0, ... }, { id: 1, ... }, ..., { id: 9, ... }]
Drivers
First let's see some examples of using the different drivers:
; // Start prototyping with a global data storedbusers;// Creates an array and pushes the item there // Connect to the browser database storage for quick access on the front-endconst db = ;dbusers;// const store = event.target.result.createObjectStore("users");// store.add({ name: 'Francisco' }); // Connect to a remote database that you have defined previouslyconst db = ;dbusers;// "INSERT users (name) VALUES (?)" ["Francisco"] (prepared statements) // Use a MongoDB instance from any hostconst db = ;dbusers;// db.users.insertOne({ name: 'Francisco' }) // Make REST API calls to the back-endconst db = ;dbusers;// POST api-url.com/users/ { "name": "Francisco" } (application/json)
TEST. Which one is a better API?
// Plain object configuration;;;// -large codebase since it has all engines// -difficult to write a new driver// -non-scalable for other drivers// -makes changing engine difficult since you also have to change the options// -single global source // Drivers that are initialized with their options;;;// -too verbose// -cannot do multiple instances// -confusing since it's a pattern not well used// -single global source // Individual instances;const db = ;const db = ;// -difficult to write a new driver// -unique instances, bringing a big concurrent mess// -change the API to change the engine; consider removing the `db` // Import a file;;;// -cannot do multiple instances// -confusing without initializing since it's the same name// -changing engines a bit more difficult
OLD DOCUMENTATION/PROJECT
Data navigation and exploration tool with Javascript:
Note: refactoring needed! This example is the next step for the code, but not yet working
const data = ;
Then read the data with as much nesting as desired:
console;console;console;console;console;
Or perform complex queries with simple dot and array notation:
;;;
Demo: