tensornet
TypeScript icon, indicating that this package has built-in type declarations

1.0.21 • Public • Published

tensornet

Build and lint

pure js classify base64 between Tensorflow Models

Installation

npm i tensornet --save

Getting Started

Make sure to have @tensorflow/tfjs-core installed and a valid tensorflow backend set. You also need to pick between sync package jpeg-js or async package sharp.

# pure js full sync blocking installation
npm i @tensorflow/tfjs-core jpeg-js
# if going to use async non blocking
npm i @tensorflow/tfjs-core sharp

View the classify.test.ts file for an example setup.

import { classify, classifyAsync } from "tensornet";
import { setBackend } from "@tensorflow/tfjs-core";
import "@tensorflow/tfjs-backend-wasm";

await setBackend("wasm");

const classification = await classify(mybase64); //using jpeg-js.
// or use native sharp for increased performance 2x
const classificationA = await classifyAsync(mybase64);
// output example
// [
//   {
//     className: 'Siamese cat, Siamese',
//     probability: 0.9805548787117004
//   }
// ]

Why

The benefits of using pure js to calc the image is in a couple areas:

  1. size and portablity required is drastically less since you do not need cairo or any of the native img dev converters.
  2. speed is also faster since the calcs are done at hand without needing to bridge any calls.
  3. can use tensors in worker threads - allows for properly using Tensorflow wasm backends in an API service 🥳.

The TF models are checked in localy.

Benchmarks

Examples of some test ran on a mac m1(64gb):

Name chars size sync async
jpeg 26791 26.16 KB 100ms 50ms

Readme

Keywords

Package Sidebar

Install

npm i tensornet

Weekly Downloads

11

Version

1.0.21

License

MIT

Unpacked Size

172 kB

Total Files

82

Last publish

Collaborators

  • jeffmendez