Viv
A WebGL-powered toolkit for interactive visualization of high-resolution, multiplexed bioimaging datasets.
About
Viv is a JavaScript library for rendering OME-TIFF and OME-NGFF (Zarr) directly in the browser. The rendering components of Viv are packaged as deck.gl layers, making it easy to compose with existing layers to create rich interactive visualizations.
More details and related work can be found in our preprint. Please cite this preprint in your research:
Trevor Manz, Ilan Gold, Nathan Heath Patterson, Chuck McCallum, Mark S Keller, Bruce W Herr II, Katy Börner, Jeffrey M Spraggins, Nils Gehlenborg, "Viv: Multiscale Visualization of High-resolution Multiplexed Bioimaging Data on the Web." OSF Preprints (2020), doi:10.31219/osf.io/wd2gu
💻 Related Software
Screenshot | Description |
---|---|
Avivator A lightweight viewer for local and remote datasets. The source code is include in this repository under avivator/ . See our |
|
Vizarr A minimal, purely client-side program for viewing OME-NGFF and other Zarr-based images. Vizarr supports a Python backend using the imjoy-rpc, allowing it to not only function as a standalone application but also directly embed in Jupyter or Google Colab Notebooks. |
💥 In Action
- Vitessce visualization framework
- HuBMAP Common Coordination Framework Exploration User Interface (CCF EUI)
- OME-Blog OME-NGFF and OME-NGFF HCS announcements
- ImJoy I2K Tutorial
- Galaxy Project includes Avivator as default viewer for OME-TIFF files
💾 Supported Data Formats
Viv's data loaders support OME-NGFF (Zarr), OME-TIFF, and Indexed OME-TIFF*.
We recommend converting proprietrary file formats to open standard formats via the
bioformats2raw
+ raw2ometiff
pipeline. Non-pyramidal datasets are also supported
provided the individual texture can be uploaded to the GPU (< 4096 x 4096
in pixel size).
Please see the tutorial for more information.
*We describe Indexed OME-TIFF in our paper as an optional enhancement to provide efficient random chunk access for OME-TIFF. Our approach substantially improves chunk load times for OME-TIFF datasets with large Z, C, or T dimensions that otherwise may incur long latencies due to seeking. More information on generating an IFD index (JSON) can be found in our tutorial or documentation.
💽 Installation
$ npm install @hms-dbmi/viv
You will also need to install deck.gl and other peerDependencies
manually.
This step prevent users from installing multiple versions of deck.gl in their projects.
$ npm install deck.gl @luma.gl/core
Breaking changes may happen on the minor version update. Please see the changelog for information.
📖 Documentation
Detailed API information and example sippets can be found in our documentation.
🏗️ Development
$ git clone https://github.com/hms-dbmi/viv.git
$ cd viv && npm install
$ npm start
Please install the Prettier plug-in for your preferred editor. Badly formatted code will fail in our CI.
To run unit and integration tests locally, use npm test
. Our full production test suite,
including linting and formatting, is run via npm run test:prod
.
To our knowledge, Viv can be developed with Node version greater than 10. You can check which current versions are tested in our CI by naviating to our Github Workflow.
🛠️ Build
-
@hms-dbmi/viv
library:npm run build
-
Avivator
viewer:npm run build:avivator
📄 Publish
First checkout a new branch like release/version
. Update the CHANGELOG.md
and bump
the version via npm verion [major | minor | patch]
. Commit locally and push a tag to Github.
Next, run ./publish.sh
to release the package on npm and publish Avivator.
Finally, make a PR for release/version
and squash + merge into master
.
🌎 Browser Support
Viv supports both WebGL1 and WebGL2 contexts, to provides coverage across Safari, Firefox, Chrome, and Edge. Please file an issue if you find a browser in which Viv does not work.