Data Visualization Experiments with Svelte and Three.js
This is an under-construction collection of time series data visualization experiments created with Svelte and currently driven by simulated financial OHLC data. These experiments explore the limits of client-side performance, immersion, and data density.
The project's name is inspired by an interest in cybernetics and Svelte's former tagline, "Cybernetically enhanced web apps". Around 2020, I switched from Vue to Svelte, influenced by the "Rethinking Reactivity" talk by Svelte creator Rich Harris. I especially like Svelte's performance, philosophy (e.g., HTML as the mother language), and developer experience.
I try to think of performance holistically. This includes using the CSS contain property to limit layout, style, paint, and size recalculations to DOM subtrees; loading or initializing
asynchronous components when they become visible via the Page Visibility or Intersection Observer API;
running performance-optimized code in web workers off the main thread. In the context of Svelte, this
involves using the $state.raw() rune for managing large objects and arrays to avoid the overhead associated with $state()-induced proxification and using an event-driven design approach that rarely
uses the $effect() rune, and if so, only with the untrack() function, to
increase predictability.
Performance expands the design space. For instance, otherwise performance-optimized code can
support design that is not very performant by design, such as applying backdrop-filter: blur() to semi-transparent elements over a real-time 3D data visualization. These semi-transparent, background-blurring
UI overlay elements let the data shine through even when information is rendered over it, making the
data omnipresent.
Performance is a precondition for immersion because lagging disrupts it. To maximize immersion, all experiments take up the entire screen and provide a full-screen mode if the device supports it. The peripheral UI does not change unnecessarily so as to not draw attention away from the main data.
To increase data density, the UI uses borders rather than padding to separate elements. While a design system that relies primarily on borders is more difficult to implement if consistency and pixel-perfect design are important, it's worth the effort as it allows more space for the data and thus makes the interface more data-centric.
I'm fascinated by the idea of clicking on a hyperlink and being immersed in vast, fast, and dense data streams embedded in a high-performance interface resembling desktop applications. The ultimate goal of this project, which is yet to be reached, is to enhance temporal perception.