Introduction to MapGPU
MapGPU is a WebGPU-native GIS engine. One npm package (mapgpu), 12 tree-shakeable subpaths, a WASM-backed spatial core, and a render pipeline that targets the GPU end-to-end — raster tiles, vector tiles, 3D buildings, glTF models and custom WGSL all on the same encoder.
§ 01Overview
WebGPU-native matters because the whole draw graph lives on one modern API. Layers share command encoders, uniforms, and bind groups instead of fighting over the WebGL context. Projection math and tile decoding run off-thread in Rust/WASM. There is no WebGL fallback to constrain the shader surface, so features like compute, storage buffers and multi-sample 3D all work as a first-class path.
- Dual-projection globe — Mode2D (Web Mercator plane) and Mode3D (vertical-perspective sphere) driven by the same scene and renderers.
- OGC adapters — WMS, WFS, OGC API Features/Maps, plus CZML, GPX, KML, XYZ and vendor providers (Bing, Mapbox, ArcGIS, WMTS).
- Rust/WASM spatial core — projection, triangulation (earcutr), MVT parsing, spatial clustering, terrain meshing and hillshade.
§ 02Requirements
MapGPU runs wherever WebGPU does. Today that means:
- Chrome / Edge 113+ — stable, no flags needed.
- Safari 18 beta — experimental, enable in Develop → Feature Flags.
- Firefox Nightly — behind dom.webgpu.enabled / launch with --enable-unsafe-webgpu.
- Node 20+ — only needed to run the examples and the benchmark harness from source.
§ 03Install
Single package. Every capability is a subpath import — only what you use ships to the client.
Import only the subpaths you actually touch — the rest is tree-shaken out of your bundle:
§ 04Your first map
A MapView needs three things: a container, a RenderEngine, and at least one layer. This is the canonical hello-world:
That's it. MapView creates the canvas, boots WebGPU, starts the render loop. Every layer you add lives in the same pipeline, picks through the same ray caster, and survives a 2D ↔ 3D switch without being torn down.
§ 052D ↔ 3D mode
The view exposes a single strategy-pattern hinge: Mode2D (Web Mercator plane) and Mode3D (vertical-perspective sphere). Both run through the same RenderEngine — layers, renderers, events and picking all survive the transition.
§ 06What to read next
- Layers — how raster, vector, GeoJSON and WGSL layers differ.
- Tools — drawing, measurement and snapping.
- Analysis — line of sight, buffer, elevation query.
- Architecture — how MapView, RenderEngine, layers and WASM fit together.
- API Reference — every class, method and option.
- Benchmarks — real numbers vs Cesium, MapLibre, OpenLayers and Leaflet.
- Examples — live, runnable scenes that use these APIs.
§ 07Core concepts
MapView is the public entry — the thing you construct and hand a container to. Underneath, ViewCore owns the shared infrastructure: render engine, layer manager, tile scheduler, buffer cache, render loop. Mode2D and Mode3D are strategy implementations of IViewMode — each owns its own camera controller, projection, and per-frame render call, but they share every layer and renderer above them.
§ 08Layers at a glance
Every layer extends LayerBase and implements ILayer. They differ only in how they acquire data and which draw delegate they bind to:
- RasterTileLayer — XYZ/WMS image tiles, GPU-sampled.
- VectorTileLayer — MVT decoded in WASM, triangulated on the GPU side.
- GeoJSONLayer — features from URL or object, passed through the renderer chain.
- GraphicsLayer — ad-hoc graphics; the target surface for drawing/editing tools.
- WGSLLayer — your own vertex + fragment shaders on the same encoder.
- Specialized — Heatmap · Cluster · Particle · Wall · Animated for common effects.
§ 09OGC adapters
mapgpu/adapters covers WMS, WFS, OGC API Features, OGC API Maps, KML, GPX, CZML, Bing, Mapbox, ArcGIS and WMTS. Each protocol gets one abstraction — capabilities discovery, feature fetch, tile URL resolution — so the layer above it stays protocol-agnostic.
§ 10Draw, measure, snap
Tools operate on a target GraphicsLayer — they add, modify and remove features there. SnapEngine is independent: register any number of source layers and it generates snap candidates for every tool that opts in.
§ 11Spatial analysis
Line-of-sight, buffer generation and elevation queries are surfaced through mapgpu/analysis. The hot paths run in the Rust/WASM spatial core — geodesic sampling, terrain intersection, Voronoi buffering — so analyses can fire per-frame without blocking the main thread.
§ 12Custom WGSL layer
Drop your own vertex + fragment shaders into the same render pass via WGSLLayer. Camera and frame uniforms are auto-injected — the camera and frame bindings are already bound by the time your shader runs.