SolarSystem Web

Build Tutorial — Porting an iOS app to the browser in a single conversation

1

The Vision — From iOS to Everywhere

You helped me build an amazing iPhone project, SolarSystem. I would also like to make this into a web-based application, written in JavaScript, using Three.js and runnable in all modern browsers. Do you think the web platform is up to it? I'd prefer minimal dependencies apart from Three.js, and I'd want it to look as beautiful (with all the texture maps) and smooth as the iPhone version.
Analysis showed the port was not just feasible but ideal. Three.js maps almost 1:1 to SceneKit: MeshStandardMaterial for PBR, SphereGeometry for planets, Quaternion for rotations, Raycaster for hit testing. The orbital mechanics is pure math — portable as-is. The only dependency needed: Three.js itself, loaded from CDN. No npm, no bundler, no build step.
Confirmed the approach: a single index.html plus 6 ES module files, zero build tooling. Three.js loaded via import map from CDN. All 17 texture files copied from the iOS project. The scene complexity (~30 meshes, 9K star points) is trivial for WebGL.
The key insight was that SceneKit and Three.js share the same rendering model: a scene graph with meshes, materials, lights, and a camera. Every concept has a direct equivalent. The only real difference is the gesture system — iOS has structured UIGestureRecognizer classes, while the web has raw mouse/touch events that need manual state tracking.
2

Building the Entire App in One Pass

Yes let's go for it! We have an ocean of tokens available.
Built the complete application in a single pass: read all 12 Swift source files from the iOS project, understood every algorithm and constant, then wrote the full JavaScript port across 6 modules. Every formula was transcribed exactly — the same JPL elements, the same Kepler solver, the same logarithmic distance scaling, the same quaternion rotation model. The UI was designed as HTML/CSS overlays matching the iOS toolbar layout.
6 JavaScript files totalling ~1,680 lines. All 9 planets, 16 moons, the Sun with 4-layer corona, Saturn's rings, 8,920 stars, orbital paths, labels, time controls, and a custom camera controller — all working on first load. The iOS app's CLAUDE.md served as a precise specification.
The port was made possible by the iOS project's meticulous CLAUDE.md documentation. Every constant, every formula, every design decision was recorded — making it possible to reproduce the entire rendering pipeline in a different language and framework without guesswork.
3

The file:// Trap

Can I test it by simply opening index.html as a file in my browser?
No — browsers enforce strict CORS restrictions on file:// URLs. ES module import statements fail, fetch() for the star catalogue CSV is blocked, and TextureLoader can't read local JPEG files. The solution is trivial: python3 -m http.server 8080 starts a local server in one line.
A one-liner gets you running: python3 -m http.server 8080, then open http://localhost:8080. No installation, no configuration, works on any machine with Python 3.
This is the most common pitfall when developing web apps with ES modules. The file:// protocol has an opaque origin, so the browser treats every file as a different origin. A local HTTP server makes everything same-origin. Any static server works — Python, Node's npx serve, VS Code's Live Server extension, etc.
4

Lighting the Scene

This is amazing! It works really well.
The initial version used SceneKit-style light intensity (2000), but Three.js uses physically correct lighting by default with different falloff calculations. Tuned the PointLight intensity down to 50 and set toneMappingExposure to 1.2 with ACES filmic tone mapping to get the natural, film-like look that matches SceneKit's rendering.
Planets are correctly lit: bright day sides facing the Sun, dark night sides in shadow, with just enough ambient light (0x262626) to prevent pure black. PBR roughness values per planet (Earth 0.5, Mercury 0.95, Jupiter 0.9) match the iOS app exactly.
Three.js's physically correct mode (renderer.useLegacyLights was removed in r155) means light intensity is measured in candela, not arbitrary units. Combined with ACES tone mapping, the rendered image maps HDR lighting values to the display's SDR range in a way that mimics film exposure curves — highlights compress gradually instead of clipping.
5

Sun Corona with Sprites

The iOS app uses nested translucent spheres with additive blending for the Sun's corona. On the web, THREE.Sprite is a better fit — sprites always face the camera (billboard behaviour), which is exactly what glow effects need. Four sprites at 1.3x, 1.8x, 2.8x, and 4.0x the Sun's radius, each with a radial gradient CanvasTexture and AdditiveBlending, produce the same warm multi-layer corona.
The Sun glows naturally at all viewing angles. The inner layer is bright white-yellow, fading through orange to a very faint extended corona. The additive blending means the layers brighten the scene rather than occluding it.
Using depthWrite: false on the sprite materials prevents the transparent parts from writing to the depth buffer, which would create invisible rectangular occluders. The sprites are child objects of the Sun mesh, so they move and scale with it automatically.
6

Custom Camera with Touch Support

Three.js ships with OrbitControls, but it doesn't match the iOS app's gesture model: one-finger pan for translation, two-finger for orbit. Built a custom CameraController from scratch that manages spherical coordinates (target, distance, azimuth, elevation) and handles five input modes: left-drag pan, right-drag orbit, scroll zoom, click selection (via Raycaster), and double-click reset. Touch support maps the same gestures: one-finger pan, two-finger orbit+pinch.
Camera behaviour is identical to the iOS app. Click a planet or its label to fly there. Double-click to return to the overview. Right-drag (or two-finger) to orbit around any focus point. Scroll to zoom logarithmically between 0.5 and 250 scene units.
Touch gesture disambiguation required a 5px dead zone: without it, every tap would start a pan. The double-click/tap detector uses a 350ms timeout to distinguish single from double. When a two-finger gesture ends with one finger still down, the controller transitions to one-finger pan mode without jumping. These subtleties don't exist in iOS's UIGestureRecognizer system, which handles them automatically.
7

The Zoom Slider Mismatch

The only issue I can see is that there is still a discrepancy between the zoom slider range and the preset zoom you get when you select a planet.
When clicking a planet in the picker menu, focusCamera() set the camera distance programmatically but didn't update the zoom slider. The slider only synced during the animation loop every 3rd frame — and if the camera moved while paused, it never synced at all. Added syncZoomSlider() calls immediately after both focusCamera() and resetCamera().
Zoom slider now jumps instantly to match the camera distance when selecting a planet or resetting to overview. The same fix was needed in the iOS app (documented in the iOS CLAUDE.md as "Zoom range consistency").
This is a classic state synchronisation bug. The zoom slider, camera distance, scroll wheel, and pinch gesture all modify the same underlying value (cameraController.distance). Every entry point that changes the distance must also update the UI. The iOS app solved this identically — ensuring all zoom controls clamp to the same 0.5–250 range and calling syncZoomFromCamera() after programmatic changes.
8

Credits Where Credits Are Due

Where should we attribute sources in the web app?
Added a "Credits" button in the top-right corner of the date bar. Clicking it opens a glass-morphism modal overlay listing every texture source (NASA, Solar System Scope, Planet Pixel Emporium), the HYG star catalogue, JPL orbital elements, and Three.js. Dismissible by clicking the X or clicking outside the panel.
Full attribution is accessible from any point in the app, without cluttering the main interface. Every NASA public domain image, every CC-BY source, every data set is credited with its original author and licence.
The overlay uses backdrop-filter: blur(10px) at z-index 50 (above all other UI). The panel scrolls independently (max-height: 80vh; overflow-y: auto) for smaller screens. Three event listeners handle dismissal: the X button, clicking the backdrop (checking e.target === overlay), and the Escape key would be a natural future addition.
9

Planet Strip — Textures in the Toolbar

I'm thinking we could move the planet presets to be individual clickable objects in the bottom bar, using the actual planet texture maps — what do you think?
Replaced the dropdown planet picker with a row of 32px circular thumbnails embedded directly in the toolbar. Each planet uses its NASA JPEG as a CSS background-image clipped to a circle. Saturn gets a decorative ring overlay — an elliptical CSS border with rotateX(65deg) 3D transform. The Sun uses a radial gradient. An overview button with a star glyph sits at the far left.
The toolbar now serves as both a control panel and a visual map of the solar system. Every planet is instantly recognisable from its real texture. A divider separates the rocky inner planets from the gas giants. Clicking any thumbnail flies the camera there with an orange selection ring. Hovering reveals the planet name beneath.
⏱ 1x 🏷 Sun Earth
The thumbnails use flex: 1 within the toolbar to distribute evenly across the available width. The planet strip takes whatever space remains after the playback controls. Saturn's ring is a pure CSS effect — a border-radius: 50% div with transform: translate(-50%, -50%) rotateX(65deg), positioned absolutely over the planet circle.
10

Making Touch Work on iOS Safari

On my phone, the buttons all highlight but don't perform any action.
Three things needed adjusting for iOS Safari's touch handling. First, touch-action: none on the full-screen canvas container was causing iOS's gesture system to capture all touches before they reached the overlaid UI — it needed to be scoped to the <canvas> element only. Second, -webkit-user-select: none on <body> was suppressing click event synthesis from touch events — again, scoped to the canvas container. Third, click events weren't always being synthesized from touches in the layered stacking context, so a dual-binding approach was needed.
Introduced the onTap() helper, which binds both touchend and click with a flag to prevent double-firing. Every interactive element now fires immediately on touch. Desktop mouse behaviour is completely unaffected.
The onTap() pattern fires on touchend with preventDefault() (no 300ms delay), and falls back to click for mouse input. A touchFired flag prevents the click from re-firing when both events are synthesized. This is now used for every interactive element in the app — 20+ bindings in total. Desktop mouse behaviour is completely unaffected.
11

Adaptive Layout — Phone and Desktop

Let's go back to a separate row of planet icons — it works better when running on phone with portrait aspect ratio. Or would adaptive be better?
Adaptive is best — a CSS @media (max-width: 700px) breakpoint switches the toolbar from a single row (controls + planet strip inline) to a two-row layout (planet strip on top, controls below) using flex-direction: column. The planet strip also needed aspect-ratio-aware zoom: portrait phones have a narrow width, so the same camera distance that works on a wide desktop leaves planets tiny. Added a portraitFactor that scales the focus multiplier by 0.5 + 0.5 × min(aspect, 1.0).
Desktop users see a clean single-bar layout. Phone users in portrait get properly sized planet thumbnails in their own row, and selecting a planet fills the narrow width appropriately. Landscape orientation on any device uses the desktop layout.
The aspect-ratio correction is applied in focusCamera(), not in CSS. The camera's field of view is fixed at 60°, so the constraining dimension on portrait screens is width, not height. Without the correction, a planet at the "right" distance for a 16:9 desktop would appear at roughly half its intended visual size on a 9:16 phone — the multiplier drops to ~78% on a typical phone, bringing the camera closer to compensate.
12

Zoom Tuning — Filling the Frame

Could we fill more of the available browser page with each planet and its moons? And for the planets without moons, let's make the preset zoom out some more so they are a similar visual size as the others.
The iOS app used a 3.7× multiplier on the system extent for camera distance — derived from the narrow iPhone viewport. A browser window is much wider, so the same multiplier leaves the planet system as a small cluster. Reduced to 2.2× for planets with moons. For moonless planets (Mercury, Venus, Uranus, Neptune, Pluto), the extent is just the planet's scene radius (~0.06–0.2), so 2.2× made them fill the entire screen. Introduced a 6.0× multiplier for moonless bodies.
Every planet now fills a comfortable portion of the viewport when selected. Jupiter with its four Galilean moons, Saturn with its ring system and seven moons, and lone planets like Mercury all appear at a visually consistent scale. The camera distance adapts to viewport proportions rather than assuming a phone screen.
The multipliers (2.2× with moons, 6.0× without) were tuned empirically. The underlying formula is cameraDistance = max(extent × multiplier, 0.5), where extent is the maximum of the planet radius and the outermost moon's orbit distance plus its own radius. The 0.5 floor prevents the camera from going inside the body.
13

Lessons from the Port

Several principles emerged from porting a native iOS app to the web:
What transferred perfectly: All orbital mechanics (pure math), body data (constants), rotation models (quaternions), scaling formulae, texture assets (JPEGs), star catalogue (CSV). The physics doesn't know what platform it's running on.

What needed adaptation: Lighting intensity values (SceneKit vs Three.js physically correct mode), glow rendering (nested spheres → billboard sprites), gesture handling (structured recognisers → raw events with state machines), UI layout (iOS toolbar icons → textured planet strip), camera framing (phone aspect → wide desktop), persistence (UserDefaults → not yet implemented).

What got easier: No Xcode project file management, instant deployment (just serve files), cross-platform by default, CSS glass-morphism UI, DOM labels (simpler than projectPoint() overlay in SwiftUI), CSS textures in toolbar (planet strip thumbnails using background-image).
The most valuable asset in the port wasn't the code — it was the documentation. The iOS app's CLAUDE.md contained every formula, every constant, every design decision with its rationale. This turned a complex reverse-engineering task into a straightforward transcription. The lesson: invest in documentation not just for future developers, but for future platforms.
14

Space Missions — Artemis II in Real Time

I'm thinking we should have a "missions" section, starting with the current Artemis II mission and its Orion spacecraft heading to the Moon. We can show the path like we show planet orbits. Obviously each mission would have a start and end point, but I imagine building up a library of missions in the future.
The mission system needed to solve several problems at once: trajectory data in a different coordinate frame (geocentric vs heliocentric), distance compression that squashes Moon-distance details, multiple vehicles per mission (SLS, SRBs, Orion), and real-time telemetry. The solution was a MissionManager class with a Moon-aligned waypoint frame, automatic rotation to match the Moon's actual orbital position, and CatmullRom smoothing for visual quality.
Artemis II launched April 1, 2026 — while we were building this feature. The trajectory renders as a faint orange loop from Earth around the Moon and back. Three vehicles (SLS, SRBs, Orion) appear and disappear at the correct times. A telemetry panel shows mission elapsed time, distance from Earth (km and miles), and speed (km/s and mph). Animated event banners pop up at key moments: Launch, SRB Separation, Trans-Lunar Injection, Lunar Flyby, Splashdown.
The hardest problem was distance compression. The pow(ratio, 0.4) formula that makes the solar system visible squashes the 10,600 km between the Moon and the flyby point to 0.01 scene units — invisible. Flyby waypoints had to be slightly exaggerated so the trajectory visibly loops around the Moon. CatmullRom curves created kinks when waypoint coordinates reversed direction; keeping values monotonic within each leg solved this. Two further issues emerged through iterative tuning: the Z-component amplification problem, where constant out-of-plane values made the trajectory appear to pass over the Moon’s pole instead of behind the far side (fixed by reducing Z to near-zero at closest approach), and the flyby speed symmetry problem, where hand-crafted post-flyby waypoints initially showed 2.5x the approach speed. In a free-return trajectory, approach and departure speeds should be roughly symmetric — spacing the post-flyby waypoints wider in time and closer in distance corrected this. The multi-vehicle architecture was designed with Apollo 11 in mind — CSM and LM separating at the Moon will use the same vehicles array with diverging waypoints. Later refinements made missions look beautiful by default: the camera azimuth is set ~31° off the Sun direction (0.55 radians) so Earth and Moon appear two-thirds illuminated — a dramatic cinematic framing that required adding optional azimuth/elevation parameters to setCamera(). Event labels were changed from always-visible to timed (~3% of mission duration around each event), keeping the view clean during playback. The bounding box calculation was made robust by computing trajectory extent at both Earth’s start and end positions, preventing drift as Earth moves through its orbit.
15

Eleven Missions, ISS, and the Timeline Slider

What other missions should we add? Just some of the most famous ones — the ones people would actually want to see. And we should definitely show the ISS orbiting Earth.
Added 10 more missions across three categories: lunar (Apollo 11, Apollo 13, Chandrayaan-3, Luna 25), interplanetary (Voyager 1, Voyager 2, New Horizons, Pioneer 10, Cassini-Huygens), and near-Earth (ISS). The key challenge was that interplanetary missions operate in heliocentric coordinates — their waypoints are planetary encounters spread across AU, not Moon-distance loops from Earth. This required a new anchorBody system: each waypoint names a planet, and at initialisation the code imports heliocentricPosition() from the orbital mechanics module to resolve each anchor to the planet’s real computed position at that date. An autoTrajectory flag triggers _generateTransferArc(), which computes Hohmann-style elliptical arcs between consecutive anchor points, so missions don’t need dozens of hand-placed intermediates. The ISS was modelled as a procedural Three.js geometry — a central truss with solar panels, radiators, and habitat modules — added as an Earth satellite with a 92-minute orbital period. A dedicated Satellites menu separates artificial objects from natural moons. A timeline slider below the trajectory lets users scrub through any mission’s entire duration, with all UI elements (telemetry, event banners, planet positions, vehicle markers) updating live. URL parameters (?mission=voyager1) allow deep-linking to any mission.
11 missions across 3 categories: 1 active (Artemis II), 4 lunar (Apollo 11, Apollo 13, Chandrayaan-3, Luna 25), 5 interplanetary (Voyager 1, Voyager 2, New Horizons, Pioneer 10, Cassini-Huygens), and the ISS as a permanent Earth satellite with a procedural 3D model. Timeline scrub works on all missions, with the slider mapped to the full mission duration. ?mission= URL parameter pre-selects and focuses any mission on page load.
The anchorBody system solved a subtle but critical problem: interplanetary waypoints defined with approximate coordinates (e.g. Jupiter at x=5.2 AU) wouldn’t match the actual computed planet position at the encounter date, because Keplerian elements shift over centuries. A Voyager trajectory passing “near” Jupiter but missing by 0.3 AU in the scene looks wrong. By importing the same heliocentricPosition() function used for rendering planets, each anchored waypoint snaps exactly to the planet’s rendered position at the flyby timestamp. The autoTrajectory generator then fills in smooth transfer arcs between these anchor points using elliptical interpolation, avoiding the need to manually plot hundreds of intermediate positions. The timeline slider presented a different challenge: when the simulation is paused, the normal animation loop doesn’t run — but scrubbing needs to update everything. The solution overrides the simulation date during scrub and forces a complete render pass (positions, telemetry, labels, banners) even while paused, so the scene always reflects the slider position. A third scale problem arose with Apollo 11’s LEM descent: the real 45 km descent path compresses to just 0.035 scene units — completely invisible. The flyby exaggeration approach (inflating distances by 10–15%) doesn’t work here because the vehicle needs to arrive at the Moon, not pass near it. The solution was runtime interpolation via the moonLanding vehicle property: during descent, the marker lerps from its orbital position to the Moon’s actual computed scene position (via moonPosition() each frame), creating clear visible separation from the orbiting CSM. During surface operations it tracks the Moon exactly, and during ascent it lerps back. Interplanetary speed consistency required a general principle: always add transition waypoints ~1,000–2,000 hours before and after anchored points to prevent speed spikes. This was applied across Cassini, New Horizons, and both Voyagers. A new anchorMoon system (the geocentric equivalent of anchorBody) resolves waypoints to the Moon’s actual ecliptic position, used for Apollo 11’s departure waypoints. The mission camera evolved through several iterations into a lazy-follow system: on selection, the camera snaps to Earth’s position + the trajectory’s local center (using tight localRadius bounds from getMissionBounds() rather than the wide start+end bounding box). During playback, the camera target lerps toward Earth’s current position each frame at lerp(0.02), keeping the trajectory centered while Earth drifts through space. User interaction breaks the follow by clearing activeMissionId.