SolarSystem

Build Tutorial — A conversation-driven development narrative

The solar system in your pocket

1

The Vision

How about building a solar system simulation for iPhone? I'd love it to use real planetary physics to calculate where all the planets actually are right now, with a beautiful GPU-accelerated 3D render that you can pinch and drag around. The planets should look like the real thing — could we make that work?
SceneKit + Keplerian orbital mechanics was identified as the ideal combination. The physics is well-understood (JPL publishes orbital elements for all planets), the math is lightweight enough for CPU, and SceneKit handles GPU rendering with PBR materials. The key challenge was scale — the real solar system is mostly empty space.
Complete project scaffolded in one pass: 12 Swift files, Xcode project with pbxproj, CLAUDE.md, README.md, architecture.html, tutorial.html. All orbital elements for 9 planets + 16 moons entered from JPL data. Build succeeded first try (after fixing two minor type issues).
The decision to use SceneKit over raw Metal saved thousands of lines of rendering boilerplate. SceneKit provides PBR materials, camera control, hit testing, and gesture handling — all GPU-accelerated and part of the standard iOS SDK.
2

First Light — Fixing the Scale

Could we test this thoroughly in the simulator, using command line args to set up starting conditions?
Systematic simulator testing with 16 different launch configurations revealed the first major bug: planet radii were calculated larger than orbit distances. Earth's scene radius was 17 units but its orbit was only 16.5 units away — planets were bigger than the solar system!
Switched to cube-root scaling for planet radii, producing Jupiter ≈ 0.25, Earth ≈ 0.11, Mercury ≈ 0.08 scene units. All 10 bodies verified rendering correctly with procedural textures, PBR lighting, and Saturn's rings.
The formula sceneRadius = (km/1000)^(1/3) × 0.06 compresses the 30x real range (Mercury to Jupiter) into a visually useful 3x range while preserving ordering. Later refined to sqrt scaling for better ratios.
3

Making the Sun Shine

The Sun looks very flat in texture. Could we improve it?
Added procedural surface texture with granulation (800 convection cells), supergranulation (40 larger cells), and limb darkening. Multi-layer corona with 4 glow spheres using radial gradient textures and additive blending. Initially added sunspots, but removed them for accuracy — real sunspot positions require live NOAA data. Sun rotation tied to real sidereal period (25.05 days).
Sun now shows realistic surface detail up close, with a smooth multi-layer corona that fades naturally. Rotation is physically accurate — visible only at high time scales, matching the real 25.05-day period.
Each glow layer uses blendMode = .add with writesToDepthBuffer = false to avoid z-fighting. The radial gradient texture is generated via CGGradient.drawRadialGradient() and applied as both diffuse and transparent contents.
4

NASA Textures

For Earth, Jupiter, Saturn, the Moon, and Mars — are there publicly available images we could texture map onto the surfaces to make them look as realistic as possible?
Downloaded real imagery from NASA, USGS, and other sources: Earth (Blue Marble, 5400×2700), Moon (LRO, 1024×512), Mars (Viking MDIM21, 4096×2048), Jupiter (NASA Cassini PIA07782), and Saturn (Cassini composite). Added as bundled JPGs with equirectangular projection for sphere mapping.
Dramatic visual improvement. Real continents on Earth, actual cloud bands on Jupiter, Valles Marineris on Mars, real craters on the Moon. A huge step from procedural textures to real planetary imagery.
NASA imagery is public domain (US government work). All textures are equirectangular projection, which SceneKit maps onto SCNSphere automatically via the default UV mapping. Later, Jupiter and Pluto were upgraded to direct NASA public-domain sources for cleaner licensing.
5

Getting the Sizes Right

Would it make sense to make moons and planets correct relative size to each other? The Moon seems enormous next to Earth.
Refined the scaling system in three ways: (1) switched from cube-root to sqrt for planet radii, preserving more of the real size hierarchy; (2) moons now use real radius ratio to their parent with a minimum floor; (3) moon distances use compressed real orbital ratios instead of hardcoded values. The formula pow(realRatio, 0.4) × 1.5 preserves relative ordering while keeping moons visible near their parent.
Jupiter is now visibly 3.3× Earth. The Moon is 0.27× Earth (real ratio). Galilean moons are properly tiny around Jupiter. Moon distances scale proportionally — close moons stay close, distant ones (Iapetus) extend further out.
The camera framing calculation uses the outermost moon position as the system extent, then positions the camera at extent × 3.7 to fill roughly half the screen (derived from FOV 60°, half-screen subtending 15°, 1/tan(15°) ≈ 3.7).
6

Labels That Actually Work

Labels still vary in size — a constant small but readable size would be great. Also, a constant distance on-screen would be best.
Refactored from 3D SCNText nodes to SwiftUI overlay labels. Each body's 3D position is projected to screen coordinates via SCNView.projectPoint(), then rendered as a SwiftUI Text view at a constant pixel size. Planets get 12pt medium weight, moons get 10pt slightly dimmer. Labels behind the camera or off-screen are culled.
Pixel-perfect labels at every zoom level. Tapping a label flies to that body. Labels are future-proof for adding info popups, animations, or interactive features.
The projectPoint() method returns a SCNVector3 where z > 1.0 indicates the point is behind the camera. The overlay uses GeometryReader with .position() modifier for absolute positioning, and .allowsHitTesting(true) with .onTapGesture for interaction.
7

Taking Control of the Camera

Planet shortcuts worked a few times, then stopped! Also, single finger drag for translation and another gesture for rotation feels more natural.
The root cause was SceneKit's built-in allowsCameraControl maintaining internal state that overrides programmatic camera changes. Replaced it entirely with a custom Coordinator that manages explicit camera state (target, distance, azimuth, elevation) with spherical-to-cartesian conversion. Swapped gestures: one-finger pan translates, two-finger pan orbits, pinch zooms.
Planet shortcuts now work reliably every time. Navigation feels natural — one finger slides the view like a map, two fingers rotate around the current focus. Double-tap returns to the full solar system overview. A house icon in the toolbar provides an always-visible home button.
The coordinator stores cameraTarget, cameraDistance, orbitAngleX (azimuth), and orbitAngleY (elevation). The updateCamera() method converts to cartesian: x = target.x + dist × cos(elev) × sin(azim). Planet shortcuts call setCamera(target:distance:) directly, bypassing any gesture state. The pendingFocus mechanism handles launch-arg focus by deferring until the coordinator's didSet fires.
8

Saturn's Rings Done Right

The Saturn ring texture doesn't seem to have been mapped radially — the bands look straight rather than circular.
SCNTube cap UVs map linearly across the disc, not radially. Built custom ring geometry from scratch: a flat disc with 72 radial segments × 4 ring segments, with explicit UV coordinates where u maps from inner radius (0) to outer radius (1) radially, and v maps around the circumference. Applied Cassini colour map and transparency pattern.
Ring bands now curve concentrically around Saturn as they should, with realistic colour variation and transparency gaps from the Cassini imagery. The Cassini Division gap is visible in the alpha map.
The custom geometry uses SCNGeometrySource for vertices, normals, and texture coordinates, with SCNGeometryElement for triangle indices. The key insight: UV u coordinate runs radially (0 at inner edge, 1 at outer edge) so the 915×64 strip texture wraps correctly from inner to outer ring.
SCNTube (linear UVs) Bands are straight lines Custom Disc (radial UVs) Bands curve concentrically u: inner→outer | v: around circumference
9

Textures for Every World

Could we texture map all the remaining planets and major moons from NASA data? Also, could the textures rotate realistically?
Downloaded equirectangular texture maps for all 9 planets and 5 major moons from NASA (Earth, Moon, Mars, Jupiter, Pluto, Europa), Solar System Scope CC-BY 4.0 (Mercury, Venus, Uranus, Neptune), Planet Pixel Emporium (Saturn), and Voyager/Galileo composites (Io, Ganymede, Callisto). Added IAU rotation data for every body — sidereal rotation period, axial obliquity, and prime meridian at J2000.0. Tidally locked moons (including Earth's) match their orbital period so the same face always points toward their parent. Venus rotates backwards at 177°, Uranus rolls on its side at 98°.
Every body in the simulation now has a real photographic texture rotating at its correct rate. Speed up time to 10,000x to see Jupiter's cloud bands spin (one rotation every 3.6 seconds), or watch the Moon keep its familiar face toward Earth. Saturn's rings stay fixed in the equatorial plane while the cloud bands rotate beneath.
The ring counter-rotation was a subtle bug — since rings are child nodes of Saturn, they inherited the parent's spin. Fixed by setting ringNode.eulerAngles.y = -spin each frame to cancel the parent's rotation while preserving the axial tilt.
10

8,920 Real Stars

Is the star field accurate, or just random points?
Downloaded the HYG v38 star catalogue (Hipparcos/Yale/Gliese amalgamation), filtered to 8,920 naked-eye stars. Each star mapped to the celestial sphere from its RA/Dec coordinates, sized by magnitude across 4 brightness tiers, and coloured by B-V index (blue-white O/B stars through yellow G stars to red M stars). The ~120 brightest named stars are labelled.
Recognisable constellations visible in the background. The Milky Way's galactic plane shows as a clear density concentration. Zoom in on any planet and see the real stars behind it. Sirius, Betelgeuse, Polaris, Vega — all in their correct positions with correct colours and relative brightnesses.
Star labels are occluded behind planet discs by projecting each body's position to screen space and computing an approximate screen radius. Stars whose projected position falls within a planet's disc are culled. Labels use the same deconfliction system as moons, with independent show/hide toggles for planets, moons, and stars.
11

Polish and Persistence

Could we have separate toggles for planet, moon, and star labels? And persist all the settings? Also a zoom slider, and an app icon?
Split the single label toggle into three independent controls (planets, moons, stars) in a popup menu. All settings persisted to UserDefaults. Added a horizontal zoom slider with logarithmic mapping above the toolbar — labels hide during drag for performance. Generated a programmatic dark-mode app icon with stylised planets, orbits, and starfield. Cleaned up ~250 lines of dead code (old 3D labels, procedural planet textures replaced by NASA imagery, unused methods) and added descriptive comments throughout all 12 source files.
A polished, feature-complete orrery. Settings persist across launches. The zoom slider provides precise control from planet close-ups to the full solar system. The app icon captures the essence of the simulation in a single glance. Code is clean and well-documented for future development.
Performance was critical for the zoom slider. The original SwiftUI Slider caused frame drops by publishing changes every pixel. Replaced with a custom DragGesture control, and label projection is throttled to every 3rd frame. Labels are hidden entirely during zoom drags via an isZooming flag. Licence research led to replacing Jupiter and Pluto textures with direct NASA public-domain sources, keeping only Saturn's Planet Pixel Emporium textures (the best available for ring detail).
12

The Wobble Bug

Do Saturn's rings really change attitude quite often? At 10,000x they're oscillating every few seconds.
The rings were wobbling because of a subtle Euler angle composition bug. SceneKit applies eulerAngles in Y-X-Z order, so (tilt, spin, 0) means: yaw by spin, THEN pitch by tilt. This causes the tilt axis itself to rotate with each spin cycle — the rings appeared to oscillate at the planet's rotation frequency. At 10,000x, Saturn spins once every 3.8 seconds, making the wobble obvious.
Switched from Euler angles to quaternion composition: tiltQuat * spinQuat. The tilt quaternion is applied in world space (fixed), and spin happens around the tilted pole axis (local). Saturn's rings now hold perfectly steady while the cloud bands rotate underneath. All planets benefit — Earth's 23.4° tilt, Uranus's 97.8° sideways roll, all rock-solid at any time scale.
Quaternion multiplication order matters: tiltQuat * spinQuat applies spin first (in the body's pre-tilt frame), then tilt (in world space). The ring's counter-rotation also uses quaternions: simd_quatf(angle: -spin, axis: (0,1,0)) in the local frame cancels the parent's spin while inheriting the stable tilt. This is a classic 3D rotation pitfall — Euler angles are intuitive to write but compose incorrectly when axes depend on each other.
13

Back-porting from the Web — Phase 1 Foundations

It's been a while since we touched the iOS project. Meanwhile we built a web version in ../solarsystem-web with space missions, an ISS, and a bunch of other features. Would you be up for porting those enhancements back into iOS? I'd love to do it in stages and apply everything we've learned about unit testing and simulator verification along the way.
Proposed a six-phase plan: (1) foundations, (2) mission infrastructure plus Apollo 11 end-to-end as a proof of concept, (3) the remaining ten missions with waypoint data exported as JSON from the web, (4) the mission UI layer (menu, timeline slider, telemetry panel, event banners), (5) ISS, (6) polish. Phase 1 centralised the moon-distance compression constants so trajectory rendering can reuse them: SceneBuilder.moonDistExponent moved from 0.4 to 0.6, giving the Moon a scene distance of 17.6 Earth radii (real 60.3) instead of 8.8. Pure-math statics were marked nonisolated so unit tests don't need to jump to the main actor. The camera coordinator's setCamera gained optional azimuth and elevation parameters plus read-only accessors — groundwork for the Sun-side mission-framing camera in Phase 6.
Eight unit tests covering scene distance, sqrt radius clamps, moon compression, monotonicity, and the zero-parent-radius guard. Simulator verification showed Jupiter's Galilean moons, Saturn's moons, and Earth's Moon all spread out more realistically with the new exponent.
SceneBuilder is @MainActor-isolated for SceneKit access, which bled into every static call site. The fix was to mark the pure-math statics (sceneDistance, sceneRadius, moonSceneDistance, and their constants) nonisolated. This lets tests call them freely from Swift Testing's default non-main context while preserving main-actor safety for everything that actually touches SceneKit state.
14

Apollo 11 — Phase 2 Mission Infrastructure

Let's do Phase 2. Start with Apollo 11 end-to-end — if it works, the rest are mechanical. And remember: thorough unit tests plus simulator verification before calling it done.
Built four new files: Mission.swift for the data shape (Mission / Vehicle / Waypoint / MissionEvent / MoonOrbit / MoonLanding), MissionData.swift with Apollo 11's three-vehicle timeline (Saturn V, Columbia, Eagle), CatmullRom.swift as a pure-Swift port of Three.js's centripetal spline (alpha = 0.5), and MissionManager.swift for scene-graph construction, per-frame updates, telemetry, and event detection. The trajectory line uses SCNGeometryElement with primitive type .line and a per-vertex colour gradient via the .color semantic, matching how the starfield renders its B-V colours. Moon-aligned waypoints rotate into the ecliptic frame at init using atan2(moonPos.y, moonPos.x) at flyby time. anchorMoon waypoints snap to the Moon's actual direction but at its semi-major-axis distance, so they land on the rendered Moon mesh instead of its eccentrically-offset true position. During runtime, Columbia orbits the Moon with cos/sin motion in a plane perpendicular to the Earth-Moon line, while Eagle walks a three-phase lifecycle: orbit, descent-to-surface snap, post-ascent orbit.
Twenty unit tests all passing — CatmullRom endpoints land exactly on control points, time-parameterised sampling hits each waypoint at its declared timestamp, anchorMoon snaps to within 1% of the Moon's semi-major axis, events fire once and reset cleanly after a rewind past launch. In the simulator, -mission apollo11 -focus earth jumps to 16 July 1969, auto-selects 10,000x replay speed, and renders the trajectory arcing from Earth to the Moon and back. Columbia's marker tracks the trajectory, snaps into a lunar orbit around T+76h, and exits cleanly on the return leg.
One subtle bug slowed the first end-to-end run: cameraCoordinator.didSet was calling focusOnBody before updatePositions had ever run, so the camera locked onto Earth's default-initialised position at the origin — which is where the Sun is. The fix was to call updatePositions(projectLabels: false) synchronously inside the didSet, so every node is at its simulated-date position before focus maths read from it. Worth remembering for any feature that sets an initial camera target from node positions during app boot.
15

Keeping the Docs Live

Time to make sure CLAUDE.md, README.md, architecture.html, and tutorial.html are up to date — and let's keep them current at the end of every phase from now on.
Refreshed the port-status table and added a full Missions section to CLAUDE.md (architecture, mission data shape, coordinate handling, runtime phases, scene graph, telemetry, event detection, tests), noted the new -mission launch arg, and added six new gotchas to the Known Gotchas list (main-actor isolation for pure math, Moon SMA alignment, anchorMoon scale, event rewind reset, CatmullRom knot clamp, line primitive indexing). Added a Missions Pipeline Mermaid diagram to architecture.html and wove the mission group into the scene-graph diagram. Updated README.md with the missions feature callout and the -mission launch arg. Recorded the feedback as a persistent rule so future phases don't skip the doc-refresh step.
All four living documents describe the current state consistently: Phase 1 and Phase 2 complete, Apollo 11 shipping, and Phases 3–6 queued up. Future phases will end with the same doc-refresh pass.
The project conventions already required living docs (“must be updated whenever a new file, service, or architectural decision is added”), but under momentum it's easy to defer. Saving the rule to persistent memory with a clear “why” — contradictions between docs are worse than stale content — turns it into a checkpoint rather than an afterthought.
16

Ten More Missions — Phase 3 via JSON Export

Let's go! On to Phase 3 — export the waypoints from the web JS once as bundled JSON, then port the other ten missions. And keep the same unit-test and simulator-verification rigour.
Wrote tools/export-missions.mjs: a small Node 22 extractor that slices ../solarsystem-web/js/missions.js up to class MissionManager, stubs out its imports (THREE, sceneBuilder, orbitalMechanics, solarSystemData, textureGenerator), evaluates the remaining data declarations in a node:vm sandbox, and serialises ALL_MISSIONS to SolarSystem/Resources/Missions.json. 2,481 lines of JSON: 11 missions, 11 vehicles, 58 events, 213 waypoints. Rewrote MissionData.swift as a loader over a small DTO layer (MissionJSON / VehicleJSON / WaypointJSON) so the domain types stay free of Codable boilerplate. Added Vehicle.autoTrajectory and MissionManager.generateTransferArc — a port of the web's Hohmann-style arc generator — so Perseverance's two anchor points (Earth launch, Mars landing) expand into a smooth elliptical transfer. Registered Missions.json as a bundled resource via four new pbxproj entries.
Twenty-four unit tests all passing, including four new ones: every mission loads from JSON, the seven interplanetary missions are flagged heliocentric, Perseverance carries autoTrajectory: "transfer", and the transfer arc generates a monotonically increasing timeline. Simulator verification showed each trajectory clearly: Cassini's full VVEJGA gravity-assist loop, Voyager 1 heading to Jupiter, Parker's multi-loop solar approach, BepiColombo's Mercury orbit, Apollo 13's free-return swing, and Artemis II's crewed lunar flyby arc.
The DTO pattern turned out to be the right call even for "almost the same shape" JSON. The domain Mission / Vehicle have custom initialisers with ergonomic defaults, which precludes synthesised Decodable. Adding a thin DTO layer with toDomain() conversion keeps both sides clean: the JSON side can grow fields over time without touching the domain layer, and the domain layer keeps whatever shape is most useful to the rest of the app. The node:vm sandbox approach also beat every alternative for the one-shot extraction — no ESM loader hackery, no npm install, just run the file in a context where its imports resolve to empty stubs.
17

Mission UI — Phase 4 Telemetry, Timeline, Events

Let's go for it. You're doing very well!
Built four SwiftUI overlays into a single MissionUIViews.swift file so the mission UI lives together: a MissionsMenu dropdown in the toolbar (airplane-departure icon, lists all 11 missions with checkmarks, plus a Stop replay item), a MissionTimelineSlider pinned above the zoom slider with an orange thumb and MET readout (dragging pauses playback via a one-way timelineScrubbing flag that restores prior pause state on release), a MissionTelemetryPanel glass-morphism HUD showing Mission Elapsed Time, distance (auto km or AU based on reference frame), and speed, and a MissionEventBannerView that slides up for four seconds whenever the simulation first crosses an event timestamp. The view model grew an updateMissionUIState() step that runs every third frame (same cadence as label projection) to compute telemetry, publish the event banner, and arm a one-shot end-of-mission speed reset so the simulation stops racing past splashdown.
Thirty-two unit tests all passing, including seven new ones covering MET formatting at the day boundary, distance formatting across heliocentric AU / small km / thousands of km, and speed formatting above/below 100 km/s. Simulator verification showed Apollo 11's TLI event banner firing at T+2:83h, the telemetry panel reading T+10:19:39 / 39k km / 1.26 km/s during translunar coast, and the orange timeline slider tracking mission progress from T+0 through T+8d.
Three subtleties made the UI feel right. First, the event banner uses a fresh UUID() as its SwiftUI identity rather than the event name, so SwiftUI re-runs the slide-in animation when the same event fires twice (rewind + replay). Second, the timeline slider's timelineScrubbing Published property toggles isPaused through a saved wasPausedBeforeScrub flag, so the view model can't fight the drag gesture with per-frame elapsed-hours sync. Third, one test expression tripped Swift's type-checker timeout (2 * 24 + 14 + 32.0 / 60.0 + 8.0 / 3600.0) — the fix was splitting into explicitly-typed Double intermediates. Worth noting for any test file that mixes Int and Double literals in long arithmetic.
18

ISS — Phase 5 Satellites

Please deploy to my phone, then proceed with 5 and 6.
Added the ISS to SolarSystemData with real orbital elements (408 km altitude, 51.6° inclination, 92.7-minute period) as a regular moon of Earth, so the existing moon-positioning, label-projection, and rotation pipelines apply for free. SceneBuilder.createBodyNode was special-cased for body.id == "iss" to return a procedural cross-shaped model instead of a sphere: a central truss (long horizontal box), pressurised modules (shorter cross-bar), four pairs of blue solar panels laid flat in the orbital plane, and two white radiators. All constant-lit, zero external assets. Added showISS to the view model (UserDefaults-backed, off by default) and an antenna.radiowaves Satellites menu button in the toolbar. The label-projection path skips ISS when the toggle is off so "ISS" doesn't float next to Earth with no mesh beneath it.
The ISS renders as a recognisable cross-shaped silhouette at close zoom, complete with its characteristic solar panel pairs. Zooming in shows the 3D structure clearly; at overview zoom it reduces to a tiny bright dot just above Earth. Two unit tests added to confirm ISS is registered as an Earth moon with the right orbital elements.
Treating ISS as a moon was a clean shortcut — the per-frame position update, quaternion rotation, and label projection all work without any ISS-specific code paths. The only special-case is createBodyNode (procedural geometry) and one label-skip when hidden. This pattern will extend naturally to other satellites like Hubble or JWST: add the orbital elements, add to the relevant planet's moons array, and (optionally) define a procedural model.
19

Lazy-Follow Camera — Phase 6 Polish

Please deploy to my phone, then proceed with 5 and 6. Don't forget to keep the .md and .html files updated :)
For lunar missions, computing the trajectory's local bounding box (MissionManager.missionBounds) and snapping the camera to Earth + that local centre with a Sun-side azimuth (Earth's ecliptic direction plus 0.55 radians for a terminator-on-the-far-side view) and elevation 0.3 radians. Distance sized to fit the trajectory's local radius with 1.4× padding for portrait viewports. A per-frame lerp at 0.02 keeps the camera target glued to earthScenePos + localCentre so the trajectory stays centred as Earth drifts heliocentrically. Gesture handlers in the scene coordinator now fire a userInteractionHandler on .began, which the view model uses to clear lazyFollowActive — the user gets full manual control the moment they drag. Heliocentric missions bypass framing and call resetToOverview instead. Event labels (TLI, Lunar Flyby, Saturn Arrival, ...) project into the SwiftUI overlay within a ±3% window of the mission duration around their timestamp (clamped 1–500 h), so at the default auto-speed each label appears for about two seconds of screen time then fades.
Apollo 11 now launches directly into a tight, Sun-side frame of Earth and the Moon without needing -focus earth. Cassini stays in the full overview so the VVEJGA trajectory is clear. Dragging breaks the follow instantly — no fight between the gesture and the lerp. Thirty-seven unit tests green; build and device install clean.
Two things worth remembering. First, azimuth in SceneKit's coordinate system uses x/z (not x/y) because scene y is ecliptic z — using x/y would place the camera below the ecliptic plane. The exact formula needed a sign flip later on (see Step 21) to land on the correct side of Earth. Second, wiring user-break as userInteractionHandler on .began rather than every .changed delta avoids spamming the view model each frame of a long drag. One call is enough to flip lazyFollowActive off for the rest of the session.
20

Wrapping Up

Let's go!
Over the course of six phases, the iOS app went from a feature-complete orrery to a full mission-replay experience with an ISS satellite. All the work ported faithfully from the companion web app — same orbital mechanics, same moon-distance compression, same mission data, same UI shapes — but expressed in Swift, SceneKit, SwiftUI, and pbxproj-registered resources. Trajectory data comes from a one-shot JSON export so the two projects stay in sync without duplicating code.
Phase 1: foundations (scaling + camera API). Phase 2: mission infrastructure + Apollo 11. Phase 3: all 11 missions via JSON export. Phase 4: mission UI (menu, timeline, telemetry, events). Phase 5: ISS. Phase 6: lazy-follow camera + 3D event labels. Thirty-seven unit tests, clean build on simulator and device, every living document up to date.
The consistent pattern across phases: a bundled JSON source of truth, DTO->domain decoding, pure Swift math that's nonisolated for testing, SceneKit for rendering, SwiftUI overlays for UI. Each phase added one new concept (moon compression constants, the mission manager, the JSON pipeline, the UI layer, procedural geometry, the camera orchestration) on top of what the previous phase had established. Iterate one phase at a time with unit tests first, simulator verification second, and docs refreshed before moving on — that cadence let the whole port land without any phase blocking the next.
21

Post-Port Polish — Niggles, Sign Flips, Rocket Icons

Just a couple of niggles so far: 1. The planet presets are zoomed quite a long way out. 2. The mission icon should ideally be a rocket not a plane. 3. The missions don't start with the camera angled so the target planet is lit well by the sun, unlike in the web version. 4. The trajectory for Artemis doesn't go beyond the moon and round the back. Other than that, looks great!
Four fixes in one pass, all small individually but each unlocking a clearly better feel:
Each fix verified in the simulator before device install: Jupiter frames tight with its moons in shot and the correct face lit; the rocket icon sits happily in the toolbar; Artemis II's arc now loops cleanly around the Moon and back. Two new CLAUDE.md gotchas captured (azimuth sign flip, trajectory depth bypass) so the same bugs can't come back silently.
The sign-flip bug is worth internalising. Spherical coordinates around a non-origin target feel natural (“azimuth 0 means look along positive Z from the target”), but the moment you want the camera to look toward the scene origin, the direction from target to origin is −target / |target|, not +target / |target|. Easy to get wrong, easy to miss when the simulator happens to show the target at a time when it was Sun-side of Earth anyway. The fix is an unconditional sign flip, applied once in each of the two focus paths.
22

Event Banner to the Top

Let's move the mission highlights popup to the top center of the screen, so it doesn't hide the mission in progress.
Pulled the MissionEventBannerView out of the bottom control stack and gave it its own top-centre overlay VStack in ContentView, with an 88pt top spacer so it sits just below the date/info bar. Card width bumped from 300 to 320pt and horizontal alignment switched from leading to centred (text inside remains leading-aligned so it stays readable). Transition flipped from move(edge: .bottom) to move(edge: .top) so the banner now slides down into view from above, matching its new position.
Banner no longer obscures the spacecraft marker or Moon during events — TLI, Lunar Flyby, LOI all fire up top while the mission continues visibly below. Kept as a free-floating ZStack overlay rather than part of the main VStack so it doesn't push telemetry, timeline, or the toolbar around when it appears and disappears.
Two composition details: (1) the spacer at the top of the banner's VStack is a fixed-height Color.clear, not a Spacer(), because Spacer() would expand to fill available vertical space and the banner would drift to the bottom. A fixed clear rectangle reserves exactly the room the info panel needs, no more. (2) The inner VStack(alignment: .leading) text stays leading-aligned while the outer card uses default (centre) horizontal alignment in its parent row — two independent alignment decisions at two layers of the hierarchy.
23

Native macOS from the Same Source

Could this project — the iOS one — easily port to native macOS too? Without changing much code?
Audited the iOS code and found about 50 UIKit-typed references across five files — UIColor, UIImage, UIGraphicsImageRenderer, UIViewRepresentable, and the gesture recognisers. The estimate came back as "half a day" so I took a swing at it in six phases:
34 unit tests green on both iOS Simulator and macOS. Build succeeds for platform=iOS and platform=macOS from the same target, same sources. Launch arguments — -mission apollo11, -focus jupiter, -timeScale 5000 — work identically on both via xcrun simctl launch … -- -mission apollo11 on iOS and open -n SolarSystem.app --args -mission apollo11 on Mac. Standard macOS navigation: left-drag to pan, right-drag to orbit, scroll / pinch to zoom, click to select, double-click to reset.
Three takeaways for porting SwiftUI + SceneKit iOS apps to macOS. First, the typealias layer is your friend — once PlatformColor / PlatformImage / etc. are in place, most of the search-and-replace is zero-risk. Second, SCNVector3's component type (Float vs CGFloat) is the single most common per-platform error; build a Double-based helper early and you'll dodge dozens of small breakages. Third, the macOS Swift type-checker has shorter timeouts for complex expressions than iOS — heavyweight closures that compile on iOS may time out on macOS and need extracting into named methods. All of these are one-time costs; once done, the same source tree builds cleanly forever.
24

Hunting a macOS Stutter with Instrumentation

labels better! Still stuttering once per second. Why not add some trace with timestamps and check for yourself?
Good instinct — guessing at performance bugs is usually slower than measuring. Added a -frameLog launch argument that prints any frame whose tick gap exceeds 20 ms (dropped frame at 60 Hz) or whose work exceeds 5 ms, plus a once-per-second summary of fps and worst-times. First run showed the baseline was a healthy 60 fps / 3 ms worst-work, punctuated by spikes where pos=265ms ui=yesupdatePositions was the culprit, specifically on UI-update frames. Split that phase into five sub-timers (bodies, stars, deconflict, mission-update, mission-UI) and ran again: every stutter was entirely in the bodies phase, with everything else sub-millisecond. Drilled one level deeper by wrapping individual projectLabel calls and the output was damning — slow projectLabel Earth (16.6ms), slow projectLabel ISS (16.5ms), repeating. 16.5 ms is exactly one 60 Hz display frame. SCNView.projectPoint was blocking the main thread waiting for a render-thread sync on every single call. Twenty-six bodies × 16.5 ms per call = 430 ms of stalls crammed into every UI-update frame. Replaced the projectPoint calls with manual matrix projection: cache projection × inverse(camera.simdWorldTransform) once per frame as a simd_float4x4, and project each body with a pure-SIMD multiply plus perspective divide plus NDC-to-screen mapping. About 100 ns per label, sub-millisecond for all twenty-six combined, and the output is already in SwiftUI top-left-origin coords so no per-platform Y flip is needed.
Before: fps~46 worst-work=266ms. After: fps~60 worst-work=2ms. Glassy smooth at 10,000× playback. Both platforms still pass all 34 unit tests. The -frameLog flag stays in the codebase so the same diagnostic path can pick apart any future regression in one command. Nice side effect: the iOS label projection got the same speedup even though it wasn't stuttering visibly.
Four reusable lessons from the session. First, cheap instrumentation pays off quickly — the frame logger was thirty lines and located the bug in two iterations. Second, print healthy baselines alongside anomalies — the once-per-second summary line made it obvious that normal frames were 3 ms, so a 265 ms frame was 80× off, not 2× off. Third, exactly-one-display-frame timings are a signature — when you see durations matching 16.7 ms / 33.3 ms / 50.0 ms at 60 Hz, you're usually looking at render-thread sync, not CPU work. Fourth, bypass framework conveniences you can reproduce cheaplyprojectPoint is opaque; a ten-line SIMD matrix multiply does the same job, is provably synchronous, and happens to also be about a hundred times faster. Whenever the framework method is called in a hot loop and does more work than arithmetic, that's a prompt to consider replacing it.
25

Labels That Stick — Building the Projection Matrix Ourselves

Labels still seem to have the label on top of the planet — see running app now. Actually tracks much better but is always overlaid on the planet so very hard to read. Would be great to take into account the radius of the object on-screen and offset by that (plus a bit).
Two follow-up issues from the projectPoint-bypass fix, both traced back to the projection matrix itself. First, labels drifted horizontally on macOS while vertical tracking was fine. Single-axis bugs point straight at a single matrix term — the [0][0] aspect component — so the fix was to stop reading camera.projectionTransform (which on macOS returns a matrix whose aspect doesn't track the live viewport) and build the matrix ourselves from camera.fieldOfView + view.bounds each frame. Hit one gotcha on the way: SCNCameraProjectionDirection only has .horizontal and .vertical cases, no .automatic despite the convention suggesting one. macOS defaults to horizontal, iOS to vertical; the two use mirror-image xScale / yScale formulae.
Second issue: even after the horizontal drift was fixed, labels still sat directly on top of the planets because the overlay had a fixed -16 pt vertical offset. Jupiter and the Sun at close zoom span hundreds of points — sixteen is nowhere near enough. Solution: offset each label by the body's actual on-screen radius plus a small margin, derived from the same cached projection matrix. A first attempt used an empirical r / clip.w * 300 formula (borrowed from the pre-existing star-occlusion code) but that undershot on widescreen Mac windows by 3–4×. Replaced with the correct worldRadius * pixelsPerUnit / clip.w, where pixelsPerUnit = yScale * (viewportHeight / 2) comes directly from the projection matrix. Now labels float just above every body at every zoom, from Mercury at overview to the Sun filling the screen.
Labels stick to their bodies horizontally and vertically across the full zoom range. Both platforms pass all 34 unit tests unchanged. The 8 pt floor handles stars and the procedural ISS model (no SCNSphere geometry, so nominal zero radius) — they still get a readable gap.
Two takeaways. First, single-axis visual bugs are matrix-term bugs — when only the horizontal or only the vertical is off, go straight to the projection matrix and inspect its [0][0] / [1][1] values rather than fiddling with empirical constants. Second, derive visual constants from the matrix you're rendering with. The 300 in r / clip.w * 300 felt plausible on iOS but was just an eyeball fit for one particular viewport. The correct yScale * height / 2 is the same factor SceneKit uses internally to size the sphere on screen, so it's guaranteed to match the rendering — no fudging per platform or per window size.