pvm/docs/plans/2026-02-08-display-system-design.md
Mikkel Georgsen f5df3e1182 Add display system design doc (Orange Pi 5 + ROCK 4D signage)
Documents the architecture for PVM's managed digital signage system:
Orange Pi 5 as leaf node (local backend, screen management API, video
pipeline), Radxa ROCK 4D as thin-client display nodes (kiosk Chromium,
CEC control, mDNS auto-discovery). API-first design for web and tablet
management clients. libsql on leaf, PostgreSQL on core.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-09 14:50:04 +01:00

14 KiB

Display System Design

Overview

PVM venues need to push content (tournament clocks, waitlists, announcements) to multiple screens. This design covers the hardware, software, and architecture for a managed digital signage system built on cheap ARM SBCs.

System Topology

                    ┌─────────────┐
                    │  Core Server│  (Hetzner)
                    │  PVM Backend│
                    └──────┬──────┘
                           │ VPN tunnel (WireGuard)
                           │ - data sync (messaging pipeline)
                           │ - remote management (SSH, updates)
                           │ - content push (templates, media)
                           │ - remote screen management
                           │
                    ┌──────┴──────┐
                    │ Orange Pi 5 │  (1 per venue, ~€65)
                    │ "Leaf Node" │
                    │             │
                    │ • PVM local backend
                    │ • Screen management API
                    │ • Video generation pipeline
                    │ • Display node orchestrator
                    │ • mDNS discovery service
                    │ • Local web server (display content)
                    └──────┬──────┘
                           │ Local network (LAN/WiFi)
                           │ mDNS auto-discovery
                           │ WebSocket for live updates
                           │
              ┌────────────┼────────────┐
              │            │            │
        ┌─────┴─────┐ ┌───┴───┐ ┌─────┴─────┐
        │ ROCK 4D   │ │ROCK 4D│ │ ROCK 4D   │  (~€34 each)
        │ Screen 1  │ │Scrn 2 │ │ Screen N  │
        │ Kiosk     │ │Kiosk  │ │ Kiosk     │
        └───────────┘ └───────┘ └───────────┘

Key Principles

  • Orange Pi 5 is fully autonomous — venue operates with no internet.
  • Display nodes are stateless thin clients — boot, discover, display.
  • All display content served locally — no cloud dependency for screens.
  • CEC gives power/input/volume control over connected TVs.
  • Conflict resolution: leaf wins, timestamps for ordering.

Hardware

Orange Pi 5 (Leaf Node)

  • SoC: Rockchip RK3588S
  • CPU: 4x Cortex-A76 @ 2.4GHz + 4x Cortex-A55 @ 1.8GHz
  • GPU: Mali-G610 MP4
  • RAM: 4GB (minimum, 8GB recommended)
  • HDMI: 2.1, 8K@60Hz, CEC supported
  • Video decode: H.265/VP9 8K@60fps, AV1 4K@60fps, H.264 8K@30fps
  • Video encode: H.265/H.264 8K@30fps, 16x 1080p@30fps parallel
  • Price: ~€65
  • Role: Runs all venue-local services. Optional video generation pipeline via ffmpeg-rockchip (hardware-accelerated encode).

Radxa ROCK 4D (Display Node)

  • SoC: Rockchip RK3576
  • CPU: 4x Cortex-A72 @ 2.2GHz + 4x Cortex-A53 @ 1.8GHz
  • GPU: Mali-G52 MC3
  • RAM: 4GB
  • HDMI: 2.1, 4K@60Hz, CEC supported
  • Video decode: H.265/VP9/AV1 8K@30fps / 4K@120fps, H.264 4K@60fps
  • Video encode: H.265/H.264 4K@60fps (not used for display, but available)
  • Price: ~€34
  • Role: Thin client. Boots into kiosk Chromium, renders assigned URL at 4K@60Hz.

Bill of Materials (Per Venue)

Item Unit Cost Qty Notes
Orange Pi 5 (4GB) ~€65 1 Leaf node
Radxa ROCK 4D (4GB) ~€34 N One per screen
SD cards / eMMC ~€5-10 N+1 Storage per device
Power supplies ~€5-8 N+1 USB-C PSU per device

Minimum venue cost: ~€110 (1 leaf + 1 screen). Typical small venue (4 screens): ~€250. Large venue (12 screens): ~€550.

Display Node Lifecycle

Boot Sequence

  1. Power on, custom Linux image boots (Armbian BSP, ~5-8s).
  2. Loads last provisioned background from local storage (immediate visual, no blank screen).
  3. Agent daemon starts, announces via mDNS (_pvm-display._tcp).
  4. Broadcasts: device ID (persistent, generated on first boot), MAC, IP, display capabilities.
  5. Waits for assignment from the Orange Pi 5.
  6. Once assigned, kiosk Chromium opens the assigned URL: http://leaf.local:3000/display/{screen-id}/{stream-slug}.
  7. WebSocket connection established for live updates + heartbeat.

Steady State

  • Renders assigned webpage at 4K@60Hz.
  • WebSocket receives live data pushes (clock ticks, table updates, announcements).
  • Heartbeat ping every 5s back to leaf node (health monitoring).
  • CEC daemon listens for commands (power, volume, input switching).

Reassignment

Staff drags screen to a new stream in the web UI. Leaf node sends WebSocket message to the display node: { type: "navigate", url: "/display/{screen-id}/{new-stream}" }. Chromium navigates to new URL — instant switch, no reboot.

Failure Modes

  • Leaf node unreachable: display continues showing last content (static). WebSocket reconnects with exponential backoff.
  • Display node reboots: boots to last background, re-announces via mDNS, leaf re-assigns automatically.
  • Network loss: same as leaf unreachable — last content persists.

Power Management

Default: Always-On

  • DPMS disabled at boot (xset -dpms; xset s off; xset s noblank).
  • Screen blanking disabled in Chromium flags (--disable-screen-saver).
  • Kernel: consoleblank=0 boot param.
  • No screensaver, no idle timeout — screen stays on 24/7 unless the management UI says otherwise.

Managed Power Schedules (via Web UI)

  • Per-screen or per-group schedules (e.g., "Screen 1: on at 17:00, standby at 03:00").
  • Uses CEC power:standby / power:on to control the TV.
  • The ROCK 4D stays powered on (agent daemon running) even when the TV is in standby — ready to wake it instantly.
  • Override button in web UI: "Wake all screens now" / "Sleep all screens now".

CEC Power State Tracking

  • Agent polls TV power state periodically (every 30s).
  • If TV is manually turned off by someone with a remote, leaf node knows and flags it in the UI ("Screen 3: TV powered off manually").
  • Optional: auto-wake if TV is turned off outside scheduled sleep window.

CEC Commands

Sent from the leaf node to display nodes via their agent daemon HTTP API:

  • power:on / power:off / power:standby
  • volume:set:{level} / volume:mute / volume:unmute
  • input:switch (force TV to the ROCK 4D's HDMI input)
  • status:query (returns TV power state)

Stream Model

A stream is a URL. The leaf node serves its own pages (tournament clock, waitlist, etc.) as URLs, and can also point a screen at any external URL.

Each stream has:

  • Name: friendly label (e.g., "Tournament Clock", "Welcome Screen")
  • URL: the content source
  • Default audio level: 0-100, muted by default
  • Priority: for scheduled overrides (e.g., "announcement" stream temporarily takes over all screens)
  • Fallback: URL to show if the primary stream source fails

Future upgrade: stream a native app's output by capturing the app framebuffer on the Orange Pi 5, encoding via ffmpeg-rockchip to HLS, and displaying in the browser on display nodes.

Screen Management API

The leaf node exposes a RESTful API. All management features are API-first. The web UI and future tablet app are consumers of this API.

Auth: Bearer token (PVM session token).

Screens (Display Nodes)

GET    /api/v1/screens                    # list all discovered screens
GET    /api/v1/screens/:id                # screen details + status
PATCH  /api/v1/screens/:id                # rename, assign group, set default stream
DELETE /api/v1/screens/:id                # forget/unpair a screen

Screen Control

POST   /api/v1/screens/:id/assign        # { streamUrl }
POST   /api/v1/screens/:id/power         # { action: on|standby|off }
POST   /api/v1/screens/:id/volume        # { level: 0-100, mute: bool }
POST   /api/v1/screens/:id/navigate      # { url } — raw URL override
POST   /api/v1/screens/:id/reload        # force browser refresh

Bulk Operations

POST   /api/v1/screens/bulk/assign       # { screenIds[], streamUrl }
POST   /api/v1/screens/bulk/power        # { screenIds[], action }
POST   /api/v1/screens/bulk/volume       # { screenIds[], level, mute }

Streams (Saved URL Presets)

GET    /api/v1/streams                    # list all configured streams
POST   /api/v1/streams                   # { name, url, defaultAudio }
PATCH  /api/v1/streams/:id               # update
DELETE /api/v1/streams/:id               # remove

Groups (Logical Screen Grouping)

GET    /api/v1/groups                     # list groups
POST   /api/v1/groups                    # { name }
PATCH  /api/v1/groups/:id                # rename, set default stream
POST   /api/v1/groups/:id/assign         # assign stream to all screens in group
POST   /api/v1/groups/:id/power          # power control all screens in group

Schedules

GET    /api/v1/screens/:id/schedules     # list schedules for a screen
POST   /api/v1/screens/:id/schedules     # { streamUrl, cron, priority }
DELETE /api/v1/screens/:id/schedules/:sid

GET    /api/v1/groups/:id/schedules      # list schedules for a group
POST   /api/v1/groups/:id/schedules      # { streamUrl, cron, priority }
DELETE /api/v1/groups/:id/schedules/:sid

WebSocket Events

WS     /api/v1/events

Pushes real-time state changes to connected management clients:

  • screen:discovered — new device on network
  • screen:online / screen:offline — heartbeat lost/restored
  • screen:tv_power_changed — CEC detected TV state change
  • screen:assigned — stream changed (from another client)

Display Node Agent API

HTTP server running on each ROCK 4D, called by the leaf node:

POST /navigate     { url }           — switch displayed URL
POST /power        { action }        — CEC power command to TV
POST /volume       { level, mute }   — CEC volume command
POST /reload                         — force browser refresh
GET  /status                         — device health, TV power state, current URL
POST /update       { imageUrl }      — trigger OTA firmware update

Orange Pi 5 Software Stack

OS: Armbian (Debian-based) with Rockchip BSP kernel (6.1 LTS) — needed for MPP/CEC driver support.

Service Role Tech
PVM Backend Tournament/player data, local DB Rust (axum), libsql
Display API Screen management REST API + WebSocket events Rust (axum)
Content Server Serves display pages (tournament clock, waitlist, etc.) SvelteKit (SSR)
Discovery Daemon mDNS listener for _pvm-display._tcp, registers new screens Rust, mdns-sd crate
Scheduler Cron-like engine for time-based stream assignments Rust, in-process
CEC Proxy Sends CEC commands to display nodes via their agent daemon Rust, HTTP calls to ROCK 4D agents
VPN Client WireGuard tunnel back to core wg-quick / systemd
Sync Agent Messaging pipeline — pushes data packets to core Rust
Video Pipeline (Optional) ffmpeg-rockchip for rendering content to HLS ffmpeg, triggered on demand

Display Node (ROCK 4D) Software Stack

OS: Armbian BSP (minimal), custom image.

Service Role Tech
Agent daemon mDNS announcement, heartbeat, receives commands from leaf Rust (cross-compiled aarch64)
Chromium kiosk Fullscreen browser, no UI chrome, renders assigned URL Chromium, kiosk mode
CEC daemon Controls TV via libcec, receives commands from agent libcec
Power manager Disables DPMS/blanking, enforces always-on unless told otherwise systemd + xset
OTA updater Pulls firmware/image updates from leaf node on boot Rust, in agent daemon

Data Architecture

┌─────────────────────┐         ┌──────────────────────┐
│     Core (Hetzner)  │         │  Leaf (Orange Pi 5)  │
│                     │  VPN +  │                      │
│  PostgreSQL         │◄────────│  libsql              │
│  • All venues       │  msg    │  • PVM domain data   │
│  • Full history     │ pipeline│  • Screen mgmt state │
│  • Analytics        │         │  • Schedules/groups  │
│                     │         │                      │
└─────────────────────┘         └──────────────────────┘
  • Leaf (libsql): Single database, all local state. PVM domain data (tournaments, players, waitlists) and screen management state (devices, streams, groups, schedules).
  • Core (PostgreSQL): Aggregated data from all venues. Full historical record. Analytics, reporting, billing. Screen management config for remote management.
  • Sync: Leaf to core via existing messaging pipeline. Eventual consistency with causal ordering and leaf-wins conflict resolution.

Monetization

Charge a small fee per ROCK 4D display device. Covers hardware cost + margin + funds ongoing development. Transparent with early customers that it's to cover initial backend and development costs.

Future Upgrades

  • App streaming: Capture native app framebuffer on Orange Pi 5, encode via ffmpeg-rockchip to HLS, display in browser on ROCK 4D nodes.
  • AI-generated slides: Instead of plain text on black background — 3D animated gold text with subtle fire effects, rendered as a webpage. Venue infotainment that scales with customer size.
  • Tablet management app: Different UI, same API. Drag-and-drop screen management from a tablet on the venue floor.