Sound: yes. Video: no.
The first inbound stream hit the media server on Sunday over SRT. Connection stable, bitrate solid, audio clean — but no video.
When an encoder sends a stream, it ships two things together: the actual data (audio and video) and a small service description — a manifest listing what tracks are present, where to find them, how to read them. The receiver opens this manifest first and takes exactly what it lists.
In this stream, the manifest had a bug — specifically on the SRT output path. Audio track listed, video track not. The video data was physically on the wire — all 4.7 Mbit/s of it, every frame — but the manifest said nothing about it. The server read what it was told and delivered audio that played perfectly.
Still investigating the exact cause, but it looks like an encoder-side issue: something about how this device builds its metadata when sending over SRT creates a sender/receiver mismatch.
Switched to RTMP. Video arrived on the first frame.
Next two hours: four gigabytes received, eight lost packets total — solid numbers for live production.
There's a class of bugs you can't catch on a test bench, because the test bench always has the right equipment with the right settings. Live, like production, is different: different device, different config, details you didn't know about until the connection happened — and then a new phase starts, you sit down, you figure it out.
This is why a service needs to go live before the first paying client arrives — while there's still time to work through things calmly.
→ Data without correct metadata is invisible to the receiver, regardless of how much bandwidth is flowing (close to 5 Mbit/s in this case) → The same hardware can behave differently depending on the transport protocol, and that's only verifiable with a real test → The first real push always finds what the synthetic test misses
Let's go.