Why can’t my Disguise VX4+ with a VFC HDMI 2.0 card output 4 K @ 60 Hz 10‑bit 4:4:4?

(…and why the same hard limit applies to a Brompton Tessera SX40 or any other processor that exposes an HDMI 2.0 socket)


1 · The myth that “HDMI 2.0 handles everything”

Question: What if someone promises you True 4K 10-bit 4:4:4 over HDMI 2.0 on your Volume (video wall)?

Disguise VFC Cards data sheet

“The spec says 18 Gb/s. Shouldn’t that be plenty for 4 K @ 60 Hz, 10‑bit, RGB 4:4:4?”

The connector is willing; physics is not. HDMI 2.0 tops out at 18 Gb/s of useful payload, and that cap is before we add protocol overhead. But wait… Real life HDMI 2.0 bandwidth is 14,4 Gb/s.

Let’s do the math:

ParameterValue
Resolution3840 × 2160 px
Refresh60 Hz
Bit‑depth10 bits per channel
Chroma4:4:4 (RGB, no subsampling)

Quick math – Active pixels only:
3840 × 2160 × 60 fps × 30 bit ≈ 14.93 Gb/s
Add blanking pixels plus 8b/10b TMDS overhead and we land around 21 Gb/s – well beyond HDMI 2.0’s runway.

Therefore something has to give:

  • Drop to 8‑bit ➜ 4 K @ 60 8‑bit 4:4:4 fits.
  • Keep 10‑bit but subsample to 4:2:2 or 4:2:0.

(A quick recap on 8b/10b and blanking is coming soon in another post.)


2 · “But my VX4+ is ‘quad‑4K’!”

Yes – but each VFC output is still plain HDMI 2.0. The VX4+ render pipeline could push 12‑bit, yet when the signal hits that socket it must squeeze into 18 Gb/s.

Designer’s EDID generator already tells you:

  • 4 K @ 60 → 8‑bit RGB 4:4:4.
  • 4 K @ 60 → 10‑bit YCbCr 4:2:2 (or 12‑bit 4:2:0).

Disguise manual r27 pag. 1902


3 · Same story on the Brompton SX40

Input limit: 600 MHz pixel clock → the HDMI 2.0 ceiling.

  • 4 K @ 60 8‑bit 4:4:4 → 594 MHz ✔
  • 4 K @ 60 10‑bit 4:4:4 → 742 MHz ✘

4 · What are my options in Virtual Production?

GoalPractical choiceTrade‑offs
Bit‑depth first4 K @ 60 10‑bit 4:2:2Full HDR, slight chroma softening; most LED walls won’t show it.
Perfect chroma4 K @ 60 8‑bit 4:4:4No subsampling, but limited grey steps.
Have both4 K @ 60 10‑bit 4:4:4 over DP 1.4, 12G‑SDI, or HDMI 2.1Not possible through a VFC HDMI 2.0 card.

5 · Common questions I get on set

  • Can I slice the wall into four quarter‑feeds (each 4 K @ 60 8‑bit) and stitch them? Yep – each feed stays under 18 Gb/s.
  • Will 4:2:2 look bad on camera? Only on razor‑sharp edges or heavy crops. Feature films run 4:2:2 10‑bit all the time.
  • What if I drop to 50 Hz or 30 Hz? Halving refresh frees bandwidth: at 30 Hz you could do 4 K 10‑bit 4:4:4 – but you lose the 60 p motion feel.
  • Why not HDMI 2.1? Because, today (2025), disguise offers no HDMI 2.1 VFC… yet.

6 · “True 4K” marketing in the wild

NovaStar’s H‑Series datasheet claims:

True 4K — 4K × 2K @ 60 Hz, RGB 4:4:4, 10‑bit.

Read the fine print:
They rely on dual HDMI 2.0 cables (each 2 K wide) or a DP 1.2 input using CVT‑RB (Coordinated Video Timing – Reduced Blanking). The headline is true, but only when you split the payload or use RB timing to trim blanking pixels. One cable alone won’t carry that load.


7 · Designer’s own disclaimer

“When using 4K 50/60 Hz over HDMI 2.0, chroma subsampling 4:2:2 is required. Designer currently produces 4:4:4 EDIDs, so you must change the setting in the GPU control panel.”
(Designer r27, page 1902)

Translation: HDMI 2.0 can’t push 4k@60Hz uncompressed 10‑bit 4:4:4. They’re not hiding anything; we just need to read the numbers.


8 · Key takeaways before you sign a quote

  1. Multiply resolution × refresh × bit‑depth × chroma → raw payload.
  2. Check the transport limit (HDMI 2.0, DP 1.2, 12G‑SDI…).
  3. Decide: lower bit‑depth, move to 4:2:2, or pick a fatter pipe.

Manufacturers aren’t lying; specs need context. If you keep the maths handy, you’ll avoid nasty surprises at camera tests.

And above all, the next time someone promises “True 4K 10-bit 4:4:4 over HDMI 2.0,” raise an eyebrow… or two!

(If this post saved you from a scare, share it with your VP team—they’ll thank you for it.)