So what’s the 12 bit setting got to do with this?
That is for the output range of the HDMI 1.3 cable. Unfortunately it has nothing to do with the color depth of the video, or the hardware is capable of decoding, only tells you the maximum that can display. Look at this:
HDMI color depth:
- HDMI 1.0 supports 8-bit (RGB or YCbCr) color depths.
- HDMI 1.2 supports 8-bit (RGB or YCbCr) color depths.
- HDMI 1.3 supports 8-bit ,10-bit, 12-bit and 16-bit (RGB or YCbCr) color depths.
Note: HDMI Licensing previously used an alternative naming scheme referring to 10-bit, 12-bit and 16-bit color as 30-bit, 36-bit and 48-bit color, reflecting the bit depth of all three colors (RGB or YCbCr) combined. 16-bit is optional for some HDMI 1.3 cables.
Then WD Live is capable of displaying up to 12-bit by HDMI 1.3, but the question is if it can decode H.264 at 10-bit. I think so, even sacrificing resolution or fps as with the video in 3D.