8Bit or 12bit?

What is the difference between 8 bit color mode or 12 bit???

I have a Samsung LCD 32LN650!!!

Thanls for your support

The technical difference is that 12bit offers a greater color depth.

In practice, though, it makes no difference at all.  There are no commercial sources for 12bit material (and I’ve never seen *anyone* able to create 12bit material) and this has been true for many years now and is unlikely to ever change.  Just leave it at 8bit and don’t worry.

Soheil1223 wrote:

What is the difference between 8 bit color mode or 12 bit???

 

A 12-bit capable decoder has 16 times more primary colors, and a total of 4 thousand times more colors overall with which to work, bringing the total color transmission capable into the tens of billions.   

This is a gross over-simplification, but basically:

When the “Inverse” Discrete Cosine Transform is applied to decompress and “reconstitute” a video stream from its compressed source, there’s much more “precision” (well, 4-bits more) at its disposal per color channel.   This can improve the picture in several ways.   One of the most noticeable is a reduction in the artifact known as “Color Banding” in images that are very uniform vary slightly in hue or brightness through a continuous gradient.    Think of a shot of a clear blue sky.   It’s almost ENTIRELY a single color, but there’s enough TINY variation that bands will appear.  

This whole 8-bit vs 12-bit thing is apparently a very religious debate.

Many TV’s accept 12-bit streams per channel, but don’t have a panel that can reproduce more than 8-bits.  

Many folks argue that banding should be addressed in ENCODING.

I don’t get wrapped up in it… 

But none of it matters without a 12bit source (you can’t “manufacture” extra colors that aren’t there to begin with).

Which is why I say set it to 8bits and never worry about it again.

I have an LG HD TV which can be set to 8 or 12bit and quite honestly I can see no difference.

Same as Rich; my TV (Panasonic plasma) can be set to either and there’s no visual difference.  Without 12-bit media to test it with, it’s impossible to tell.  Set to 8-bit and forget about it.  ;)

+1

Without a proper source, it’s useless. It’s like testing an audio device with 128kbps mp3’s :slight_smile:

BTW: Anybody has seen a DVDs with 12 bits? Or a Bluray? I’m thinking to iso files built by my laptop from original disk as I didn’t see any difference…

There are no commercial sources for 12bit – no DVDs or blu-rays.  They would have to change their manufacturing process and that ain’t gonna happen, not this late in the game.

You can master something yourself if you like – but it would have to start off with material that isn’t already sampled down to 8bit (IOW, you’d have to start with analog video – AFAIK, no consumer camcorder can record in 12bit, and I’m rather doubtful even many commercial ones can).  But even analog video would be a challenge because you’d then have to have a capture card that could do 12bit.  You *could* do it with still images – Photoshop will do 12bit color – but is that *really* all that significant?

I really wish both WD and TV makers had just left this feature off as it will just confuse folks with no real purpose.

From my understanding, In order to pass HDMI 1.3, they have to support Deep Color, and must AT LEAST support 12-bit.  10-bit and 16-bit are optional.   Not sure if that applies to both SOURCE and SINK, or just SINK.

Hmmm, that’s interesting.  Have no idea why that’s in the standard, then, since no one commericially is going to do anything about it.  And does HDMI 1.3 certification even *mean* anything?  I mean, are you going to pass up something just because it wasn’t certified for 1.3?

Oh well, just another thing to add to the FAQ, if you so desire (namely – forgedabouit)

Some good info here.

http://www.abccables.com/info-hdmi-deep-color-.html

There’s actually quite a few devices that are doing Deep color; the PS3, quite a few Video Cards for PCs, etc. 

Anything natively rendered in those devices will be Deep Color sources.

It was more this bit that was interesting:

It is extremely important for the consumer to understand that every movie that was ever transferred to a DVD or any other digital format has been transferred using an 8-bit color depth. While the newest Deep Color format may give an improvement in the quality of the picture, there is no content currently available, no archived material, no movies, and no TV shows, that can be trans-coded easily into the deep color system. With the current problems of Digital Rights Management (DRM) and High-bandwidth Digital Content Protection (HDCP) just starting to be worked out, it is highly unrealistic that the entertainment industry and Hollywood will hurry to produce software that is an exact copy of the original movie quality.

Yeah, that’s my point – 12bit material ain’t coming any time Soon (read: in MY lifetime).

What they don’t say is that Hollywood has a tremendous amount invested in the digital conversion process already, and that to change that to provide 12bit color would be VERY expensive (one of my many other hats was working with digital conversion.  Sigh.  I think I’ve lived too long).

Unless there is a compelling reason to change, they just ain’t gonna spend the money.  Right now, the major emphasis is on 3D, so my own gut is we are at least a decade (if not more) away from changing standards that would even allow 12 bit material to be produced.

So – it’s still a question of, fuggedaboutit.