Originally Posted by electrictroy
Here's what I come up with:
24 bit color per pixel
1/2 frame sent per second(interlaced)
24.9 million bits/second
That doesn't sound right. Help please. Where's the error in my math?
Are you asking about the total signal or the broadcast signal?
True HD is...
2200 pixels per line in 29.97 Frames per second (that includes active video and horizontal blanking)
1125 lines per frame (that includes active picture and vertical blanking)
29.97 Frames per second
10 bits for luminance
10 bits for chroma
so the math would look like this
1125 * 2200 * 29.97 * 20 = 1,483,515,000 bits or 1.484Gbps
But broadcast is different they stuff that into an MPEG frame.
They use only 8 bit color and only 1088 lines and I think only 1920 pixels.
1088 * 1920 * 29.97 * 16 = 1,001,698,099 bits or 1.002Gbps.
It wouldn't matter if the frame is "i" or "p" it would still be the same.