We'd like to remind Forumites to please avoid political debate on the Forum... Read More »
We're aware that some users are experiencing technical issues which the team are working to resolve. See the Community Noticeboard for more info. Thank you for your patience.
📨 Have you signed up to the Forum's new Email Digest yet? Get a selection of trending threads sent straight to your inbox daily, weekly or monthly!
Help explain this HD malarky please!
Options
Comments
-
moonrakerz - All broadcast HDTV is 1080i - you were close!
Camcorders - no, if you buy a £300 1080p camcorder, it will not look like a bluray!
It's the same as cameras. You can buy a 10MP camera in Argos for under £100. Or you can buy a 10MP SLR for £500, and buy a lens for £500 - the SLR's pictures will look far better. 1080p and 10MP is simply a count of the resolution, the number of pixels. It completely ignores the lens, lighting etc used to get the picture, still or moving.
So a classic case of numbers over quality....“I may not agree with you, but I will defend to the death your right to make an a** of yourself.”
<><><><><><><><><<><><><><><><><><><><><><> Don't forget to like and subscribe \/ \/ \/0 -
OK - if we wish to be pedantic !Interlace is definately a "legacy" technologyUnfortunately it still persits even today.1080i (interlaced) is a set which has had corners cutmoonrakerz - All broadcast HDTV is 1080i - you were close!
BUT -the thread was headed HD though
as would be the spurious assumption OP is only ever going to watch broadcast TV and not say, blu-ray disks etc etc
(Perhaps you should read every line of my post instead of "every other" !)
0 -
I thought broadcast was still at 720? Didn't realise the HD broadcast made the jump to 1080 already!
It's correct regarding the pixel count and the quality of the source image and how it was captured. You have all types of variables that effect picture quality, such as quality of optics and image recording hardware used to caputre, and then processing on top of that, as well as what you play your HD content through.
I've recorded some events with my 1080p camcorder, (I use digital tape as I prefer it), it wasn't a cheap a £300 one either. The picture quality when played through my 1080p tv was excellent, crystal. Had the event been recorded by a professoinal studio using professional HD gear and lighting etc the picture would have look much better in terms of the source captured.
The megapixel count only really makes a huge difference when you want to blow up the image, but again without a decent sensor it doesn't really make a difference how many megapixels you have. What you generally have on compacts and entry level SLR camera's are sensors with more and more pixels crammed into the same space which can degrade quality. You kind of get what you pay for; I'm not saying that you should go out and spend 1.5k on a camera, because even that alone won't get you studio quality pictures.
If £300 is your budget, shop around and see what you can find. Have a look in shops and see if you find anything you like, then look online for a better deal.0 -
I didn't say £300 was my budget, and I'm not on about Tv's either!0
-
moonrakerz wrote: »(Perhaps you should read every line of my post instead of "every other" !
)
0 -
Hello,
Im my simple way of thinking:
1080i = 1920 x 1080 interlaced
1080p = 1920 x 1080 progressive or progressive scan
Interlaced works by drawing all the odd lines first, then going back and doing all the even lines (or maybe it's even first then odd but you get the idea - it takes two passes from top to bottom to draw the whole screen)
Progressive works by drawing all the lines in order from top to bottom in one go.
So both pictures have the same resolution, the only difference is the way they are displayed/drawn on screen.
It's generally accepted that 1080p gives better quality than 1080i especially for fast moving/changing scenes (eg Sport / movies), but it's all subjective and depends on the devices being used to record, playback and view the video.
NiVZ0 -
-
Okay, sounds like I wasn't technically correct (but then I'm no video engineer)
Found this explanation on Wiki which sounded more like what really happens:
"All mainstream analog and many digital television systems arrange the scan lines of each frame into two consecutive fields, one containing all even lines, another with the odd lines. The fields are then displayed in succession at a rate twice that of the nominal frame rate. For instance, PAL and SECAM systems have a rate of 25 frames/s or 50 fields/s, while the NTSC system delivers 29.97 frames/s or 59.94 fields/s. This process of dividing frames into half-resolution fields at double the frame rate is known as interlacing."
NiVZ0
This discussion has been closed.
Confirm your email address to Create Threads and Reply

Categories
- All Categories
- 350.9K Banking & Borrowing
- 253.1K Reduce Debt & Boost Income
- 453.5K Spending & Discounts
- 243.9K Work, Benefits & Business
- 598.7K Mortgages, Homes & Bills
- 176.9K Life & Family
- 257.2K Travel & Transport
- 1.5M Hobbies & Leisure
- 16.1K Discuss & Feedback
- 37.6K Read-Only Boards