We’d like to remind Forumites to please avoid political debate on the Forum.
This is to keep it a safe and useful space for MoneySaving discussions. Threads that are – or become – political in nature may be removed in line with the Forum’s rules. Thank you for your understanding.
📨 Have you signed up to the Forum's new Email Digest yet? Get a selection of trending threads sent straight to your inbox daily, weekly or monthly!
What HDMI cable???
Comments
-
kwikbreaks wrote: »Yes point taken. What I really don't know is how far there is between working perfectly and getting really obvious problems such as sparklies.
One (off topic) thing I will ask you though is if my media player is set to 1080p output (which it is) is the bandwidth requirement for the cable lower if playing 720p or even SD - now I don't think it is as every one of those 1920x1080 pixels gets transmitted down the cable regardless of the original source.
I'm not properly set up for HD yet - I'm having a house refurbished and have bought some kit in advance but it isn't set up properly at home - the TV isn't in full use - it is sitting on the floor so I've not spent a huge time watching it but when I have I haven't seen a hint of problems even though I'm using cheap 4m cables.
Well thats dependant on many different things (Not least whoever it is thats 'trying' to notice a difference.) One of the links ill post is when a test was done to find definite measured errors coming out the other end of a hdmi cable but they COULDNT see any specific problems ~ ergo there was the odd pixel showing the wrong colour. How many errors it would take to actually SEE a difference is obviously highly subjective (And dependant on many things)
As for your question ~
720P and SD content dont need as much bandwidth as 1080 so they should work fine regardless (If they dont then theres something seriously wrong with the cable)
However, SD content UPSCALED to 1080 will test a lot of cables:idea:0 -
Ok
1st up we have measured 'jitter' in digital cables ~
http://www.stereophile.com/features/368/index3.html
After measuring the first two products (the PS Lambda and the Panasonic SV-3700), I went back and repeated my measurements to make sure the analyzer was giving consistent results, and that my test setup was correct. When I remeasured the SV-3700, I got about half the jitter than when I first measured it! What caused this reduction in measured jitter?
Changing the direction of the digital interconnect between the transport and the jitter analyzer.
This phenomenon was easily repeatable: put the cable in one direction and read the RMS jitter voltage, then reverse the cable direction and watch the RMS jitter voltage drop. Although I'd heard differences in digital-cable directionality, I was surprised the difference in jitter was so easily measurable—and that the jitter difference was nearly double.
To confirm this phenomenon, I repeated the test five times each on three different digital interconnects. One was a generic audio cable, the other two were Mod Squad Wonder Link and Aural Symphonics Digital Standard, both highly regarded cables specifically designed for digital transmission. The generic cable wasn't directional: it produced the same high jitter in either direction. But both the Wonder Link and the Aural Symphonics had lower jitter levels overall, but different jitter levels depending on their direction. Moreover, the generic cable had higher jitter than either of the two premium cables—even in the latters' "high-jitter" direction.
http://www.stereophile.com/features/368/index8.html
Conclusions
There is now no question that jitter in CD transports and digital interfaces affects digital audio sound quality. Not only do different transports and interfaces sound different, they produce varying amounts of jitter and have their own "jitter signatures," seen in the jitter's spectral distribution.
Moreover, we can see that transport jitter goes right through the digital processor's input receiver (even the Crystal CS8412) and affects the amount of jitter at the DAC's word clock—the point where jitter makes an audible difference. If the word-clock timing is different, the sound will be different.
The revelation that digital interconnects and their direction can introduce large differences in measured jitter was quite a shock. The differences heard between digital interconnects—and in their directionality—have now been substantiated by measurement.
In conclusion then ~ the above experiment proved that jitter can and does have an effect in digital cables to the point it actually affects the DAC its going into (meaning that whether or not enyone could actually HEAR the difference, there is a definite MEASURED difference all the same):idea:0 -
Next up ~ MEASURED errors in the cable and yet they COULDNT actually SEE any discernable difference ~
http://www.audioholics.com/education/cables/long-hdmi-cable-bench-tests/evaluation-conclusion
"
We took two of the worst performing cables of the bunch, a RAM Electronics 50' HDMI cable ($130) and an equally challenged Tributaries (Series 9) 15 meter cable ($899). Both understandably fail eye pattern tests at even 720p resolutions. Both, unfortunately, also claim HDMI 1.3 support at up to 10.2 Gbps bandwidth and with Deep Color support. It was fairly obvious that both of these cables would fail real-world tests when connected to a 1080p source.
Except that they didn't.
I saw clean video on two separate displays. I even used two different sources - one HDMI 1.3 and the other sporting an older HDMI 1.2 chipset. Then I got real desperate and nabbed an old HDMI 1.0 source (A Helios NeuNeo player) and slapped it up to triple check the signal.
What?!? Scratching my head I searched in vain for a way to get them to fail. I couldn't. Not at 1080p or any other resolution. Finally I actually resorted to connecting the two huge cables end-to-end. That netted me sparkles at 720p/1080i and absolutely no picture at 1080p with our HDMI 1.0 player. OK, so there are some limits after all. That's good to know."
Conclusion ~ the cable had to be DOUBLED in length before the sparklies actually appeared! Taking this into account we can (on these cables at least) determine that if you HALF the length of a cable when the 'sparklies' start to appear there ARE actual errors being displayed on screen. At what point someone would actually 'see' these errors is highly subjective, but theyre there all the same:idea:0 -
On a completely separate note ~
http://www.russandrews.com/images/articles/OriginalResearchPaperVersion16Feb09.pdf
Ben Duncan (A highly respected audio engineer) conducted a test specifically for Russ Andrews to PROVE that braided cables 'reduce' RFI in cables ~
heres the pics that go with it ~
http://www.russandrews.com/images/articles/WeavePaperGraphsDoc.pdf
The above HAS been varified by a university professor and 'proves' that braided cables do indeed reduce RFI (noise in the mains supply).
Furthermore (Although I cant yet provide a link but I have a booklet at home), this affects amplifiers 'measureably':idea:0 -
If you buy a 1080p rated cable and you get sparkles at 1080p then its obviously faulty - get your money back!
However if it works - it works - end of.
Pay more money if you want possible better bandwidth for future use or better build quality.0 -
Ok
1st up we have measured 'jitter' in digital cables ~
http://www.stereophile.com/features/368/index3.html
After measuring the first two products (the PS Lambda and the Panasonic SV-3700), I went back and repeated my measurements to make sure the analyzer was giving consistent results, and that my test setup was correct. When I remeasured the SV-3700, I got about half the jitter than when I first measured it! What caused this reduction in measured jitter?
Changing the direction of the digital interconnect between the transport and the jitter analyzer.
This phenomenon was easily repeatable: put the cable in one direction and read the RMS jitter voltage, then reverse the cable direction and watch the RMS jitter voltage drop. Although I'd heard differences in digital-cable directionality, I was surprised the difference in jitter was so easily measurable—and that the jitter difference was nearly double.
To confirm this phenomenon, I repeated the test five times each on three different digital interconnects. One was a generic audio cable, the other two were Mod Squad Wonder Link and Aural Symphonics Digital Standard, both highly regarded cables specifically designed for digital transmission. The generic cable wasn't directional: it produced the same high jitter in either direction. But both the Wonder Link and the Aural Symphonics had lower jitter levels overall, but different jitter levels depending on their direction. Moreover, the generic cable had higher jitter than either of the two premium cables—even in the latters' "high-jitter" direction.
http://www.stereophile.com/features/368/index8.html
Conclusions
There is now no question that jitter in CD transports and digital interfaces affects digital audio sound quality. Not only do different transports and interfaces sound different, they produce varying amounts of jitter and have their own "jitter signatures," seen in the jitter's spectral distribution.
Moreover, we can see that transport jitter goes right through the digital processor's input receiver (even the Crystal CS8412) and affects the amount of jitter at the DAC's word clock—the point where jitter makes an audible difference. If the word-clock timing is different, the sound will be different.
The revelation that digital interconnects and their direction can introduce large differences in measured jitter was quite a shock. The differences heard between digital interconnects—and in their directionality—have now been substantiated by measurement.
In conclusion then ~ the above experiment proved that jitter can and does have an effect in digital cables to the point it actually affects the DAC its going into (meaning that whether or not enyone could actually HEAR the difference, there is a definite MEASURED difference all the same)
Irrelevant0 -
Next up ~ MEASURED errors in the cable and yet they COULDNT actually SEE any discernable difference ~
http://www.audioholics.com/education/cables/long-hdmi-cable-bench-tests/evaluation-conclusion
"
We took two of the worst performing cables of the bunch, a RAM Electronics 50' HDMI cable ($130) and an equally challenged Tributaries (Series 9) 15 meter cable ($899). Both understandably fail eye pattern tests at even 720p resolutions. Both, unfortunately, also claim HDMI 1.3 support at up to 10.2 Gbps bandwidth and with Deep Color support. It was fairly obvious that both of these cables would fail real-world tests when connected to a 1080p source.
Except that they didn't.
I saw clean video on two separate displays. I even used two different sources - one HDMI 1.3 and the other sporting an older HDMI 1.2 chipset. Then I got real desperate and nabbed an old HDMI 1.0 source (A Helios NeuNeo player) and slapped it up to triple check the signal.
What?!? Scratching my head I searched in vain for a way to get them to fail. I couldn't. Not at 1080p or any other resolution. Finally I actually resorted to connecting the two huge cables end-to-end. That netted me sparkles at 720p/1080i and absolutely no picture at 1080p with our HDMI 1.0 player. OK, so there are some limits after all. That's good to know."
Conclusion ~ the cable had to be DOUBLED in length before the sparklies actually appeared! Taking this into account we can (on these cables at least) determine that if you HALF the length of a cable when the 'sparklies' start to appear there ARE actual errors being displayed on screen. At what point someone would actually 'see' these errors is highly subjective, but theyre there all the same
Hmm, this is from the conclusion in the same article,I have to come away saying that most cables under 4-5 meters will pass just about anything in today's arsenal of 1080p - and that's likely to include Deep Color if and when it ever makes an appearance (not likely soon due to current Blu-ray limitations).0 -
Well thats dependant on many different things (Not least whoever it is thats 'trying' to notice a difference.) One of the links ill post is when a test was done to find definite measured errors coming out the other end of a hdmi cable but they COULDNT see any specific problems ~ ergo there was the odd pixel showing the wrong colour. How many errors it would take to actually SEE a difference is obviously highly subjective (And dependant on many things)
As for your question ~
720P and SD content dont need as much bandwidth as 1080 so they should work fine regardless (If they dont then theres something seriously wrong with the cable)
However, SD content UPSCALED to 1080 will test a lot of cables
aliEnRIK,
Pondering kwikbreaks' question overnight brought back another one to my mind. One that I've wondered about before but never asked.
Suppose one has two pieces of good kit, both of which are capable of upscaling from SD to 1080p.
In my case – to make the question clearer by giving them identities, for the purposes of the argument – this would be a Sony RDR-HXD995 DVD recorder with built-in Freeview tuner and a Sony KDL-52w4000 LCD television with its own Freeview tuner. Same generation of Sony tuners, then (I think).
The aerial signal passes through the DVD recorder and goes to the television; so, they are also accessing the same RF signal.
If one wishes to view on the television the Freeview picture produced by the DVD recorder (by means of a given HDMI lead), should one get a better result by:
a) letting the DVD recorder pass the SD Freeview signal directly to the television and letting the television upscale it entirely?
b) letting the DVD recorder pass a 720p signal to the television and letting the television upscale that to 1080p?
or
c) letting the DVD recorder do the entire upscale to 1080p and then send that to the television?
In other words, should the fact that it is more demanding of an HDMI cable to ask it to transmit a 1080p signal than a 720p signal result in a poorer picture when viewed on the television, or would this effect be outweighed by the fact that it is being upscaled in more stages?
Curiously, I find we get a better picture (slightly sharper and a little richer in colour) by watching Freeview channels output by the DVD recorder in 1080p and sent by HDMI cable, than by watching them on the television's own Freeview tuner!
spaceboy,
And do all cheap cables produce them?
Some cables do, some cables don't. It depends (especially at 1080p) upon the quality of the cable, not the price at which it's sold – and the answer to your question depends also upon the point (which you haven't defined) at which, for any given length, you consider that the cable becomes (or ceases to be) cheap.
Rather like a compatible cartridge for an inkjet printer, it isn't cheap if it has to be binned and replaced with something better.
Don't laugh at banana republics. :rotfl:
As a result of how you voted in the last three General Elections,
you'd now be better off living in one.
0 -
Hmm, this is from the conclusion in the same article,
Yes, they say 'most' cables
Youll find that the VAST majority of the cables they tried were pretty/VERY expensive ones and they tried a few cheapie ones which were known to be pretty good in the american market
If theyd tried a cheapie ebuyer or ebay cable then you might have something.
regardless, they MEASURED errors and nothing could be seen on screen (that was an 'obvious' error)
They also tested them under pretty 'perfect' conditions. There are plenty of other factors involved in 'real world' viewing (source used, jitter involved, RFI etc etc)
I rest my case:idea:0
This discussion has been closed.
Confirm your email address to Create Threads and Reply
Categories
- All Categories
- 352.2K Banking & Borrowing
- 253.6K Reduce Debt & Boost Income
- 454.3K Spending & Discounts
- 245.2K Work, Benefits & Business
- 600.9K Mortgages, Homes & Bills
- 177.5K Life & Family
- 259K Travel & Transport
- 1.5M Hobbies & Leisure
- 16K Discuss & Feedback
- 37.7K Read-Only Boards