![]() ![]() |
Audio Asylum Thread Printer Get a view of an entire thread on one page |
For Sale Ads |
58.179.75.170
In Reply to: RE: REVIEW: Oppo DV-981HD Universal SACD/DVD-A Players posted by svisner on June 18, 2007 at 20:48:36
You said: " I purchased this unit because I wanted a DVD player with HDMI output and with upscaling that would allow me to take advantage of a new LCD monitor’s 1080I/1080P resolution. I am aware of some debate regarding the benefits some think do or do not accrue from this resolution as well as from upscaling."
Any TV with resolution of 1920 x 1080 has to have a scaler of it's own. It needs to be able to scale standard definition TV signals to 1920 x 1080, and it needs to be able to scale high def signals at 720p to 1920 x 1080. You don't need an upscaling DVD player in order to take advantage of your monitor's resolution, in fact you can't avoid taking advantage of it's resolution because it can't display signals at some other resolution without rescaling them first.
LCD and plasma displays are also progressive scan displays. Standard def TV signals are interlaced, therefore you don't even need a de-interlacing player. Your TV has to have it's own de-interlacing processor anyway.
That doesn't mean there's no value in getting a player that can scale to a 1080 resolution and provide a progressive signal output, but it does mean that such players aren't essential and you don't need to have one in order to get the full benefit of your screen's resolution. The question you need to consider is whether or not you get a better picture letting the player or the TV do the scaling and the de-interlacing. In my case, with a 1366 x 768 display, I feel my best results are obtained with my player (a Denon 2907) doing the de-interlacing and the TV doing the scaling. Your results may well be different but you should try comparing the player and the TV's performance on both of these aspects, and treat each separately. The results may surprise you.
I'm not trying to say that the Oppo is a bad player becaus, from all reports I've heard, it's rather good and probably exceptional for it's price. Given the cost of a hi-def 1920 x 1080 display capable of handling both 1080i and 1080p, however, it's quite possible that your TV does one or both of the de-interlacing and rescaling operations better than the Oppo. After all, hi-def displays need to do these things well if they are going to produce good results with standard def signals and your new TV may surprise you with what it can do if you haven't tried feeding it a standard def, interlaced signal.
And even if you decide the TV does one or both of those things better than the Oppo, don't think you've wasted your money. The Oppo is still producing a video picture and providing a sound source. It apparently does both of those things very well.
My main reason for posting this is simply to try and clear up some of the confusion that seems to exist regarding the need for a DVD player to de-interlace and rescale the image. It simply isn't essential but it is possible that the player may do it better than the TV, especially if it's a new player with recent technology in these areas and an older display with older technology.
I think there's a tendency to think that you have to use every feature of a player to get the most out of it but the real aim is to get the best picture quality you can, and to do that you should use the device which does the best job on de-interlacing and on rescaling to do those tasks. Both your TV and the Oppo can do both so you need to find out which one does them best. Comparing the Oppo to other progressive scan players that rescale with both players performing those functions doesn't tell you anything about whether you'd be better off using the TV for one or both of those tasks, just about which player can do them best and you've made your decision on that.
David Aiken
Follow Ups:
The first upscaler I bought was Sony's first, the 975 or 375 or some such thing. Anyway using the HDMI output and upscaling to my Sony 55" LCDRP (whatever the goddam model is, I dunno, it was new a couple of years ago)the only time it looked better was looking at high frequency test patterns---there was no improvement watching regualr program material.
Since then I don't worry about it, I hook up whatever connections are convenient. My Sony 995 DVD jukebox is hooked to the TV through both HDMI and component, I use the component for watching Academy ratio pictures. The Toshiba upscaler also in the system goes out HDMI. The LD player goes composite and the HD cable box goes component.
Unlike claimed audio improvements claimed video improvements can be captured by cameras, know what I mean?
Can they, and just what are you saying about audio improvements? I accept that a camera can't capture an audio improvement but other things can.
Let's draw a distinction between "claimed improvement" which was your term, and "improvement". An improvement for the purposes of this discussion is one which can be shown to exist with proof of some kind, whether it be a measurable result of photos demonstrating the change, or whatever. No one really argues about improvements.
On the other hand, everyone argues about "claimed improvements", those improvements for which no proof other than a user's report of their sensory experience is offered. Are you saying that such things don't exist for video performance because a camera can capture everything? Even photographs have a limit of resolution and certain sorts of change won't show up in photographs. If it's digital, does the change involve gradations in colour that require a greater palette than the digital media offers? If it's 'analog' photograpy, does the grain structure of the film obscure the true level of sharpness and detail? I think it will take a little bit more than your one liner to convince me that photographs can resolve any argument about video detail.
"Claimed improvements" fall into 2 groups: those improvements which are real but for which no satisfactory proof has yet to be offered, and those which aren't real and arise from errors, confusion, etc on the part of the claimants. You can't demonstrate scientifically that all "claimed improvements" fall into the second of those groups and there's usually no way to determine whether a particular "claimed improvement" falls into the first or the second group. The only way the arguments about most "claimed improvements" ever get to be resolved to everyone's satisfaction is when a proof is found and they get moved from the "claimed improvement" bundle to the "improvement" bundle.
I admire your optimism about "claimed video improvements" but I require a lot more in the way of proof before I'll accept that statement. I guess that indicates that I don't know what you mean.
David Aiken
David--What I mean is that many video improvements can be captured by a still camera and demonstrated and studied at length and at leisure.
This is difficult, if not impossible, with audio as with audio you cannot capture a moment and study it. Well you can with measurements I suppose but not the sound itself.
I accept that "many video improvements can be captured by a still camera and demonstrated and studied at length and at leisure", or at least that some certainly can, however your initial statement left out the "many" and didn't make any qualifications about the universality of your camera test.
I disagree with "This is difficult, if not impossible, with audio as with audio you cannot capture a moment and study it. Well you can with measurements I suppose but not the sound itself."
You can capture sound with a recording and one could set up a system that allowed you to 'freeze' the tape at a given moment and play that moment for a continuing period. What you would get is, essentially, coloured noise which gives you little idea of how that moment sounds 'in situ' within the flow of sound.
But that may not be too different to capturing a moment of video with a camera. When I pause a DVD, I find that the frozen image with many DVDs appears very grainy while the picture while the disc is playing looks anything but grainy. The still does not capture a particular aspect of the playing video and some things which appear to produce a quite noticeable improvement in sharpness and other aspects of picture quality simply don't show on the still shot. I had this experience when I got a new equipment rack for my HT system and set the components up in it with the centre speaker on a separate solid stand. Previously the centre speaker had shared a wide AV rack with the components. My centre speaker is large and crossed over to the sub at 40 Hz, and the shelves of the old AV rack had been glass. Not only the picture but the sound impmroved with the change.
While the still image did not show any of the things that I regarded as visual improvments, I tried recalibrating my picture settings using Digital Video Essentials. Another impression I had of the changes in picture quality was that the colours in overall image looked brighter. The black levels didn't appear to have changed when I used DVE but I had to reduce the TV's contrast setting by 1 step in order to restore my peak white levels. An interesting confirmation.
I'm wary of claiming that any one test is the be all and end all of testing for difference. Still photos of video don't reveal everything. Any single audio test doesn't reveal everything. I suspect that there's little difference in how testable claims in both areas are. Too many people run a limited range of tests because those are the only tests they have, see no difference in results, and claim that as a proof of no difference. It isn't. It says nothing about the things left untested. We also don't always have the tests we need at a given point in time. Tests for jitter were not available when CD players were introduced and quite a few people ridiculed claims that there were audible differences between players that tested similarly on the tests available. When tests for jitter became available differences were revealed. It's simply unscientific to assume that we have all the tests we need. We may have, but we can never be certain of that and any time someone invents something which offers an improvment there's going to be the possibility that some new test comes along with the improvment. That won't happen in every case but it will certainly happen in some.
I don't think your comparison of video and audio in relation to tests holds up. There are grey areas for both when it comes to deciding whether or not something makes an improvement and there probably always will be. We have no way of accurately deciding which 'improvements' in the grey area are real and which are not. We occasionally get to move something from the grey area to the proven area but, when it comes to those things left in the grey area, we're each left to make our own judgement. We can say some are more probably genuine improvements than others and we each may draw the line between probable and improbably at different points, but wherever we draw the line the only sure bet is that we're going to make mistakes. Draw the line too conservatively and you'll ignore more genuine improvements than you will accept things that don't work. Draw the line too loosely and you'll pick up a lot of things that don't work along with the ones that do. Essentially you choose the sort of error you're prepared to make and the amount of time and money you're prepared to spend at the fringes. This is going to be equally true for video as it is for audio.
David Aiken
I have a Sony SXRD, and it doesn't do nearly as good of a job upscaling as the Oppo. It does deinterlace very well, but not upscaling. Most people I have talked to have said that they prefered their DVD player to do the upscaling than their TV. While it isn't mandatory, a decent upscaling player DOES help give a better picture. It seems to be an area that manufacturer's skimp on(?).
Jack
I've got a Denon 2910 with the Faroudja chip for de-interlacing, and a Loewe LCD TV with 1366 x 768 resolution. I use HDMI connections which means that I have to take a progressive signal from the player. I leave all upscaling to the TV because that works slightly better than having the player rescale to 720p or 1080i and then having the TV rescale that to 768.
There's so many variables in this area and it drove me crazy when I first got the LCD screen and started asking which was the best way to go. A lot of the questions I was asking didn't seem to get discussed much and got conflicting answers when I could find any. After some months I started to feel I has some kind of understanding on the issues. If I replaced my TV with a 1080 screen of some flavour, I'd start doing the comparisons on scaling once again because it's possible that the 2910 might do a better job if I used it to scale to the native resolution of the display, something it can't do with a 1366 x 768 display.
I don't think there's a single answer that will fit every case, but we all start out looking for one of those sorts of answers and then get annoyed when we find there isn't one and we're going to have to make our own comparisons and then make up our own minds :-(
David Aiken
DavidWell, I regard myself as a novice regarding video. I appreciate your excellent post. It gives me some settings to try and results to compare.
I do think that the current setting (HDMI at 1080i) looks better than the component video it superseded - and certainly better than S-Video. I can't tell yet what improvement is due to what setting, and I take your point about interlacing.
Again, thanks for an excellent post - and for helping demystify some of this subject for others and for me.
FAQ |
Post a Message! |
Forgot Password? |
|
||||||||||||||
|
This post is made possible by the generous support of people like you and our sponsors: