2007-10-08, 13:40 | Link #61 | |||
King of Hosers
Join Date: Dec 2005
Age: 41
|
Quote:
Not to mention at the same time I did say this. Quote:
Quote:
In the end I will agree with your choice only because of the fact that there will apparently be no HD broadcasts. I was under the impression there would eventually be (in the not so distant future) HD broadcasts, which made releasing an SD broadcast at high res absurd. Evul broadcasters, must be movax's fault. |
|||
2007-10-08, 17:18 | Link #64 |
Banned
Join Date: Nov 2003
Location: Hamburg
Age: 54
|
Well, that's been very interesting and - kinda enlightening to me. I've got to admit that I'm a little bit puzzled, but I guess I shouldn't be too surprised either.
So, to sum it all up: Over the last 24 hours I've talked to alot of encoders about this issue which I considered intriguing (because it touched down on some research I've been doing ~1 month ago on my vacation), namely visual impression of above-SD resolutions, and their use for "HD" encodes. What does matter, what doesn't matter much, what works and what doesn't. Initially I experimented alot with DVD sources (720x480 - the defined "SD" resolution), particularly with Elfen Lied R2, one of the cleanest and best transfers I've seen so far. I remembered the fullscreen impression of Hayate of Gotoku (the first show I worked on that deserved a "HD" label with MY understanding of the word), and I wondered if I could somehow recreate this with the Elfen Lied DVD source, now armed with x264. The results were sobering. Even though the source seemed fine, I was unable to recreate a comparable level of sharpness. Consequently, I did some upscale experiments, but again, to no satisfactory end. I really tried, and I know a thing or two about handling DVD sources, but it all came down to one bitter truth: The source itself might have been "clean", but it didn't really fully utilize the full resolution. Most background details were in itself blobs of multiple pixels, there were no real clarity and 1-pixel-width structures. Attempts to sharpen without creating halos (the normal bane of too much sharpening) were made again and again, but - nope. The _best_ (subjective!) results were reached with a combination of supersampling to 1.5x source resolution, sharpening, mild linedarkening, mild antialiasing, downsizing, and finally (key) slight _linethinning_ (originally, only to warp away sharpening halos a bit). Initially, I couldn't understand that, really. Why did linethinning have such an impact on the visual impression? Eventually I understood - it was because my eye was drawn to contrasts - not background details - , and fullscreen playback of the usual sd-linewidth of ~3-5 pixels, the resulting lines looked "fat" and not realistic, especially in conjunction with mild residual halos giving it an "embossed" look. By thinning the lines, one didn't gain details or anything, but the resulting visual impression was more pleasing. So I began looking for sources which had "finer" structures. And there they were - 1280x720 captures of HD-mastered material. Here, filtering could be done quite differently: Just to clean up the lines, doing some very mild line removal and sometimes line mending (mild antialiasing). So the background details were multi-pixel blobs and not really sharp? SO WHAT. That's not where the eye was looking. And this is why HD-mastered raws aired in SD (case 2) are still way superior to what you get as SD DVD sources: They are much more detailed, SD or not. See the bell tree I used for my initial screenshot. You simply DO NOT GET structures this fine on common SD sources. Therefore, having even those "crippled" HD versions resized back up either by capper or (even better) by stations BY FAR surpasses what's well-known as normal SD. Because what really counts is the quality of the original material, the airing res is much less important. Upscaling back to HD res can be done cleanly, and it will still result in a balanced frame. The background structures will be harmed, but the visual impression of a sharp HD pic is salvageable. SD sources differ from other SDs already, and scaled-down HD material is most definitely NOT "normal" SD. See Kimikiss raws as a prime example of what I'm talking about. So, to wrap it up, we need to agree to disagree. Type 2 cases _absolutely_ deserve a HD resolution release in my opinion. Here, going SD is in fact generally harmful, because the lines tend to be thinned too much, leaving a strange look of unstructured color plains. Whether you call them HD (I do and consider it justified), MHD (we can tag them "Middle HD" instead of the Mentar pun) - I don't care. They're an important and fairly large class of sources which deserve proper handling. So far, every single encoder I've talked to who has compared the mkv with the avi agreed that the result is very obvious (and no, Nich, it has nothing to do with the sharpness advantage of h264 - I've made so many tests that I could puke them out). On the other hand, all those encoders criticizing this decision I talked to had one thing in common: They hadn't even bothered to have a look at what they were talking about. Some of them even with a flabbergasting aura of pride, because they just KNEW that they were right. They had to be. Feedback for these MHD releases has been overwhelmingly positive. I'll definitely continue doing them, and many other groups do the same thing. I'd invite the naysayers to find a calm minute when nobody is watching them, to download MHD and SD releases, and to try to make honest assessments of the difference. And if anyone of them succeeds at what I failed at (making SDs look like MHDs without scaling to their resolutions), please gimme a holler. Peace. |
2007-10-09, 01:45 | Link #67 |
vehicularGao
|
I dunno whats really going on in this thread, I don't really have the time or care to crawl over every post, but i felt the need to point out, in case someone hadn't already
In my humble opinion, upscaling provides no increase in quality. The differences come in when you watch the broadcast content content at a higher resolution. In which case quality will differ depending on the device doing the actual upscaling. Most likely the television network upscaling equipment is faster and of higher quality than in the average HDTV. (protip: this paragraph is mostly filler) So why do they downscale the content before rebroadcasting it? I would like to think that this is miscommunication. Usually companies like this don't do something without a reason. So, if true, it could be the result of inaction rather than poor actions. This could simply be that they're gearing up to 720p but don't have all the steps in place. Then why upscale back? Depending on how the system works to get the content from the air to their television, it may be more efficient to use that bandwidth (not talking about internet bandwidth) and then be able to market on the point that they broadcast in 720p. Many early-tech HDTVs don't upscale either. This could be speculated on forever, so I'm leaving it incomplete at that. The question here should be, should fan subs use the 720p content or the 480p content? The answer should be obvious. Even if it is higher quality 480p, since it was produced at 720p, it is still only 480p now. So then it is only a matter of trying to retain as much quality as you can in your encodes (as usual). And if I recall correctly, you can get more quality per bit at the lower resolution. Computers don't need pre-upscaled content, so why upscale in advance like a television network. Unrelated: Hardware encoders are usually going to be better quality than software encoders. |
2007-10-09, 03:04 | Link #68 |
渡辺曜のお兄さん
Join Date: Apr 2007
Location: Australia
|
After reading through the whole thread, it all comes down to the labelling of what 'HD' is. Mentar argues HD is based on perception; while others argue it's based on original intent of broadcast; i.e. resolution.
I find the same situation arises when you try and define what a 'person' is. Is a foetus a 'person'? Is the conceptus a 'person'? Is it only a 'person' when it is born? How about when logical thought processes are in place - i.e. is a 4 year old a 'person', whereas a newborn is not? My suggestion: Remove the label of HD altogether from releases, unless it really is a HD-mastered, HD-broadcasted, HD-ripped encode. Then all is happy and prosperous in the world. My opinion: I don't care if it's been re-upscaled as long as it looks damn good and it does. I'm happy. |
2007-10-09, 06:38 | Link #69 | |||
Junior Member
Join Date: Oct 2007
|
Over the last month, i've been thinking about when such types of encodes are justified(if at all) and what are the best choices one can make when encoding them. Seeing that Mentar has done his share of the research, i'd like to ask a few questions myself.
How can one easily tell apart from a source which was originally HD and then downscaled and upscaled by some broadcaster? How can we tell if they didn't take a very clean SD source, upscaled it then warpsharped(or some other line thinning filtering)? I have one such source where it lacks the background detail, and it's known it doesn't air in HD, yet there we see nice thin lines and contrasts. I'm unable to tell whether this was originally an SD source which was upscaled and sharpened or if it was a HD source which was downscaled then upscaled and sharpened. As for seeing if a source is genuine HD we can easily downscale/upscale then subtract() from original(avisynth) and we'll see clear thing lines as well as detailed backgrounds showing up in the subtracted result. If you apply the same procedure to some better upscales like Mentar's MHD ShanaS02E01 and a few other ones, you can see that the lines have been enhanced and thus downscaling/upscaling loses some of that detail. (even if its made up detail!) Quote:
Source resolution? SD (704x396 for example?), WS or some other SD res. What kind of sharpening was applied? limitedsharpenfaster, msharpen.. line darkening? msharpen or fastlinedarken or others? Downsize? back to what resolution original? (then we'd have SD resolution in which case i don't fully understand what was the purpose of this?) Linethinning? What filters are recommended for this? I've played a bit with some variations of warsharp, but i can't settle on using them as i notice how radically they change the image, even if i use only a bit of them, they don't just change the lines, they change the whole image. Quote:
What antialiasing is prefered? Doesn't antialiasing itself upscale the image once, filter it then downscale? Quote:
So my questions sum up to this in the end: What are recommended ways of downscaling if we chose to go that way instead of leaving it in HD? When do we know we should leave it in the captured resolution and what filtering(both general ideas and specific recommended filters) is recommended for enhancing the lines in such captures. These questions have been bothering me for the past month and i hope i didn't ask a bit too much. |
|||
2007-10-09, 10:05 | Link #70 |
Part 8
IT Support
|
Sounds like a Roman gladiator to me.
Anyway, mentar: I think you're trying to redefine "HD". HD doesn't mean High Detail, which is what you are suggesting it does. It just means the video has been 720p+ for its entire life. ...That's my original opinion. I was originally very much "naughty mentar, definitions are forever!" Then I read at work today wikipedia's page on the duck test. It's was probably just my natural tendancy to denounce upscaling as unholy that made me not like the idea of calling good upscales 'HD'. If it looks like HD, it is HD! (is it just a coincidence I had duck for dinner tonight too?) |
2007-10-09, 12:07 | Link #71 |
Senior Member
Join Date: Mar 2004
|
Since this is a matter of intent and semantics, just tag it HQ - I know some groups do it, and it sidesteps the issue of resolution definitions. After all, isn't the intent of releasing two different versions supposed to be that Version A (HD, HQ, h264, whatever) is better than Version B (SD, LQ, h264, whatever)?
|
2007-10-09, 15:28 | Link #72 | |||
Banned
Join Date: Nov 2003
Location: Hamburg
Age: 54
|
Quote:
Quote:
In the meantime I'll go with the real definition (see Wikipedia). HD = High Definition, a set of resolutions. Nothing more. There is no such requirement that the footage delivered in HD has to adhere to certain quality standards. In other words, a 1280x720 video of a black screen is HD aswell. The problem with the addition "has been 720p for its entire life" is that if you use this, some commercial BluRay releases stop to be HD. A peculiar result, isn't it? Quote:
My opinion. Nobody is required to agree though |
|||
2007-10-09, 15:55 | Link #73 |
Junior Member
Join Date: Oct 2007
|
So to step back from all this flaming which seems pointless, now that we've established what is true HD and what is recovered HD(from what someone called MHD).
Mentar, could you give us some recommended processes and filters to enhancing station upscales of what used to be HD material, as well as for upscaling downscaled HD material(might be a bad choice due to chroma subsampling, maybe picking the station upscale is a better choice?). |
2007-10-09, 16:12 | Link #74 | |||||||||
Banned
Join Date: Nov 2003
Location: Hamburg
Age: 54
|
Just a brief warning, I'm writing this on 3 hours of sleep in the last 40 hours now...
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
I'm afraid I offered very little useful advice, but - that's the best I can come up with right now. |
|||||||||
2007-10-09, 16:35 | Link #76 | ||
Junior Member
Join Date: Oct 2007
|
Quote:
Quote:
Actually, your previous post clarified quite a few issues for me, thanks. |
||
2007-10-09, 16:51 | Link #77 |
Umai Fansubs cofounder
Join Date: Mar 2007
Location: Los Angeles
Age: 38
|
"Most people seem to recommend Spline36 or Bilinear for downscaling compared to lancos3/4, and i'm under the impression that lanczos3/4 makes for more blurry images when downscaled, while bilinear makes edges more jaggy."
Uh, I thought it was the reverse O_o Bilinear blurs/smooths while lanczos sharpens |
2007-10-09, 16:55 | Link #78 |
Two bit encoder
Fansubber
Join Date: Jan 2006
Location: Chesterfield, UK
Age: 39
|
Here's an idea, how about labelling it as an SD upscale?
It's true that the common use of "HD" is nothing more than a set of resolutions, but it's also touted as a major step up in quality. If you ask someone what HD means to them, they will likely tell you something along the lines of much improved quality. While you can upscale and sharpen all you want, it's still a far cry from a native HD source, and I sometimes wonder if what you do is counter productive, such as people leeching a "HD" episode (which is actually an upscale) and thinking "This isn't the major step up in quality everyone talked about". Also for your consideration: Where you are encoding (or was encoding) 1024x768 fansubs, how about using 960x720 instead? I'll assume you use that resolution because you like HD, but on a 720p display it will cause rescaling anyway which kind of defeats the objective of what you do. With a 960x720 it should display 1:1, and if the end user wanted, it's easy enough to add borders to pad it out to 1280x720.
__________________
|
2007-10-09, 17:11 | Link #79 | ||
Banned
Join Date: Nov 2003
Location: Hamburg
Age: 54
|
Well - to be honest, to me, a "SD upscale" would be a (bad) upscale of a native SD source to HD resolutions. In other words, the abominations that gave "upscale" such a bad name.
So, we should go with something else. Rather "HD upscale" (HD mastered, downscaled, and eventually upscaled again) or MHD. Dunno. Quote:
Quote:
|
||
2007-10-09, 17:13 | Link #80 | |
Banned
Join Date: Nov 2003
Location: Hamburg
Age: 54
|
Quote:
|
|
|
|