AnimeSuki Forums

Register Forum Rules FAQ Community Today's Posts Search

Go Back   AnimeSuki Forum > Support > Tech Support > Playback Help

Notices

 
 
Thread Tools
Old 2011-01-28, 19:55   Link #61
Simon
気持ち悪い
 
 
Join Date: Dec 2005
Location: New Zealand
Quote:
Originally Posted by TheFluff View Post
Whether you have a modern card or not is completely irrelevant. All that matters is if you have a video decoder ASIC that supports h264, or if you do not. What kind of GPU you have is completely unrelated to this topic, since video decoding is not done by the GPU, and there are only like four or five different video decoder ASIC models out there, so it's not like you have a lot of variation.
OK, I'm confused: so when Nvidia talk about GPUs supporting different hardware acceleration features, they actually mean that the GPUs work with different ASICs? Is the decode ASIC sold with the GPU as a chipset then, or do Nvidia just tell OEMs what to use?

Or am I wrong to think of the ASIC as a discrete device, in that it's physically a block on the GPU die but functionally distinct?

Quote:
Haali's renderer is a video renderer only.
What's the reason to prefer it over say EVR? More reliable? More stable? More efficient?

Quote:
Anyway, the moral of this story is that hardware accelerated video decoding is a toy. It's useful in small plastic toys like cellphones and set-top multimedia boxes, but in a real computer where you have a real CPU at your disposal, it's fucking retarded to limit yourself to the performance of a small, slow and dumb 400MHz ASIC tacked on to your graphics card.
What about laptop battery life? Again using Nvidia as an example, they claim hardware-accelerated decoding chews significantly less power in total. Is this just marketing BS?
__________________
And what rough beast, its hour come round at last,
Slouches towards Akihabara to be born?

Last edited by Simon; 2011-01-28 at 20:09. Reason: Accuracy doesn't apply to video renderers. duh
Simon is offline  
Old 2011-01-28, 21:47   Link #62
TheFluff
Excessively jovial fellow
 
 
Join Date: Dec 2005
Location: ISDB-T
Age: 37
Quote:
Originally Posted by Simon View Post
OK, I'm confused: so when Nvidia talk about GPUs supporting different hardware acceleration features, they actually mean that the GPUs work with different ASICs? Is the decode ASIC sold with the GPU as a chipset then, or do Nvidia just tell OEMs what to use?
Almost everything on that chart isn't actually decoding, but other video related things like deinterlacing, sharpening and other filtering, which IS done on the GPU and not by a separate chip.

Quote:
Originally Posted by Simon View Post
Or am I wrong to think of the ASIC as a discrete device, in that it's physically a block on the GPU die but functionally distinct?
It's a separate chip on the graphics card. It's not removable, but it's not part of the GPU either. It could just as well go on the motherboard, but then you'd have to transfer decoded pictures back and forth between main memory and VRAM in order to do processing on them, and that's usually slow.

Note that when I said these things are small and cheap, I meant it. They usually cost less than $5 USD each when you buy in bulk, they only require a few megabytes of memory (the maximum required decoded picture buffer for h264 level 4.1 is just ~12 megabytes big, IIRC) and they don't require active cooling.

Quote:
Originally Posted by Simon View Post
What's the reason to prefer it over say EVR? More reliable? More stable? More efficient?
I'm probably just prejudiced since I used to do a lot of playback support when the EVR was new, but it used to be pretty buggy with certain drivers. Haali's renderer is more stable and more configurable, in my experience. Quality-wise they should be about on par.

Quote:
Originally Posted by Simon View Post
What about laptop battery life? Again using Nvidia as an example, they claim hardware-accelerated decoding chews significantly less power in total. Is this just marketing BS?
No, that's true; using a dedicated decoder chip that only does one thing is a lot more power efficient than using a general-purpose CPU. That's why it's useful in the aforementioned small plastic toys like cellphones, iPads, netbooks, etc etc. Of course, if you want to watch on a small plastic toy, you have to deal with small plastic toy video quality as well.

When high bitdepth encoding starts to become popular later this year (x264 can already encode it, and a matching decoder for FFmpeg is under development) I will laugh and laugh and laugh at all the poor fools who built media PC's with slow CPU's and relied on the graphics card to do the job, because suddenly their prized hardware acceleration will be completely unable to help them.
__________________
| ffmpegsource
17:43:13 <~deculture> Also, TheFluff, you are so fucking slowpoke.jpg that people think we dropped the DVD's.
17:43:16 <~deculture> nice job, fag!

01:04:41 < Plorkyeran> it was annoying to typeset so it should be annoying to read

Last edited by TheFluff; 2011-01-28 at 21:59.
TheFluff is offline  
Old 2011-01-28, 23:23   Link #63
sneaker
Senior Member
 
Join Date: Dec 2008
Luckily even cheap low power CPUs like the Core i3 have no problem handling HD content. But yeah, I wouldn't rely on any graphic card myself. Even though they can handle all current encodes, you never know what the future will bring. (Like said 10 bit encoding) And you don't have to mess around with buggy drivers and decoders - it simply works.

Last edited by sneaker; 2011-01-28 at 23:36.
sneaker is offline  
Old 2011-02-04, 13:31   Link #64
tyranuus
Team Spice and Wolf UK
 
 
Join Date: Feb 2010
Location: England
Age: 36
Quote:
Originally Posted by TheFluff View Post
Whether you have a modern card or not is completely irrelevant. All that matters is if you have a video decoder ASIC that supports h264, or if you do not. What kind of GPU you have is completely unrelated to this topic, since video decoding is not done by the GPU, and there are only like four or five different video decoder ASIC models out there, so it's not like you have a lot of variation. Their performance is very similar as well; the only one that is notable at all is that one model that got used on some of NVidia's cards that has some bug that makes it unable to decode certain h264 resolutions.
Whilst what you're saying is to some extent true, you have the simple fact that older cards tend to support a smaller range of files (I'm not just talking x264 here). Newer cards tend to support a wider range of files, making them better suited to use inside a media PC. I've seen varied levels of dxva accelleration between cards during testing, so trying to argue there is no difference between new cards and older card generations seems a little fundamentally flawed (and yes I'm talking h264 mkv here). Newer ASICs sometimes have bugfixes in compared to older generation devices, which you highlight yourself. You're also ignoring the possibility of using hardware accellerated filters on the video to try and improve the picture.

As an example, a HTPC review I read a little while ago flagged up that whilst all current 5*** series could decode 1080p with no issues, with an additional hardware denoise filter on top, whereas with Nvidia, the GT210 wasn't actually strong enough, you needed a GT220 or above.

I might be wrong here, as this isn't my 'area', but additionally, a quick flick around Wiki (apologies I know its not a great source) and the web would suggest there are quite a few more than 4-5 various different chip designs out there capable of decoding h.264, and available in ASIC form for integration into larger devices.


Quote:
You cannot get them to decode faster, nor can you get them to decode bigger images or things outside level 4.1. It's fairly easy to create encodes that are incompatible with hardware decoding chips; some fansub groups do it intentionally.
When you have hardware manufacturers claiming hardware accellerated l5.1 then that raises a flaw in your argument, how well or how exactly it does so are other sides of the coin, but the simple fact remains you have players and cards capable of leverage hardware decoding against l5.1. ATI as an example have been claiming 5.1 hardware support since early 2010.

Quote:
Haali's renderer is a video renderer only. As in, it only deals with video. Additionally, it only deals with rendering uncompressed video, not decoding it, so it most certainly does not have issues with "some files and input formats", and even more certainly it doesn't have issues with audio.

(what the hell is a "HD audio codec" anyway)
Admittedly my bad there, I was referencing Haali's splitter, which does, or did, have some known issues with high def audio codecs. I tend to think EVR will still remain the most popular renderer however.
Seeing as you seem fairly clued up, I'm actually kinda surprised you had to ask what I meant by HD Audio though, seeing as its one of the more common ways of referring to DTS-MA/DDTHD. An increasing number of people are running MKVs with embedded DTS-MA/THD ripped straight from the bluray, especially as its now so easy to do so, and space is no longer a constraint like it used to be.

As an example, people mention some of the issues here whilst discussing other issues:
http://www.avsforum.com/avs-vb/showthread.php?t=1247893

Quote:
Anyway, the moral of this story is that hardware accelerated video decoding is a toy. It's useful in small plastic toys like cellphones and set-top multimedia boxes, but in a real computer where you have a real CPU at your disposal, it's fucking retarded to limit yourself to the performance of a small, slow and dumb 400MHz ASIC tacked on to your graphics card. Whoops, that 15mb decoding buffer needed? Too big for the poor widdle chip. Welp, no video for you.
I still think you're being overtly hardheaded. I concede software decoding is much more flexible, but you give the impression that hardware decoding is pointless/useless, whereas in reality, hardware decoding has made HD video playback a real possibility on a wide range of machines, which haven't a hope in hell of playing via the CPU smoothly.
A HUGE number of people out there have machines, even relatively modern ones (lower Core2 or equivalent X2 as an example), that couldn't handle decent HD rips well, especially not 1080p ones without dropping frames. We're not just talking toys here, but a wide range of otherwise perfectly reasonable desktops and laptops, that simply don't have enough raw grunt to deal with 1080p effectively. If the machine is suitable for the other intended uses (most likely internet and office work), then what makes more sense - overhaul the system for several hundred pounds/dollars, or fitting a cheap video card with a built in decoder that'll do the work for you? Even with limitations, I don't call that a joke, I call it progress.

Additionally, a weaker CPU and capable decoding card can be run passive or with low speed fans, in many cases quite successfully, whereas a more powerful machine tends to produce more heat and will usually require more potent cooling.
Either way you look at it, there are some real benefits to hardware decoding, and stating they are only useful for mobile devices seems rather shortsighted, people's machines and thier usage patterns or design aims can be as individual as the user. Personally, I tend to use relatively high end PCs or laptops, and even then, being able to decode video with minimal heat and noise generated whilst doing so is still pretty appealing!
__________________
Total Anime watched= Enough. What can I say? I'm a convert...
***
PRAY FOR SPICE AND WOLF III and faster Yenpress novel releases!
Reading: None at the moment

Last edited by tyranuus; 2011-02-04 at 14:13.
tyranuus is offline  
Old 2011-02-05, 04:48   Link #65
TheFluff
Excessively jovial fellow
 
 
Join Date: Dec 2005
Location: ISDB-T
Age: 37
Quote:
Originally Posted by tyranuus View Post
Whilst what you're saying is to some extent true, you have the simple fact that older cards tend to support a smaller range of files (I'm not just talking x264 here). Newer cards tend to support a wider range of files, making them better suited to use inside a media PC. I've seen varied levels of dxva accelleration between cards during testing, so trying to argue there is no difference between new cards and older card generations seems a little fundamentally flawed (and yes I'm talking h264 mkv here). Newer ASICs sometimes have bugfixes in compared to older generation devices, which you highlight yourself. You're also ignoring the possibility of using hardware accellerated filters on the video to try and improve the picture.

As an example, a HTPC review I read a little while ago flagged up that whilst all current 5*** series could decode 1080p with no issues, with an additional hardware denoise filter on top, whereas with Nvidia, the GT210 wasn't actually strong enough, you needed a GT220 or above.
I kept trying to read this part of your post but you go on about how you think using hardware accelerated video filtering is a good idea and using the word "whilst" so I can't really take it seriously, sorry.

Quote:
Originally Posted by tyranuus View Post
I might be wrong here, as this isn't my 'area', but additionally, a quick flick around Wiki (apologies I know its not a great source) and the web would suggest there are quite a few more than 4-5 various different chip designs out there capable of decoding h.264, and available in ASIC form for integration into larger devices.
I have no idea about ATI (since lol ATI and lol ATI drivers) but nVidia have exactly three (3) different decoder chips that support H.264, across their entire range of products. See appendix A in the VDPAU documentation.

A quick look at ATI's UVD documentation hints that the situation is similar there as well; there seem to be three different feature sets (i.e. three different generations of decoder chips).

Quote:
Originally Posted by tyranuus View Post
When you have hardware manufacturers claiming hardware accellerated l5.1 then that raises a flaw in your argument, how well or how exactly it does so are other sides of the coin, but the simple fact remains you have players and cards capable of leverage hardware decoding against l5.1. ATI as an example have been claiming 5.1 hardware support since early 2010.
Where are they claiming that? It's bullshit. What they do support (with certain driver versions) is a bigger DPB, so you can use up to 16 reference frames at 1080p (nVidia supports this too), but just like nVidia they have a hard limit of max 2048 pixels in either direction and max 8192 macroblocks for the entire image. There is also a hard limit on the decoding speed (more specifically ~60fps at 1080p).

Quote:
Originally Posted by tyranuus View Post
Seeing as you seem fairly clued up, I'm actually kinda surprised you had to ask what I meant by HD Audio though, seeing as its one of the more common ways of referring to DTS-MA/DDTHD. An increasing number of people are running MKVs with embedded DTS-MA/THD ripped straight from the bluray, especially as its now so easy to do so, and space is no longer a constraint like it used to be.

As an example, people mention some of the issues here whilst discussing other issues:
http://www.avsforum.com/avs-vb/showthread.php?t=1247893
You seem very confused. "HD audio" is audio with a bitdepth over 16 bits per sample and/or a sampling rate over 48kHz, much like HD video is video with a resolution higher than 720x576 (or so). It does not require any specific codec. Using DTS-HD or Dolby's TrueHD is generally a bad idea anyway since they compress considerably worse than FLAC, and all three are lossless anyway so it's not like converting between them makes a difference in quality. I don't see what this has to do with the discussion about DXVA, however.

Quote:
Originally Posted by tyranuus View Post
I still think you're being overtly hardheaded. I concede software decoding is much more flexible, but you give the impression that hardware decoding is pointless/useless, whereas in reality, hardware decoding has made HD video playback a real possibility on a wide range of machines, which haven't a hope in hell of playing via the CPU smoothly.
A HUGE number of people out there have machines, even relatively modern ones (lower Core2 or equivalent X2 as an example), that couldn't handle decent HD rips well, especially not 1080p ones without dropping frames. We're not just talking toys here, but a wide range of otherwise perfectly reasonable desktops and laptops, that simply don't have enough raw grunt to deal with 1080p effectively. If the machine is suitable for the other intended uses (most likely internet and office work), then what makes more sense - overhaul the system for several hundred pounds/dollars, or fitting a cheap video card with a built in decoder that'll do the work for you? Even with limitations, I don't call that a joke, I call it progress.
That sure is a lot of words, but if you cannot decode 1080p h264 in software on a Core2Duo (no it doesn't matter which Core2Duo) you are doing something wrong. Even my old Athlon X2 that I bought in ~2005 or so could play 1080p with software decoding. In fact any multicore CPU bought after early 2007 or so should be able to decode 1080p in software (toy CPU's like Intel Atom's not included).

Also, you keep using the word "whilst". Just for your information, using that in a forum post makes you look like a pretentious faggot. Hope this helps.
__________________
| ffmpegsource
17:43:13 <~deculture> Also, TheFluff, you are so fucking slowpoke.jpg that people think we dropped the DVD's.
17:43:16 <~deculture> nice job, fag!

01:04:41 < Plorkyeran> it was annoying to typeset so it should be annoying to read
TheFluff is offline  
Old 2011-02-05, 12:51   Link #66
tyranuus
Team Spice and Wolf UK
 
 
Join Date: Feb 2010
Location: England
Age: 36
Quote:
Originally Posted by TheFluff View Post
I kept trying to read this part of your post but you go on about how you think using hardware accelerated video filtering is a good idea and using the word "whilst" so I can't really take it seriously, sorry
I'm sorry, but if you're that hung up on the word 'whilst'...it does render this discussion a little pointless.
At the end of the day, my personal conviction is that if something does the job in hardware with reduced power consumption and heat, for most purposes acceptably; then it's a step forward. Everything with PCs has always been a trade off between the factors of heat, power and noise (usually in that order). This is simply another step along that road.

There's always going to be exceptions where something different is needed, or something a little bit more...flexible, but I'm surprised you don't seem to see benefits from being able to offload decoding and filtering in hardware even using a normal PC. Most people I know want a quiet (and cheap) PC that performs reasonably well for the tasks at hand, regardless of whats going on, hardware decoding works towards that goal, which is why everyone seems to be adding it. It makes a bigger difference on phones/ULV laptops/Netbooks/AIO boxes, but there is still a fair place in a standard PC for it.


Quote:
I have no idea about ATI (since lol ATI and lol ATI drivers) but nVidia have exactly three (3) different decoder chips that support H.264, across their entire range of products. See appendix A in the VDPAU documentation.

A quick look at ATI's UVD documentation hints that the situation is similar there as well; there seem to be three different feature sets (i.e. three different generations of decoder chips).
If you only meant between card generations then yes you're correct, that's about where things stand. From the way you'd phrased your response previously I'd thought you meant in general, rather than purely in terms of GPU addon ASIC.
I believe ATI may have undergone a couple of subrevisions on UVD suggesting revisions of the ASICs, but either way that's not really that important. In terms of drivers they're both pretty lol at points now, Nvidia certainly seem to have nosedived a bit. Then again I am a little biased after massive issues with Nvidia driver-caused DPC latency issues on a recent platform I used, which was an absolute nightmare.

Quote:
Where are they claiming that? It's bullshit. What they do support (with certain driver versions) is a bigger DPB, so you can use up to 16 reference frames at 1080p (nVidia supports this too), but just like nVidia they have a hard limit of max 2048 pixels in either direction and max 8192 macroblocks for the entire image. There is also a hard limit on the decoding speed (more specifically ~60fps at 1080p).
Try as an example:
http://blogs.amd.com/play/2010/04/28...E2%80%99s-new/
Multiple players (at the least, I recall it mentioned in release notes for MPC-HC and VLC) support ATI's accelleration.
Note also the particular stipulation that they are no longer limited to a hard limit of 2048, but support 4kx2k.

Now whether it works properly I don't know, I've already said that, but it does appear to disagree with your statement.

Quote:
You seem very confused. "HD audio" is audio with a bitdepth over 16 bits per sample and/or a sampling rate over 48kHz, much like HD video is video with a resolution higher than 720x576 (or so). It does not require any specific codec. Using DTS-HD or Dolby's TrueHD is generally a bad idea anyway since they compress considerably worse than FLAC, and all three are lossless anyway so it's not like converting between them makes a difference in quality. I don't see what this has to do with the discussion about DXVA, however.
HD audio can refer to both. In audiophile/music production terms, yes it does refer to the likes of 24bit or 96Khz/192kHz and the like, but in general usage it also refers to the mass market lossless audio codecs (DTSMA/THD, and occasionally its labelled onto raw PCM as well). This is not my choice of naming scheme, but simply the one in general use right now.
The difference between file size of a FLAC encode and a DHD encode often actually isn't that big, but FLAC is usually a fair bit more efficient than DTS-MA. Still, with space at a lower premium than it used to be, sometimes time/ease does come up.

You're right though, although audio format is still relevant in terms of playback in general terms (especially for thos who want to bitstream), this section all sprung up from the misunderstanding of which section of Haali's you were referring to rather than DXVA, so apologies for sidetracking there!

Quote:
That sure is a lot of words, but if you cannot decode 1080p h264 in software on a Core2Duo (no it doesn't matter which Core2Duo) you are doing something wrong. Even my old Athlon X2 that I bought in ~2005 or so could play 1080p with software decoding. In fact any multicore CPU bought after early 2007 or so should be able to decode 1080p in software (toy CPU's like Intel Atom's not included).
The sheer number of people out there I've seen commenting on audio desyncs, stuttering, framedrops or other general complaints using X2, early lower end C2 chips or laptop chips excluding Atom states that's not always the case, especially given the vast variations between bitrate and encode quality and settings out there.

You've got more efficient decoders out there like CoreAVC which will run most stuff on just about anything modern, but the most common software decoder is going to be FFDShow, and that does sometimes struggle to decode decent quality rips at 1080p on older CPUs. CoreAVC'll do it, but again, in the scheme of things not many people have that.
Most people want to whack a codec pack/ffdshow on and use WMP, or perhaps use MPC-HC and for it to work fine from the go. Most machines will have some background tasks running, be it MSN, AV, whatever meaning that 100% of the CPU usually won't always be available. You could argue that not having an exceptionally clean PC, and the fastest software decoder in this situation is doing it wrong, but it is a reasonable representation of the average PC.

If people are doing things like running subs or even karaoke subs increasing CPU usage elsewhere then that's even more apparent. (and given we're talking on an anime forum here, that IS relevant- it's not the video decoding itself but it's part and parcel with the viewing experience). In this sort of 'average joe' situation, with his average pc, offloading the decoding makes perfect sense. Most videos out there will play appropriately, it removes a lot of the dependancy on the host system, and frees up more resources for other tasks the system may or may not be handling.
As an example, despite the fact my machines are MORE than capable of software decoding (well bar my download box), I tend to use hardware accelleration for the aforementioned heat and power reasons, and also because it means I can watch a video whilst doing whatever else I want at the time with virtually no impact. Kinda nice to be able to watch a video and file convert or run an encode in the background with virtually no impact on the time taken. Makes waiting a lot more tolerable.


Quote:
Also, you keep using the word "whilst". Just for your information, using that in a forum post makes you look like a pretentious faggot. Hope this helps.

Heh, not the intention, just trying to have a realistic discussion on the subject. I wouldn't say using the word TWICE is a massive over use though, not like I used it every sentence!
__________________
Total Anime watched= Enough. What can I say? I'm a convert...
***
PRAY FOR SPICE AND WOLF III and faster Yenpress novel releases!
Reading: None at the moment

Last edited by tyranuus; 2011-02-05 at 15:47.
tyranuus is offline  
Old 2011-02-05, 14:45   Link #67
sneaker
Senior Member
 
Join Date: Dec 2008
Most (all?) people having problems with older dual cores and 1080p simply don't use a multi-threaded decoder.
sneaker is offline  
Old 2011-02-05, 15:17   Link #68
tyranuus
Team Spice and Wolf UK
 
 
Join Date: Feb 2010
Location: England
Age: 36
I know FFDShow is multithreaded out the box, but I've got a feeling FFMpeg actually isn't surprisingly (which has a knock on effect). Could be wrong here, but I remember a massive discussion on XBMC and other related forums about increased system requirements during the shift between Camelot and Dharma builds, which the devs highlighted as being down to the fact that they could no longer use a multi-threaded patch that had been used in Camelot with the current version of the decoder. [If I'm wrong there, don't shoot me Fluff ]
Subtitles at higher res with animation also seem to be surpringly CPU heavy sometimes, one of my friends was trying to run 720p on an old single core 3700+ and couldn't work out what was killing the system with CoreAVC in use. Turned out the subs were sapping almost 50% of the CPU at times.

I'll actively admit though thats really moving outside the bounds of stuff I know anything about though, as generally I can't stand coding, linux and related.
__________________
Total Anime watched= Enough. What can I say? I'm a convert...
***
PRAY FOR SPICE AND WOLF III and faster Yenpress novel releases!
Reading: None at the moment

Last edited by tyranuus; 2011-02-05 at 15:43.
tyranuus is offline  
Old 2011-02-05, 15:22   Link #69
sneaker
Senior Member
 
Join Date: Dec 2008
The ffdshow versions I used were pre-configured to single thread only and as far as I know the multi threaded ffmpeg h.264 decoder is still a patch (ffmpeg-mt) and not in the main branch. (Not sure though)
VLC also didn't use multiple threads (or at least not frame based multi-threading) until sometime last year. I don't know when multi-threading was introduced to mpc-hc or if it depends on the one who builds the binary.
sneaker is offline  
Old 2011-02-05, 22:02   Link #70
cyberbeing
Senior Member
 
 
Join Date: May 2006
Location: California
For a long time now, all builds of FFDshow Tryouts have had both the single-threaded libavcodec and multi-threaded ffmpeg-mt selectable for h.264 decoding. MPC-HC also uses ffmpeg-mt for it's built-in h.264 decoder.

An AMD X2 at 2Ghz+ should be able to decode 1080p h.264 just fine with any multi-threaded decoder. I would know, as I still have a computer using a 2.4Ghz AMD X2 4800+ (939) from 2005. You're looking at around 40-50% CPU during low-motion and 60-70% CPU during high-motion when playing 1080p @20Mbps avg w/ 5.1 audio and basic softsubs using FFDshow with ffmpeg-mt.
__________________
cyberbeing is offline  
Old 2011-02-06, 05:22   Link #71
TheFluff
Excessively jovial fellow
 
 
Join Date: Dec 2005
Location: ISDB-T
Age: 37
Quote:
Originally Posted by tyranuus View Post
The sheer number of people out there I've seen commenting on audio desyncs, stuttering, framedrops or other general complaints using X2, early lower end C2 chips or laptop chips excluding Atom states that's not always the case, especially given the vast variations between bitrate and encode quality and settings out there.
That is neither the fault of the release nor of the decoder, since these people usually have a million spyware programs and three different conflicting antivirus suites installed. As I said, they are doing it wrong.
__________________
| ffmpegsource
17:43:13 <~deculture> Also, TheFluff, you are so fucking slowpoke.jpg that people think we dropped the DVD's.
17:43:16 <~deculture> nice job, fag!

01:04:41 < Plorkyeran> it was annoying to typeset so it should be annoying to read
TheFluff is offline  
Old 2011-02-06, 16:34   Link #72
PositronCannon
Senior Member
*Fansubber
 
 
Join Date: May 2010
Location: Spain
Age: 33
Eh... I haven't tried a lot of 1080p (I don't even have a use for anything above 480p since I watch everything at that res anyway) but the few 1080p vids I've watched did struggle with certain scenes. This is on a 1.6GHz Core2Duo with CoreAVC and not much of anything beyond core system processes running on the background and a clean system. Hell, I've seen certain scenes in 720p bringing MPC-HC close to 85% CPU usage already.
PositronCannon is offline  
Old 2011-02-06, 23:42   Link #73
cyberbeing
Senior Member
 
 
Join Date: May 2006
Location: California
Since the AMD X2 desktop processors perform similar to the Core 2 Duo moblie processors, it's the same deal where you would want 2Ghz+. With laptop limitations in general, 2.4Ghz to be really safe. This of course doesn't apply to the Core i3/i5/i7 mobile processors released recently.

PositronCannon, if your laptop doesn't have 1920x1080 screen, there would be no point of watching 1080p anyway. Just limit yourself to 720p, which you seem to be able to playback just fine, and is likely a close match to your screen resolution.
__________________

Last edited by cyberbeing; 2011-02-07 at 00:03.
cyberbeing is offline  
Old 2011-02-07, 01:37   Link #74
PositronCannon
Senior Member
*Fansubber
 
 
Join Date: May 2010
Location: Spain
Age: 33
Quote:
Originally Posted by cyberbeing View Post
PositronCannon, if your laptop doesn't have 1920x1080 screen, there would be no point of watching 1080p anyway. Just limit yourself to 720p, which you seem to be able to playback just fine, and is likely a close match to your screen resolution.
Of course. I was just saying this "any Core2Duo will play 1080p fine" seems a bit exaggerated to me. Unless mobile-only models aren't included.
PositronCannon is offline  
Old 2011-02-07, 09:08   Link #75
namaiki
Senior Member
 
Join Date: Dec 2009
Location: Sydney, Australia
Quote:
Originally Posted by PositronCannon View Post
Hell, I've seen certain scenes in 720p bringing MPC-HC close to 85% CPU usage already.
Did you test with the CPU locked at full clock?(High Performance power profile) Otherwise the CPU will downclock which can skew readings.

My laptop with a Core 2 Duo L7500 @ 1.6GHz hasn't seen any lag in 1080p video that wasn't related to subtitles (cough DirectVobSub cough), but I don't have/play much that is 1080p.

Last edited by namaiki; 2011-02-07 at 09:37.
namaiki is offline  
Old 2011-02-07, 09:44   Link #76
PositronCannon
Senior Member
*Fansubber
 
 
Join Date: May 2010
Location: Spain
Age: 33
Yeah, I've always had it at 100% minimum. Still, I think it was probably just some particular encode(s). Generally it stays around 50% with occassional peaks at maybe 70%, certainly more than enough to play anything 720p without an issue. Even before using CoreAVC I still barely had any noticeable frame drops.
PositronCannon is offline  
Old 2011-02-07, 09:46   Link #77
namaiki
Senior Member
 
Join Date: Dec 2009
Location: Sydney, Australia
Quote:
Originally Posted by PositronCannon View Post
Yeah, I've always had it at 100% minimum.
I'd still let it downclock when not benching. Anyway, I'll stop wasting your time.
namaiki is offline  
Old 2011-02-07, 09:52   Link #78
PositronCannon
Senior Member
*Fansubber
 
 
Join Date: May 2010
Location: Spain
Age: 33
...that probably is a good idea, lol.

And yeah, every 1080p I've tried had subtitles, so it was probably that.
PositronCannon is offline  
Old 2011-02-07, 09:56   Link #79
PositronCannon
Senior Member
*Fansubber
 
 
Join Date: May 2010
Location: Spain
Age: 33
...that was in response to the downclocking suggestion. xD I have plenty of time to "waste" anyway.

Anyway, nuff spam on my part. :V
PositronCannon is offline  
Old 2011-02-16, 05:15   Link #80
NvIon
Junior Member
 
Join Date: Aug 2009
http://www.anandtech.com/show/4181/n...ts-this-year/3

Quote:
NVIDIA's video decoder gets an upgrade in Kal-El to support H.264 at 40Mbps sustained (60Mbps peak) at a resolution of 2560 x 1440. This meets the bandwidth requirements for full Blu-ray disc playback. NVIDIA didn't just make the claim however, it showed us a 50Mbps 1440p H.264 stream decoded and output to two screens simultaneously: a 2560 x 1600 30" desktop PC monitor and a 1366 x 768 tablet display.
Looks like I can play all H.264 high profile videos, even those with L5.1 16 reference frames encodes off my smartphone/tablet device without transcoding/reencoding ever again on a very power efficient SoC that uses no more than 1 watt of power.

The future is so great while so many Luddites are still stuck in the past with CPU decoding wasting power all the time, using 35/65/95 watts of power to decode.
NvIon is offline  
 

Tags
guide, h.264, wiki candidate


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 01:26.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.
We use Silk.