AnimeSuki Forums

Register Forum Rules FAQ Members List Social Groups Search Today's Posts Mark Forums Read

Go Back   AnimeSuki Forum > AnimeSuki & Technology > Tech Support

Notices

Reply
 
Thread Tools
Old 2007-07-26, 21:55   Link #41
grey_moon
Yummy, sweet and unyuu!!!
 
 
Join Date: Dec 2004
Quote:
Originally Posted by WanderingKnight View Post
But! It doesn't mean they'll make them open source, which can have its significant drawbacks. I hope there's a sane response from ATI and they do what they should have done a lot of time ago (open source them).
Hee hee they (ATI and Nvidia) are so paranoid about each other I can't see them open sourcing their drivers which is a real shame. I can still remember all the crap they both pull in regards to the benchmarking of their products.

ATI must surely realise that they need to do something about the OSOS market, I mean they have they seen how hard it is to get Beyrl working with an ATI card
__________________
grey_moon is offline   Reply With Quote
Old 2007-07-26, 22:02   Link #42
hobbes_fan
You could say.....
 
 
Join Date: Apr 2007
Good to hear I really want to buy a HD2400xt for my HTPC but there's nothing for Linux users. I mean nvidia has already got all their stuff together for the High definition 8xxx series within weeks of release, which were stable.

I can never understand why ATI sucks at this. considering they and their owners AMD are lagging behind their competition (Intel and Nvidia), you'd think that they'd be on their bike trying to make their product as accessible for all platforms as possible.

AMD's marketshare is about 20%, ATI's would probably be around 30%. (Speculation) No-one I know using Linux will touch ati with a 10ft pole. Granted Linux users make up about 10-15% of all home O/S's still it's silly of them to just flat out suck for so long with Linux support.

Out of the box, from my experience ATI products offer better performance out of the box over nvidia, it's just that nvidia has better support, and more overclocking capabilities.

I'll hold off buying an 8500gt for a couple of months, hopefully ATI will have released something so I can at least have a choice on upgrading a 7300gt.

Now is only there was someway to get proper on the fly Dolby Digital or DTS encoding on Linux. Highly unlikely though.
hobbes_fan is offline   Reply With Quote
Old 2007-07-27, 10:21   Link #43
SeijiSensei
AS Oji-kun
 
 
Join Date: Nov 2006
Location: Mucking about
Age: 64
Quote:
Originally Posted by WanderingKnight View Post
But! It doesn't mean they'll make them open source, which can have its significant drawbacks. I hope there's a sane response from ATI and they do what they should have done a lot of time ago (open source them).
The usual response to this request is that the drivers incorporate proprietary software from ATI's development partners for which permission to open is not available. I hear the same is supposedly true about NVidia.

I use the proprietary NVidia driver which works fine for me. Luckily there are pre-built kernel modules at Livna so updates generally happen in sync with kernel updates. I'm glad Dell is leaning on ATI, but I'm happy with my NVidia setup as it stands.

There's also been repeated discussions about a standardized "application binary interface" that would enable companies to write proprietary drivers that wouldn't need kernel source code "shims." I believe Linus opposes this whole notion, though, so I doubt it's going to happen any time soon.

I just wish there were some good resolution to the problem of proprietary binaries working with Linux. These issues have been around for nearly a decade now and show little sign of being resolved. If anything, the debate over GPLv3 suggests the situation may worsen, not improve.
__________________
SeijiSensei is offline   Reply With Quote
Old 2007-07-28, 16:35   Link #44
WanderingKnight
Gregory House
*IT Support
 
 
Join Date: Jun 2006
Location: Buenos Aires, Argentina
Age: 25
Send a message via MSN to WanderingKnight
Linus replies to Con Kolivas and addresses the scheduler issue.

Quote:
"There's been a lot of recent debate over why Linus Torvalds chose the new CFS process scheduler written by Ingo Molnar over the SD process scheduler written by Con Kolivas, ranging from discussing the quality of the code to favoritism and outright conspiracy theories. KernelTrap is now reporting Linus Torvalds' official stance as to why he chose the code that he did. 'People who think SD was "perfect" were simply ignoring reality,' Linus is quoted as saying. He goes on to explain that he selected the Completely Fair Scheduler because it had a maintainer who has proven himself willing and able to address problems as they are discovered. In the end, the relevance to normal Linux users is twofold: one is the question as to whether or not the Linux development model is working, and the other is the question as to whether the recently released 2.6.23 kernel will deliver an improved desktop experience."
I've read the article and the mailing list excerpts, and Linus is showing a certain degree of favoritism, but, apparently, with reason.
__________________


Place them in a box until a quieter time | Lights down, you up and die.
WanderingKnight is offline   Reply With Quote
Old 2007-07-29, 05:31   Link #45
Jinto
Asuki-tan Kairin ↓
 
 
Join Date: Feb 2004
Location: Fürth (GER)
Age: 33
Honestly I don't know which strategy is better... Completely Fair Scheduler works in that it allows the task with the longest wait time to process next. It will support certain levels of task importance and have a detection of sleeper tasks (The problem with sleeper tasks is, that such tasks will immediatly give free the CPU for other tasks when they are still in sleep mode - thus they trigger a useless scheduling event ...scheduling consumes CPU time too. With lots of sleeping tasks this can be a considerable performance killer since the scheduler would needlessly waste performance for cycling sleepers, therefore this approach needed a sleeper detector).

The Staircase deadline scheduler lets tasks run according to their priority that gives them a certain place on the staircase and a certain quota (of CPU time). When a process uses its quota partly it is dropped in prioriy to the next level/stair. Each level/stair has its own quota, so if the quota for an entire level/stair is reached all its tasks drop to then next lower level. Tasks that completely used their quota will be placed in a special expired area where their priority is restored to the original value (but they cannot run there). The algorithm proceeds until all tasks are either in the expired area or on the lowest level/stair and the quota of this level is used up. Then all tasks are moved into the expired area and their priority levels are restored. Then the whole expired area is turned into an active area and the whole process of tasks going down the staircase starts again.

In completely fair scheduling time and task priority is the decission factor for allowing CPU time. In staircase deadline scheduling it is priority and quota (deadline). Both algorithmic ideas seem to be efficient imo, however I think the staircase deadline approach is a little more complex. But the staircase deadline scheduler is by design optimized to deal with sleepers. High priority tasks that sleep will not use much quota but still drop down a level each time they run on the CPU. So scheduler actualizes that there are no sleep cycles that waste processor time of low priority tasks. Since all the sleeper tasks will move down to the same low stair/level each time they are given processing rights on the CPU.
Since I don't know how the sleeper task detector in completely fair scheduling works, it is hard to compare the two.
Though it seems to me, that both approaches are quite well suited for fair scheduling. The completely fair scheduler might be even better, depending on how well the sleeper detector works.

Now let me finally explain how an ideal scheduler should work:

1) scheduling the processing time of each available task completely fair according to their needs and priority (e.g., a sleeping task doesn't have a need to be executed)

2) Doing 1) but using the lowest possible time and effort to do it. The scheduler itself should use as little as possible processing time itself.

3) Finding the best trade off between 1) and 2). Usually one can increase fairness by making the scheduler more complex which increases the load of the scheduler on the overall available performance. One can decrease complexity and therefore load of the scheduler when using simple scheduling, which might not be very fair.
Dependend on the usage of the scheduler the optimum for this is different for different systems.

On multiprocessor systems I could imagine that the staircase deadline approach will increase more in complexity than the completely fair scheduler approach. (But this is according to my gut instinct not a scientific conclusion)
Jinto is offline   Reply With Quote
Old 2007-07-29, 05:46   Link #46
grey_moon
Yummy, sweet and unyuu!!!
 
 
Join Date: Dec 2004
Quote:
Originally Posted by WanderingKnight View Post
Linus replies to Con Kolivas and addresses the scheduler issue.

I've read the article and the mailing list excerpts, and Linus is showing a certain degree of favoritism, but, apparently, with reason.
Yup once I've read the mail discussion I'm with Linus on this one.

Even if we ignore the statements about Con's attitude to the bug reporters, I personally would always take a proven developer over an unknown.

http://kerneltrap.org/node/14008
__________________
grey_moon is offline   Reply With Quote
Old 2007-08-04, 23:04   Link #47
WanderingKnight
Gregory House
*IT Support
 
 
Join Date: Jun 2006
Location: Buenos Aires, Argentina
Age: 25
Send a message via MSN to WanderingKnight
Good news for Red Hat fans (*points at SeijiSensei* ).

Quote:
"It looks like Red Hat is going to release their Global Desktop Linux in September and give Ubuntu a challenge for the Linux desktop market. Red Hat Global Desktop 'would be sold with a one-year subscription to security updates.'"
"It looks like another choice for the proverbial Aunt Tillie. The release is being delayed in order to provide greater media compatibility, 'to permit users to view a wide range of video formats on their computers.'"
I wouldn't mind trying it out... but charging for it? I understand charging enterprise customers for support, but... how is that supposed to compete with Ubuntu in the desktop market, when with the latter you get free updates for 18 months and lifetime availability of future versions?
__________________


Place them in a box until a quieter time | Lights down, you up and die.
WanderingKnight is offline   Reply With Quote
Old 2007-08-07, 03:05   Link #48
grey_moon
Yummy, sweet and unyuu!!!
 
 
Join Date: Dec 2004
Lenova are offering SLED on some of their laptops, hopefully the market will see some Compiz/Beryl eyecandy preinstalled to give Vista a run for its money.

http://news.bbc.co.uk/1/hi/technology/6933859.stm

So out of the big 3 we have:
Dell - Ubuntu
Lenova (IBM) - SuSE
HP.... Come on Red Hat pull your fingers out!
__________________
grey_moon is offline   Reply With Quote
Old 2007-08-08, 16:08   Link #49
WanderingKnight
Gregory House
*IT Support
 
 
Join Date: Jun 2006
Location: Buenos Aires, Argentina
Age: 25
Send a message via MSN to WanderingKnight
Having new vendors enter the field is good, and having different distros for each one is better yet... more vendors, more drivers, more compatibility.
__________________


Place them in a box until a quieter time | Lights down, you up and die.
WanderingKnight is offline   Reply With Quote
Old 2007-08-08, 18:14   Link #50
TakutoKun
Mew Member
*IT Support
 
 
Join Date: Aug 2007
Location: Ontario, Canada
Age: 29
Quote:
Originally Posted by grey_moon View Post
Lenova are offering SLED on some of their laptops, hopefully the market will see some Compiz/Beryl eyecandy preinstalled to give Vista a run for its money.

http://news.bbc.co.uk/1/hi/technology/6933859.stm

So out of the big 3 we have:
Dell - Ubuntu
Lenova (IBM) - SuSE
HP.... Come on Red Hat pull your fingers out!
The main problem with Linux/Unix systems is simplicity and compatibility. Linux, yes, is getting easier to use and to install programs. However, many programs that are done in a RPM have to be installed by using the shell. In a Linux class I was in last semester, I lead a class on installing a game. On average, it took 15-20 minutes to install a simple pong-like game using Shell commands.
TakutoKun is offline   Reply With Quote
Old 2007-08-08, 18:42   Link #51
WanderingKnight
Gregory House
*IT Support
 
 
Join Date: Jun 2006
Location: Buenos Aires, Argentina
Age: 25
Send a message via MSN to WanderingKnight
Quote:
The main problem with Linux/Unix systems is simplicity and compatibility. Linux, yes, is getting easier to use and to install programs. However, many programs that are done in a RPM have to be installed by using the shell. In a Linux class I was in last semester, I lead a class on installing a game. On average, it took 15-20 minutes to install a simple pong-like game using Shell commands.
First of all, if you're looking for simplicity, take a look at something like Apt (Debian-based systems), or even more simple, the shiny Synaptic Package Manager GUI (Ubuntu). But if you want to do it the old way, by downloading the source and compiling it, most of times a simple:

Code:
./configure && make && make install
should be enough. I recognize I have had trouble with some source codes, but 90% of the time running that good ol' command gets the job done.

BTW, I should remind you of the first post of this thread:

Quote:
This is not a thread for the discussion of the disadvantages of Linux in comparison to other operative systems.
Just not to derail the discussion any further.
__________________


Place them in a box until a quieter time | Lights down, you up and die.
WanderingKnight is offline   Reply With Quote
Old 2007-08-08, 20:04   Link #52
TakutoKun
Mew Member
*IT Support
 
 
Join Date: Aug 2007
Location: Ontario, Canada
Age: 29
Quote:
Originally Posted by WanderingKnight View Post
First of all, if you're looking for simplicity, take a look at something like Apt (Debian-based systems), or even more simple, the shiny Synaptic Package Manager GUI (Ubuntu). But if you want to do it the old way, by downloading the source and compiling it, most of times a simple:

Code:
./configure && make && make install
should be enough. I recognize I have had trouble with some source codes, but 90% of the time running that good ol' command gets the job done.

BTW, I should remind you of the first post of this thread:

Just not to derail the discussion any further.
I should add that those are the only disadvantages of Linux. Other than that, I love the security and stability. OpenSUSE works well with laptops, I find.
TakutoKun is offline   Reply With Quote
Old 2007-08-08, 21:19   Link #53
grey_moon
Yummy, sweet and unyuu!!!
 
 
Join Date: Dec 2004
Sorry but I have to answer that point...

More hardware vendors will equal more ease of use for the end users. They are into selling their products, and just as a major non oss software vendor (Novell) has invested in user operability, I believe that the HW vendors will do so too. More users = more money to invest in making more users For these bad boys, it isn't about FSF and Stallman, but about selling stuff and one barrier they all know they need to break through is the user operability one.

WK mentioned cmd line apt, I personally haven't gone near the command line for this laptop, everything was done via add/remove. I wanted to experience Ubuntu from the user perspective. OpenSuSE is a lot easier to install apps only with YaST, no need to worry about what dependencies etc.

I think that what is happening is a glorious thing, because it will as WK pointed out increase driver compatibility

User uptake of Linux has already happened, the user just doesn't know it. Lots of us use it in one form or another without realising. The HW vendors selling it as a main stream OS, gives us the buyer more choice *cheer*, and gives us the user more impact when software/hardware developers produce their products *cheer*. As long as it doesn't flop like yogi bear with no food, I can't think of anything negative from this news.
__________________
grey_moon is offline   Reply With Quote
Old 2007-09-06, 15:16   Link #54
SeijiSensei
AS Oji-kun
 
 
Join Date: Nov 2006
Location: Mucking about
Age: 64
ATI to release open-source video drivers

From the cited blog posting:

Quote:
AMD is making the commitment to do two major things:

* To develop of a fully functional 2D and 3D driver that supports all of their newer radeon chipsets. This will be done in full collaboration with the open source community and will have the direct participation of hackers from companies like Red Hat and Novell.
* To release documentation that anyone can use to build and support drivers for their chips.
This is quite stunning news given how long the video card manufacturers have said it wasn't possible to release open-sourced drivers because of the need to protect trade secrets. Releasing the specs themselves into the public domain is especially impressive since it will enable third-parties to write competing open-sourced drivers. Who knows, perhaps someone will take a stab at writing alternative drivers for Windows and Macs?

Of course, the ATI decision is being made by its new management after its purchase by AMD. AMD sees its primary competitor, Intel, offering open-sourced drivers for its video and wireless chipsets. It seems unlikely that nVIDIA will not eventually follow suit if only for competitive reasons.

Proprietary video drivers have been one of the last major hurdles to developing truly open systems. I'm rather glad now to see that Linus didn't back down from his "no-binary-interfaces" position. Who would have thought that the hardware manufacturers like Intel and AMD would be the ones to fold their hands?
__________________
SeijiSensei is offline   Reply With Quote
Old 2007-09-06, 20:53   Link #55
Ledgem
Love Yourself
 
 
Join Date: Mar 2003
Location: Northeast USA
Age: 28
I wonder if it has to do with AMD's unveling of their new fusion-type processors, due out in 2009... basically, their plans are to have one (or more) "Bulldozer" cores dedicated to processing graphics. I'd imagine it'll require a completely different set of drivers, and if it's successful, might signal the end of graphics as "cards" as we know it. I doubt it'll knock graphics cards out directly, but it'd be interesting to see...

I was going to link a neat article that I read about the Bulldozer cores, along with the Falcon (desktop) processor and the Sand Tiger (server) but I can't find it in the history of any of my computers. Could've sworn I read it yesterday...
__________________
Ledgem is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 22:32.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.
We use Silk.