AnimeSuki Forums

Register Forum Rules FAQ Community Today's Posts Search

Go Back   AnimeSuki Forum > Support > Tech Support

Notices

Reply
 
Thread Tools
Old 2007-08-16, 04:47   Link #61
Tiberium Wolf
Senior Member
 
 
Join Date: Dec 2004
Location: Portugal
Age: 44
052569 > Getting 2 GB of mem is pretty cheap already.

You can just go to the windows task manager in performance tab and keep the max peek. If you run what you usually run you can see what you are using at max.

Yes... XP or others down... I haven't used vista yet.

I am lost in your investiment analogy or whatever...


Ledgem > Man... that's the program fault. Anyway the times I experienced exceeding ram needs was when I had my old pc with 768 RAM. Some stopped the operation it was doing and other simply terminated. It depends of the program.


---------------------------------

I am not saying that every situation everyone should just turn off the swap file. PCs just act like servers and for work should turn off since you are working with important things and don't want it to go bad for some reason.
__________________
Tiberium Wolf is offline   Reply With Quote
Old 2007-08-16, 05:35   Link #62
grey_moon
Yummy, sweet and unyuu!!!
 
 
Join Date: Dec 2004
If a program runs out of memory you actually get a "out of memory" message, but I think it depends on what exceeds the maximum memory, since the program/s could slurp up to 99% and then a core system process might decide it needs another 2%. Now if it is something essential to the OS I'm not sure what would happen....
__________________
grey_moon is offline   Reply With Quote
Old 2007-08-16, 08:06   Link #63
TakutoKun
Mew Member
*IT Support
 
Join Date: Aug 2007
Location: Ontario, Canada
Age: 39
I will not recommend the removal of the pagefile in Server Systems. As mentioned by grey_moon, you will receive an "out of memory" error. On a server, receiving such an error could cause problems that could lead to the loss of performance, service, fault tolerance, security, and etc... There is a slight difference to what you should direct the page file in a server system, it should be directed towards Background Services and System Cache. I am not sure if anyone has mentioned this, one main problem wit the page file is security. When you restart your system, Windows, by default, does not clear the page file leaving some potentially important information in your page file that could be detrimental to your organization. This, however, is solved by modifying your Local Policy with the MMC.
TakutoKun is offline   Reply With Quote
Old 2007-08-16, 08:49   Link #64
grey_moon
Yummy, sweet and unyuu!!!
 
 
Join Date: Dec 2004
Ughuuuu I set my eraser to gutmann erase my page file before, I had a few cups of coffee waiting for it to shutdown
__________________
grey_moon is offline   Reply With Quote
Old 2007-08-16, 09:17   Link #65
TakutoKun
Mew Member
*IT Support
 
Join Date: Aug 2007
Location: Ontario, Canada
Age: 39
Quote:
Originally Posted by grey_moon View Post
Ughuuuu I set my eraser to gutmann erase my page file before, I had a few cups of coffee waiting for it to shutdown
That is similar to the wait times experience when burning CD-Rs at 1x.
TakutoKun is offline   Reply With Quote
Old 2007-08-17, 06:53   Link #66
SeijiSensei
AS Oji-kun
 
 
Join Date: Nov 2006
Age: 74
I don't run Windows any more, but wouldn't a simple solution to the slow page file problem be to put it on a USB memory key? A 2GB device should be more than enough swap for any Windows machine with at least 512M of memory. Plus it would overcome the slow transfer from swap problem some of you mentioned.

In Linux I almost always devote a partition to swap at least equal in size to the RAM memory of the machine. Still it's so efficient that I rarely find much swap in use. At the moment, with Firefox, Thunderbird, Azureus, a bunch of shells, and a few other windows, I'm using about half of my 1 GB of physical memory; the rest is automatically devoted to disk caching. Of my swap space of 2 GB, 72k is in use!

As for disk fragmentation, the ext3 filesystem on my computer that Azureus uses is about 7% fragmented. This is pretty high for ext3; my root filesystem has only 0.6% fragmentation, but it's also only 15% full compared to 86% for the Azureus filesystem. As disk space declines, it becomes harder to find a large enough block of storage to write files contiguously. I'd guess that Azureus's practice of pre-allocating all the storage for a torrent might exacerbate this problem as well.
SeijiSensei is offline   Reply With Quote
Old 2007-08-17, 07:10   Link #67
grey_moon
Yummy, sweet and unyuu!!!
 
 
Join Date: Dec 2004
Quote:
Originally Posted by SeijiSensei View Post
I don't run Windows any more, but wouldn't a simple solution to the slow page file problem be to put it on a USB memory key? A 2GB device should be more than enough swap for any Windows machine with at least 512M of memory. Plus it would overcome the slow transfer from swap problem some of you mentioned.
USB2 has a max bw of 480Mbps, but I've never seen a usb storage device come close to that, whilst plain old PATA can reach 1064Mbps. I believe that there is a CPU overhead with the USB interface, and the data has to be encapsulated into packets adding more overhead (probably what the CPU is up to).

Memory keys (I am guessing you are on about flash drives) have limited writes, about 10k to 100k depending on when it was brought. I have heard there are some that have 1 mil writes per cell now, but I don't have any refs on that. It will fragment due to wear levelling, but that isn't an issue as there is no head seek times involved.

Oh can the swap be moved onto a removable disk?
__________________
grey_moon is offline   Reply With Quote
Old 2007-08-17, 10:30   Link #68
Gundam Zero Force
Senior Member
 
Join Date: Nov 2006
Location: Seattle, WA
Age: 37
While I don't belieive defragment will solve major issues I know of one time though it did . . ..

On my old pc everything was exremeley slow to the point it froze up. So I tried a defrag and after that point the freezing stopped. I think I was lucky though becasue freezing issues most of the time can mean more serious issues.

On my new pc, while I defrag quite often, the computer has always been fast and I haven't see any big improvements. But I figure with all the files swapping and moving I do each week I should defrag to just be safe.

Last edited by Gundam Zero Force; 2007-08-17 at 12:06.
Gundam Zero Force is offline   Reply With Quote
Old 2007-08-17, 10:40   Link #69
TakutoKun
Mew Member
*IT Support
 
Join Date: Aug 2007
Location: Ontario, Canada
Age: 39
Quote:
Originally Posted by Gundam Zero Force View Post
While I don't bel;eive defragment will solve major issues I know of one time though it did . . ..

On my old c everything was exremeley slow to the point it froze up. So I tried a defrag and after that point the freezing stopped. I think I was lucky though becasue freezing issues most of the time can mean more serious things.

O my new pc, while I defrag quite often, the computer has always been fast and I haven't see any big improvements. But I figure with all the files swapping and moving I do each week I should defrag to just be safe.
This a pretty good example of fragmentation of the file system versus performance. Of course, this case might not work all of the time, but I am glad to see that it is working for you.
TakutoKun is offline   Reply With Quote
Old 2007-08-17, 13:13   Link #70
Tiberium Wolf
Senior Member
 
 
Join Date: Dec 2004
Location: Portugal
Age: 44
Oh I just tried chkdsk on my download drive. The 4th pass in takes 45 mins to do 3%. See how serious is my fragmentation!?
__________________
Tiberium Wolf is offline   Reply With Quote
Old 2007-08-17, 14:48   Link #71
TakutoKun
Mew Member
*IT Support
 
Join Date: Aug 2007
Location: Ontario, Canada
Age: 39
Quote:
Originally Posted by Tiberium Wolf View Post
Oh I just tried chkdsk on my download drive. The 4th pass in takes 45 mins to do 3%. See how serious is my fragmentation!?
It is time to clean up.
TakutoKun is offline   Reply With Quote
Old 2007-08-17, 15:41   Link #72
Tiberium Wolf
Senior Member
 
 
Join Date: Dec 2004
Location: Portugal
Age: 44
Quote:
Originally Posted by matradley View Post
It is time to clean up.
Nah... too many files to do the CRC calc.
__________________
Tiberium Wolf is offline   Reply With Quote
Old 2007-08-17, 19:41   Link #73
grey_moon
Yummy, sweet and unyuu!!!
 
 
Join Date: Dec 2004
Quote:
Originally Posted by Tiberium Wolf View Post
Oh I just tried chkdsk on my download drive. The 4th pass in takes 45 mins to do 3%. See how serious is my fragmentation!?
It is Friday, so unless the job involves consuming alcohol it takes longer to finish (unless it is 30 mins before home time then it is finished no matter what state it is in...)
__________________
grey_moon is offline   Reply With Quote
Old 2007-08-18, 03:44   Link #74
Jinto
Asuki-tan Kairin ↓
 
 
Join Date: Feb 2004
Location: Fürth (GER)
Age: 43
Quote:
Originally Posted by Tiberium Wolf View Post
I was saying to close the gaps with contiguous files(not spread out files). It's like in the windows defrag if you had all blue blocks but there were several free blocks(white) but very small. It would be in better is all data was compacted not leaving those small gaps. Those gaps in the future would lead to fragmentation.
If you read my comment on why I think a test that was posted somewhere on page 2 is flawed, you'ld know why this strategy of JIT defragging is not necessarily a good idea. And why files that are used in a sequence better stay close together (even if that means that some of them are fragmented in the process). But sequentially fragmented is better than sequentially used files that are randomly spread over the disk.

In the sequentially fragmented case you'ld jump forward little gaps while doing the read, in the other case of defragged files that are spread over the whole disk, you have big gaps between files (which are read sequentially) requiring to jump and forth and back huge gaps in a file sequence. And starting a program or Windows typically is a case of reading files sequentially (in a specific order).
__________________
Folding@Home, Team Animesuki
Jinto is offline   Reply With Quote
Old 2007-08-18, 05:53   Link #75
Tiberium Wolf
Senior Member
 
 
Join Date: Dec 2004
Location: Portugal
Age: 44
Quote:
Originally Posted by Jinto Lin View Post
If you read my comment on why I think a test that was posted somewhere on page 2 is flawed, you'ld know why this strategy of JIT defragging is not necessarily a good idea. And why files that are used in a sequence better stay close together (even if that means that some of them are fragmented in the process). But sequentially fragmented is better than sequentially used files that are randomly spread over the disk.

In the sequentially fragmented case you'ld jump forward little gaps while doing the read, in the other case of defragged files that are spread over the whole disk, you have big gaps between files (which are read sequentially) requiring to jump and forth and back huge gaps in a file sequence. And starting a program or Windows typically is a case of reading files sequentially (in a specific order).

I saw that but I was only talking about the files being contiguous and not having many small gaps and this done while the pc was idle. I was not talking about the defrag program putting most used files in the beginning of the HD. That's another issue. But then again if you fill up the HD it will slow down anyway. The best option is to have 1 HD for it. I mean the most used files are the OS, drivers and codecs files. 1 HD just for them. Nothing else stays in that HD.
__________________
Tiberium Wolf is offline   Reply With Quote
Old 2007-08-18, 05:56   Link #76
TakutoKun
Mew Member
*IT Support
 
Join Date: Aug 2007
Location: Ontario, Canada
Age: 39
Quote:
Originally Posted by Tiberium Wolf View Post
I saw that but I was only talking about the files being contiguous and not having many small gaps and this done while the pc was idle. I was not talking about the defrag program putting most used files in the beginning of the HD. That's another issue. But then again if you fill up the HD it will slow down anyway. The best option is to have 1 HD for it. I mean the most used files are the OS, drivers and codecs files. 1 HD just for them. Nothing else stays in that HD.
I also support the idea of having on hard disk for the OS and one for your files. Overall, you should see a performance increase and, of course, you will have better disk management.
TakutoKun is offline   Reply With Quote
Old 2007-08-18, 09:07   Link #77
toru310
Senior Member
 
Join Date: Apr 2006
Location: Philippines
Ermm this maybe a little off topic but when I accidentally pressed disk cleanup I got so amazed that I can recover so hard disk space about 3gb. I'm also wondering what is the compressed old files? Finally does this harm my files if I do disk cleanup? Oh yeah If I remember it clearly it also has temporary internet files? and that means the cookies?
Spoiler:


I'm getting confused about what to do first disk cleanup or disk defragment?

Now about this disk deframenter this program basically just arrange the files so that the actuator arm won't stress it self right? So do this also give me some free bytes for my hard drive?
toru310 is offline   Reply With Quote
Old 2007-08-18, 10:22   Link #78
hobbes_fan
You could say.....
 
 
Join Date: Apr 2007
disk cleanup first, defrag last. Basically you want data packed together as tightly as possible. if you defrag first and disk cleanup last there will be gaps between the data making your defrag a pointless exercise.

A defrag won't give you anymore room, it just packs data tightly together
hobbes_fan is offline   Reply With Quote
Old 2007-08-18, 10:24   Link #79
WanderingKnight
Gregory House
*IT Support
 
 
Join Date: Jun 2006
Location: Buenos Aires, Argentina
Age: 35
Send a message via MSN to WanderingKnight
Quote:
Ermm this maybe a little off topic but when I accidentally pressed disk cleanup I got so amazed that I can recover so hard disk space about 3gb. I'm also wondering what is the compressed old files? Finally does this harm my files if I do disk cleanup? Oh yeah If I remember it clearly it also has temporary internet files? and that means the cookies?
Compressing old files is at your will. Personally, I wouldn't recommend it... especially because I don't know how does Windows handle the compression. My hunch is that it'll probably screw up the organization of your files... besides, how "old" does the system reckon a file must be when it becomes compression material? I wouldn't trust it too much (then again, I probably wouldn't trust any of the Windows bundled apps like Disk Cleanup or the defrag app).

Also keep in mind that if the system attempts to compress 3 GB of scattered data (not scattered in the sense of fragmentation, but in the sense of it being thousands, or even millions, of different files) it'll probably become unusable for a considerable amount of time (while it attempts the compression).
__________________


Place them in a box until a quieter time | Lights down, you up and die.

Last edited by WanderingKnight; 2007-08-18 at 14:57.
WanderingKnight is offline   Reply With Quote
Old 2007-08-18, 10:37   Link #80
Jinto
Asuki-tan Kairin ↓
 
 
Join Date: Feb 2004
Location: Fürth (GER)
Age: 43
Quote:
Originally Posted by Tiberium Wolf View Post
I saw that but I was only talking about the files being contiguous and not having many small gaps and this done while the pc was idle. I was not talking about the defrag program putting most used files in the beginning of the HD. That's another issue. But then again if you fill up the HD it will slow down anyway. The best option is to have 1 HD for it. I mean the most used files are the OS, drivers and codecs files. 1 HD just for them. Nothing else stays in that HD.
Actually you described a method of how you would utilize these gaps for stuffing in small files (which have to be taken out of somewhere to stuff them into the gaps). And that practice will most likely take many small files out of a certain file sequence (order).

@Migufuchi Fusutsu,

(1) Compressing old files, means that files that a re rarely used become compressed. But in the process of compressing them they become more vulnerable to data inconsistency issues (ever had a broken zip file?)
Thatswhy I would not recommend to compress old (not so often used) files.

(2) Your temporary internet files do not contain the cookies imo. At least in internet explorer there are two separate Delete options... one for the file cache and one for the cookies. Though I do not know how this option in the disk clean up interpretes "temporary internet files" maybe this includes the cookies.

(3) Of course, if you plan to do disk clean up then do it before disk defragmentation.

(4) Defragmentation does not necessarily spare your disk actuator going long ways. It depends on how much your disk was fragmented before the defragmentation. And of how clever this defragmentation tool arranges the defragged files on your disk. If the defragmentation tool does not at least recognize and enforce data localtiy on folder structures or you have lots of small files or many files in few folders, then the defragmentation will bring only little advantage (if any).
__________________
Folding@Home, Team Animesuki
Jinto is offline   Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 02:06.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.
We use Silk.