View Full Version : Nothing Appearing on Monitor
Ok I had this problem a few weeks back but after rebooting my computer and taking out the cords and reconnecting them it worked again. This time absolutely nothing I try is working and I don't know much so here it goes.
I turn on my computer, I hear the sound of it booting up but my monitor says it is getting no signal. I hooked the monitor up to the other computer and it worked and also tried our other computer's monitor without success.
Is this a video card problem or is there something else maybe wrong on the inside?
I have a Nvidia GeForce 6100 Integrated Graphics Card. Computer was purchased about 4 years ago.
Seems like your graphics card is failing... If it's a integrated graphics card, you may need to replace the motherboard, but it sounds like you have a external GPU (meaning a graphics card installed on a AGP or PCI Express slot on your motherboard).
Keep in mind that the Geforce 8600 series have a serious problem of GPU failure... so you may want to replace your graphics card which is a easy fix.
Sorry I edited my post a little late to clarify what I had, I also have an unused PCI express slot elsewhere on my motherboard.
Is this a nVidia motherboard? I know most of their motherboards have problems with these nVidia motherboard. Your best bet is to install a graphics card and use that instead since buying a motherboard would mean you need to upgrade the CPU and the RAM that is compatible with that motherboard.
Since you have a PCI Express X16 slot, there shouldn't be a problem installing one and the PCI Express 2.0 cards should work even if the slot isn't 2.0.
4 years old? If you can, get a new windows xp pro computer. But I think that'll be hard now, lol. But without a doubt, you need a new video card.
So it should be possible if I used a new video card even if my old video card is integrated? I don't know if its a nvidia motherboard, I'd have to check when I get home but I don't have net access at home now to post the info :(
Yes, if you have an integrated card you can choose to use either the integrated one, or a card in an expansion slot. Typically, the motherboard will detect the expansion card and automatically use it instead of the integrated one. Even if it doesn't all you have to do is flip a couple settings in your BIOS.
Alright, thanks for the assistance everyone.
vBulletin® v3.8.6, Copyright ©2000-2013, Jelsoft Enterprises Ltd.