2013-08-03, 21:04 | Link #1 |
Juanita/Kiteless
Join Date: Apr 2006
Location: New England
Age: 40
|
Next gen gaming is coming - Anyone have a next gen plan for their PC?
PS4 and XBox One release in a few months and next gen gaming is being phased in. Are there any PC gamers here who plan to buy/build a new computer close to the start of next gen, or do some beefy upgrades for next gen?
My current specs are very good. I have a Radeon HD 7950 GPU with 3 GB VRAM, an intel i5 3570k (best i5 there is right now) with quad cores and 3.4 GHz, and 8 GB DDR3 RAM (1600 RAM). I've been told my CPU will be good for PC gaming for multiple years, and it can always be overclocked (isn't it simple to overclock this type of CPU, as well?). My RAM will be fine for a while. Once I see games that will utilize 6 GB or 8 GB of RAM, I'll want more RAM since 8 GB wouldn't be enough for those games due to RAM being needed for the OS and applications running in the background, so I'll just double my RAM once those games come along. I'm double checking on something, with crossfire and SLI, the VRAM on the GPUs don't stack, right? If you have two 2 GB GPUs, you don't end up with 4 GB VRAM, just 2 GB of VRAM, right? So now the GPUs. Someone told me that I could go quite a distance with two of my current GPU in crossfire (so I'd have 3 GB VRAM). While this is probably true, I'm thinking I might want two GPUs with more VRAM and then crossfire/SLI them so that I go farther into next gen before needing to upgrade the GPU again. What I may do is buy a GPU in the second part of 2014 or first part of 2015 (from the latest Radeon or Geforce line) that has 4 GB of VRAM and then 12-18 months later buy another one of the same GPU and then crossfire/SLI them. I'll want two GPUs with at least 4 GB VRAM, since the VRAM amounts don't stack. I may get a card (and the second one later) with more VRAM, it depends on what is available for my price range when I'm ready to get the first card. What do you techies think? Sound like a good plan? Feel free to add input and critique things in this thread. What are some of your plans for the next gen we are going into? Anyone upgrading/building/buying a new system late this year or within a year of the launch of the next gen consoles?
__________________
|
2013-08-03, 21:54 | Link #2 | ||||
Senior Member
Join Date: Oct 2008
Location: Quebec
Age: 32
|
Quote:
Quote:
Quote:
Quote:
But if you can't effort the new best, yes buying a used (the new one will most probably be extinct by then) and boost your performance for cheap. But you need the PSU, the case, the airflow, the motherboard and the right connector. Next year, with new AMD CPU and GPU. |
||||
2013-08-03, 22:33 | Link #3 |
Juanita/Kiteless
Join Date: Apr 2006
Location: New England
Age: 40
|
I'd like more input on crossfire and SLI. I've never done it before.
If more people advise against crossfire/SLI, I may just go with one card. I will wait to see how my specs run The Witcher 3. If they run it very well, I will not get a new GPU until 2015. I won't double my RAM until games use 6 GB and 8 GB. How much RAM does Windows 7 64-bit use? I run the Home Edition.
__________________
|
2013-08-03, 23:06 | Link #4 | |
Senior Member
Join Date: Oct 2008
Location: Quebec
Age: 32
|
Quote:
If The Witcher 2 is as hardcore graphically as The Witcher 3, your graphic card might not be enough Last edited by spikexp; 2013-08-03 at 23:36. |
|
2013-08-03, 23:24 | Link #5 | |
Juanita/Kiteless
Join Date: Apr 2006
Location: New England
Age: 40
|
Quote:
I'll wait to see the recommended specs for The Witcher 3. I think my GPU will be good enough to run the game in 1080p and at a pretty good frame rate while having high settings on. Right now, my system runs Skyrim with multiple 2k resolution HD texture packs at 1080p and at an almost constant 60 fps, and that game was made with XBox 360 as the lead platform, too, not primarily as a PC game.
__________________
|
|
2013-08-04, 06:07 | Link #6 |
Senior Member
Join Date: Oct 2004
Age: 35
|
I follow a simple rule and it is to upgrade the GPU every two to three generations. Sometimes it doesn't always happen depending on the improvements and changes they've made to the GPUs during my eternal slumber in my mancave.
Unless of course there is a really awesome super duper deal like the price mistake Amazon had last month with the ASUS GTX 680 selling for $197 brand new. Then of course I'll jump on that deal ASAP! NO QUESTIONS ASKED! |
2013-08-04, 07:17 | Link #7 |
Senior Member
Author
Join Date: Oct 2007
Location: Philippines
Age: 47
|
Thinking of a few things that should be done in a couple of years (assuming that the system keeps itself solid):
1.) 8gb to go from 4gb, as I nearly maxed out Skyrim at 3.1gb (and that'll be the only game I have to play, at least until I get lucky). 2.) A mid-range GPU that's around for a year or two. One card with at least 1gb of discrete VRAM, and 128-bit bandwidth is enough for me. 3.) Another 500gb or 1tb hard drive. 4.) Have to consider a quad-core Phenom. 5.) I don't have to overclock just to enjoy my gameplay.
__________________
|
2013-08-05, 18:50 | Link #8 |
Senior Member
Join Date: Dec 2005
Location: US
|
Guidelines for upgrading a computer are fairly simple.
I don't think it's in any way tied to console generations. Developers look at what computers are in the market, and then make their games with appropriate requirements. They don't come up with an arbitrary set of requirements, then hope the market will buy computers to satisfy those requirements. As console launches don't really affect PC sales (if anything, they'll probably reduce PC sales), it doesn't really have any effect on PC game requirements. As for upgrading, everyone knows 3-4 years is generally what works. For CPU, quick overview: http://en.wikipedia.org/wiki/Intel_Tick-Tock For GPU, Nvidia: http://en.wikipedia.org/wiki/Compari...ocessing_units AMD: http://en.wikipedia.org/wiki/Compari...ocessing_units RAM, HD, etc etc, get what you need, and not more, obviously. I'm running a lynnfield (nehalem) right now. I had originally planned on upgrading to a haswell + nvidia 760 this winter holiday (black friday?), fairly reasonable. But since I'm not really playing any games right now, I'm gonna see if I can't make it to skylake (DDR4, PCIe 4.0 huehuehue). (As for why PCIe 4.0, triple monitor of course. Anybody who buys a gaming-capable PC today should be getting triple monitors. It's kind of a waste if you don't.) Last edited by ImperialPanda; 2013-08-05 at 19:27. |
2013-08-05, 20:40 | Link #9 | |
Senior Member
Join Date: Jul 2010
|
Quote:
|
|
2013-08-06, 04:03 | Link #10 |
Senior Member
Join Date: Jan 2006
|
I don't want to just blindly splurge out and get the best hardware available, so I'll have to see the test results after the actual games come out to find out what exactly will be sufficient. The Battlefield 4 benchmarks are what I'm most interested in ATM.
Multi-GPU setups will sometimes have problems, better get used to turning it off for some games to work properly. Also not all motherboards will support it. It consumes a lot of power. If you have the other hardware to make it happen, it can sometimes be worthwhile to double up your current GPU by buying a used one. Personally, I'd always go for a single GPU setup. That's essentially what the next gen consoles have as well.
__________________
|
2013-08-06, 15:45 | Link #11 |
Juanita/Kiteless
Join Date: Apr 2006
Location: New England
Age: 40
|
I think I ought to go with a single card. Keep it simple. I'll wait for The Witcher 3 and see how well it runs on my PC and how that compares to how it runs on XBone and PS4. If I don't get a new GPU in 2014, then I intend to get one in 2015. I think if I get something with at least 4 GB of VRAM, that card could go strong for three years. It would be good for more than that, certainly, but it'd probably go strong for up to three years, like I said.
__________________
|
2013-08-06, 22:38 | Link #12 |
Senior Member
Author
Join Date: Jul 2008
Location: USA
|
I never thought SLI was worth it as far as price/performance goes. They say you can get a MAX of 90% more frames, I would want double (100%) ALL THE TIME! If you make a good purchase of a single card, it should last you long enough until the next single card worth getting comes out. In those situations I've come across, if getting another card of the same type for SLI is cheap but only offers a performance increase in the latest great game of like 15 FPS, why would I go with that instead of say 50? (and for a fair price, which is what I would consider a card worth getting to be). SLI just never made sense in that manner.
The manner it could make sense for some people could be for super high resolution rigs, like multiple monitors or 30" monitors, in which case I don't think a single card would net you fantastic performance for the newest game (I wouldn't think it to be future proof anyways).
__________________
|
2013-08-06, 23:39 | Link #13 |
Pretentious moe scholar
Join Date: Oct 2006
Location: Vancouver, Canada
Age: 37
|
The Xbox One's GPU is supposed to perform 1310 billion operations per second (Gflops) and have 68.2 gigabytes per second (GB/s) of memory bandwidth to main memory - plus a 32MB cache capable of 192 GB/s to take some of the load off the main memory.
The PS4's GPU is supposed to be 1843 Gflops and 176GB/s memory bandwidth to main memory (but no cache). A Radeon HD7950 at stock speeds is 2867 Gflops and 240GB/s memory bandwidth. Again, no cache. Now, the PS4/Xbone GPUs may be based on newer architectures and will be running more optimized code. But I'm not sure that's going to make a huge difference. The 360 and PS3 run Crysis 2 at 30FPS, 1280x720 at about medium settings. The 360's GPU is 240GFlops, 22.4 GB/s memory with 10MB high speed cache. A Radeon HD6450 (newer architecture with 240Gflops, 28.8gb/s) will do 24.7 frames per second at equivalent settings. It's only because PC gamers like to run everything maxed out at 1080P+ that we've needed so much graphics power relative to consoles Bottom line, if game makers decide to build there games with 1080P in mind this generation (last gen, most developers built with 720P in mind since that was the common TV res when the consoles came out), I think you'll be in good shape for quite a while. The only benchmark I've seen so far for the HD7950 with a next gen game is the Battlefield 4 Alpha, which does 30FPS at 1920x1080, 4X AA, very high quality. And that's presumably not particularly optimized code since it's an alpha version.
__________________
|
2013-08-07, 16:16 | Link #15 |
Juanita/Kiteless
Join Date: Apr 2006
Location: New England
Age: 40
|
What do you guys think about VRAM on GPUs? My card is 3 GB, which is pretty good. For someone who wants to have a very capable gaming PC (without spending $500-$600 or more on GPUs every 2 years), what do you think is a good amount of VRAM to have? I said that when I upgrade my GPU in about a year to a year and a half from now, I'd like a GPU that has at least 4 GB of VRAM, but how much do you think a high amount of VRAM will even matter? It is just like how I found out my i5 will be very good for some years and you don't need a top of the line i7 for PC gaming, can a similar thing be said about GPUs in these days and VRAM?
__________________
|
|
|