Solved

How many years would this setup last for gaming?

I'm building a desktop, is this a good setup? (for gaming)
Will the 3G be enough for ultra settings on a 1080p monitor? (until late 2016 at least)
gtx 780 3G
8gb Ram
Asus z87-A motherboard
1 TB HD
intel i5 4th gen
24 answers Last reply Best Answer
More about years setup gaming
  1. not ultra settings but probably high settings
  2. Best answer
    That build will do fine for a couple years. A GTX 780 3GB will be able to run all games on Ultra Settings for 2 years.... that is very fair to say on a 1920x1080p 60Hz display.
  3. fkr said:
    not ultra settings but probably high settings


    Would overclocking or perhaps more vram make max settings be an option for longer?
  4. skit75 said:
    That build will do fine for a couple years. A GTX 780 3GB will be able to run all games on Ultra Settings for 2 years.... that is very fair to say on a 1920x1080p 60Hz display.


    Would overclocking or perhaps more vram make max settings be an option for longer?
  5. Not really.

    That rig will be just fine for the majority of games in 2016 on a 1920x1080p 60Hz display. It should Max out most games on that display probably into 2017. A GTX 780 is still overkill on that display, right now for ALL games.
  6. battlefield 4, arma III, crysis3, metro LL are all sitting right at 60 FPS right now, so to say in a couple years will i still be able to play a ultra is not something I am comfortable with guaranteeing.

    http://www.tomshardware.com/reviews/geforce-gtx-780-ti-review-benchmarks,3663-8.html

    above is the gtx 780ti review with numbers for all the other cards also.

    overclocking will always help and you may get another 10% out of the card. I would just be prepared to go to high settings at some point or SLI.

    more GDDR will help as games like watch dogs are already butting heads with 3 gigs of GDDR.

    the r9 290 and 290x are also options
  7. skit75 said:
    Not really.

    That rig will be just fine for the majority of games in 2016 on a 1920x1080p 60Hz display. It should Max out most games on that display probably into 2017. A GTX 780 is still overkill on that display, right now for ALL games.


    That's what i've been reading, except for a heavily modded Skyrim apparently.
    I hope no vanilla game for the next 2-3 years will need more than 3G on a 1080p 60hz monitor
  8. fkr said:
    battlefield 4, arma III, crysis3, metro LL are all sitting right at 60 FPS right now, so to say in a couple years will i still be able to play a ultra is not something I am comfortable with guaranteeing.

    http://www.tomshardware.com/reviews/geforce-gtx-780-ti-review-benchmarks,3663-8.html

    above is the gtx 780ti review with numbers for all the other cards also.

    overclocking will always help and you may get another 10% out of the card. I would just be prepared to go to high settings at some point or SLI.

    more GDDR will help as games like watch dogs are already butting heads with 3 gigs of GDDR.

    the r9 290 and 290x are also options


    I'm surprised to see r9 290 runs better than a gtx 780.
    I heard Watch Dogs has bad optimization, though. Hopefully GTA V won't.
    Based on experience would you say there is a difference between ultra and high?
    Also are games still smooth at 30+ fps?
  9. The last gaming build I did was a i7-4770K with a GTX 780 and I was getting 88-96 FPS in BF4 using Ultra preset settings on a 1920x1080p 144Hz Benq display.

    I thought the Watchdogs VRAM thing was going to be fixed? Are they now saying that a game built on console technology and ported to PC still is going to require all that VRAM and that it was "by design" hahah.... never saw that coming ;)
  10. skit75 said:
    The last gaming build I did was a i7-4770K with a GTX 780 and I was getting 88-96 FPS in BF4 using Ultra preset settings on a 1920x1080p 144Hz Benq display.

    I thought the Watchdogs VRAM thing was going to be fixed? Are they now saying that a game built on console technology and ported to PC still is going to require all that VRAM and that it was "by design" hahah.... never saw that coming ;)


    Interesting, do you notice differences between Ultra and High settings?

    And i don't know what was up with Watch Dogs, never played it, just heard it was demanding.
    I hope they don't put the same excuses for The Witcher 3, couldn't even run the first on my current setup but hope to play them smoothly with this one.
  11. there is a big difference between ultra settings and high as far as how much computing power you need, although the difference in quality is not as great and is still a very good looking game.

    30 fps is what you need to make sure you are not geting a lagging screen with jitters and such. games at 30fps are still very playable

    I think you would be gaming very happily with a gtx 780 or a r9 290/x in two years time but I would not support that you would be at ultra and great framerates.

    when games are starting to push the 3 gig limit you know it is not very far behind that all games will reach this point. just go read the new r9 285 article today and it says how the gtx 760 needs to be relooked at becasue of the 186 memory interface and how that is just not enough anymore for its price point.

    i mean two years ago for 400-500 you would have been buying a gtx 670 or if you upped your budget maybe a gtx680

    the 680 would still work today but not a 670. it is just my 2 cents but all of these cards would still game today

    hell I run CF 7950's
  12. fkr said:
    there is a big difference between ultra settings and high as far as how much computing power you need, although the difference in quality is not as great and is still a very good looking game.

    30 fps is what you need to make sure you are not geting a lagging screen with jitters and such. games at 30fps are still very playable

    I think you would be gaming very happily with a gtx 780 or a r9 290/x in two years time but I would not support that you would be at ultra and great framerates.

    when games are starting to push the 3 gig limit you know it is not very far behind that all games will reach this point. just go read the new r9 285 article today and it says how the gtx 760 needs to be relooked at becasue of the 186 memory interface and how that is just not enough anymore for its price point.

    i mean two years ago for 400-500 you would have been buying a gtx 670 or if you upped your budget maybe a gtx680

    the 680 would still work today but not a 670. it is just my 2 cents but all of these cards would still game today

    hell I run CF 7950's


    Sorry for asking so much, but would that mean the 770 4G is a better future-proofing choice?
    Or is that totally dependent on the game?
    Also thanks for all the great info
  13. i think the the r9 290 is the best card out there right now, it has the 512 memory bus interface and is based on the newest architecture right now hawaii. It also has 4 gigs of GDDR.

    if you want to stick with nvidia then a 780 is the best.

    just having more GDDR is not the answer you must look at the speed it runs (MHz) and also the memory interface that it runs through (be it 256, 386 or 512 bit)

    If you are worried about the reliability of a card just get a MSI since they cover a card even if you are not the original owner and they have replaced GPU's for me that should not have been replaced because my daughter spilled juice on the card.

    Please ask any questions you want to I do not get tired of answering questions
  14. fkr said:
    i think the the r9 290 is the best card out there right now, it has the 512 memory bus interface and is based on the newest architecture right now hawaii. It also has 4 gigs of GDDR.

    if you want to stick with nvidia then a 780 is the best.

    just having more GDDR is not the answer you must look at the speed it runs (MHz) and also the memory interface that it runs through (be it 256, 386 or 512 bit)

    If you are worried about the reliability of a card just get a MSI since they cover a card even if you are not the original owner and they have replaced GPU's for me that should not have been replaced because my daughter spilled juice on the card.

    Please ask any questions you want to I do not get tired of answering questions


    MSI isn't sold in my country. There is warranty on Nvidia, haven't checked AMD at all though.
    I'll ask though.
    Thank you!
    Hope your daughter is more careful with her juice..
  15. Stay away from the 4GB GTX 760 and 4GB GTX 770 cards. Both show only minimal performance gains over the 2GB version and that is only at multi-monitor resolutions. They are a waste for a single 1920x1080p display.
  16. skit75 said:
    Stay away from the 4GB GTX 760 and 4GB GTX 770 cards. Both show only minimal performance gains over the 2GB version and that is only at multi-monitor resolutions. They are a waste for a single 1920x1080p display.


    I'm just a bit worried about the vram on 780 for future games.
    I decided to wait till the new cards are out this month and see what happens.
    Thank though.
  17. i agree. if anything there will be a price drop for current cards.

    I would do the r9 280/x or wait.

    it seems nvidia is skipping the 8xx series and going straight to the 9xx series so hopefully we see another good push in GPU performance
  18. The GTX 780 & 780Ti make full use of that 3GB of VRAM. I will go out on a limb here and say that it is safe to say that future games will not be able to saturate that 3GB on a 1920x1080p display before the 780 architecture is ready to be upgraded.

    In other words, you will likely feel the 780 is too slow for games before you run out of VRAM capacity in a 1920x1080p game.
  19. skit75 said:
    The GTX 780 & 780Ti make full use of that 3GB of VRAM. I will go out on a limb here and say that it is safe to say that future games will not be able to saturate that 3GB on a 1920x1080p display before the 780 architecture is ready to be upgraded.

    In other words, you will likely feel the 780 is too slow for games before you run out of VRAM capacity in a 1920x1080p game.


    while i tend to agree I always worry about future proofing when I see a current gen game that is already hitting the limit. I am of course referring to watch dogs. I know this is a console port but it is written in the x86 instruction set and future console games are now going to be able to use allot more GDDR in there games. one of my favorite websites "hardocp" did a write up recently about the state of the game today and what is needed at every resolution and it came to the conclusion that you have to have 3 gigs of GDDR to play at ultra settings. here is the link to the 1080p tests with a bunch of current gen cards

    http://www.hardocp.com/article/2014/08/18/watch_dogs_performance_image_quality_review/6#.VAdknFxT42U
  20. Good link.... I visit there also.

    So I guess the question becomes... Do you actually think Watchdog's way of using VRAM is the way of the future or do you chalk it up to some lazy development/rushed to market product that could of been more efficient @ launch?

    I happen to believe Watchdogs is an anomaly and I wouldn't build a rig around an exception to the rule. There are certainly, more visually beautiful games out there that do not have the bloated VRAM issues that Watchdogs has.
  21. skit75 said:
    Good link.... I visit there also.

    So I guess the question becomes... Do you actually think Watchdog's way of using VRAM is the way of the future or do you chalk it up to some lazy development/rushed to market product that could of been more efficient @ launch?

    I happen to believe Watchdogs is an anomaly and I wouldn't build a rig around an exception to the rule. There are certainly, more visually beautiful games out there that do not have the bloated VRAM issues that Watchdogs has.


    it is definitely the anomaly right now and is by no way a new standard. I do think that once a game breaks a certain barrier in GPU needs other games tend to follow in footstep and start ramping up their GPU usage also. Since this thread is asking for 2 years down the road at ultra I would not be surprised to see that 3 GDDR become the new minimum by that time. two years ago one gig used to be plenty and now 2 gigs is the minimum so in 2 more years i just see that moving to 3 gigs.
  22. fkr said:
    skit75 said:
    Good link.... I visit there also.

    So I guess the question becomes... Do you actually think Watchdog's way of using VRAM is the way of the future or do you chalk it up to some lazy development/rushed to market product that could of been more efficient @ launch?

    I happen to believe Watchdogs is an anomaly and I wouldn't build a rig around an exception to the rule. There are certainly, more visually beautiful games out there that do not have the bloated VRAM issues that Watchdogs has.


    it is definitely the anomaly right now and is by no way a new standard. I do think that once a game breaks a certain barrier in GPU needs other games tend to follow in footstep and start ramping up their GPU usage also. Since this thread is asking for 2 years down the road at ultra I would not be surprised to see that 3 GDDR become the new minimum by that time. two years ago one gig used to be plenty and now 2 gigs is the minimum so in 2 more years i just see that moving to 3 gigs.


    But let's say i SLI'd in the future... does that count for nothing if the vram is only 3gigs?
  23. You would get some extra processing power from SLI but your available VRAM capacity would still be 3GB.
  24. skit75 said:
    You would get some extra processing power from SLI but your available VRAM capacity would still be 3GB.


    this is why i like the r9 290. it uses the fastest 512 bit memory interface and you get 4 gigs.

    basically in SLI/CF the GDDR is copied from card to the next so both cards need to use the ram at the same time
Ask a new question

Read More

Gaming Desktops RAM Asus Systems Motherboards Gtx