Login

russian armor

2GB VRAM enough for 1900x1080p?

9 Sep 2015, 17:35 PM
#21
avatar of The Prussian Officer

Posts: 76

Permanently Banned

Never run 16x AA you need a magnifying glass to see the difference over 4x, 2x is enough for RTS games anyway :snfBarton:


I know haha, but it was just to try what would happen if I did put it on 16x aa. Results were horrible with a incendiary barrage for example. Meanwhile I can run War Thunder on movie settings with 16x aa, Battlefield 4 at ultra 1440 p with 8x aa, Far Cry 4 at 1080 p ultra with 4x aa.

Simply a case of, relic not optimizing well.
10 Sep 2015, 01:13 AM
#22
avatar of Werw0lf

Posts: 121

edit: are you planning to oc?

OC is potentially on the agenda. Depends upon real world performance returns. If standard clock provides sufficient CPU grunt for COH2 -not a performance bottleneck, then no, until required. I have owned some legendary clockers over the years, i.e. most recent notable bang for buck reliable OCers being Intel Core 2 Q6600 and AMD Phenom II X4 955BE. But I OC on an as required basis, and only if substantial (worthwhile) performance gains can be had without sacrificing either stability & reliability. Three years is my personal sufficient longevity target.

I will go ahead and upgrade the core system regardless, some saving being rendered through reintegration with existing hardware. e.g venerable Corsair (Seasonic rebadge) HX-720 modular PSU (also have a HX-520 available, newer), 2TB Hitachi or Seagate 7200rpm HDDs -both good, Hitachi is faster real world performer, cooler running and quieter, Thermaltake V9 case (+ x2 additional 120mm case fans ..haha.. not so old yet to not enjoy a bit of bLinG), iHBS312 BD writer etc, adding an OS upgrade to WIN10/X64 into the mix when it happens. I currently run WIN7SP1/64. I have a decent enough CPU air cooler but a new CPU cooler of some type will be prerequisite. I've always been a fan of the performance & bang for the buck offered by Arctic Cooling PWM fans for case, CPU, and GPU all else being equal, although am not 'brand blinded' e.g. currently running a Xigmatek S1283 CPU (air) Cooler. I am favouring LC this time around though. In a hot and humid ¾ of the year sub tropical climate especially, it's just so much more efficient acknowledging that it's also significantly more expensive than air for a quality, quiet one.

On the replacement GPU, the jury is still out deliberating.

Lust is for AMD new die Fury class, desire is trying to persuade logic to go for something pricier in the GTX970/R9 390 die recycled class than the GTX960/R9 380 I envisage me paying for, and reason is shouting "just wait!". I spoke with my mate lastt night previously mentioned in an earlier post, and he is currently running a 4GB GTX980 planning on upgrading within the next 12 months if a cost vs benefit path eventuates within that time frame, so there's a possibility there.

As it is, my venerable HD 6870 performs adequately enough to play this game surprisingly smoothly in 1920x1080, but comes of course with the caveat/encumberance that it is of course, detail level restrictive. And, it's most apparent 1GB of VRAM of its era is insufficient for this game at that res.

I feel a need to comment that I am appalled at Relic's lack of optimization of this title, especially after two years since its release. If only that was equitable with SEGA's marketing of it!

10 Sep 2015, 01:37 AM
#23
avatar of Cyanara

Posts: 769 | Subs: 1

So glad I got my GTX970 before the dollar crashed through the floor :p

Based on your initial question, I might try doing some personal benchmarks of VRAM usage and chuck them into my performance guide as a graph.

But bottom line: 4GB of VRAM will get you a video card with much better longevity, especially with the new generation of consoles increasing the amount of memory cross-platform games can utilise. You can overclock an older card, but you can't put more RAM onto it.
10 Sep 2015, 02:45 AM
#24
avatar of Cyanara

Posts: 769 | Subs: 1

Ok, I just did some initial benchmarking. Hopefully this data helps you make a decision that's right for you.

  • Resolution: 1920x1080
  • No AA or VSync
  • All other in-game graphics options set to max.
  • All values from fresh boot of the game. Playing two maps in a row meant the VRAM wasn't fully flushing. Usage went up by 500MB on the same map.
  • VRAM usage may increase as game goes on. These were all taken at the beginning.


La Gleize (6-8 Players) - Changing Textures
Higher: 3.3GB
High: 2.8GB
Medium: 1.6GB
Low: 1.2GB

Changing Snow Quality and Image Quality had no noticeable impact on VRAM usage.

Angoville (2 Players) - Changing Textures
Higher: 1.9GB
Low: 0.984GB


So there's totally room to play with less than 4GB of VRAM, especially if 1v1 is your thing. You just need to know your limits with setting your texture options.
10 Sep 2015, 04:59 AM
#25
avatar of Werw0lf

Posts: 121

Thanks for taking time out to run those tests to assist me Cyanara. :thumb:

Yer the current exchange rate bites hard.

I've already made the inscribed in stone decision 4GB is minimum VRAM regardless whichever I buy. ..even if I weaken and do go NOW with an R9 380 or GTX960.

Right now, despite the R9 series recycled die status and TDP, the R 380 is favoured of those two, but of course, it then provokes the temptation that a no frills reference GTX970 can be had for just $90 more over the preferred MSI R9 380 option and $50 cheaper than the next closest priced GTX970! What's a poor lad to do?!! Cheapest either/or 380/970 scenario option is $110 differential a 30% premium, not insubstantial but still 'at a stretch' justifiable using the 'buy once not twice' maxim.

I'm torn, because experience over the years has taught me that from motorbikes to tools and most everything else, 'buy the whisky you prefer to drink, not the one you'd prefer to pay for' at the best price you can source it as a minimum acceptable compromise point.

In terms of total system build & longevity, in that scenario I'm begrudgingly inclined to stretch the budget and the Galax GTX970 would win the day. Unfortunately, moving to an R9 390 compounds by another $80 on top of that, so despite their 8GB onboard, 970 comes out on top in that 'just how far am I prepared to go above planned GPU budget?' wrist wringing contest. Prime negative of Galax's budget (no backplate) ver NVIDIA ref. design 970 is that it's almost certainly going to be louder, unresearched as yet but possibly considerably so vs either a STRIX or Frozr dual fan config.

Now I'll give it all few days cool off to consider so emotion (impulse & desire) can detach from the buying decision. Although $450ish used to be the required and relatively cheap spend for best buying OEM (came maked in static bag with driver disk, no packaging) in the day if you wanted the HOT Diamond Monster 3dfx Voodoo 4GB or Diamond Viper NVIDIA RIVA 128 4GB, spending that amount on just a GPU today is a grudge spend for me as I elaborated in an earlier post.

Again guys, many thanks for ALL the comments and suggestions, which I know you mean to assist me and anyone else with reading with a similar enquiry in future. :wave:
10 Sep 2015, 09:48 AM
#26
avatar of scratchedpaintjob
Donator 11

Posts: 1021 | Subs: 1

a gtx 970 with a stock cooler is pretty bad. not only is the 3.5+0.5GB Ram a thing in some modern games even at fullHD, maxwell also doesnt support important parts of DX12. furthermore, a stock cooler means that it is loud and warm....
12 Sep 2015, 03:35 AM
#27
avatar of Werw0lf

Posts: 121

a stock cooler means that it is loud and warm....

Not intended as argumentative, rather just matter of fact.

Investigation unveiling marketing hype reveals the reference design cooler performs quite adequately. Just like my almost identical current HD 6870 implementation it isn't loud although it's obviously not going to be as the Frozr type solution which turn off fans when GPU workload is virtually non-existent, ie. surfing. The GPU temp remains well within spec and indeed, not even especially hot. More importantly, the affordability factor of stock versus average OC + proprietary cooling outweighs the real world in game performance difference on an impactful significance comparison basis. Or in other words, warts 'n all, the stock GTX 970's lesser performance is insignificant relative to the sexier Frozer or STRIX implementation, yet is worlds apart perf wise than a OC + propietary cooling solution R9 380 for not all that much more in $$. Moving to an R9 390 is significant pricepoint step again, and although has benefits, is not without its own warts.

If wanting a GTX970, the vanilla ref design is good value in terms of performance vs price comparative with alternatives. I still think it's overpriced for what it is warts 'n all, but hey....NVIDIA are in a "take it or leave it" selling position.

Acknowledge the 970's 3.5VRAM issue. Wasn't aware of restrictive DX12 support issues? Segmenting suX doesn't it?! Completely unnecessary deliberate implementation of the former to hobble performance in order to delineate segment pricepoints relative to performance. I suppose now we have a set in concrete good-guy bad-guy cosy GPU cartel, it's all we can expect as long as their public keeps paying demand pricing for oversized glamorized boxes. And they will because cause the primary target demographic is renewed regularly as youth blossoms into young adulthood with the capacity and willingness to pay.

Although I will buy either brand based upon price performance compatibility, generally I am more disposed to AMD GPUs and graphics. ATI (AMD) have been around a long, long time, and were always the very best of chipsets and own brand spec. graphics card. But can you believe that after more than a decade, ATI and now AMD still haven't fixed the mouse cursor corruption issue which still occurs with the very latest Catalyst drivers! When it occurs in game, it's efffectively game over requiring a system reboot to fix. I just finished a 1v1 auto I lost primarily due to this issue. One extremely frustrating and highly motivating factor to buy an NVIDIA product this time around.

I'm not prepared to pay AUD$750+ for a GTX 980, so that being the case, other than the imperfect GTX 970, what's your solution?




12 Sep 2015, 11:05 AM
#28
avatar of scratchedpaintjob
Donator 11

Posts: 1021 | Subs: 1

well, the stock cooler is ~20 degrees warmer than the g1 gamign in the test they did. its a small bit little louder, but seeing that the g1 gaming is one of louder desgins, one could say, that the reference design is loud and warm. you will likely get 20% more clock with such a design.

i only can say it for germany, but you can get the stock design for 315€ and the custom designs from 330€ to 350€ for the good ones. that is not a lot more, especially if you plan to sale it in 2 yeras or so, the resalevalue is drastically higher with a good design.

about the drivers: AMD has stepped up in this department, maybe not in your case, but they are generally getting better, while nvidias driver are getting worse imo. dou you have a dual-screen setup? have you contacted AMDs customer-support?

the gtx 970 is slow when a game needs more than 3.5 GB ram and doesnt support asynchronous computing. therefore i cannot recommend to buy it at all.

the r9 390 nitro (not the x!) is 340€ in germany, so 15€ more than than a stock gtx970. is this vastly different to australia?

edit:
mouse cursor corruption may not be a problem with r9 390
https://www.reddit.com/r/AdvancedMicroDevices/comments/3hyj5o/r9_390_and_cursor_corruption_bug/
12 Sep 2015, 12:52 PM
#29
avatar of Werw0lf

Posts: 121

edit:
mouse cursor corruption may not be a problem with r9 390

Re the mouse cursor bug. ATI/AMD know about it. Have for years. I've had a succession of primarily ATI/AMD GPUS over the past decade, as well as a few NVIDIAs (more than one PC plus a laptop) and ATI now AMD did and do nothing about it. Google will reveal. So I wouldn't count on it unless the R 390's use a completely different version of Catalyst? And then I'd have to witness it myself. Coming from a fan of AMD/ATI GPUs generally, that's harsh criticism.

I think this post sums up the real world best rather than the 'it hasn't happened to me so you must be crazy/wrong' posts. "Happened to me with my 5770, and my 7870. Time will tell for the 390" I do sincerely hope the former are in fact right this time around.

AUD/EURO ATM = .63

Difference in price between cheapest ref (Palit/Galax) and ASUS/MSI STRIX/Frozr is AUD 70~80 or EUR 45~50 on average. A little cheaper for a XFX/Sapphire or other proprietary twin fan arrangement, but still AUD 50 more (EUR 30). IME a 20% perf difference would be 'highly unlikely'. Even ASUS who are never shy of self promotion claim less that 8% more speed for their OC ver STRIX. Quote: "1202 MHz boost clock for up to 7.7% faster game performance". MSI do make Frozr cooler refs, but not for the 970. Interpolating, up to a -10deg average cooler than ref still seems ambitious. MSI make no speed advantage claims other than the math you can do youself on their OC in terms of % of normal clocks.

IME GPUs depreciate like crazy now as soon as the NEW next best thing arrives. Marketing sees to that, and segmentation has also done its bit to eliminate resale other than gifting. Unless a bleeding edge buyer cycling every 12 months, any card is pretty much valueless by the time it's past its use by date as no one else wants it then either.

Copy the 'we feel cheated' (& rightly should too IMO) 3.5GB effective VRAM issue and other compromises.

R9 390 locally (not X) is AUD$520 => EUR 330 - it's come down in price $20 over the past week. The Sapphire brand unit can be had marginally cheaper, but I won't touch Sapphire brand cards. Been there have the t-shirts. IME they use cheaper grade components on their PCB and are prone to premature failure. Not worth the hassle to save a paltry $20 when you're spending $500 or more. Reliability is always a paramount criteria AFAIC.

If I was going to spend on either the 4GB 3.5GB GTX 970 Frozr/STRIX or an 8GB R9 390 pitched at the same pricepoint, AMD would be highly likely to get my money again despite the mouse cursor bug and essentially high TDP 3 year old recycled die. In my heart, $500 is really considerably more than I want to spend on JUST a GPU.





12 Sep 2015, 13:28 PM
#30
avatar of Khan

Posts: 578

Should be fine. I get a steady 55+ FPS on my R7 260x at 1920x1080 with Image Quality and Textures set to High with Low AA.
1 user is browsing this thread: 1 guest

Livestreams

Sweden 55
Poland 4

Ladders Top 10

  • #
    Steam Alias
    W
    L
    %
    Streak
Data provided by Relic Relic Entertainment

Replay highlight

VS
  • U.S. Forces flag cblanco ★
  • The British Forces flag 보드카 중대
  • Oberkommando West flag VonManteuffel
  • Ostheer flag Heartless Jäger
uploaded by XXxxHeartlessxxXX

Board Info

716 users are online: 716 guests
5 posts in the last 24h
39 posts in the last week
138 posts in the last month
Registered members: 45071
Welcome our newest member, damiandbishop
Most online: 2043 users on 29 Oct 2023, 01:04 AM