Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
VRAM Redux
#1
What are the chances we will ever see a redux of the "GTX 770 4GB vs 2GB Showdown"

http://alienbabeltech.com/main/gtx-770-4...-tested/3/

Suggested VRAM requirements continue to be overblown and that article was the only one I have found that addressed the issue in a straightforward and logical manner.
Reply
#2
apoppin is at babeltechreviews nowadays if you want to discuss the article with him.

http://www.babeltechreviews.com/community/index.php
Adam knew he should have bought a PC but Eve fell for the marketing hype.

Homeopathy is what happened when snake oil salesmen discovered that water is cheaper than snake oil.

The reason they call it the American Dream is because you have to be asleep to believe it. -- George Carlin
Reply
#3
It was a great article, one I remember very well
Reply
#4
(06-09-2015, 06:15 AM)gstanford Wrote: apoppin is at babeltechreviews nowadays if you want to discuss the article with him.

http://www.babeltechreviews.com/community/index.php

Yes I'm aware of the move.

(06-09-2015, 12:31 PM)ocre Wrote: It was a great article, one I remember very well

Seen the subject covered anywhere else ?
Reply
#5
(06-10-2015, 09:02 AM)JackNaylorPE Wrote: Yes I'm aware of the move.

(06-09-2015, 12:31 PM)ocre Wrote: It was a great article, one I remember very well

Seen the subject covered anywhere else ?
yes. No single source of such magnitude. The situation has changed somewhat.......
but it is very game dependent

these are games are limited. somewhat new but especially look at unity
http://www.gamersnexus.net/guides/1888-e...ark-vs-2gb

It can be worse in the modern console ports that try to load a lot of stuff in vram. But most of the time, we dont see big results just going from 2gb to 4gb on the same exact card.

here is a few more games where 2gb has a spectacular showing.
http://www.guru3d.com/articles_pages/gig...iew,7.html
https://www.pugetsystems.com/labs/articl...emory-154/

probably not gonna see a major difference in many games, especially older games. It still seems overblown for the most part. The chip itself is just all that powerful to need all that ram.
Reply
#6
The main interest for me personally is it adds another dimension to card selection. Tho it seems nVidia is working to discourage the practice, two lesser cards in SLI has been a more attractive purchase than the top card.

(2) 560 Ti's beat the 680 by 40% while being $100 less
(2) 650 Ti's beat the 680 for less money

The mid level cards then were slowed down a bit and nVidia made it less cost effective
(2) 770's was faster than the 780 / 780 Ti
(2) 970s is faster than the 980 / 980 Ti and cheaper at least while the game promos was going on and you could sell off the 2nd game coupons

But today I see peeps arguing against the 980 Ti cause 6 GB is not enough at 4k. Personally, I don't think any card or pair of cards is adequate at 4k and I'm not really interested until the AAA games can all do 60 fps at 144 Hz. For whatever reason, the ability of a game to use more than X GB is somehow inferred as meaning there is an observable impact if it has less than X GB.

This was confirmed in the article here with the Max Payne test and in the Guru 3D article which concluded

"What is interesting to see is that the 4GB version utilized over 3GB memory here, the 2GB version obviously can only utilize 2GB. That has no effect on FPS or game rendering experience whatsoever though."

However, the GamersNexus thing does introduce a significant new wrinkle, tho game specific..... the effect seems to be very similar the RAM Speed / CAS ..... some games it means nothing ..... most games it's marginal .... and some games it can mean 10+% better performance.
Reply
#7
Actually, 2 650 Ti BOOSTs beat the 670 for less money.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#8
(06-11-2015, 01:21 AM)SteelCrysis Wrote: Actually, 2 650 Ti BOOSTs beat the 670 for less money.

and the 680

http://www.techpowerup.com/reviews/NVIDI...LI/21.html


Here's another one .... 295x2 4 GB vs 290x 8GB in CF

http://www.kitguru.net/components/graphi...-review/9/
Reply
#9
The last time I bought flagship cards was with GTX 280, I also went flagship with 8800 GTX (both in SLI, I haven't run a gaming rig without SLI inside since g71).

Most of the time I SLI two lower tier cards and don't regret it one bit. However, I also usually get cards with larger framebuffers (128mb Ti4200, 256mb 6600GT, 512mb 7900 GT's, 2gb GTX 460's) and that is something I don't regret either, none of those cards cost all that much more than their lesser memory variants.
Adam knew he should have bought a PC but Eve fell for the marketing hype.

Homeopathy is what happened when snake oil salesmen discovered that water is cheaper than snake oil.

The reason they call it the American Dream is because you have to be asleep to believe it. -- George Carlin
Reply
#10
(06-11-2015, 01:16 AM)JackNaylorPE Wrote: The main interest for me personally is it adds another dimension to card selection.   Tho it seems nVidia is working to discourage the practice, two lesser cards in SLI has been a more attractive purchase than the top card.

(2) 560 Ti's beat the 680 by 40% while being $100 less
(2) 650 Ti's beat the 680 for less money

The mid level cards then were slowed down a bit and nVidia made it less cost effective
(2) 770's was faster than the 780 / 780 Ti
(2) 970s is faster than the 980 / 980 Ti and cheaper at least while the game promos was going on and you could sell off the 2nd game coupons

But today I see peeps arguing against the 980 Ti cause 6 GB is not enough at 4k.   Personally, I don't think any card or pair of cards is adequate at 4k and I'm not really interested until the AAA games can all do 60 fps at 144 Hz.  For whatever reason, the ability of a game to use more than X GB is somehow inferred as meaning there is an observable impact if it has less than X GB.

This was confirmed in the article here with the Max Payne test and in the Guru 3D article which concluded

"What is interesting to see is that the 4GB version utilized over 3GB memory here, the 2GB version obviously can only utilize 2GB. That has no effect on FPS or game rendering experience whatsoever though."

However, the GamersNexus thing does introduce a significant new wrinkle, tho game specific..... the effect seems to be very similar the RAM Speed / CAS ..... some games it means nothing ..... most games it's marginal .... and some games it can mean 10+% better performance.

Very interesting post,


I don't think you will have a single issue with the 980ti and 6gb. You really can't listen to the gibberish people say. Most people don't think for themselves and just repeat whatever BS they have been fed. Obviously, you are capable of independent thinking and can put things together for yourself.

I don't know why you brought up the 980ti. It doesn't fit in your two lesser card pattern. I highly doubt that 6gb on the 980ti will ever be a big drawback. That is not to say that that the titanX can never have an advantage, it may in some cases. But it won't be anything deal breaking, if it even is measurable. Right now, aftermarket 980ti models are blasting right past the 12gb titanX in 4k games running at their stock settings right out of the box. These cards aren't held down by temps so they boost higher and stay there. The 6gb isn't an issue today, I don't expect it ever will be.

I am confused by two things in your post. The first, I will just address it and you can correct me if I am wrong. When you say that nvidia is working to discourage, are you talking about the gtx960? It is pretty weak, sure. But the 960 is awfully petite all by itself. Two together, on that tiny bus.......there isn't much hope there.

I am not so sure that nvidia is trying to discourage mid level GPU SLI or if it is just a byproduct of them capitalizing on this situation. See, the gm206 is the half of the mid level maxwell. It is half of the gm204. The gm200 is like triple a gm206.

I still think SLI 970 would be a pretty potent setup. Not sure how it does against a 980ti.

We need to wait one more generation to see if this is a pattern or just how things turned out this round.

The second thing.....
Now this one has me scratching my head. What do you mean by 60fps at 144hz?
Reply
#11
Just noting that twin x70s are faster than x80s and x80 Tis. No real reference to the memory issue. 6GB is way more than enough, after the "fake rage" 3.5 GB issue was conclusively shown to be a "non issue". I don't see anything suffering from 970s memory at 1440p and below and if you want 4k, you don't buy a 970.

nVidia is working to discourage peeps buying two 3rd tier cards instead of their flagship. No more buying two $200 cards and beating the x80. The performance of the 960s in SLI is dismal .... not even as fast as a single 970.

(2) 560 Ti's ($400) beat the 580 ($500) by 40% while being $100 less
(2) 960s ($400) is a real dumb buy as it is $80 more than the $320 970 which beats it.

It would seem that nVidia was leaving cash on the table by discouraging the purchase of the 580 .... by putting out a product (560 Ti) that with 2 of them, you could get 40% more performance while reducing their income by $100. They want you to pay more more money for more performance, not the other way around. They have accomplished that with the 9xx series.

Right now the enthusiast's gamer's buzz is about either

1440p @ 144 Hz
2160p @ 60 Hz

Given the choice, I'll take 1440p @ 144 Hz. I don't see 2160p as being ready for prime time for 12 - 18 months.

a) If it ain't 120/144 Hz, I'm not interested
b) We won't see 2160p at much above 60 Hz until Display Port 1.3 arrives.
c) What's the point of having 120/144 Hz if your cards can't do 60+ fps ?
d) No pair of cards in SLI / CF can reliably deliver 60+ fps across all new AAA games nor can they deliver 2160p at those frame rates until they have DP 1.3 .... tho i thought I saw a monitor blurb from Computex that used 2 DP 1.2 cables.
Reply
#12
DP 1.2 can handle 2160p@60 FPS.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#13
(06-16-2015, 01:38 AM)JackNaylorPE Wrote: Just noting that twin x70s are faster than x80s and x80 Tis.   No real reference to the memory issue.  6GB is way more than enough, after the "fake rage" 3.5 GB issue was conclusively shown to be a "non issue".  I don't see anything suffering from 970s memory at 1440p and below and if you want 4k, you don't buy a 970.

nVidia is working to discourage peeps buying two 3rd tier cards instead of their flagship.  No more buying two $200 cards and beating the x80.  The performance of the 960s in SLI is dismal .... not even as fast as a single 970.

(2) 560 Ti's ($400)  beat the 580 ($500) by 40% while being $100 less
(2) 960s ($400) is a real dumb buy as it is $80 more than the $320 970 which beats it.

It would seem that nVidia was leaving cash on the table by discouraging the purchase of the 580 .... by putting out a product (560 Ti) that with 2 of them, you could get 40% more performance while reducing their income by $100.  They want you to pay more more money for more performance, not the other way around.   They have accomplished that with the 9xx series.

Right now the enthusiast's gamer's buzz is about either

1440p @ 144 Hz
2160p @ 60 Hz

Given the choice, I'll take 1440p @ 144 Hz.  I don't see 2160p as being ready for prime time for 12 - 18 months.

a)  If it ain't 120/144 Hz, I'm not interested
b)  We won't see 2160p at much above 60 Hz until Display Port 1.3 arrives.
c)  What's the point of having 120/144 Hz if your cards can't do 60+ fps ?
d)  No pair of cards in SLI / CF can reliably deliver 60+ fps across all new AAA games nor can they deliver 2160p at those frame rates until they have DP 1.3 .... tho i thought I saw a monitor blurb from Computex that used 2 DP 1.2 cables.

oh okay

I totally agree with you about the refresh rates. No gamer should be interested in buying 60hz panels these days. is your monitor higher than 60hz now?

I also think that 1440p would be ideal. But it is not really the resolution, its the pixels per inch that matter. Pixel density. But that apparently is too complex and we resort to 1440p/1080/1600/2160 and ignore the most important factor. It is the pixel density that makes 4k what it is, not the 4k part.

I ended up with a 1080p 144hz gsync monitor. It is an amassing gaming monitor. The draw back is its size. Its huge for a 1080 screen. 27 inches. XB270H.
I love it though, and there are many others that do.

But a monitor like this, it may not be fore everyone. I dont have perfect vision and perhaps that is one of the reasons i am fine with it. I think for 27 inches, a lot of people would want at least 1440p. But me, I would rather have higher frame rates and the highest settings rather than the denser pixels. One of the reasons i love PC and always have, it is the choice and options. The preferences are what make it awesome, at least i appreciate that.

Anyway, i agree that the 960 is crap for SLI. But i think it is crap for a mid ranged option.
I am not sure if this is any master plan to prevent mid ranged SLI besting the high end though. I am leaning towards it being a byproduct. Being stuck at 28nm really had an effect. Maxwell was pulled in and originally it was planned for 20nm.
I know it may not sound like much but changing fabs or nodes is bigger that what most think. It is not cheap and could drain a lot of resources. A lot happens before a chip can move to production, Heck a lot before one even tapes out. Nvidia was very successful with the gm107 but they went with a smaller and less complex maxwell first. The follow up with the gm204 was inevitable and seemed to go very well. Nvidia had confidence but decided to go with the gm204 rather than trying to push out the biggest 28nm chip ever next. See, the gm204 was a much safer bet and would buy them a lot of time to get the gm200 to market.

I truly believe that the gtx960 was filled with the gm206 just as a product of opportunity and not a plan to stop mid ranged SLI. The gm206 fills the slot. It is just how this gen played out.

Maxwell is would have been a different generation had it been on 20nm. Its segmented with the 750ti on one generation and the 960 on the other. It wasnt this organized plan from long ago, it is just how things played out.

of course, nvidia may never release a powerful mid range gpu again. It remains to be seen. But with advancing nodes becoming so problematic these days, times are changing. GPUs have never been stuck on a node for so long. The future, who knows
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)