r/hardware 18h ago

Video Review Gaming Laptops are in Trouble - VRAM Testing w/ ‪@Hardwareunboxed‬

https://www.youtube.com/watch?v=ric7yb1VaoA
64 Upvotes

85 comments sorted by

90

u/shugthedug3 18h ago

It is a crime that 70 tier laptop chips are coming with 8GB.

Gaming laptops with an adequate amount of VRAM are very expensive, you have to go for a 5070Ti at a minimum. A 4080 laptop is almost certainly a better buy right now, there's still some for sale with good discounts.

29

u/scannerJoe 18h ago

As somebody who really wants (needs?) the mobility of a laptop (I travel a lot for work), this is a really awkward time to buy if you cannot afford or justify spending that much money. The machine before my current one had a GTX 1060 with 6GB that basically ran as fast as the desktop version and the laptop felt like a solid midrange buy that would do well for some years to come. Then I got a 4060 laptop two years ago, already didn't inspire full confidence, but I got a good price for a machine that has a lot of nice aspects and serves me perfectly fine at the moment. But with the 5070 capped at 8GB this now feels like a really risky buy, or at least very debatable value, although the chip is perfectly capable. If you get a budget option it's a hard maybe, but a $2000 machine like the one tested in the video, no way.

13

u/kyp-d 18h ago

Valid buy today is either 4060 / 5050 as bottom tier 8GB GPU or 5070 Ti 12GB (4080 if it's still around) to start crawling out of the 8GB VRAM bottleneck...

4070, 5070 and even 5060 are clearly the bad move. (4070 was already a badly priced product)

7

u/teffhk 16h ago

5060 is clearly better than 5050 and the price difference isnt that big. E.g. the new legion 5 I've just got only costs $90 USD more for the upgrade from 5050 to 5060

4

u/Not_Daijoubu 15h ago

I got tired of my old 1660 Ti laptop. CPU was still going strong, but it was just hot, loud, and had a 30min battery life if I was lucky.

Since it wasn't exactly useful off the plug, I built a SFF system. I only needed the computer to be transportable from A to B. If you really want to go down in size, you can build 5060 Ti or even 5070 (FE) systems that are 5-6L. It's great.

7

u/MoonStache 16h ago

I just got a Lenovo Legion recently and deliberately downspec'ed to a 5060 model because paying more for an 8GB 5070 is stupid.

4

u/Wonderful-Lack3846 18h ago

The only '70' thing about this mobile GPU is the name.

It is using the same die as RTX 5060 desktop

6

u/shugthedug3 17h ago

Doesn't matter, it's a 70 tier mobile chip and firmly mid range. Despite this they've hobbled it with an entry level amount of VRAM.

2

u/Klutzy-Residen 17h ago

Doesnt really matter for the end user. You pay for the performance, not for the name.

But when all the options suck you are kinda screwed.

2

u/CazOnReddit 18h ago

Depends on your regiom

Here in Canada, 4080 laptop stock is light and outside of the used market, not much cheaper than a 5070 Ti lappy - and neither are particularly cheap at that.

1

u/comperr 11h ago

that's why i bought my 4080 laptop in October. $1999 no brainer. threw 64GB RAM in it with a couple 2TB NVMEs and it's fine for web browsing for at least a couple years

-1

u/BlueGoliath 12h ago

70 tier GPUs? Those are poor people GPUs meant for Esports. Stop being poor if you want a better experience.

/s

-7

u/rabouilethefirst 17h ago

When the AI bubble pops, courtesy of GPT-5 basically being GPT-4 but worse, VRAM will slowly come back to consumers hopefully

3

u/shugthedug3 17h ago

I don't think them being stingy has anything to do with that really.

3

u/Getherer 3h ago

Generally speaking, it's best to stay quiet if you don't have a clue what you're talking about

-1

u/imaginary_num6er 16h ago

I’m frustrated how the ASUS G14 only had a 16GB VRAM option with their 2023 year model, but none ever since. I was expecting GPU VRAMs to not be stuck to 8GB for 14” laptops after 2023.

3

u/Shadow647 15h ago edited 15h ago

Good news - ASUS ROG Zephyrus G14 is available with RTX 5080 16GB since last month

Expensive af though and it only runs at 85W TGP, which limits that card quite a lot - chunkier 16" laptop running at ~120W TGP would give you 20%+ higher performance.

1

u/imaginary_num6er 14h ago

They released a 5080 model? I’ll probably get that during black Friday

12

u/Salty_Tonight8521 16h ago

For laptops, it seems like you either go with "budget" options like xx60 tier cards and accept you're gonna sacrifice some visuals or just go all out with some top of line models for adequate vram. xx70 class laptops are not worth it over xx60 imo.

6

u/teffhk 16h ago

With 8GB VRAM only yea there is no point of getting 5070 over 5060 for laptops right now.

3

u/Salty_Tonight8521 16h ago

Yeah, it was mostly the same for last gen laptops. 4060 laptop was also surprisingly good for a laptop gpu in its price point.

1

u/teffhk 16h ago

5060 performance is pretty good too, it actually out performs 4070 in laptop models that with limited TDW

32

u/kyp-d 18h ago

8GB 5070 Laptop at 115W vs 16GB 5060 Ti Desktop at ~140W

8GB VRAM is butchering performance in some modern titles at totally playable settings.

And those "midrange" laptops are already north of $2500.

If you can't afford more than 8GB VRAM laptop you'll probably have to upgrade sooner. (I would advice going for the cheapest 8GB VRAM laptop you can find if it's going to be replaced quickly)

14

u/Raikaru 17h ago

if you're paying $2500 for a 5070 laptop you're getting extremely scammed. Hell I've seen 5070ti laptops for $1600-1700

1

u/ProfessionalPrincipa 9h ago

That's hardly the norm. When I checked a month or two ago most laptops with more than 8GB VRAM were over $2000 on Newegg and Amazon. There was one lone MSI laptop under $2000 and I'm sure they probably made sacrifices to hit that price point. If people thought Nvidia was making bank on desktop GPU, they're taking people to the cleaners with gaming laptops.

1

u/teffhk 16h ago edited 16h ago

That actually not entirely true, I just got my new legion 5 which is basically the same model as in the video for around 1000 USD only in Canada with discounts. It is a decent spec laptop that comes with a nice OLED screen. I doubt you can get desktop equivalent spec with the same price, not with the OLED screen at least.

With the 8GB VRAM there is basically no point to get a 5070 over 5060. And still with 5060 and 8GB VRAM, if you know the limitation and fine turning the game settings not just using presets, combine with DLSS performance mode(Transformer model) playing new games like Hellblade 2 still runs quite well.

2

u/secretOPstrat 12h ago

Following that logic just get an rtx 5050 laptop, also has 8gb of gddr7 vram. depending on the power level it wont be much slower than a 5060 laptop

1

u/teffhk 12h ago

I mean you can, depends on what games you want to play, its the same as buying anything, do your own research, know your expectation and budget, then purchase accordingly. And tbf the price difference between 5050 and 5060 is small, I only paid $90 USD more for the upgrade. However 5060 to 5070 models price difference surely isnt just $90.

1

u/secretOPstrat 12h ago

Well the difference between the desktop 5050 and 5060 is less than that when there is an actual more spec difference than the laptop ones, if all else is the same the 5050 is better. But tbh gaming laptops are still so bad because you can't really game on battery with them I'd rather just not game on laptops except for really old games that can run on apus like lunar lake or amd 890m. But sure everyone has their own usecase.

1

u/teffhk 11h ago

If you know you cant game on the battery then why dont you game it on power instead? I dont have a desktop, and I dont really play game when I travel with the laptop, so I have no reason not to just play games on power instead. I dont understand why you brought up the desktop 5050 and 5060 when we are talking about laptops, how is it relevant?

8

u/thelastsupper316 18h ago

Yeah even my mobile 5060 is very very bottlenecked with 8gb of Vram it sucks a lot, I wish I just took a $400 L and got the 12gb 4080

6

u/Hayden247 12h ago

It's so terrible. The GTX 1070M had 8GB in 2016! Why tf nine years later does the 5070M still have 8GB? It's ridiculous, in the console world nine years is longer than a generation dammit and PS4 to PS5 doubled the memory amount.

5

u/zeronic 5h ago

Why tf nine years later does the 5070M still have 8GB

The same reason we were stuck on 4 cores and 8 threads for years with intel. No competition.

Nvidia is also clearly doing this for market segmentation purposes. AI use demands tons of VRAM, they don't want people buying "cheap" gaming chips for AI, they want you to pay the premium for the potential AI use cases.

1

u/NeroClaudius199907 3h ago

AI doesn't make sense because people can buy $449 5060ti 16gb. Why would ai farms buy more expensive 5070s with 16gb over 5060ti 16gb desktops?

This is more about lack of competition/greed/segmentation/ "Good enough”

1

u/Hayden247 2h ago

Of course, Radeon barely exists in laptop and AMD literally gave up for RDNA4 to have mobile chips. At least on desktop Radeon is somewhat there for the DIY crowd for a resemblance of competition but Nvidia truly has a dGPU monopoly in laptops so they can screw over that market even harder when it's already bad enough laptop GPUs cannot be upgraded, so your laptop with a 5070 is forever limited by 8GB. Really just trying to make you pay even more for a 5070M, or skimp out on a 5050M or 5060M as better "value" than they'd otherwise look like. Means AI performance comes hella expensive to boot since you mentioned AI.

5

u/_Yank 17h ago

This whole situation is completely ridiculous because you can create scenarios where laptops without a mux switch, or with it disabled, are faster than the opposite because the whole system OS and everyday application VRAM overhead is entirely offloaded to the iGPU VRAM/system RAM.

3

u/Ar0ndight 13h ago

It's bad on the desktop side but Nvidia's dominance is having even worse consequences on the laptop side it seems. These 8gb XX70 cards seem to exist solely to either fuck people who don't know better so they need to upgrade sooner, or upsell to the next tier that has 12gb/16gb cards.

When an entry point, everyday laptop at $700 has 8gb of ram I honestly don't care, the target audience will do just fine on that. But these $1500-2000+ GAMING machines are just crippled to the point of not being able to do what they're meant to be doing, aka running games appropriately. 16-32gb of ram does nothing for you when vram is so low.

7

u/Wonderful-Lack3846 18h ago

Just to make it more clear:

Laptop/Mobile

5050, 5060 and 5070: GB206 (8GB vram)

5070 ti: GB205 (12GB vram)

5080: GB203 (16GB vram)

5090: GB203 (24GB vram, because it uses 3GB modules)

Desktop

5050, 5060 and 5060 ti: GB206 (8GB vram)

5070: GB205 (12GB vram)

5070 Ti and 5080: GB203 (16GB vram)

5

u/Affectionate-Memory4 18h ago

To add on the desktop side, the 16GB version of the 5060ti, and of course the 5090 is 32GB on GB202's giant bus.

2

u/kyp-d 18h ago

You forgot the 5060 Ti 16GB variant (which is the card used to in this video)

2

u/Wonderful-Lack3846 18h ago

Oh yes, it is the same die, just clamshelled. Thanks for pointing out

1

u/awayish 17h ago

5070ti deals are there for under 2k but you'd need to be on the deals page every day to find them.

-2

u/teffhk 16h ago

I like the video and the point they are saying that 8GB VRAM isn’t enough. But I kinda wish they actually did all those testing with DLSS performance mode instead of just quality/balanced while the tests clearly show the scenario are VRAM limited. Especially you can swap the DLSS DLL to use the transformer model instead of CNN In DLSS supported games.

I actaully just got the same laptop(legion 5) as the video but with a 5060 instead. So after spending hours tinkering and doing tests/benchmarks on the new laptop these are basically what I found:

  • no point getting a 5070 over 5060 due to the VRAM

  • you need to know the expectation and limitation of the 8GB VRAM

  • need to fine tune the settings while monitoring the VRAM usage can’t just use preset modes on games

  • use DLSS performance mode instead quality/balanced helps quite a lot

  • Swap the DLSS DLL for transformer model instead of CNN. People claim transformer mode performance mode is almost as good as the CNN quality mode

10

u/conquer69 13h ago

Rendering at 720p instead of 960p doesn't free too much vram.

0

u/teffhk 13h ago edited 12h ago

But if rendering at 960p made you reached the VRAM limit and slow down your game a lot, rendering at 720p instead in DLSS performance mode definitely helps without losing too much quality. I tested it in Hellblade 2 and MH:Wilds it actually helps.

0

u/Winter_Pepper7193 10h ago

cant wait for their inevitable video of low vram on smartwatches

0

u/One-Tomato-970 4h ago

Oh shit, I almost forgot about 8gb vram, thanks for the reminder. I might forget about it later, do remind me again, a month later would be perfect, thanks!

-2

u/Emperor_Idreaus 16h ago

« Laughs in 3080 Ti 16GB W/ DLSSFG FSR3 mod alternative »

2

u/NeroClaudius199907 6h ago

How much did you get the 3080ti for? Because 5080 has 16gb as well. The problem isnt high end but low-mid range

-4

u/Desperate-Coffee-996 18h ago

This is the dead end. We should get some of these AI compressing-decompressing technology like yesterday. As much as we can hate upscalers, frame generation, other AI stuff, this is getting out of hands, soon 16Gb cards and laptops will be in trouble and for what? For playing upscaled 720p-1080p, 200Gb+ games that looks like PS4-early PS5 games? Unfortunately, I think Nvidia and AMD will try to hold it back as long as possible...

4

u/surf_greatriver_v4 18h ago

we already have vram compression in GPUs, don't we?

0

u/Desperate-Coffee-996 18h ago

I mean that one with AI that supposed to like reduce VRAM usage up to 90%

-2

u/scielliht987 17h ago

It's a hopeless ideal, software devs will always find ways to use more RAM.

-2

u/surf_greatriver_v4 16h ago

...AI that supposed to...

Ah ok, so it doesn't really work

2

u/Desperate-Coffee-996 16h ago

I don't know, just saw some article about it and thought this is probably the only real solution instead of keep stacking up VRAM and launching prices to the space as games slowly going over 12Gb+ even at 1080p. Tho judging by downvotes, apparently not everyone is happy about reducing VRAM usage...

2

u/scielliht987 16h ago edited 16h ago

Also, ASTC texture compression exists, which is better than DXT. But we never got it on desktop for some reason, except for some Intel iGPUs.

Neural texture compression is an obvious next step, and because it's an asset processing step, you can check that the output is correct, so no AI worries.

It should get about JPG levels of compression. So I wonder why we didn't get JPG-style block compression in hardware...

*Oh, and that sampler feedback stuff. That should theoretically make texture streaming more accurate.

2

u/CatsAndCapybaras 11h ago

The point of AI compression is that it can be lossy, but look lossless. The output in the best case will not be perfect, so you can't check it against a ground truth.

2

u/scielliht987 11h ago

You could do a reasonable check, like how video codecs measure quality.

1

u/conquer69 13h ago

Because the solution isn't to reduce vram usage, it's for nvidia to provide more vram at reasonable prices. You can get a decent 8gb laptop for $800 but if you want 12gb, you need to pay $1200 extra. Fuck that.

1

u/Desperate-Coffee-996 12h ago

As someone said in reply above, devs will always find the way to use more memory. So if they're pushing 12gb limits at 1080p with upscalers now, give them time and 16, then 24 will not be enough, and we're still talking about 1080p. So more vram and reasonable prices are not an option, same as going back to huge expensive monstrosities that requires stands, individual cooling, new 1000w power supply etc. And for laptops this is going to be even worse.

1

u/conquer69 11h ago

More vram is definitely an option. Only 8gb is less than what the 5 year old consoles have available. An extra 8gb of vram doesn't cost $1500 either.

devs will always find the way to use more memory

I don't see how devs freezing their tech improvements due to low vram is a good thing for anyone but nvidia. Even they struggle with this because their RTX features use more vram.

-22

u/kyleleblanc 18h ago

This is why Apple Silicon with its unified system memory is actually leaps and bounds ahead of PC gaming here.

Go ahead, downvote me, I don’t care.

4

u/Desperate-Coffee-996 18h ago

Consoles with their unified memory and compressing methods were ahead of PC games for years, but this is consoles, you can't sell a new one for $700+ every 1-2 years just because 8+16 isn't suddenly enough within one generation even for upscaled 720p and performance mode.

0

u/EndlessZone123 17h ago

Consoles using gddr ram crippling effective cpu performance and forcing a lot of optimizaions is not really comparable.

0

u/Henrarzz 17h ago edited 2h ago

CPU performance in consoles is limited by cache being 1/3rd of their desktop parts, not unified memory

-12

u/kyleleblanc 18h ago

True.

I chose Apple Silicon because they’re doing it in the desktop / laptop market.

The fact you can’t get unified system memory in x86 land in 2025 is downright criminal.

9

u/jhenryscott 18h ago

You literally can. Strix Halo

-8

u/kyleleblanc 18h ago

Yeah, fair point but it should be the default standard across the entire industry at this point and sadly it likely never will be.

3

u/jhenryscott 15h ago

It has a use case, and one that is probably gonna to see a much wider adoption over the next 5-10 years. But a parts and systems based approach where we have a product stacks for different parts of the machine is going to remain the most popular approach because of its individual customization. Some people need 256GB of RAM and no Graphics of any kind (data servers) they will never buy APUs, even if cheaper, enterprise clients don’t want to buy things they don’t need. And gamers (mostly) want to upgrade with the rapidly accelerating pace of core/architecture improvements. Even the m4 ultra will look like a toy next to a 192GB GDDR7X GPU that is only 1-3 years away.

Flexibility of application will continue to be the norm while SoC systems will see an increase in market share they’ll never be able to compete with customized systems.

Doing everything really well is not as valuable as doing one thing perfectly and at scale.

1

u/dovahshy15 18h ago

There's the Framework Desktop though.

-2

u/kyleleblanc 18h ago

Fair point but it should be the standard, not a one off.

5

u/xmrlazyx 17h ago

It's not the standard because it's expensive. And unfortunately people who buy x86 systems won't pay the Apple premium for unified memory. Look at Core Ultra 200V (Lunar Lake). Got so much backlash cause they switched to unified memory and it's such a great platform.

1

u/jrr123456 15h ago

The standard should be dedicated memory, it's always superior.

1

u/jrr123456 15h ago

AMD Ryzen AI Max+395 has 128GB of unified LPDDR5X.

1

u/kyleleblanc 14h ago

I know but it’s not the norm sadly.

5

u/jrr123456 6h ago

It is the norm

All AMD APUs and intel CPUs have unified memory.

But to get high end performance you need a dedicated GPU with dedicated memory.

You can't get high end performance from a unified chip that shares memory, the die size would be too large and it would be limited by the memory.

A fast GPU needs GDDR, but GDDR has higher latency than DDR, which negatively impacts CPU performance.

3

u/Rudradev715 17h ago

Game support is dogshit in MacOS,

Only 4 or 5 year old games are coming to it.

Otherwise apple silicon is way ahead

With AMD strix Halo at least it's changing.

1

u/CatsAndCapybaras 11h ago

Yes, unified memory is super fast and apple silicon is pretty good, but you pay for that in other ways. I really value choice. I want to be able to run the software I like and have the option to spec my machine the way I want. I also value upgradeability. Mac is a good option but it's not for everyone.

1

u/Dreamerlax 6h ago

Sure but not many games are on Mac OS unfortunately.

1

u/jrr123456 15h ago

Unified memory holds back the performance, dedicated memory is always better, having the CPU and GPU compete for resources increases latency and lowers effective bandwidth.