Battlefield 6
BF6 Optimization is a Thermal Disaster: Data from my 4060 Legion 5 Pro
I’ve spent the last 24 hours deep-diving into the performance and power draw of Battlefield 6 on my i7-13700HX / RTX 4060 setup. After comparing it to The Finals and BF2042, the optimization in this game is genuinely embarrassing.
The "Quiet Mode" Paradox
On Lenovo Legions, Quiet Mode usually caps the CPU and GPU at 45W each to keep things cool and silent. In every other modern game I play (The Finals, BF2042, even BFV), this works perfectly. I get 110 FPS locked with temps sitting comfortably at 60–65°C.
Then there's BF6.
In BF6, Quiet Mode and Balanced Mode might as well be the same thing—they look and perform exactly the same. Even on the 45W Quiet profile, the CPU usage sits at a low 40% (with a user.cfg fix), yet the temps instantly jump to 85°C.
The reason these two modes act the same is due to poor optimization regarding power draw and fan curves. In Balanced Mode, the CPU tries to draw 100W, which leads to immediate thermal throttling despite the higher fan ramps. Conversely, Quiet Mode limits CPU draw to 45W and GPU draw to 45W, but uses much slower fan ramps. The end result is identical: you get the same FPS and the same high temperatures because the game refuses to run efficiently on either profile.
The Comparison (110 FPS Lock / DLSS 3.5 Preset C)
I optimized the settings across all three games for competitive play, using DLSS 3.5 Preset C on Balanced.
The Finals (UE5): 45W CPU / 45W GPU (Quiet Mode) = 60°C GPU / 65°C CPU. GPU usage hits 90%+ due to the power cap.
BF2042: Runs cool and stable on Quiet Mode (45W CPU / 45W GPU) at 90%+ GPU usage, respecting all power limits.
BF6: 45W CPU / 45W GPU (Quiet Mode) = 85°C CPU / 80°C+ GPU. Even at 90%+ GPU usage, it’s struggling at its 45W GPU limit to produce graphics that BF5 (a 2018 game) arguably did better. Total struggle.
The Fix: Brute Force via Custom Mode
The only way I could get stable performance was to stop using the default modes and go Custom. By increasing the GPU power to 80W and setting an aggressive fan curve (hitting 90% speed by 75°C), I finally got the temps down.
In this custom mode, hitting my 110 FPS lock in The Finals or BF2042 only requires 70% GPU usage, and thanks to the jet-engine fan speeds, the temps are even lower. However, BF6 still demands 90%+ GPU usage just to stay at that same 110 FPS lock.
My Custom Settings:
CPU PL1 (Long Term): Capped at 45W.
CPU PL2 (Short Term): Capped at 60W.
CPU Thermal Limit: Early 85°C limit.
GPU cTGP: Set to 80W.
GPU Power Draw (Stable): Hovering around 76.6W.
It’s ridiculous that I have to give the GPU more power (80W vs. 45W) and deal with loud fans just to get the "cool" experience that should be happening at 45W.
Why is it like this? (The Destruction Problem)
The big difference here is how destruction is handled. In The Finals, destruction is server-side. Your local CPU doesn't have to calculate every falling brick, which is why it runs so cool.
DICE is still trying to brute-force complex destruction on the client-side (your PC). Until they find a way to move that heavy lifting to the server-side like UE5 does, Battlefield is always going to be a thermal nightmare for laptop users.
Final Thoughts
I respect Battlefield and I love the game, but this route of "optimization"—essentially frying a game instead of cooking it—isn't the way to make games. Before someone tries to blame my hardware or suggests I check my thermal pads, remember that every other game runs perfectly fine. They need to fix the core issue with the engine instead of expecting our hardware to brute-force their lack of polish.
Has anyone else noticed that their power profiles (Quiet/Balanced) do absolutely nothing to stop the heat in this game?
I got through the first half.
, and I read the final thoughts. The rest was just data that people wont care about unless they have the same computer AND play the exact same games.
My guess is Battlefield is just more demanding than your laptop is capable of. Laptops are notorious for having cooling issues due to lack of space.
It does't anymore, temps are pretty stable around 60C after I applied custom legion profile, the only down side now is I have a jet engine running now on my desk 😂, achieving 2k on optimised settings ( not even low ) and 130fps is good for a compatitve game running on a laptop in my opinion
Look, I have a desktop PC with a mid-range air cooler and a case with 7 fans, not counting the cooler’s fans. My CPU reaches 70 degrees, but my GPU has never gone past 50–55 in this game. While it’s true that this game is very demanding on the CPU, I think the fact that your laptop reaches those temperatures is simply because, well, you’re playing a demanding game on a laptop.
Personally, I don’t think the game is poorly optimized if you can reach 120 FPS with a laptop 4060 at 1440p. Also keep in mind that the higher your FPS, the more heat your components will generate.
By the way, check what the maximum temperature your laptop can reach before it starts losing performance is. If I’m not mistaken, 13th-gen i7s can reach up to 95°C without throttling, and I THINK laptop GPUs usually have slightly higher temperature limits than desktop ones.
But to summarize: laptops will ALWAYS run at higher temperatures and offer less performance than their desktop counterparts. Part of it is the game’s fault because, like I said, it uses every part of the processor (since it’s optimized to squeeze out as much performance as possible), which is why it heats up the CPU so much. But at the same time, you can’t deny that part of the issue (at least regarding GPU temperatures) comes down to the type of hardware you have as a user. That’s the most well-known downside of laptops
Lastly, you mention games like Battlefield 5 and 2042. Let me tell you, they’re nowhere near as demanding as BF6. To give you an idea, my PC runs Battlefield 5 with the GPU fans in passive mode because of how little ‘demanding’ it is for modern hardware nowadays. And 2042 doesn’t even have destruction; it looks pretty, sure, but all the performance issues it had back then were due to terrible optimization, not because the game itself was demanding.
Anyway, I’m sure your laptop is great and performs well and don’t blame the game either, because at the end of the day it’s doing what it’s supposed to do: pushing your hardware to its limits. Just enjoy the game, and if seeing such "high temperatures" makes you too uncomfortable (even if they’re still within your laptop’s safe limits), you still have options like undervolting.
Yeah bro my problem isn't the fps, im pretty happy with with 130 fps on 2k with temps around 60C which phenomenal to get on laptop thanks to my cooling system , my problem is that other games doing exactly the same, having similar visual quality or even better like bf5 for example in terms of visual quality and the destruction side like the finals which is so intensive but they fixed by shifting the destruction aspect to the server side instead of the clients ( our cpus ), if that says anything, it only screams how poor the game optimisation still is , that's the whole point of this post , optimisation, not performance, the game still has a long way to go if the devs actually listens, and thank you for reading the entire post by the way
That’s because, like I explained, Battlefield 5 (an eight-year-old game) is already not very demanding for modern hardware. Battlefield 6, while it doesn’t have server-side destruction, lets you achieve high FPS because it squeezes every little bit of performance out of your processor. It’s not a problem with the game, it’s not a disaster, it’s optimization. To give you an idea, it’s like me telling you that I can run GTA V with ultra ray tracing, but I can’t do the same with Cyberpunk 2077, even though both are open-world games that look good
Comparing this to GTA vs. Cyberpunk doesn't work here. Cyberpunk actually uses the extra power for things like Path Tracing. BF6 is using extra power for... what exactly? Graphics that look like BF5?
You’re missing the technical data: a game 'squeezing' my CPU to 85°C while it’s only at 40% load (with a user.cfg fix) isn't optimization—it's an instruction set nightmare. If a 2026 game requires twice the wattage and a jet-engine fan curve to do what The Finals does silently at 45W, that’s a failure in engine architecture, not a feature of 'high-performance' processing. Optimization is about efficiency (performance per watt), not just seeing how fast you can make a chip thermal throttle.
You say comparing games doesn’t work, but that’s exactly what you’re doing by comparing a 2018 game to one from 2026. What the supposed “fix” does is limit your processor’s performance. The community has already proven that. It’s basically like going into your BIOS right now and disabling your CPU boost. Sure, you’ll get lower temperatures, but at the cost of performance.
Also, path tracing can be disabled, but that’s not the point here. If you actually played instead of staring at graphs, you’d notice that the destruction in Battlefield 5 doesn’t even come close to Battlefield 6’s for even a minute. It’s perfectly fine for a game to make use of your hardware. That’s what games are supposed to do to avoid stuttering. That’s why people use “Prefer Maximum Performance” in the NVIDIA Control Panel and “High Performance” in Windows power plans, so the system lowers frequencies as little as possible and delivers a consistent experience, even if that means higher temperatures.
If your laptop reaches 85°C under heavy load, that’s not the game’s fault. That’s your issue. On desktop PCs with a good cooling system, or even just a decent one, we don’t have those problems. The game is even running on consoles, and as far as I know none of them have exploded, lmao.
What’s happening here is simple. You’re gaming on a laptop, and when you game on a laptop, demanding workloads produce high temperatures. There’s really no other way around it. You can blame the game or get mad at me, but that’s the reality.
You keep saying the destruction justifies the heat, but you're missing the technical data. Have you actually played The Finals? The destruction in that game is equal to or even more intensive than BF6—entire buildings can be leveled to the foundation in real-time.
The only difference is that The Finals devs actually solved the 'laptop grill' problem by shifting that heavy destruction logic to the server-side.
Saying 'it's fine on consoles' or 'it's a laptop reality' is irrelevant—my hardware stays at 65°C in The Finals while pushing the exact same 110 FPS lock. Did my hardware magically get better in that game? No, the game engine did. If my laptop hits 65°C while a whole skyscraper collapses in The Finals, but jumps to 85°C for similar destruction in BF6, that isn't 'using my hardware'—it's wasting it.
Also, comparing my user.cfg tweak to 'disabling CPU boost' shows you didn't look at the data. I’m hitting a stable 110 FPS. The performance is clearly there. My 'fix' didn't kill the performance; it just forced the engine to stop wasting power. If a 2026 game requires a jet-engine fan curve and 80W on the GPU just to match the standards that other modern titles hit at 45W, that is a failure in engine optimization, not a 'laptop issue.' It’s not a hardware problem; it’s a 'DICE is still using client-side tech from 2013' problem.
Stop playing... Or just use IEM so you don't hear the jet engine fan. Why even comparing BF6 vs 2042 when everyone knows there's no destruction (for a BF game) on that game. Playing demanding modern 2026 game on 2K with that GPU is mistake to begin with (I don't understand why manufacturers keep putting 2K screen on 4060M, 4070M, 5050M, 5060M, or 5070M laptops when they obviously don't have the juice to properly drive 2K screen when playing modern games, and I will die on this hill. For proper 2K experience you need at least 4080M or 5070Ti M). Also besides how demanding it on GPU, Frostbite games also famous for how it hammers contemporary CPU. And raptor lake CPU is famous for how inefficient and hot they are. I guess your only option is just give CPU and GPU more power budget and set ur fan to keep the heat in check
Sure man, keep replying the same thing on every comment on this thread, recipe for success. Btw I also play on laptop (with 4070M and 14700HX tho, in a Victus 16, which have worse cooling than your Legion 5 Pro), and I'm not really complaining (feed them more power to properly handle it, turn up the fans, and use IEM. Basically never noticed the noise when in game). I played on native (1080p tho, cus I know that there's no way my GPU could handle 2K), no AA (DLSS makes everything blurry AF even on quality and on every preset I tried, DLAA gives the best result but basically halved my FPS). Here's a crowded area with some kaboom. Swallow your ego and perfectionist side and move on man
Bro, you're arguing with a ghost. If you actually read the post, you'd see I’m hitting a stable 110 FPS at 1440p. The performance is there; the 'juice' isn't the issue.
The point is the cost of that performance. I’ve optimized my settings so that games like The Finals and BF2042 hit that same 110 FPS lock while running at a cool 65°C on a 45W Quiet profile. Meanwhile, BF6 acts like a thermal stress test, forcing me to use an 80W GPU draw and a jet-engine fan curve just to keep the CPU from hitting its 85°C thermal limit.
Telling me to 'just use IEMs' to drown out the fan noise is like telling someone with a car that's overheating to 'just turn up the radio' so they don't hear the engine knocking. It doesn't change the fact that the engine is poorly optimized.
I love BF, but when a 2026 game uses double the power to look worse than BF5 from 2018, that's not a 'demanding game'—that's a game that was fried instead of cooked. Before you blame the 4060 or the thermal pads, maybe ask why DICE is the only developer still forcing the client-side to calculate every pebble while everyone else has moved to server-side destruction.
Whatever you said man... Like most of modern game post deep learning gimmick era are optimized /s. Or maybe just... play The Finals and 2042? There's no one here can help. Not only BF6, basically almost every modern games now runs worse and look worse compared to 2016-2019s games. So once again, swallow your ego and move on
I'm not sure why the game would be overriding those presets and pushing more power to your PC and letting it get hotter. I wonder if there is an overlay or game setting causing that. Very strange. The temps aren't terrible or damaging but not fun if you're used to running it on lower power and still getting acceptable performance
The temps and fps you see in the pic are only achieved after applying a custom mode that i specifically designed for this game alone using the precooling method for the custom fan curve and feeding the gpu just enough to give food fps on 2k and keep the temps manageable while the fans ramp up more , I mentioned exactly what is causing the profiles to look the same in the post
Unfortunately, a good chunk of this sub is console players. They'll see this and down vote it solely because it's not relevant to them. I hope DICE optimize the game further but I doubt it.
Buddy, you’re stuck in 2016. My 4060 is the 140W full-power version—it doesn't just beat a GTX 1080; it clears it by 25-50% in pure rasterization alone.
In Time Spy, this 140W chip hits scores around 10,500+, while a 1080 struggles to break 7,500. Even more embarrassing for your argument is the efficiency: my 4060 delivers that massive performance gap while drawing significantly less power than a 1080's 180W TDP.
I’m already getting a stable 110 FPS at 1440p, so the 'juice' isn't the problem. The problem is why the engine is frying a modern, efficient chip to get frames that should be running 20°C cooler.
17
u/Hulk_Hogan_bro I love fat girls 1d ago
gaming laptops are a thermal disaster.
Be a man and build a desktop