Ionico 17 review

NBrooke

Member
New benchmark values with this update to power levels? Be interested to see how the firestrike score for this stacks up to the previous runs you've submitted!
 

Macco26

Expert
I've published the new Firestrike in the benchmark thread.
However Firestrike:

Timespy:

They are not so much higher because for how 3Dmark works, it rarely pushes both CPU and GPU at the same time, so Dynamic Boost is enough to push GPU wattage when required, it's just in the combined short part it might be a difference.
New CC can also overclock GPU core/VRAM, a thing I haven't done yet, though.
 

Macco26

Expert
It's no HDR, and it goes above 300, like 320 or so, that's what I've seen by many reviewers of same panel (but on different System integrators).
You can get an idea by YT search for 'Jarrod 1440p', with a video of last December for the same 15 and 17 1440p panels used here.
 

barlew

Godlike
PCS offers 3.1.23 now if you want, you should try it in your accont's download portal. You should be fine if you're ok with disabled PL limits in Office mode (people had told me).

Sorry mate I'm not deliberately hounding you.
Sorry mate I'm not deliberately hounding you.

I just want to point out to anyone reading this as it is important, just because the version of CC you are referring too comes from the same OEM does not mean it is compatible with the Ionico.

The OEM configures each chassis as per the specification of each system reseller. This means they have to configure CC individually for each system reseller.

The CC v3.9.18 which is currently out in the wild has not been configured for the Ionico which is one of the reasons people on this forum will tell users to stay away from it.
 
Last edited by a moderator:

Cenksenci

Bronze Level Poster
Sorry mate I'm not deliberately hounding you.

I just want to point out to anyone reading this as it is important, just because the version of CC you are referring too comes from the same OEM does not mean it is compatible with the Ionico.

The OEM configures each chassis as per the specification of each system reseller. This means they have to configure CC individually for each system reseller.

The CC v3.9.18 which is currently out in the wild has not been configured for the Ionico which is one of the reasons people on this forum will tell users to stay away from it.
 

Macco26

Expert

Another set of benchmarks, this time on Fortnite, on the same machine:
1080p, 1440p, Low, Med, High and Epic settings, Ray Tracing off and on, all 4 DLSS modes tested.

(all done in dGPU-only mode. Too many variables otherwise)
 

LizardSoul

Active member

Another set of benchmarks, this time on Fortnite, on the same machine:
1080p, 1440p, Low, Med, High and Epic settings, Ray Tracing off and on, all 4 DLSS modes tested.

(all done in dGPU-only mode. Too many variables otherwise)
Another detailed video, thank you.
I am curious to know why the GPU power usage was not at 140W in the last test though, given how it was processing RT and at 99-100% usage.
P.s. I hope it's not a stupid question ;) .
 

Macco26

Expert
Absolutely it's not a stupid question.
140W are given only if CPU is below 25W, and yes, despite RTSS showed it was ofter there, maybe it was not long enough (e.g. CPU power had fast spikes we can't see) which inhibited this boost.
I could also override (with the latest control center) this Dynamic Boost behavior; and I did but only during the Overclocked part (OC), there you might see more moments w/ the GPU at or > 135W than anywhere else, because it didn't care of how much power the CPU had at the same time.
But D.B.2.0 is there for a reason: to allow not to put too much heat in the laptop (if CPU produces a lot of wattage, GPU can't, and viceversa), so for the longevity of the laptop / lower noise, I sticked with normal Nvidia 125+15 state for the rest of the video.

EDIT: adds to that: every time I set a DLSS mode actually the resolution of the screen was a LOT lower than the final one (then magically upscaled). The lower the resolution, the quickest we reach the bottleneck of the CPU. It might be a case where the internal resolution got like 1280x720, then upscaled to 2560x1440 (probably DLSS Performance). The lower resoluton pumps a lot of stress on the CPU. The 10875H can't simply provide so much frames per second to the GPU. The GPU does not require more than much, because the CPU is stalling. Also Zen3 wouldn't do much, btw. That's why DLSS presets are usable only with RayTracing, where the bottlenecks shifts to the GPU, and the CPU breathes again.
 
Last edited:

LizardSoul

Active member
Absolutely it's not a stupid question.
140W are given only if CPU is below 25W, and yes, despite RTSS showed it was ofter there, maybe it was not long enough (e.g. CPU power had fast spikes we can't see) which inhibited this boost.
I could also override (with the latest control center) this Dynamic Boost behavior; and I did but only during the Overclocked part (OC), there you might see more moments w/ the GPU at or > 135W than anywhere else, because it didn't care of how much power the CPU had at the same time.
But D.B.2.0 is there for a reason: to allow not to put too much heat in the laptop (if CPU produces a lot of wattage, GPU can't, and viceversa), so for the longevity of the laptop / lower noise, I sticked with normal Nvidia 125+15 state for the rest of the video.

EDIT: adds to that: every time I set a DLSS mode actually the resolution of the screen was a LOT lower than the final one (then magically upscaled). The lower the resolution, the quickest we reach the bottleneck of the CPU. It might be a case where the internal resolution got like 1280x720, then upscaled to 2560x1440 (probably DLSS Performance). The lower resoluton pumps a lot of stress on the CPU. The 10875H can't simply provide so much frames per second to the GPU. The GPU does not require more than much, because the CPU is stalling. Also Zen3 wouldn't do much, btw. That's why DLSS presets are usable only with RayTracing, where the bottlenecks shifts to the GPU, and the CPU breathes again.
I see, I was convinced the GPU would receive full wattage as long as the CPU was below 45W, not 25W, all clear now, and thanks for the extra clarification about DLSS, I sure need to inform myself a bit more about such new features, before deciding which laptop, and what setup, to go for.

Speaking of which, I am still divided about going for the AMD or Intel CPU, any advise you feel comfortable giving?

From what I have been able to gather, the Ryzen 9 5900HX is superior (albeit not greatly so) to the i7-10875H on almost every front (plus allowing 3200Mh ram), so the only thing holding me back from going AMD after so many years as Intel user, is simple habit (silly reason, I know) and a slight worry that AMD may be missing useful features as well as presenting worse manufacturing quality.
To specify, my usage would anyway be mostly gaming, plus very light office work.

If this is not the best place to discuss it though, I can open a new post elsewhere, no problem.
 

Macco26

Expert
If I were you and had to buy now, I'd go 5900HX unless hundreds of EUR more. About CPU, In gaming they are in the same ballpark, but for anything else AMD is better.
I'd cheap out on the GPU though (3070 is enough, 3080 too much cost per performance).

Please just note that since a lot of die space in Zen3 is occupied by the L3 cache, the 8 cores are all crammed together -> a lot of heat generated in a single part of the die. It is VERY HARD to pull out heat from there, unless you go Liquid Metal (a thing very hard to apply and not being offered by PCS). Intel Cores occupy like 60% of the die, Cezanne no more than 35%-40%:


1619592513538.png


1619592601020.png

Intel 10875H

This means you might see 95C while gaming on AMD (aka throttling), unless you apply LM with all the precautions by yourself (you need to isolate a lot of circuitry over the die to avoid them touching the LM else bye bye laptop, etc.)
Still, for productivity tasks it seems a lot less hotter than that.

Given the above I'd go for a 17 inch sized if going AMD, instead of 15, at the least. Bigger side radiators, easier to remove heat at the least.

Do note though that in some weeks Tigerlake-H45 comes. It's the very first 8 cores 10nm Intel processor with new architecture. And Tongfang will update their Intel lineup with that. And will offer Thunderbolt 4.0 for that, a thing current 10th gen lacks, because they already designed the same motherboard for TGL-ready and TGL drives the TB straight from the CPU, so our motherboard can't offer TB (lacks the TB controller). For this I'm saying expect a fast refresh of the Tongfang lineup as soon as TGL arrives in full fledge. They will reuse the same motherboard, they just need the Intel shipping the CPU and a new BIOS. And this is the first High performance Intel chip at 10nm. Might be worth a check.

EDIT: Videocardz states that Tigerlake-H45 embargo ends the 10 or 11th of May. First laptops to be presented the 11th. So really couple weeks away.
 
Last edited:

Macco26

Expert
A 6 core 11400H Tigerlake seems on par with an 8 core 5800H in geekbench, holy cow.
LINK

I wonder what a 8 core 11800H or 11900H or 11980H can do (all 8 cores, just better and better binned for higher and higher Mhz)

Found one 11800H (the lowest tier of 8 core TGL) vs 5900HX:
LINK
 

LizardSoul

Active member
If I were you and had to buy now, I'd go 5900HX unless hundreds of EUR more. About CPU, In gaming they are in the same ballpark, but for anything else AMD is better.
I'd cheap out on the GPU though (3070 is enough, 3080 too much cost per performance).

Please just note that since a lot of die space in Zen3 is occupied by the L3 cache, the 8 cores are all crammed together -> a lot of heat generated in a single part of the die. It is VERY HARD to pull out heat from there, unless you go Liquid Metal (a thing very hard to apply and not being offered by PCS). Intel Cores occupy like 60% of the die, Cezanne no more than 35%-40%:


View attachment 25568

View attachment 25569
Intel 10875H

This means you might see 95C while gaming on AMD (aka throttling), unless you apply LM with all the precautions by yourself (you need to isolate a lot of circuitry over the die to avoid them touching the LM else bye bye laptop, etc.)
Still, for productivity tasks it seems a lot less hotter than that.

Given the above I'd go for a 17 inch sized if going AMD, instead of 15, at the least. Bigger side radiators, easier to remove heat at the least.

Do note though that in some weeks Tigerlake-H45 comes. It's the very first 8 cores 10nm Intel processor with new architecture. And Tongfang will update their Intel lineup with that. And will offer Thunderbolt 4.0 for that, a thing current 10th gen lacks, because they already designed the same motherboard for TGL-ready and TGL drives the TB straight from the CPU, so our motherboard can't offer TB (lacks the TB controller). For this I'm saying expect a fast refresh of the Tongfang lineup as soon as TGL arrives in full fledge. They will reuse the same motherboard, they just need the Intel shipping the CPU and a new BIOS. And this is the first High performance Intel chip at 10nm. Might be worth a check.

EDIT: Videocardz states that Tigerlake-H45 embargo ends the 10 or 11th of May. First laptops to be presented the 11th. So really couple weeks away.
Once again, a more exhaustive answer than I expected, much obliged.
Regarding the CPU, that's all I needed to hear to stick to Intel once again.
Not only lower temps have become almost as important as performances when it comes to a laptop I'd buy, but I will have to wait anyway till June for the next shipment of 3070, so I may as well keep an eye for this new Tigerlake-H45 CPU.
Unless I decide to buy elsewhere, that is.

I do worry though, that with the new CPU and related motherboard\BIOS design, will also come new bugs, which I will have to work around myself, or wait for eventual official fixes, in hope I don't brick the new laptop in the meanwhile.. but that's the price of innovation I guess.

When it comes to LM instead, I don't even want to consider it as an options, I have seen and read some horror stories about it.
Also, I suspect it may only move the thermal bottleneck to the entire heatsink, rather than the what is between that and the CPU.

The issue now remains on how to maintain my sanity while I wait, as I am currently stuck with a 15 years old Dell laptop (which I have almost entirely upgraded over the years). If it's reasonable for PCS to still be able to repair a >5 years old laptop bought from them, I may even send them my currently bricked Octane Series laptop.
 

Macco26

Expert
PCS are pretty busy so wouldn't get your Octane repaired before the TGL ships I guess... ^^
I feel your pain. My Clevo broke in december and I had to wait 3 months with an old Dell with a Geforce 450GT meanwhile. And some old i7 I don't even know the nm process.. That's why I picked the first RTX30 series laptop I could.

About Tigerlake, at least they should have sorted out any early bugs with the RTX cards. For example the MUX problems fixed with BIOS etc.

Imagine that Zen3 + RTX30 offering got both new CPU and GPU at the same time vs last year, so double the chance to mess up something (for instance TF seems to have cheaped out a bit on the heat pipes, infact, for those). TGL arrival will give you a new CPU, yes, but the RTX 30 is pretty established, now. Just one variable, and probably a simple one (I expect same parameters PL1 and PL2 applies etc.)

Regarding LM I say this because the US system integrator for same Tongfangs is applying LM to handle that heat. It's a very minority the ones who do, but the only way to lower to 88-90C while gaming it seems.
 

LizardSoul

Active member
PCS are pretty busy so wouldn't get your Octane repaired before the TGL ships I guess... ^^
I feel your pain. My Clevo broke in december and I had to wait 3 months with an old Dell with a Geforce 450GT meanwhile. And some old i7 I don't even know the nm process.. That's why I picked the first RTX30 series laptop I could.

About Tigerlake, at least they should have sorted out any early bugs with the RTX cards. For example the MUX problems fixed with BIOS etc.

Imagine that Zen3 + RTX30 offering got both new CPU and GPU at the same time vs last year, so double the chance to mess up something (for instance TF seems to have cheaped out a bit on the heat pipes, infact, for those). TGL arrival will give you a new CPU, yes, but the RTX 30 is pretty established, now. Just one variable, and probably a simple one (I expect same parameters PL1 and PL2 applies etc.)

Regarding LM I say this because the US system integrator for same Tongfangs is applying LM to handle that heat. It's a very minority the ones who do, but the only way to lower to 88-90C while gaming it seems.
Yeah, chances are I will be patient then, and wait.
Also thank you for the reassuring words about possible bugs, I had not seen it from that point of view, so the new Intel CPU configuration should be indeed relatively safe.
Fingers crossed.
 

Macco26

Expert
In all all honesty, the only minor question is: being TGL a 10nm die, vs the 14nm die, how large it is? How easy is to remove the same heat produced by the same 45W TDP than 10th gen?
We actually don't know. You should stick with looking every day at 3DMark until someone publishes 11800H benchmarks so you can see the temperatures and compare I guess.
Still, being the motherboard the same, I expect the CPU package and heatspreader being the same, so as large as the 10th gen. It's the die inside pretty small.
 

Macco26

Expert
Just to add some testing about Battery operation I've found that for office works the Laptop is very well suited under these particular conditions:
  • iGPU only set via BIOS (it turns off completely the Nvidia GPU; it should be similar to Nvidia Optimus without the risk of turning it on due to some nasty program here and there)
  • reduced screen refresh rate of the panel in Windows Settings. This panel has only two native Refresh rates available: 165 Hz and 40 Hz. While 40 Hz isn't great, it's enough to give pleeeenty of juice at Office work. Also, watching YouTube isn't bad at all, even if it does not cover the full 60 FPS the majority of videos offer.
The above settings plus the normal Office mode in Control Center, keyboard lighning turned on but at low level and timing out after 10" usage, Windows battery slider set to Better Battery (it could go even further in Battery Saver) allows to consume like 43% of the battery in 3h 8 mins of usage.

A usage that, as usual, is pretty office oriented: MS Word, Excel, Outlook, MS Teams, Chrome, Edge Chromium, Citrix Receiver, PDF editor, and so on.

This from the Batteyreport Windows creates by invoking powercfg /batteryreport (I've turned on the laptop at 13:19, worked on another PC until then; it entered standby for 1 minute while I was AFK during this testing).

2021-06-0408:00:50ActiveAC100 %91.245 mWh
08:33:00Suspended100 %91.245 mWh
13:19:33ActiveBattery100 %91.245 mWh
13:50:52Suspended92 %83.945 mWh
13:51:47ActiveBattery92 %83.945 mWh
16:27:27Report generatedBattery57 %52.010 mWh

Also seen as this (another generated report after a while, at ):
START TIMESTATEDURATIONENERGY DRAINED
2021-06-0413:19:33Active0:31:198 %7.300 mWh
1 min standby (note by myself)
13:51:47Active2:53:1939 %35.585 mWh

the battery consumption, if estimated correctly until fully depleted at 0%, would be a whopping 7 h and 17 minutes.
(3h 8 min / 43%).

Not bad for a 14nm CPU gaming laptop I guess. It sips power.

Oh, and all this with a fully silent laptop: check the 0% RPM fan! (dGPU fan not appearing as I totally disabled it in BIOS). This while I am writing this:
1622818590399.png
 
Top