Octane VI 17.3" review

Retron

Silver Level Poster
Last week my latest PCS laptop arrived - it's the fourth one I've had over the past decade.

My previous laptop was an Octane III with an i7-6700K and a GeForce 1080. It's served me well, but I fancied the greater performance on offer from today's systems - the i9-9900K (yes, despite the heat) and the clincher, the mobile 2080 which has just become available. I use my laptop for a variety of tasks, from website development, to dabbling with VMs, to gaming. I chose the i9 because of its higher single-core turbo (useful for gaming) and the greater cache (ditto) compared to the i7 range. The hyperthreading also means it'll have a long useful lifespan in theory. I finished off the system with a 1TB Samsung 970 M.2 SSD and a 2TB WD Blue SSD for bulk storage.

Here's a photo of the old and new Octanes side by side - the Octane III has now been sold to my friend (who's a coder and also a gamer). There's very little external difference between them, but why change something that works? The Octane III is on the right, the new Octane VI is on the left.

P1140271.jpg

The main visible difference is in the screen. Both are 4K, both have AUO panels but the model is different. The old one is an AUO B173ZAN01.1 IPS panel, without G-Sync. The new one is the AUO B173ZAN01.0, which *does* have G-Sync. It also has a grainer appearance as the surface coating is rougher - it shows up as a sort of prismatic effect, with whites for example not looking as smooth as they do on the 1.1 panel. I've not seen any mention of this anywhere online, so I suspect not many people get to see both models side by side! There are macro photos below:

P1140279.jpg P1140278.jpg

It's a fair trade-off, though, and having G-Sync will doubtless be useful. Incidentally, anyone saying 4K on a laptop isn't worth it hasn't used one for any length of time. Even if you set Windows' desktop scaling to 200%, emulating an HD display, the crispness of text and graphics is amazing. The fact the panel has a really good colour gamut is just icing on the cake - it's way better than a generic 72% NTSC TN panel (such as the one I use on my desktop).

Slightly worryingly the panel on the new laptop developed a rectangle (from top to bottom) that was darker than the rest of the screen, very noticable on areas of plain colour. An experimental squeeze at the bottom-left of the bezel (which is where the cabling goes) solved it for now, but I'll keep an eye on it... I suspect the cable is slightly loose. It's under warranty, but I'd rather not have to send it back!

Anyway, onto performance. How do you get an 8-core, hyperthreaded, 95W chip into a laptop? It turns out Clevo have lowered the TDP via the BIOS, albeit you can modify it if you want. Out of the box, it has a 65W TDP with an 83W short power limit. This seems to have no effect on lightly-threaded loads (such as Warcraft, for example), but it does mean that heavily threaded loads will be throttled. No complaints, though, as it turns out the old i7-6700K was similarly throttled in my old Octane. (Intel XTU will give you the details, it's a handy program!)

4.png

Gaming wise, no problems whatsoever. Whereas the old laptop would run Warcraft at quality level 7, at 4K, at a steady 60-70fps, the new one runs it at 10, again at 4K, at a similar framerate. The lowest I've seen so far was 59fps, although with G-Sync available it could go a bit lower with no problems. After an email I received codes for BF V and Anthem, so I look forward to seeing those running in due course.

Here are some GPU-Z screenshots, one taken after a WoW session and the other showing the highest temperatures, clocks etc recorded during the session. One was taken via Print Screen, and the other by the in-built tool - that's why they look a bit different!

5.png

All in all, it's an impressive upgrade from the old Octane. Clevo have squeezed some high-end components in, but inevitably there are compromises. It seems the majority of the power allocation is given to the GPU, which seems fair, but those who want to tweak can do so to give the CPU priority instead. Even with lowered TDPs compared to stock, for my use it ticks all the boxes: good for programming/compiling, good for database work, good for gaming.

The only slight thing concern for me is the potentially dodgy LCD cable - I'll keep an eye on it. Hopefully that squeeze earlier today will have popped the cable back and it won't be an issue going forward!
 
Last edited:

ubuysa

The BSOD Doctor
Many thanks for the detailed review and photos The comparison is useful too. This will help others thinking of a similar build. :)
 

SlimCini

KC and the Sunshine BANNED
To clarify, your new Octane VI 17" does have a g-sync 4k panel? This is not advertised in the configurator...
 

Retron

Silver Level Poster
To clarify, your new Octane VI 17" does have a g-sync 4k panel? This is not advertised in the configurator...
Yes - unlike the old Octane III (which has the 1.1 panel), the Octane VI has G-Sync (the 1.0 panel - it's an older revision).

This isn't mentioned on the configurator, which still lists the 1.1 panel and no G-Sync... thus if you want 4K G-Sync, it might be best to email PCS to make sure they'll be using the 1.0 rather than 1.1 panel.
 

fejerm

Active member
Can you please tell me how does the GPU core clock look while playing GPU intensive games like FC5 or BFV?
Latest MSI and Asus laptops run the RTX 2080 with 1840-1890 MHz core clock while gaming.

However I saw a video with the P775TM where the GPU clock was jumping between 1650 MHz and 1850 MHz, this is a huge gap which can result in microstuttering.

What is your experience ?
You can use MSI Afterburner to show stats while playing.

Also, did you order it with stock paste?
Your GPU temp seems too high. Yes the CPU is heating up the GPU too but still it should not be 91C, this is just too high. RTX 2080 had lower tdp and bigger die so it should run cooler than the gtx 1080 and it should not reach 91C.

Or you purposely let it run so hot and used Auto fan so it would not become loud ?

If you did not turn down the fanspeed then there is something wrong with your system. 91C for the RTX 2080 in not acceptable and it is thermal throttling.

LXnaF7a.jpg

A shame that pcs sent it to you with a broken heatsink or/and bad paste job. (Unless you ordered the default bad quality paste which in case there are no more questions :) )
 
Last edited:

Retron

Silver Level Poster
Can you please tell me how does the GPU core clock look while playing GPU intensive games like FC5 or BFV?
Latest MSI and Asus laptops run the RTX 2080 with 1840-1890 MHz core clock while gaming.

However I saw a video with the P775TM where the GPU clock was jumping between 1650 MHz and 1850 MHz, this is a huge gap which can result in microstuttering.
I don't have anything other than WoW installed at the moment, but playing that the core drops down below 1400 MHz as the temperature hits 90. This is with the fans set at whatever the stock BIOS sets them at.

It looks like the GPU fan (on the right) isn't revving up much, hence the GPU is throttling itself. If I manually force the fans to full (using Fn+1) the GPU temperature drops to around 75C and the core settles at 1620-1650 Mhz, with the odd (rare) jump to 1850.

I ordered Windows 10 Pro with the laptop. However, there were no Clevo utilities installed, so I suspect that something like the Clevo Control Centre may help. I'll have a more detailed look later when I get back from work.

(FWIW, I also chose the fancy paste option, but that just mentioned the CPU rather than the GPU.)
 

fejerm

Active member
I don't have anything other than WoW installed at the moment, but playing that the core drops down below 1400 MHz as the temperature hits 90. This is with the fans set at whatever the stock BIOS sets them at.

It looks like the GPU fan (on the right) isn't revving up much, hence the GPU is throttling itself. If I manually force the fans to full (using Fn+1) the GPU temperature drops to around 75C and the core settles at 1620-1650 Mhz, with the odd (rare) jump to 1850.

I ordered Windows 10 Pro with the laptop. However, there were no Clevo utilities installed, so I suspect that something like the Clevo Control Centre may help. I'll have a more detailed look later when I get back from work.

(FWIW, I also chose the fancy paste option, but that just mentioned the CPU rather than the GPU.)

Thank you for the answer.

So you did not change fan setting in the Clevo Control Center from Auto to Overclock (or performance ? I forgot the name).
On auto it will let the components overheat. On a long term basis this is bad because 90C GPU temp will shorten the component's life.
I suggest you manually set up a fan curve in the Clevo software so the GPU fan would be at 90% at 80C and 100% at 85C.

The GPU boost clock is strange

It is too low. MSI and Asus laptops have 200 mhz higher GPU clock.
I am guessing the 330W adapter is not enough with a 9900K so the GPU is not 100% utilized.

I personally will use the laptop with a custom made 600W adapter to roll out any issues with the adapter not providing enough juice.
 

fejerm

Active member
PCS might not like me for posting benchmarks made by a rival Company but a competitor got these max temps for 9900k and RTX 2080 after 20 min 100% CPU and GPU stress testing.
They used their own software to undervolt the CPU. (A software which is an alternative to the Clevo Control Center which can be purchased separately and Obisidan's website)

They got pretty good GPU temp .
 
Last edited by a moderator:

SpyderTracks

We love you Ukraine
PCS might not like me for posting benchmarks made by a rival Company but Obisidian PC got these max temps for 9900k and RTX 2080 after 20 min 100% CPU and GPU stress testing.
They used their own software to undervolt the CPU. (A software which is an alternative to the Clevo Control Center which can be purchased separately and Obisidan's website)

They got pretty good GPU temp .
View attachment 12681

You're missing the point that Retron didn't have CC installed, therefor his fan profile wasn't optimised and cooling as well as voltages wouldn't be optimised.

It's not a valid performance metric until CC is installed.
 

fejerm

Active member
You're missing the point that Retron didn't have CC installed, therefor his fan profile wasn't optimised and cooling as well as voltages wouldn't be optimised.

It's not a valid performance metric until CC is installed.

Yes, that's why I wrote that he should install it and set up a fan profile because 90C can damage his system.
 

Retron

Silver Level Poster
Yes, that's why I wrote that he should install it and set up a fan profile because 90C can damage his system.

It's interesting - I've had several PCS Clevo laptops before (Vortex II, Defiance, Octane III and now this Octane VI) and all aside from the Octane VI have worked fine, fan-wise, out of the box (as in, they don't need the Control Centre installed, the BIOS defaults work fine).

As it happened I used that Obsidian software mentioned in an earlier post, it does the job well and unlike the Control Centre doesn't install a load of bloatware I don't need. After installing it, the default fan profile kicked in and yes, as you'd expect, the GPU fan now revs up properly (EDIT: And the CPU fan has a bit more oomph too). Re-running the Warcraft test (which involves setting everything to max, at 4K, then flying from one end of the map to the other, over a load of different scenery, models and players, then doing some questing) gave a much better result.

sc5.png
(During play - note the throttling is now "Pwr" rather than "Thrm")

sc4.gif
(After play, showing the highest reached during gameplay)

The card still hovers around 1620-1650 MHz generally, with again the odd very brief spike higher, but the main difference is the temperature - which didn't go above 75C over a 15-minute period. Much better! Thank you for the pointers, folks, it's helped greatly.
 
Last edited:

fejerm

Active member
Holy hell!
No wonder the avarage gpu core clock is 200 Mhz lower than for MSI and Asus..that is an insane amount of power throttling !
Not enough power from a single 330W adapter?
Or just Clevo set the TDP limit in the vBIOS too low ?
Do you see microstuttering while playing because of this ?
 

Retron

Silver Level Poster
Holy hell!
No wonder the avarage gpu core clock is 200 Mhz lower than for MSI and Asus..that is an insane amount of power throttling !
Not enough power from a single 330W adapter?
Or just Clevo set the TDP limit in the vBIOS too low ?
Do you see microstuttering while playing because of this ?

I've not seen microstuttering, although to be honest I'm usually too engrossed in what I'm doing to notice the odd wayward frame! I daresay there would be the odd microstutter when the clock ramps up to 1800+, but it happens infrequently.

I suspect the locked lower TDP is due to balancing the needs of the 2080 against the potential use of a 9900k - I'm guessing the 2080 MXM card will have the same limits in its BIOS regardless of which CPU it's paired with. Bear in mind the 9900k as installed by Clevo goes up to 83W TDP, whereas the i9-8950HK as used by MSI and Asus only has a 45W TDP (even though it'll boost up, it won't be allowed anywhere near 83W).

Further - the 2080 MXM module is also available for the 15-inch version of the Octane, which only comes with a 230W PSU by default. Therefore I'd reckon it's safe to say that by default it'll be pegged back such that the CPU and GPU combined will be less than 230W - probably a fair bit less.

That would also mean that with a 330W PSU instead, there should be a fair bit of leeway to boost the power cap of the 2080.
 
Last edited:

Jame

Member
Hi, I like your detailed review. I'm a few weeks away from buying one aswell, but with a 2060 and 64gb of ram with the i9 as im going to be using it for heavy sampled instrument based music production. Is there any way you could run latencymon on your build to check the DPC latency as it's make or break for me purchasing the laptop. If it's too high , I will get audio cut outs and underruns, despite the great specs of the laptop. If you were to do it and post the results in the forum, it would help so much.
 

Jame

Member
If you are unfamiliar with latency mon, it's a program that tests your pcs ability for real time audio. An amazing score would be if all the driver reads are around 0.01 , but under 0.5 should be usable. If any drivers are above the threshold , it will cause underruns, especially in sessions with more tracks.
 

Retron

Silver Level Poster
If you are unfamiliar with latency mon, it's a program that tests your pcs ability for real time audio. An amazing score would be if all the driver reads are around 0.01 , but under 0.5 should be usable. If any drivers are above the threshold , it will cause underruns, especially in sessions with more tracks.
Not sure if it's what you're after or not (as I'm not really versed in the audio side of things), but latencymon reports that hdaudiobus.sys has a highest execution time of 0.153. Everything's below 0.5 with the exception of the Nvidia driver, which has had 0.67 as its high.

Restarting it without YouTube playing, but with a fan control thing, lots of tabs in Edge, Task Manager, Battle.net and Explorer open shows a different set of results: 0.33 is the highest now, for latencymon itself, with 0.27 for the graphics driver and 0.06 for the audio driver.

If you want a specific reading, let me know where it is and I'll post it.
 

SpyderTracks

We love you Ukraine
If you are unfamiliar with latency mon, it's a program that tests your pcs ability for real time audio. An amazing score would be if all the driver reads are around 0.01 , but under 0.5 should be usable. If any drivers are above the threshold , it will cause underruns, especially in sessions with more tracks.

It’s pointless to take latency tests on a non optimised windows install.

You’d need to manually configure windows for audio to achieve those kinds of figures.
 

debiruman665

Enthusiast
Thank you for the answer.

So you did not change fan setting in the Clevo Control Center from Auto to Overclock (or performance ? I forgot the name).
On auto it will let the components overheat. On a long term basis this is bad because 90C GPU temp will shorten the component's life.
I suggest you manually set up a fan curve in the Clevo software so the GPU fan would be at 90% at 80C and 100% at 85C.

The GPU boost clock is strange

It is too low. MSI and Asus laptops have 200 mhz higher GPU clock.
I am guessing the 330W adapter is not enough with a 9900K so the GPU is not 100% utilized.

I personally will use the laptop with a custom made 600W adapter to roll out any issues with the adapter not providing enough juice.

Did you ever get that 600W power adapter?

I've not seen microstuttering, although to be honest I'm usually too engrossed in what I'm doing to notice the odd wayward frame! I daresay there would be the odd microstutter when the clock ramps up to 1800+, but it happens infrequently.

I suspect the locked lower TDP is due to balancing the needs of the 2080 against the potential use of a 9900k - I'm guessing the 2080 MXM card will have the same limits in its BIOS regardless of which CPU it's paired with. Bear in mind the 9900k as installed by Clevo goes up to 83W TDP, whereas the i9-8950HK as used by MSI and Asus only has a 45W TDP (even though it'll boost up, it won't be allowed anywhere near 83W).

Further - the 2080 MXM module is also available for the 15-inch version of the Octane, which only comes with a 230W PSU by default. Therefore I'd reckon it's safe to say that by default it'll be pegged back such that the CPU and GPU combined will be less than 230W - probably a fair bit less.

That would also mean that with a 330W PSU instead, there should be a fair bit of leeway to boost the power cap of the 2080.

I keep wondering about this.

it's likely the laptop is set for the worst-case scenario (which never happens IRL) of 100%,100% i9 & rtx 2080. What's the theoretical risk of increasing the power draws to maximum and allowing the processor to go full until it heats throttles? Would this not be advantageous during non-gaming sessions?

As far as I can tell the remaining wattage in the power supply is needed to charge the battery when in use. So what's the downside of letting the beast run free and set it for unlimited power draw?
 

Marco1227

Active member
Just to point out, PCSpecialist claims that the 4k screen in the Octane is the 1.1 version (no G-synch) but already in my Octane V which I bought in 2017 the monitor is the 1.0 version which support G-synch.
So if your Nvidia panel give you the option to enable G-synch, it is likely that, despite what PCspecialist claims in the configuration screen, your 4k screen is the AU OPTRONICS B173ZAN01.0 H/W:0A F/W:1 which support G-synch
I am obviously refering to the 17.3'' screen.
 
Top