Jump to content
NotebookTalk

*Official Benchmark Thread* - Post it here or it didn't happen :D


Recommended Posts

Posted
14 hours ago, Mr. Fox said:

Sweet! Nice to see Intel opening a can of whoopass. I wonder if there will be a 290K Plus? That would be the one I would want. It would need to be a 9950X beater in all core workloads.

Unless something changes in a big way with Zen 6, my next build will most likely be a return to Intel. Will see what happens when it happens and go from there. I think the 9950X is a very good CPU, but I do not like how many hard-coded functional limitations there are with AMD CPUs and the X3D CPU features don't really offer me anything important to me since I am not particularly obsessed with gaming.

 

They opted to skip the slated 290k Plus and just stick with the 250k and 270k. Seeing as the 270k is basically a 285k but depending on bins will be better and the 290k was literally the 285k just pushed a bin up, Intel actually did the right thing this time coming in at killer price points on both the 250k and 270k.

 

Technically, 285k has TVB but it is a non starter really for us.

 

And just like that, the 9600x drops to ~$182 a few days before the 250k launches.....

 

Mobile wise, I'm curious to see what 290hx plus brings to the table as it is rumored to have 10-15% more performance than the 275/285hx variants but that will come down to the supporting rig around the engine as always. Difference between the 275hx in my Acer Neo 16s and the 275hx in my Alienware 18 is night and day.

---

 Zen 6 is rumored to have a 24 core variant though and if we get that, double X3D AND what is rumored to be significant fabric interconnect latency improvements, it could be a killer chip too. 

 

But Nova is going to be a brand new architecture with its own 3d caching system and I am expecting absolutely killer performance out of it too. Unlocked with absolutely no boundaries and a ton of cores, it has pulled ~600-700w in testing. Sounds right up your alley @Mr. Fox!

 

 

-------------------------------------------------------------------

 

Apple's new M5 Max chip is a decent little bump CPU wise but GPU wise it is a pretty substantial upgrade....

 

M1->M5 performance progression in just the Macbook 16" Max models. I imagine the slated M5 Ultra is going to be a beast especially now with M5 the CPU and GPU are own their own, individual modules with Apple having solved the high speed interconnect issues and allowing the same shared memory. This bodes well for future configs too for really building out Macs with multiple GPUs.

 

AI performance is absolutely through the roof. I suspect a lot of M5 Ultra boxes are going to be snapped up for AI work.

 

CPU:

 

XG3iSGY.png

 

GPU Raster:

 

IEZp4X4.png

 

GPU RT:

 

5sj4NzO.png

 

  • Like 1
  • Bump 1

Electrosoft Alpha:  9800X3D  | Asus X870E Hero Crosshair  | MSI Vanguard RTX 5090 OC | AC LF II 420 | TG 2x24GB 8200 @ 8000 tuned  | Samsung 9100 Pro 2TB | EVGA 1600w P2 | Phanteks Ethroo Pro | Alienware AW3225QF 32" OLED

Heath: i9-14900KS SP109 | AC LF III 360  | Asus Strix Z690 D4 | Gigabyte 9070XT Gaming OC Edition| 32GB DDR4 2x16GB B-Die 4000  | Samsung 980 1TB Pro |  Antec Flux Pro  | Samsung G7 32" 165hz 32"

Alienware Area-51  18 | 275HX | Nvidia RTX 5070ti  | 32GB DDR5 6400  |Gen 5 2TB | 18" QHD | WiFi 7

 

 


 

Posted

So now I'm looking at one of the following 

 

7800 xt, 7900 xt, gre, 9070, xt, possible 7900 xtx

Also looking to pick up an x570-ace for dual GPU scenario in tandem with losses scaling. 

 

Only reason for entertaining a second 7900 xtx is in the faint hope I can get one with better memory overclocking. Seems to be where people are able to hit gold. 

 

At that point I'll pick up a waterblock for the 5800 X3D also, I'll make it easier on my fingers digging into the system time and again. 

 

Also tried booting some of the Gpu's I have but don't seem to be porting lol just my luck

  • Like 2
  • Bump 1

AM4-DD | 5800X3D @ 4.45Ghz | Wet Devil 7900 XTX | 32GB DDR4 3200Mhz | 2TB NVME | EVGA 1300w | Win11 IoT

UNRAID | Xeon Gold 6248 40c/80t | 64GB + 512GB PMEM | RTX 3090Ti | 9300i-16P | 68TB HDD's | BeQuiet DP 13 1000w
HomeLab | E5-2697A 32c/64t | 128GB DDR4 2133Mhz | 2x2x1x1TB NVME Bifurcated | GTX 1060 3GB | X550 10Gbps 

Telegram / TS3Twitter

2700X to 5800X3D upgrade! With a 10850K cameo!

 

Posted
On 3/24/2026 at 11:31 AM, electrosoft said:

Mobile wise, I'm curious to see what 290hx plus brings to the table as it is rumored to have 10-15% more performance than the 275/285hx variants but that will come down to the supporting rig around the engine as always. Difference between the 275hx in my Acer Neo 16s and the 275hx in my Alienware 18 is night and day.

 

I think its supposed to be 30x NGU and D2D, 40x Ring. My 275HX in the Hydroc G2 can do that easily with Premamod, but the real advantage comes from memory tuning. I have seen a few post sub-80ns AIDA64 scores, but I find heat is a big limitation on combined load stability so realistically for me 85ns is the best it can do on air with reasonable ambient temperatures.

 

I think the A51 18 could do better temperature wise with the memory as the modules sit on the side opposite of the vapor chamber. You would have to use Smokeless UMAF to modify timings and test. Sadly max MT would be somewhat limited compared to even a mediocre desktop board as you cannot adjust VDD2 up beyond 1.1v (except on the Hydroc G2 with Premamod).

  • Thumb Up 2
  • Bump 1

Desktop - Intel 285K, Asus Z890 Apex, 48GB DDR5-8400 C36, 800GB Optane P5800X, Corsair HX1500i, Fractal Define 7 XL, Windows 10 Pro 22H2

Lenovo Legion 9i G10 - 275HX, 2x32GB Kingston Fury DDR5-5200 CL38, 4TB WD 8100, RTX 5090 mobile, 18.0 inch FHD+ 440hz IPS, Windows 10 Pro 22H2

Hydroc G2 / Uniwill IDY X6AR559Y - 275HX, 2x16GB DDR5-6400 CL38, 4TB WD SN850X, RTX 5090 mobile, 16.0 inch QHD+ 300hz MiniLED, Windows 11 Pro 24H2

Posted
2 hours ago, Reciever said:

So now I'm looking at one of the following 

 

7800 xt, 7900 xt, gre, 9070, xt, possible 7900 xtx

Also looking to pick up an x570-ace for dual GPU scenario in tandem with losses scaling. 

 

Only reason for entertaining a second 7900 xtx is in the faint hope I can get one with better memory overclocking. Seems to be where people are able to hit gold. 

 

At that point I'll pick up a waterblock for the 5800 X3D also, I'll make it easier on my fingers digging into the system time and again. 

 

Also tried booting some of the Gpu's I have but don't seem to be porting lol just my luck

My vote would be for the 9070 XT. So far it is AMD's best GPU in terms of performance.

  • Like 1
  • Bump 1

WRAITH | X870E Apex | 9950X | RTX 5090 | 32GB DDR5 @ 8200 | O11 XL EVO | HC-500A Chiller

BANSHEE | B850MPOWER | 4585PX | RTX 5080 | 48GB DDR5 @ 8000 | XT M3

Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

Posted
1 hour ago, Mr. Fox said:

My vote would be for the 9070 XT. So far it is AMD's best GPU in terms of performance.

Forgive me I typed it all out on my phone at break but this second GPU would strictly be just for Frame Gen mule under lossless scaling, my 7900 XTX would still be my raster, at 3Ghz its pretty good for me 24/7. I dont care about power savings, but I do concern myself with the 12VHPWR which is why Im dancing around upgrading to something "modern".

This is all just so I have 4K120 comfortably in recent single player titles. I prefer the idea of offloading frame gen to a second GPU over having it done on the same one.

 

Also looking at the IOCrest M.2 to 10Gbps adapter also and lastly the GC-Titanridge TB3 card for the last slot.

I think this would make for the best "everything I can think of" daily driver for a while.

 

Welp, scratch the ACE. Looks like Asus never fixed Curve Optimizer in the BIOS so it doesnt actually work lol, Back to the drawing board.

 

The Crosshair VIII Dark Hero looks like a good replacement choice.

  • Like 2
  • Bump 1

AM4-DD | 5800X3D @ 4.45Ghz | Wet Devil 7900 XTX | 32GB DDR4 3200Mhz | 2TB NVME | EVGA 1300w | Win11 IoT

UNRAID | Xeon Gold 6248 40c/80t | 64GB + 512GB PMEM | RTX 3090Ti | 9300i-16P | 68TB HDD's | BeQuiet DP 13 1000w
HomeLab | E5-2697A 32c/64t | 128GB DDR4 2133Mhz | 2x2x1x1TB NVME Bifurcated | GTX 1060 3GB | X550 10Gbps 

Telegram / TS3Twitter

2700X to 5800X3D upgrade! With a 10850K cameo!

 

Posted
8 hours ago, Reciever said:

Forgive me I typed it all out on my phone at break but this second GPU would strictly be just for Frame Gen mule under lossless scaling, my 7900 XTX would still be my raster, at 3Ghz its pretty good for me 24/7. I dont care about power savings, but I do concern myself with the 12VHPWR which is why Im dancing around upgrading to something "modern".

This is all just so I have 4K120 comfortably in recent single player titles. I prefer the idea of offloading frame gen to a second GPU over having it done on the same one.

 

Also looking at the IOCrest M.2 to 10Gbps adapter also and lastly the GC-Titanridge TB3 card for the last slot.

I think this would make for the best "everything I can think of" daily driver for a while.

 

Welp, scratch the ACE. Looks like Asus never fixed Curve Optimizer in the BIOS so it doesnt actually work lol, Back to the drawing board.

 

The Crosshair VIII Dark Hero looks like a good replacement choice.

One of the best things about owning a 9070 XT is, unless you do something stupid in terms of choosing the wrong GPU, you do not have to give a second thought to the garbage 12VHPWR connector. Having the legacy 8-pin PCIe connector is, in my mind, the most compelling reason to own one. Thankfully, hardly any of them use 12VHPWR connectors. The few that do deserve to be ostracized for their exceptionally poor judgment and lack of regard for people that purchase their products.

  • Like 1
  • Bump 1

WRAITH | X870E Apex | 9950X | RTX 5090 | 32GB DDR5 @ 8200 | O11 XL EVO | HC-500A Chiller

BANSHEE | B850MPOWER | 4585PX | RTX 5080 | 48GB DDR5 @ 8000 | XT M3

Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

Posted
15 hours ago, Reciever said:

So now I'm looking at one of the following 

 

7800 xt, 7900 xt, gre, 9070, xt, possible 7900 xtx

Also looking to pick up an x570-ace for dual GPU scenario in tandem with losses scaling. 

 

Only reason for entertaining a second 7900 xtx is in the faint hope I can get one with better memory overclocking. Seems to be where people are able to hit gold. 

 

At that point I'll pick up a waterblock for the 5800 X3D also, I'll make it easier on my fingers digging into the system time and again. 

 

Also tried booting some of the Gpu's I have but don't seem to be porting lol just my luck

 

Offloading frame gen to a secondary card via which method?

 

In regards to just generically picking amongst that group, it would be the 9070xt but depending on how you're offloading and executing FG, it would come down to potentially best bang for the buck too.

 

14 hours ago, win32asmguy said:

 

I think its supposed to be 30x NGU and D2D, 40x Ring. My 275HX in the Hydroc G2 can do that easily with Premamod, but the real advantage comes from memory tuning. I have seen a few post sub-80ns AIDA64 scores, but I find heat is a big limitation on combined load stability so realistically for me 85ns is the best it can do on air with reasonable ambient temperatures.

 

I think the A51 18 could do better temperature wise with the memory as the modules sit on the side opposite of the vapor chamber. You would have to use Smokeless UMAF to modify timings and test. Sadly max MT would be somewhat limited compared to even a mediocre desktop board as you cannot adjust VDD2 up beyond 1.1v (except on the Hydroc G2 with Premamod).

 

Did you ever give Smokeless a try on your Alienware 18 or Dell Max 18 before sending them back? How much flexibility do you have on your Lenovo? I really wish they had an 18" version of the Hydro. 😞

 

I'm content with my Alienware 18 275hx/5070ti primarily because it was free so bang for buck is undetermined since buck == 0; 🤣 I haven't had much time to play around with it lately though.

 

I've been plowing through Midnight at 4k Ultra settings and I think I can safely say we've reached end game with the 5090. Numerous spots can get into the mid to high 80s utilization but nothing so far can cap it out. This shows a 5080 or 4090 would still hit walls in some spots due to the high utilization spots on the 5090

 

As I'm sure you've experienced, Silvermoon, like Dorongal is an instant player physics frame killer....

 

------------------------------------------------------------------------

 

AMD officially dropping the 9950X3D2 April 22nd...... I don't think this is going to do much of anything for gaming overall with the same fabric limitations which will be greatly enhanced in Zen 6 but you never know.....

 

I'll be keeping an eye out on this one as I skipped the 9850X3D as it seemed a silly bin bump that can't even sustained its rated speeds in many scenarios vs the 9800X3D which is rock solid.

 

 

 

-------------------------------------------------------------------------------

 

Official confirmation of the 290k being scrapped (which we knew already)

 

270k officially available but showing up for $349.99 not $299.99 shipped and sold by Newegg and Amazon. Microcenter also has it listed for $357.....

 

Shenanigans!

 

Of course it is popping up in combo deals already as Newegg (and soon Micro$enter) tries to bundle and move stagnant hardware....

 

I'm going to hold off and let the air clear a bit as this SP83 265k is doing well enough atm....

 

---

 

Cheapest GPU prices:

 

5060 8GB........$349.99

5060 16GB......$699.99 (?)

 

5060ti 8GB.....$379.99

5060ti 16GB....$539.99

 

5070...............$629.99

 

5070ti.............$913.99

 

5080...............$1289.99

 

5090...............$3799.99 (You have TWO kidneys....you'll be ok!)

 

9060xt 8GB...$329.99

9060xt 16GB.$439.99

9070..............$599.99

9070xt...........$699.99

 

 

 

Electrosoft Alpha:  9800X3D  | Asus X870E Hero Crosshair  | MSI Vanguard RTX 5090 OC | AC LF II 420 | TG 2x24GB 8200 @ 8000 tuned  | Samsung 9100 Pro 2TB | EVGA 1600w P2 | Phanteks Ethroo Pro | Alienware AW3225QF 32" OLED

Heath: i9-14900KS SP109 | AC LF III 360  | Asus Strix Z690 D4 | Gigabyte 9070XT Gaming OC Edition| 32GB DDR4 2x16GB B-Die 4000  | Samsung 980 1TB Pro |  Antec Flux Pro  | Samsung G7 32" 165hz 32"

Alienware Area-51  18 | 275HX | Nvidia RTX 5070ti  | 32GB DDR5 6400  |Gen 5 2TB | 18" QHD | WiFi 7

 

 


 

Posted
7 minutes ago, electrosoft said:

Did you ever give Smokeless a try on your Alienware 18 or Dell Max 18 before sending them back? How much flexibility do you have on your Lenovo? I really wish they had an 18" version of the Hydro. 😞

 

I'm content with my Alienware 18 275hx/5070ti primarily because it was free so bang for buck is undetermined since buck == 0; 🤣 I haven't had much time to play around with it lately though.

 

Yeah, UMAF did work with both the A51 18 and Dell Pro Max 18 Plus. Sometimes it would freeze or fail to apply settings though so you have keep trying if something did not stick. Just be aware that with any later bios update Dell may write-protect the EFI setup variable region which effectively disables modifying any hidden settings.

 

With the Legion 9i SREP worked on the shipping bios but none of the updates. NGU and D2D can be adjusted but on this one memory timing changes always fail to train. Its basically stuck at 5200MT CL38 Dual Rank (the 64GB Kingston Fury kit). M-die also worked at 5600MT but timings were much worse than the Kingston kit.

 

There will be an 18 inch Uniwill chassis but this year it only gets up to 5070 mobile. Next year its supposed to have better GPU options according to Prema, so Eluktronics should end up carrying it.

  • Bump 1

Desktop - Intel 285K, Asus Z890 Apex, 48GB DDR5-8400 C36, 800GB Optane P5800X, Corsair HX1500i, Fractal Define 7 XL, Windows 10 Pro 22H2

Lenovo Legion 9i G10 - 275HX, 2x32GB Kingston Fury DDR5-5200 CL38, 4TB WD 8100, RTX 5090 mobile, 18.0 inch FHD+ 440hz IPS, Windows 10 Pro 22H2

Hydroc G2 / Uniwill IDY X6AR559Y - 275HX, 2x16GB DDR5-6400 CL38, 4TB WD SN850X, RTX 5090 mobile, 16.0 inch QHD+ 300hz MiniLED, Windows 11 Pro 24H2

Posted
10 hours ago, electrosoft said:

 

Offloading frame gen to a secondary card via which method?

 

In regards to just generically picking amongst that group, it would be the 9070xt but depending on how you're offloading and executing FG, it would come down to potentially best bang for the buck too.

 

Lossless Scaling allows using a second GPU for Frame Generation. 

I tested it initially with the 3090Ti (raster) and 1080Ti as the frame gen card. I was able to get stable FPS with Monster Hunter Wilds running native at 3440x1440 and from a perception standpoint it was smooth. 

 

This was running 3090Ti PCIe 3 x16 and 1080Ti PCIe 3 x4 which is pretty limiting for this type of scenario. 

 

I would like this for smoothing out 4K120 until all the new stuff comes out. 

 

Ideally you run (for 4K120) at minimum PCIe 4 x8/x8. You also want to set a framerate that doesnt cause the GPU to sit at full tilt, so if 110 is your max average then set the framerate for 90, let the mule carry it to 120 via dynamic framerate target. It also has the standard 1/2/3 multipliers also.

  • Like 1
  • Bump 1

AM4-DD | 5800X3D @ 4.45Ghz | Wet Devil 7900 XTX | 32GB DDR4 3200Mhz | 2TB NVME | EVGA 1300w | Win11 IoT

UNRAID | Xeon Gold 6248 40c/80t | 64GB + 512GB PMEM | RTX 3090Ti | 9300i-16P | 68TB HDD's | BeQuiet DP 13 1000w
HomeLab | E5-2697A 32c/64t | 128GB DDR4 2133Mhz | 2x2x1x1TB NVME Bifurcated | GTX 1060 3GB | X550 10Gbps 

Telegram / TS3Twitter

2700X to 5800X3D upgrade! With a 10850K cameo!

 

Posted
On 3/23/2026 at 5:14 PM, electrosoft said:

Intel Ultra 5 250k review. Out of box is clearly a step up over the 245k in a meaningful way not just productivity but gaming even beating the 265k in many games and some productivity. We'll need a deep dive and I'm sure some will pick up the 250k to bin. I'm really curious about the 270k now more than ever....

 

 

FARtjfK.png

 

Some rudimentary memory scaling results (XMP modes across the board):

 

nHMVfVS.png

 

-------------------------------------------------------

 

Micro$slop MAY be listening finally about draconian account requirements for installs and setups. As a reminder, Apple, to this day, does NOT require an account to set up MacOS.

 

-------------------------------------------------------

 

Deba8auer and the 270k price performance. Cheaper and faster than a 285k..... 

 

$299.99 is a killer price...

 

Definitely picking up a few to bin.

 

 

 

Hmm. Seen more info about this? Intel should stop make such changes within same gen Chips (to upsell the refresh as something new). Almost as Nvidia making new gen fake frames features as the new for new gen Graphics cards. Soon you'll pay extra for software features or slight tuned firmware/vbios versions to milk more money from the more expensive Gpu SKUs.

 

A popular benchmark, Geekbench, says it will issue a warning when Intel’s new “Arrow Lake Refresh” desktop chips enable Intel’s new IBOT feature. Why? Because the benchmark vendor can’t be sure that scores reported with it can be considered trustworthy.

 

https://www.pcworld.com/article/3099125/intel-new-performance-tool-casts-doubt-on-benchmark-scores.html

 

Regarding IBOT. NOT Intel APO. What will Intel brand the Optimizer software feature next time?

 

What buyers of Arrow Lake Refresh should know is that IBOT is not on by default, so if you want this performance boost, you must use Intel’s software to activate the feature. It also is only in 12 games as a start, with no concrete timeline for additional title support.

And yes, for those who’ve abandoned Windows: IBOT will eventually come to Linux. But Intel won’t say when—not just yet.

 

 

  • Thumb Up 1
  • Bump 1

"The Killer"  ASUS  Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Posted
1 hour ago, Papusan said:

 

Hmm. Seen more info about this? Intel should stop make such changes within same gen Chips (to upsell the refresh as something new). Almost as Nvidia making new gen fake frames features as the new for new gen Graphics cards. Soon you'll pay extra for software features or slight tuned firmware/vbios versions to milk more money from the more expensive Gpu SKUs.

 

A popular benchmark, Geekbench, says it will issue a warning when Intel’s new “Arrow Lake Refresh” desktop chips enable Intel’s new IBOT feature. Why? Because the benchmark vendor can’t be sure that scores reported with it can be considered trustworthy.

 

https://www.pcworld.com/article/3099125/intel-new-performance-tool-casts-doubt-on-benchmark-scores.html

 

Regarding IBOT. NOT APO. 

 

What buyers of Arrow Lake Refresh should know is that IBOT is not on by default, so if you want this performance boost, you must use Intel’s software to activate the feature. It also is only in 12 games as a start, with no concrete timeline for additional title support.

And yes, for those who’ve abandoned Windows: IBOT will eventually come to Linux. But Intel won’t say when—not just yet.

 

 

 

https://browser.geekbench.com/v6/cpu/17330779

 

Geekbench doesn't like me being able to wipe the field of other x86 CPUs like this lol. Can't hurt AMD fanboy feelings. 

 

But I absolutely agree Intel needs to stop with the single gen optimization BS. They need to commit to a timeline, say 5 years of guaranteed support for SKU optimizations. It's my biggest gripe with this strategy. I do feel Intel is not going to stop here though with IBOT, I expect multiplayer games will be added soon, especially listening to Robert at Intel speak recently when he very quickly said "not yet". 

 

IBOT and the large cache next gen is very likely going to be a big game changer for Intel, they will eventually force IBOT on by default just like APO. 

  • Haha 1
  • Bump 1
Spoiler

The Beast Asus Z790 APEX | Intel i9 13900K | ASUS RTX 4090 Strix OC | 64gb DDR5 7466 CL34 Dual Rank A-Dies | Samsung 990 Pro 2TB | Innocn 4K 160Hz Mini LED HDR1000 | LG 27GN950-B 4K 160Hz | Corsair 170i Elite LCD 420mm AIO | Corsair 7000D | EVGA 1600w T2

Little Beast EVGA Z690 DARK | Intel i9 13900K | Nvidia RTX 4090 FE | 32gb DDR5 SK Hynix DDR5 8000 CL36 A-Dies | Samsung 980 Pro 2TB | LG OLED C1 4K 120Hz G-Sync/FreeSync | Alienware AW2721D 1440p 240Hz G-Sync Ultimate | Corsair 115i Elite 280mm AIO | Lian Li 011 Dynamic | EVGA 1000w P6

 

 

Posted

Looks like I might have to RMA sometime soon.

 

x870e Hero LCD display is starting to go on the fritz and flash random colors in patches. Google search shows this is a known problem and eventually it will die flat out, bummer.

 

----------------------------------------------------------------------

 

On 3/26/2026 at 2:38 PM, win32asmguy said:

 

Yeah, UMAF did work with both the A51 18 and Dell Pro Max 18 Plus. Sometimes it would freeze or fail to apply settings though so you have keep trying if something did not stick. Just be aware that with any later bios update Dell may write-protect the EFI setup variable region which effectively disables modifying any hidden settings.

 

With the Legion 9i SREP worked on the shipping bios but none of the updates. NGU and D2D can be adjusted but on this one memory timing changes always fail to train. Its basically stuck at 5200MT CL38 Dual Rank (the 64GB Kingston Fury kit). M-die also worked at 5600MT but timings were much worse than the Kingston kit.

 

There will be an 18 inch Uniwill chassis but this year it only gets up to 5070 mobile. Next year its supposed to have better GPU options according to Prema, so Eluktronics should end up carrying it.

 

Yeah, I remember when they did that on my 11th gen XPS laptop from them after I altered the cfg values to enable TS support a subsequent BIOS update locked it down even further. It's problematic so that's why I was asking about your experience with Smokeless.

 

A 5070 is just a bit too under powered on the laptop scene for me at this point. 5070ti is the bare level entry level 😞

 

All I want is an 18" laptop with full BIOS control and at least a 5070ti+ with a super beefy cooling system and space to mod.....am I asking too much? 🤣

 

1 hour ago, Papusan said:

 

Hmm. Seen more info about this? Intel should stop make such changes within same gen Chips (to upsell the refresh as something new). Almost as Nvidia making new gen fake frames features as the new for new gen Graphics cards. Soon you'll pay extra for software features or slight tuned firmware/vbios versions to milk more money from the more expensive Gpu SKUs.

 

A popular benchmark, Geekbench, says it will issue a warning when Intel’s new “Arrow Lake Refresh” desktop chips enable Intel’s new IBOT feature. Why? Because the benchmark vendor can’t be sure that scores reported with it can be considered trustworthy.

 

https://www.pcworld.com/article/3099125/intel-new-performance-tool-casts-doubt-on-benchmark-scores.html

 

Regarding IBOT. NOT Intel APO. What will Intel brand the Optimizer software feature next time?

 

What buyers of Arrow Lake Refresh should know is that IBOT is not on by default, so if you want this performance boost, you must use Intel’s software to activate the feature. It also is only in 12 games as a start, with no concrete timeline for additional title support.

And yes, for those who’ve abandoned Windows: IBOT will eventually come to Linux. But Intel won’t say when—not just yet.

 

 

 

Yeah seems to be the new thing with adding new features to the newest products at the time. Intel potentially pulling an AMD and Nvidia move. This isn't new as APO really didn't go anywhere. Luckily it had WoW support and I still enable it on the wife's system for her SP109 14900KS. I wouldn't expect IBOT to have long, far reaching support that builds upon Arrow Lake and continues to extended to it and Nova in the future but you never know.

 

Either way, if Intel is able to dynamically reorder execution instructions to extract better performance, I'm here for it.

 

37 minutes ago, Talon said:

 

https://browser.geekbench.com/v6/cpu/17330779

 

Geekbench doesn't like me being able to wipe the field of other x86 CPUs like this lol. Can't hurt AMD fanboy feelings. 

 

Nice!

 

Still crazy how Apple silicon ST scores are pretty bonkers now but I'm so glad to see Intel making the right moves lately from Battlemage to returning to mobile dominance to enhancing and lowering the prices of Arrow Lake and not doing a cash grab of a 290k.

 

All reports show Nova is going to be a monster.

 

I haven't picked up a 270k individually yet as prices shot up but I did snag a combo deal on hold from Microcenter for $617.50  to contemplate with a 270k, MSI Z890 Tomahawk and 2x16GB Crucial DDR5 6400 that I would just use in my Z20 Jonsbo mATX case.

 

Still mulling it over if I want to pick that up, suck it up and pay the current $349.99 price tag or just wait till the first wave hype dies down and prices return a bit.

 

Kinda want to play around with an MSI z890 board though too for comparison....

 

-----------------------------------------------------------

 

In before the "Well, max tuned chip vs chip I think the 14600k would be better blah blah blah" crowd pipes up...not realizing 99% of users aren't going to touch their settings at all and even just enabling XMP is scary to them....🤣

 

 

 

 

  • Like 1

Electrosoft Alpha:  9800X3D  | Asus X870E Hero Crosshair  | MSI Vanguard RTX 5090 OC | AC LF II 420 | TG 2x24GB 8200 @ 8000 tuned  | Samsung 9100 Pro 2TB | EVGA 1600w P2 | Phanteks Ethroo Pro | Alienware AW3225QF 32" OLED

Heath: i9-14900KS SP109 | AC LF III 360  | Asus Strix Z690 D4 | Gigabyte 9070XT Gaming OC Edition| 32GB DDR4 2x16GB B-Die 4000  | Samsung 980 1TB Pro |  Antec Flux Pro  | Samsung G7 32" 165hz 32"

Alienware Area-51  18 | 275HX | Nvidia RTX 5070ti  | 32GB DDR5 6400  |Gen 5 2TB | 18" QHD | WiFi 7

 

 


 

Posted

As far as I am concerned, any hardware "feature" that requires installing software to function is not actually a feature, but a stupid gimmick that should be regarded as worthless. If the feature only works in Windows it is even more worthless. Having to install software is an unacceptable joke, even if Linux support is provided for it. The PC tech industry is quickly becoming the domain of idiots and profusely littered with trash. Hardware should have full functionality that requires no software and is OS agnostic.

  • Like 1

WRAITH | X870E Apex | 9950X | RTX 5090 | 32GB DDR5 @ 8200 | O11 XL EVO | HC-500A Chiller

BANSHEE | B850MPOWER | 4585PX | RTX 5080 | 48GB DDR5 @ 8000 | XT M3

Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

Posted
10 minutes ago, Mr. Fox said:

As far as I am concerned, any hardware "feature" that requires installing software to function is not actually a feature, but a stupid gimmick that should be regarded as worthless. If the feature only works in Windows it is even more worthless. Having to install software is an unacceptable joke, even if Linux support is provided for it. The PC tech industry is quickly becoming the domain of idiots and profusely littered with trash.

 

Software or huge driver packages to try fix their hardware to function properly is lame. In the old days you didnt have to install software or drivers to make Intel chips works. Even the chipset driver isn't an real driver. 

 

**** also feels like a workaround **** rather than a fundamental improvement in game performance. Instead of broad architectural gains, this approach selectively enhances individual titles through predefined profiles. The biggest limitation is support

 

https://www.guru3d.com/review/core-ultra-5-250k-and-7-270k-plus-processor-review/page-30/

 

 

  • Thumb Up 1

"The Killer"  ASUS  Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Posted
13 minutes ago, Mr. Fox said:

As far as I am concerned, any hardware "feature" that requires installing software to function is not actually a feature, but a stupid gimmick that should be regarded as worthless. If the feature only works in Windows it is even more worthless. Having to install software is an unacceptable joke, even if Linux support is provided for it. The PC tech industry is quickly becoming the domain of idiots and profusely littered with trash.

 

Being mislead by tech tubers telling you that you have to install 'additional' software was your first mistake. The Intel Performance Package is literally a motherboard driver that you install just like my 9850X3D has Chipset drivers that installs a 3D V-Cache Performance Optimizer and Platform Power Management drivers. No longer do you have to enable anything in the BIOS, it's all on by default, you no longer download the overlay (was never required before anyways) from the Microsoft store as it's all been brough in house, single Intel PPP Driver which installs everything. Intel PPP driver is now on the Intel website as well as your motherboard support page. https://www.intel.com/content/www/us/en/download/869519/intel-platform-performance-package.html 

 

Intel simply brough it all under one roof now, a single driver that does it all which is how it should have been. It's a great fix.

 

So PPP isn't a gimmick, it's literally Platform Power Management and Optimizations that every single vendor does. I'm not sure how this misunderstanding even started. 

  • Thumb Up 2
Spoiler

The Beast Asus Z790 APEX | Intel i9 13900K | ASUS RTX 4090 Strix OC | 64gb DDR5 7466 CL34 Dual Rank A-Dies | Samsung 990 Pro 2TB | Innocn 4K 160Hz Mini LED HDR1000 | LG 27GN950-B 4K 160Hz | Corsair 170i Elite LCD 420mm AIO | Corsair 7000D | EVGA 1600w T2

Little Beast EVGA Z690 DARK | Intel i9 13900K | Nvidia RTX 4090 FE | 32gb DDR5 SK Hynix DDR5 8000 CL36 A-Dies | Samsung 980 Pro 2TB | LG OLED C1 4K 120Hz G-Sync/FreeSync | Alienware AW2721D 1440p 240Hz G-Sync Ultimate | Corsair 115i Elite 280mm AIO | Lian Li 011 Dynamic | EVGA 1000w P6

 

 

Posted
Just now, Mr. Fox said:

As far as I am concerned, any hardware "feature" that requires installing software to function is not actually a feature, but a stupid gimmick that should be regarded as worthless. If the feature only works in Windows it is even more worthless. Having to install software is an unacceptable joke, even if Linux support is provided for it. The PC tech industry is quickly becoming the domain of idiots and profusely littered with trash.

 

1 minute ago, Papusan said:

 

Software to fix fundamental hardware functions is lame. In the old days you didnt have to install software or drivers to make Intel chips works. Even the chipset driver isn't an real driver. 

 

**** also feels like a workaround **** rather than a fundamental improvement in game performance. Instead of broad architectural gains, this approach selectively enhances individual titles through predefined profiles. The biggest limitation is support

 

https://www.guru3d.com/review/core-ultra-5-250k-and-7-270k-plus-processor-review/page-30/

 

 

 

Intel and even AMD have before and now are  offering CPU optimization tools to compensate primarily for scheduler issues with these hybrid chips and then the next level is re-ordering (aka optimizing) execution for optimal performance dynamically.

 

We are basically talking drivers here. 

 

I don't see a problem with these drivers / tools Instead I see a problem with Windows and to a lesser degree developers making trash, bloated, unoptimized software.

 

It's like when you run Timespy or Steel Nomad and you can see a certain performance delta between AMD and Nvidia so that sets a performance bar.

 

We know the 5090 is heads and tails more powerful than the 9070xt / 7900xtx, yet you will suddenly see titles where the performance narrows to an insane level not reflective of benchmarks.

 

You then go into games and literally the performance varies anywhere between 20 to 90% depending on the game being tested not only AMD vs Nvidia but even Nvidia vs Nvidia and AMD vs AMD. The variance and bloat is insane.

 

This can extend even to CPU performance. Some games have horrific CPU optimizations. It was no shock that WoW was on the Intel APO short order list of games.

 

WoW is guilty of this and needs a major overhaul. Player physics absolutely tanks performance regardless of the CPU and leaves the GPU twiddling its thumbs.

 

Fallout 76 is guilty too and needs a major overhaul. The differential in X3D vs Intel for performance is insane.

 

So many console ports relying on the grunt of the CPU/GPU and not optimizing properly need it too.

 

Watching Vex's take on Crimson Desert and some truly great scaling and optimization vs so many bloated corpses masquerading as competent game engines.

 


 

 

4 minutes ago, Talon said:

 

Being mislead by tech tubers telling you that you have to install 'additional' software was your first mistake. The Intel Performance Package is literally a motherboard driver that you install just like my 9850X3D has Chipset drivers that installs a 3D V-Cache Performance Optimizer driver. No longer do you have to enable anything in the BIOS, it's all on by default, you no longer download the overlay (was never required before anyways) from the Microsoft store as it's all been brough in house, single Intel PPP Driver which installs everything. Intel PPP driver is now on the Intel website as well as your motherboard support page. https://www.intel.com/content/www/us/en/download/869519/intel-platform-performance-package.html 

 

So PPP isn't a gimmick, it's literally Platform Power Management and Optimizations that every single vendor does. I'm not sure how this misunderstanding even started. 

 

Basically this...........

 

I'm not going to look a gift horse in the mouth and NOT accept CPU or GPU makers doing what they can to optimize and present their products in the best light to my benefit no matter the scale or exclusivity.

 

I do get rankled a bit when they have the ability to backport the technologies to previous CPUs/GPUs but don't trying to push current iterations as hard as possible though....but that's for another day. 🙂

 

 

 

 

  • Thumb Up 1

Electrosoft Alpha:  9800X3D  | Asus X870E Hero Crosshair  | MSI Vanguard RTX 5090 OC | AC LF II 420 | TG 2x24GB 8200 @ 8000 tuned  | Samsung 9100 Pro 2TB | EVGA 1600w P2 | Phanteks Ethroo Pro | Alienware AW3225QF 32" OLED

Heath: i9-14900KS SP109 | AC LF III 360  | Asus Strix Z690 D4 | Gigabyte 9070XT Gaming OC Edition| 32GB DDR4 2x16GB B-Die 4000  | Samsung 980 1TB Pro |  Antec Flux Pro  | Samsung G7 32" 165hz 32"

Alienware Area-51  18 | 275HX | Nvidia RTX 5070ti  | 32GB DDR5 6400  |Gen 5 2TB | 18" QHD | WiFi 7

 

 


 

Posted
2 minutes ago, electrosoft said:

 

 

Intel and even AMD have before and now are  offering CPU optimization tools to compensate primarily for scheduler issues with these hybrid chips and then the next level is re-ordering (aka optimizing) execution for optimal performance dynamically.

 

We are basically talking drivers here. 

 

I don't see a problem with these drivers / tools Instead I see a problem with Windows and to a lesser degree developers making trash, bloated, unoptimized software.

 

It's like when you run Timespy or Steel Nomad and you can see a certain performance delta between AMD and Nvidia so that sets a performance bar.

 

We know the 5090 is heads and tails more powerful than the 9070xt / 7900xtx, yet you will suddenly see titles where the performance narrows to an insane level not reflective of benchmarks.

 

You then go into games and literally the performance varies anywhere between 20 to 90% depending on the game being tested not only AMD vs Nvidia but even Nvidia vs Nvidia and AMD vs AMD. The variance and bloat is insane.

 

This can extend even to CPU performance. Some games have horrific CPU optimizations. It was no shock that WoW was on the Intel APO short order list of games.

 

WoW is guilty of this and needs a major overhaul. Player physics absolutely tanks performance regardless of the CPU and leaves the GPU twiddling its thumbs.

 

Fallout 76 is guilty too and needs a major overhaul. The differential in X3D vs Intel for performance is insane.

 

So many console ports relying on the grunt of the CPU/GPU and not optimizing properly need it too.

 

Watching Vex's take on Crimson Desert and some truly great scaling and optimization vs so many bloated corpses masquerading as competent game engines.

 


 

 

 

Basically this...........

 

I'm not going to look a gift horse in the mouth and NOT accept CPU or GPU makers doing what they can to optimize and present their products in the best light to my benefit no matter the scale or exclusivity.

 

I do get rankled a bit when they have the ability to backport the technologies to previous CPUs/GPUs but don't trying to push current iterations as hard as possible though....but that's for another day. 🙂

 

 

 

 

 

Remember when they told us that 13th and 12th gen wouldn't work with APO? That is my biggest issue with Intel. They did say IBT has specific silicon "hooks" but I doubt. They said they're looking into bringing it in some fashion to ARL 1. Tells me they're waiting to see how initial sales look before they make that decision. 

  • Bump 1
Spoiler

The Beast Asus Z790 APEX | Intel i9 13900K | ASUS RTX 4090 Strix OC | 64gb DDR5 7466 CL34 Dual Rank A-Dies | Samsung 990 Pro 2TB | Innocn 4K 160Hz Mini LED HDR1000 | LG 27GN950-B 4K 160Hz | Corsair 170i Elite LCD 420mm AIO | Corsair 7000D | EVGA 1600w T2

Little Beast EVGA Z690 DARK | Intel i9 13900K | Nvidia RTX 4090 FE | 32gb DDR5 SK Hynix DDR5 8000 CL36 A-Dies | Samsung 980 Pro 2TB | LG OLED C1 4K 120Hz G-Sync/FreeSync | Alienware AW2721D 1440p 240Hz G-Sync Ultimate | Corsair 115i Elite 280mm AIO | Lian Li 011 Dynamic | EVGA 1000w P6

 

 

Posted
1 hour ago, Talon said:

 

Being mislead by tech tubers telling you that you have to install 'additional' software was your first mistake. The Intel Performance Package is literally a motherboard driver that you install just like my 9850X3D has Chipset drivers that installs a 3D V-Cache Performance Optimizer and Platform Power Management drivers. No longer do you have to enable anything in the BIOS, it's all on by default, you no longer download the overlay (was never required before anyways) from the Microsoft store as it's all been brough in house, single Intel PPP Driver which installs everything. Intel PPP driver is now on the Intel website as well as your motherboard support page. https://www.intel.com/content/www/us/en/download/869519/intel-platform-performance-package.html 

 

Intel simply brough it all under one roof now, a single driver that does it all which is how it should have been. It's a great fix.

 

So PPP isn't a gimmick, it's literally Platform Power Management and Optimizations that every single vendor does. I'm not sure how this misunderstanding even started. 

 

1 hour ago, electrosoft said:

 

 

Intel and even AMD have before and now are  offering CPU optimization tools to compensate primarily for scheduler issues with these hybrid chips and then the next level is re-ordering (aka optimizing) execution for optimal performance dynamically.

 

We are basically talking drivers here. 

 

I don't see a problem with these drivers / tools Instead I see a problem with Windows and to a lesser degree developers making trash, bloated, unoptimized software.

 

It's like when you run Timespy or Steel Nomad and you can see a certain performance delta between AMD and Nvidia so that sets a performance bar.

 

We know the 5090 is heads and tails more powerful than the 9070xt / 7900xtx, yet you will suddenly see titles where the performance narrows to an insane level not reflective of benchmarks.

 

You then go into games and literally the performance varies anywhere between 20 to 90% depending on the game being tested not only AMD vs Nvidia but even Nvidia vs Nvidia and AMD vs AMD. The variance and bloat is insane.

 

This can extend even to CPU performance. Some games have horrific CPU optimizations. It was no shock that WoW was on the Intel APO short order list of games.

 

WoW is guilty of this and needs a major overhaul. Player physics absolutely tanks performance regardless of the CPU and leaves the GPU twiddling its thumbs.

 

Fallout 76 is guilty too and needs a major overhaul. The differential in X3D vs Intel for performance is insane.

 

So many console ports relying on the grunt of the CPU/GPU and not optimizing properly need it too.

 

Watching Vex's take on Crimson Desert and some truly great scaling and optimization vs so many bloated corpses masquerading as competent game engines.

 


 

 

 

Basically this...........

 

I'm not going to look a gift horse in the mouth and NOT accept CPU or GPU makers doing what they can to optimize and present their products in the best light to my benefit no matter the scale or exclusivity.

 

I do get rankled a bit when they have the ability to backport the technologies to previous CPUs/GPUs but don't trying to push current iterations as hard as possible though....but that's for another day. 🙂

 

 

 

 

 

1 hour ago, Talon said:

 

Remember when they told us that 13th and 12th gen wouldn't work with APO? That is my biggest issue with Intel. They did say IBT has specific silicon "hooks" but I doubt. They said they're looking into bringing it in some fashion to ARL 1. Tells me they're waiting to see how initial sales look before they make that decision. 

Windoze is a broken surveillance tool and a raging dumpster fire of an OS. Literally a digital dung heap.

 

I do not have a problem with drivers. If functionality requires any Windoze Service(s) to be running that consumes memory and CPU cycles for the feature to work, then it is a software gimmick and not a hardware feature. Now, It may be a necessary evil and unavoidable fix for the raging Micro$lop dumpster fire, but it is still not a hardware feature if it depends on Windoze services to be running.

 

If it is a hardware feature it will work in a UEFI shell or in a Linux environment because it is baked into the CPU or GPU and functions at the firmware level and there is no way for it "not to work" if it is in the wrong OS environment. If it is a hardware feature, there is either no way to turn it on or off, or you do it in the BIOS, not Windoze, and it will not care if you are running an old version of Windows 10 or 11 that doesn't have certain updates installed.

  • Like 1
  • Bump 1

WRAITH | X870E Apex | 9950X | RTX 5090 | 32GB DDR5 @ 8200 | O11 XL EVO | HC-500A Chiller

BANSHEE | B850MPOWER | 4585PX | RTX 5080 | 48GB DDR5 @ 8000 | XT M3

Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

Posted
2 hours ago, electrosoft said:

All I want is an 18" laptop with full BIOS control and at least a 5070ti+ with a super beefy cooling system and space to mod.....am I asking too much? 🤣

 

To be completely honest that is a tough one. I also tested out the Micro Center Raider 18 with 285HX, 5090m and 1600p 240hz. While it does have the advanced bios available I ran into issues with its audio getting distorted under combined loads, very high temperatures (over 110C on the stock system memory) and eventual crashes under load that needed an EC reset to recover. Maybe just a bad sample? Maybe single rank memory would run cooler or just something about the Micron 32GB modules? Although I found many others had similar issues on MSI's Global Forums with 2025 Intel Vector/Raider/Titan models...

 

Today's project has been turning the media center back into a gaming machine for use on our 4k living room TV for my wife. We picked up one of those Midnight edition 5070 GPU's at Micro Center which seems to perform fine. With the 285K in 200S boost mode with the 48GB memory at 8000 CL40 WoW we are seeing 100fps running around Silvermoon without RT enabled. I need to find a nicer looking ATX case that has space for 3x3.5 inch HDDs and preferably a single 5.25 inch Bluray drive. The Fractal Design Define R7 XL is far too big for the space we have available under the TV.

Desktop - Intel 285K, Asus Z890 Apex, 48GB DDR5-8400 C36, 800GB Optane P5800X, Corsair HX1500i, Fractal Define 7 XL, Windows 10 Pro 22H2

Lenovo Legion 9i G10 - 275HX, 2x32GB Kingston Fury DDR5-5200 CL38, 4TB WD 8100, RTX 5090 mobile, 18.0 inch FHD+ 440hz IPS, Windows 10 Pro 22H2

Hydroc G2 / Uniwill IDY X6AR559Y - 275HX, 2x16GB DDR5-6400 CL38, 4TB WD SN850X, RTX 5090 mobile, 16.0 inch QHD+ 300hz MiniLED, Windows 11 Pro 24H2

Posted

The new ASUS BIOS is working well on the Apex. My system did not care for AGESA 1.3.0.0 or 1.3.0.0a but 1.3.0.1 seems good. They added that new "Crosshair Tweak" option in the memory settings. I am using mode 2 (Normal) because I don't give a flying hoot about mitigating the extremely remote possibility of being inconvenienced by a row hammer exploit. 

 

There is also a newer ZenTimings beta in the Discord available for download.

 

2201-8000-jpg.2759154

  • Like 1

WRAITH | X870E Apex | 9950X | RTX 5090 | 32GB DDR5 @ 8200 | O11 XL EVO | HC-500A Chiller

BANSHEE | B850MPOWER | 4585PX | RTX 5080 | 48GB DDR5 @ 8000 | XT M3

Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

Posted

I recently picked up the new version of Painkiller on a Steam sale. I have several of the original Painkiller titles and loved playing them. The new one is no exception. A bit flashier and better graphics, but just as fun. Your sole objective is to kill endless hordes of demons without getting killed in the process. Very similar to Doom but less keyboard effort needed than Doom.

 

 

WRAITH | X870E Apex | 9950X | RTX 5090 | 32GB DDR5 @ 8200 | O11 XL EVO | HC-500A Chiller

BANSHEE | B850MPOWER | 4585PX | RTX 5080 | 48GB DDR5 @ 8000 | XT M3

Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

Posted
1 hour ago, Mr. Fox said:

I recently picked up the new version of Painkiller on a Steam sale. I have several of the original Painkiller titles and loved playing them. The new one is no exception. A bit flashier and better graphics, but just as fun. Your sole objective is to kill endless hordes of demons without getting killed in the process. Very similar to Doom but less keyboard effort needed than Doom.

 

 


This looks pretty sick, I especially like that it’s coop. 

  • Like 1
Spoiler

The Beast Asus Z790 APEX | Intel i9 13900K | ASUS RTX 4090 Strix OC | 64gb DDR5 7466 CL34 Dual Rank A-Dies | Samsung 990 Pro 2TB | Innocn 4K 160Hz Mini LED HDR1000 | LG 27GN950-B 4K 160Hz | Corsair 170i Elite LCD 420mm AIO | Corsair 7000D | EVGA 1600w T2

Little Beast EVGA Z690 DARK | Intel i9 13900K | Nvidia RTX 4090 FE | 32gb DDR5 SK Hynix DDR5 8000 CL36 A-Dies | Samsung 980 Pro 2TB | LG OLED C1 4K 120Hz G-Sync/FreeSync | Alienware AW2721D 1440p 240Hz G-Sync Ultimate | Corsair 115i Elite 280mm AIO | Lian Li 011 Dynamic | EVGA 1000w P6

 

 

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use