
1610ftw
Member-
Posts
1,213 -
Joined
-
Last visited
-
Days Won
1
Content Type
Profiles
Forums
Events
Everything posted by 1610ftw
-
MSI has been limiting themselves for some time now (3 generations) with offering almost exclusively UHD screens in their top tier laptops despite evidence to the contrary that it is hurting sales of these top of the line units especially with the gaming variants and even more so in the absence of Gsync that would be less of an issue with a QHD+ screen. Agreed on the vapor chamber, provided it is really significantly better and apparently it is for the CPU. I would prefer to have it in the 4080 and 4090 SKUs and only offer the heatpipe solution for the 4070 SKUs that are available in certain countries. It is also hard to justify 4 or more different models that apart from having two different GPUs only differentiate themselves from the others by their memory or storage so fully agreed on that. Anything less than 32GB memory does not make much sense any more and a 2TB drive is a size that is actually of some use and a good fit for a top tier system without breaking the bank and it would help specialized retailers to deck out the units for their customers as they require it.
-
I think it was really also bad luck involved, this does not always happen. Regarding the black bar, have you checked the post by @win32asmguy about taking his unit apart?
-
Obviously too late now but the XMG bios will be good enough for: 155W GPU and 170W CPU in combined use 128GB memory @3200 with relaxed timings and overall very stable operation. Of course we always like to push things but taking apart the x170 isn't that much fun and if you want to continue to try a few things with the bios you may want to think about a cutout in order to be able to reach the bios chip the next time around.
-
Same here, no XTU for me and CCC is for now gone completely. I only tried the free version of the Obsidian software and it intermittently stopped working when in maximum fan mode and would revert to auto mode. At the time it was not possible to try the full fledged version and even then people (not you) reported issues with the X170 series so I did not want to spend the money and needless to say it was not much longer until Obsidian went out of business. With Clevo Fan Control things have been very relaxed and I even ported over my system SSD to another X170 without any issues except of course for my Windows registration. Very nice and no issue at all that very important hardware (chipset, CPU and GPU) changed - very nice.
-
Whatever is missing it seems to be the best of the rest and what is most important for me is that at least here it runs stable and does not cause any issues which I cannot say about CCC, Obsidian or RLEC viewer that all gave me trouble sooner or later. And that is not even taking into account that CCC is also a hideous monstrosity.
-
Just use Clevo Fan Control, it is decent enough when you only have two fans, discussed in this thread: I talk about it a little in the last post, here is a screenshot: I have found that with the bottom cover attached I usually gravitate to the Min 30% on AC setting as it is the minimum that still allows good temps. Without the bottom cover less is possible and this is what I currently use as it lowers my SSD temps by up to 25C.
-
MSI seems to think that the Titan is worth about 1000$ more for its added features when equipped with comparable memory, GPU, CPU and screen which seem to be: - part-mechanical keyboard - new touchpad that is higher quality and also easier to clean - 270W CPU and GPU combined vs 250W for the Raider - vapor chamber - 400W GAN charger that is both lighter and more compact the downsides would be: - trackpad has to be illuminated to know where its borders are - no fingerprint sensor - change of haptic feedback when going back and forth between arrow keys/numpad and the mechanical part of the keyboard - slightly taller at the front due to the new trackpad Personally I prefer the cleaner design of the Raider and it also has an RGB bar at the front. OK, just kidding about the RGB bar, I find that stuff hideous and would check if I could take it out, did that with both my Clevo X170.
-
Great to see that you are spending some time with it 😀 And yes, it would be rather cool to be able to put a 13900K or 14900K in there!
-
OK, so the best way to handle keyboard lights for me is to boot from an external drive and set the lights there via CCC. It will then be saved to bios but the downside is that nothing can be done any more with the keyboard lights from then, I cannot even switch off the keybaord lights. Tested with both the X170KM-G and X170SM-G
-
Coming back to open RGB after unsuccessful previous attempts. Yesterday I swapped my system drive and my Clevo not only reverted back to its nasty blue default but without CCC I am not even able to switch off the keyboard lights 😞 Has anybody had success with a program like open RGB and is currently using it for per key RGB configuration? I would really like to finally color-code some keys but at the moment the only way to do that looks to be a multi-boot system where I boot into an installation that I only keep around for setting RGB without the nasty CCC polluting my main system.
-
The nice thing about the 3080 is that its power consumption is quite low. Are you running it at 150W? In any case the performance is better than a 200W+ 2080 Super. I have a 3080 coming that will go into my X170SM-G which should be the best of both worlds - 4 useable SSD slots, 10 cores and a GPU with 16GB memory 🙂
-
Nice! You might want to try the 180W bios - scored more or less the same for me as 200W and should allow you to go beyond 11K.
-
There is a bunch of changes that you have to make in bios to enable undervolting / overclocking, have you done that? If not please check out the videos below. The second one goes through all settings I believe and the other one os more geared towards the GT77 but I am not sure if it gives step by step instructions. Personally I prefer to do everything in Throttlestop instead of the bios or Intel XTU. It also allows me much easier tweaking and as my laptops usually only get rebooted once a week if at all I do not mind having to start Throttlestop manually and there are also easily switchable presets and even an automatic switch between battery and AC power if you want to. In any case it is a very handy tool that also gives me nice info in the task bar about power usage and CPU and GPU temps so it is what I always use. I just hope that things are not so borked that you will not be able to use XTU / Throttlestop as that is horrible when on a power budget!
-
1080SLI performance: Easily can do between 15 and 16000 points TimeSpy, without any shunt-modding or other excessive measures needed, just get your cooling straight and use MSI afterburner. Finally surpassed by a single card 3 generations, 5 iterations and more than 6 years later, that is a very long time! 2016: 1080SLI 2080 2080s 3080 3080ti 2023: 4080/4090
-
Always nice to be mocked by level 1 support after the machine was messed up through their firmware updates. From what I have seen MSI used to be pretty good with allowing CPUs to attain the maximum power the system could sustain and maybe I could understand them setting the long term limit to 160 or 150W with these new CPUs that in theory could be set to consume south of 200W which is just a bit too much but going down to 120W is a waste of the 13980HX. For now you may want to check how much you can undervolt your GT77 with the new target of 120W - it should still give you quite good performance even with 120W and an optimized undervolt should at least give you 10% higher performance than stock so if you did not undervolt before your performance will probably be quite similar and not suffer that much. I would still try to roll back those recent "improvements" but for now that should help.
-
How would that even be possible when you are using Throttlestop and an unlocked 13900HX?
-
I never game so no gaming benchmarks 🙂 My stock 2080 with 155W is 10.2K. Stock 2080 with 200W would be closer to 11K.
-
Thanks a lot, a lot of work went into this and it is nice to see it documented!
-
Nah, that would be too easy 😄
-
Yes indeed, for a non-gamer like me who already lost G-Sync with a new panel it would be quite tempting 😄
-
I doubt that dynamic boost will be much of an issue, probably a 5 to 7% reduction in performance but still better than the 2080 Super. The lack of GSync could of course be an issue for people who need it but everybody has to answer that for himself.
-
Obviously it would be a modification but then the 3070 and 3080 are running on P775, P870 and X170SM-G chassis of several members so it is possible. The question is if you need it - if temps are fine and you are at mostly gaming with FHD resolution I would not expect it to be that big of an upgrade but if you want maximum power savings / a quieter system or could use some added GPU memory then it may be worth it.
-
I think best was something like 11365 but that was the best I could get after many tries and with afterburner activated. Edit: I could not find the old scores but here is my best result today with the P775 with MSI Afterburner and at 155W, about 7% more than without: So it is less than a 400 points difference compared to the MSI or about 3.5%, not that much really. No delidding here - I would first want to find a CPU that is worth it and in need of a delid. Probably less gained if you already have a sub 10 degree temp difference between cores and much more if one or more cores get a lot hotter than the rest. In my experience it is OK to run a bit higher in Time Spy than in real life so I regularly use at least 4.8 for Time Spy and I think it could probably be something like that for games as games are usually less demanding on the CPU than a multicore benchmark run. You got to keep in mind that 9900K are quite wide ranging with regard to power consumption and therefore temps and performance. My best 9900K is at least 30% less in power consumption at 4.6 GHz than the worst one that I have seen - would not want that bad one in a laptop! So the short of it is that these can get VERY hot and there are three things to look for: - binning - cooling - undervolting That combo can easily make a difference between 4.4 and 4.8 up to 5.0 GHz whereas I am not so sure how much better you can get with delidding when you have an already very good CPU. As for the 8700K it is probably pretty good for your use case and it does not need that much power compared to a 9900K but 4.5 seems a bit low and I would expect more after you have done some optimizations. How much higher depends on all the above factors and games are a rather tricky load where both CPU and GPU are working hard and I usually only deal with that in benching.
-
OK, did a quick run without MSI Afterburner and I got 10.2K with 155W max which is the same power consumption as in my MSI GT75 that tops out at 11300+ HWinfo data: The 9900K in the P775 is not exactly great so I am currently thinking of putting in another one with better binning but for this benchmark it made no difference as I limited it to 4.0 GHz all cores. For comparison on a well set up system of that generation (Coffee Lake) I would try to run the CPU at 4.8 to 5.0 Ghz all cores.