Jump to content
NotebookTalk

1610ftw

Member
  • Posts

    1,133
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by 1610ftw

  1. I only have 4.0 drives in mine - works perfectly 😃
  2. separate slot: SATA only 3 SSD array: the lower slot: NVME only the other two: NVME/SATA I have used Sata SSDs up to 4TB and NVME up to 8TB in it
  3. Completely ridiculous! I got 192GB (4 x 48GB) DDR 5 for something like 550$ and I think that 128 GB (4 x 32) GB was around 400$ but I would have to look it up.
  4. 32GB more memory for 1100$ ??? Are you sure that nothing else has been upped?
  5. This is the AMD version of the Raider 18 chassis but the Chinese like to call it the Titan Pro which they probably think sounds cooler. It has been online on the international website for some time now so those "journalists" are not really that great and merely blurting out stuff without even having a look - who would have thought 😄 You can find it here: https://www.msi.com/Laptop/Raider-A18-HX-A7VX And here in the overview of the Raider series: https://www.msi.com/Laptops/Products#?tag=Raider-Series I need at least 128GB of memory and either TB or USB 4 so it is not for me but as high powered BGA books with added memory and possible storage go it looks quite good and luckily has the QHD+ display.
  6. You got to look at the Leaderboard view then everybody will only be there once and you got 9th place: https://www.3dmark.com/search#advanced?test=sw DX&cpuId=&gpuId=1372&gpuCount=0&gpuType=ALL&deviceType=ALL&storageModel=ALL&showRamDisks=false&memoryChannels=0&country=&scoreType=overallScore&hofMode=true&showInvalidResults=false&freeParams=&minGpuCoreClock=&maxGpuCoreClock=&minGpuMemClock=&maxGpuMemClock=&minCpuClock=&maxCpuClock= CPU does not matter any more as this is a GPU only score. I still think it is better to just use Time Spy as they have about 100 times as many scores so this gives a better idea of how good the 3080 really works: https://www.3dmark.com/search#advanced?test=spy P&cpuId=&gpuId=1372&gpuCount=0&gpuType=ALL&deviceType=ALL&storageModel=ALL&showRamDisks=false&memoryChannels=0&country=&scoreType=graphicsScore&hofMode=true&showInvalidResults=false&freeParams=&minGpuCoreClock=&maxGpuCoreClock=&minGpuMemClock=&maxGpuMemClock=&minCpuClock=&maxCpuClock= With a 150W power limit it probably will be very hard to crack the top 100 in that ranking.
  7. I have very little respect for software that mainly tries to spy on me to show me ads or worse not to mention the trend to force upon me their "new and better" way of doing things. It is sad to see how things have developed in that area to a point where I consider software and apps by these big corporations as a lost cause and to be avoided if possible. But I got to say that I still admire a lot of the hardware when it comes to pure performance but whom am I telling that as many of us have some more recent higher end computer hardware. Everything else like the trend to absolve themselves of any responsibility when things go wrong, incredible price hikes, dubious performance claims, stonewalling right to repair and the trend to sell us iterations of the same as new and different etc. is not something I appreciate from these hardware manufacturers. The sad thing is that with everybody doing it there isn't one of those big 5 chip / hardware manufacturers that one could point to and say do it like they do as everybody tries hard to be a bad example even though they are doing it in different ways. Even so from time to time a company like framework surfaces and they have a great concept and do things different. I wish them all the best but unfortunately it is not that easy to get excited about their stuff...
  8. Haha, never heard of that one 😄 Intel obviously wants to compete with Qualcomm in certain aspects like implementing next to zero upgradeability so Lunar Lake now supposedly comes with integrated memory - all in the name of AI and battery life I guess. Apart from that I have for a number of weeks tried to make sense of all the new stuff that is coming out but have now given up as it is an utter waste of time to try and make sense of the lunacy that unfolds in the race for the next generation of CPUs and GPUs or combined chips. From what I could gather before I stopped following the big AI love fest there seems to be a competition between AMD, Apple, Intel, Nvidia and Qualcomm as to who can create the most deceptive and irrelevant charts for their new products while often not even telling us the most basic specs or release dates. So I'll be back when I can see proper benchmarks that actually allow for comparisons to previous generations of hardware.
  9. You may want to tone down your posts and try to be nice although this obviously will take some effort 😄 I posted the link to the bios I use a few posts ago (XMG) but if you are looking for a different reseller bios that is not posted online it is not that likely that people have it as most users use either the last plain vanilla Clevo bios or the XMG version.
  10. Yep, that Kingston memory is much more likely to work within specs than the Mushkin that seems to be hit and miss and it also needs more voltage. It MAY work with 64GB of memory but I would be very surprised if it works with 128GB. Looking forward to hear if you can sustain that CL20 timing with 128GB!
  11. Nice and at only 1.2V! After a quick search of my own I was surprised that there is even a CL18 memory from Mushkin: https://www.poweredbymushkin.com/Home/index.php/products2/item/130-ddr4/1581-mra4s320gjjm32gx2?ic=1 and I did not find it on their site but they go as low as CL16: https://www.amazon.com/Mushkin-Redline-Notebook-Dual-Channel-MRA4S320GJJM16GX2/dp/B08QHMYCK3?th=1 I have no experience with them whatsoever but it looks to me like this memory runs with tighter timings and higher voiltage directly from the manufacturer probably via an XMP profile so it is a bit of a long shot that this will work. According to Amazon mixed reviews, working for some but not for others: https://www.amazon.com/product-reviews/B08QHMYCK3/ref=cm_cr_arp_d_viewopt_sr?ie=UTF8&filterByStar=all_stars&reviewerType=all_reviews&formatType=current_format&pageNumber=1#reviews-filter-bar So CL16 is probably a bit too ambitious...
  12. I would expect but certainly not guarantee the following behavior: all these memory sticks will default to the timings I mentioned above if you do not mess with the bios you will be able to go back to your previous memory from those in case you have issues You MAY run into issues trying to chase tighter timings at 3200 where you will not be able to get everything back up and running. Certainly I would expect it to be extremely unlikely to get tighter timings at 128GB but you may find success at 64GB with only two memory sticks. So while I would not think it is worth a try to do any memory overclocking with all 4 sticks there might be some potential there with only two. Personally if I was after tighter timings I would at first check if they have been achieved by anybody else and also what skill level those people had and which kind of memory they tried. The next step would be to check how easy it is to use a programmer to bring your Clevo back to life in case something goes wrong and you brick it. Only then make your decision and keep in mind that faster memory will only get you so far - we are talking about relatively modest improvements in your situation even for a best case scenario. Personally I would go with memory that other have running at 3200 for 64 or 128GB and be done with it but in the end you have to decide if you want to try to go beyond that. Oh and last but not least I have used the latest XMG bios on mine and can only speak for that bios when it comes to supporting 128GB at 3200: https://download.schenker-tech.de/package/xmg-ultra-17-id-xul17e21/
  13. I have now ordered one myself from here: https://www.ebay.com/itm/326153579810 It's a reasonable price and has the QHD+ resolution that I prefer. It is a regular ca. 500nits LCD screen and if somebody wants to check it out this seems like the best price by quite some margin for the 4080 option. Currently it also is the only option available with a QHD+ screen as all other versions seem to have the UHD+ screen option that does not really work for my use case. I am especially interested in checking out how the 14900HX performs as I already had a Titan for testing that I was not that impressed with the one that the Titan had and I also need to check if I can squeeze in an additional 2230 SSD in the WiFi slot. I wanted a laptop with 4 memory slots and an 18" 16:10 QHD+ screen for some time now and could not really warm up to the Titan that looks silly to me with its combination of black, silver/grey and blue plus the screen is UHD+ for now. The Raider is "only" black and red plus it also costs a lot less so it is worth a try for me.
  14. @win32asmguy It might be worth a mention that it is possible he will have to rescue his Clevo with a programmer if the memory testing goes wrong - not everybody is as willing to take apart a laptop like you are 🙂 @NBTUser I just checked and my exact timings per CPU-Z are 22-22-22-52 G.Skill has an offering with the same timings; https://www.gskill.com/product/2/197/1540866339/F4-3200C22D-64GRS That may or may not be overclockable to be faster especially when you only use 2 of your 4 memory slots but you should be aware that there are certain risks involved in playing with the memory timings and going beyond the regular timings with all 4 slots occupied is probably pushing it.
  15. So can you switch off G-Sync and the error disappears?
  16. Just for my understanding: Even with only the HDMI cable connected to the 3080 unit you get this picture when you output 4K 60 RGB 8 bit?
  17. Not sure what you are trying to do - have you connected both outputs to one display or to two different displays?
  18. Your HDMI port is not supposed to support more than 4K 60 - can't expect 4K 120 from it 🙂 As for the display port output you should be able to do 4k 120 there. Have you tried if that works with your other SM-G that has the 2080s?
  19. Now if that isn't a huge increase in CPU score 😄
  20. This is not something that hasn't been discussed widely, have a look here: https://www.reddit.com/r/GamingLaptops/comments/13pkz35/trick_to_force_nvidia_dynamic_boost_for_mobile_gpu/ Maybe you need to deinstall platform controller manager but I am not sure about that as my GPU back then did not have boost (Turing) so the driver did not help except for reducing power uptake and I have since installed another driver. And here is a request to implement an override in NVCleanstall: https://www.techpowerup.com/forums/threads/feature-request-enable-power-limit-slider-for-mobile-gpu-like-driver-528-24.310787/ This also addresses CPU behaviour during boost. I have not done any screenshots back then as for me the slider working was not particularly exciting as it was already documented elsewhere.
  21. That's too bad. If you re-read I was not claiming that the slider would work in your case but it worked for me back then on a different card so in principle this is possible on mobile cards. Back then no specific instruction was needed, it was just a matter of using the standard driver and MSI Afterburner. This is why I said that I only used it up to Turing and you can find a couple of posts by me and others if you type in 528.24. Maybe in this specific case you also need the Prema bios? @MaxxD obviously has the Prema bios it so no idea if others will be able to get 165W without it.
  22. I can confirm that was the last driver where the slider worked. I have posted that before and others, too. I never used it for a card though that was newer than Turing. As @MiRaGe says there will probably be issues sooner or later or maybe even now with certain games which is why I think that a shunt mod will be a more flexible and permanent solution but of course there is a certain risk. Maybe that new Nvidia software that is supposed to also replace Afterburner will help, too but I doubt it as Nvidia has rarely been known to do the right thing in these matters.
  23. Well that would be a great idea - why has nobody thought about that before 😄 Actually I think it is a shitty idea by Nvidia and that it would be much easier to set an honest max of 165W or preferably more and then add a slider that goes from let's say 100 to 200W and let the customer decide max power consumption. From the start it has been ridiculous that the 2080 Super with more limited potential is able to run at 200W and the 3080 is not allowed to run at 200W, too and the same now goes for the even more powerful 4090. 150W 2080S to 200W -> small increase in performance 150W 3080 to 200W -> pretty big increase in performance That is also the reason that I would prefer to do a shunt mod and then I would try to dial in the 3080 at something like 180W max. Not easy of course to measure exact power consumption after a shunt mod but with a watt meter and always running the same benchmarks that max out the GPU it should be possible.
  24. These are not getting any younger and unless you are benching you will gain very little beyond 200W. Here is the 150W 2080 in my GT75: https://www.3dmark.com/3dm/105739683 As you can see it is not far behind the best 2080 with 200W and it will run a lot cooler with that power consumption with a reduction in performance of a bit more than 5% Here is my 2080S in the SM-G. Obviously with some overclocking via MSI Afterburner but with 200W: https://www.3dmark.com/3dm/91483697 So it is not that hard to surpass 12000 with the stock 200W. The 3080 scales better so adding 20 to 30W to the 3080 would help more.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use