Jump to content
NotebookTalk

1610ftw

Member
  • Posts

    1,028
  • Joined

  • Last visited

  • Days Won

    1

1610ftw last won the day on February 9 2023

1610ftw had the most liked content!

2 Followers

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

1610ftw's Achievements

Mentor

Mentor (12/14)

  • One Year In
  • Posting Machine Rare
  • Very Popular Rare
  • One Month Later
  • Conversation Starter

Recent Badges

811

Reputation

  1. This is the AMD version of the Raider 18 chassis but the Chinese like to call it the Titan Pro which they probably think sounds cooler. It has been online on the international website for some time now so those "journalists" are not really that great and merely blurting out stuff without even having a look - who would have thought 😄 You can find it here: https://www.msi.com/Laptop/Raider-A18-HX-A7VX And here in the overview of the Raider series: https://www.msi.com/Laptops/Products#?tag=Raider-Series I need at least 128GB of memory and either TB or USB 4 so it is not for me but as high powered BGA books with added memory and possible storage go it looks quite good and luckily has the QHD+ display.
  2. You got to look at the Leaderboard view then everybody will only be there once and you got 9th place: https://www.3dmark.com/search#advanced?test=sw DX&cpuId=&gpuId=1372&gpuCount=0&gpuType=ALL&deviceType=ALL&storageModel=ALL&showRamDisks=false&memoryChannels=0&country=&scoreType=overallScore&hofMode=true&showInvalidResults=false&freeParams=&minGpuCoreClock=&maxGpuCoreClock=&minGpuMemClock=&maxGpuMemClock=&minCpuClock=&maxCpuClock= CPU does not matter any more as this is a GPU only score. I still think it is better to just use Time Spy as they have about 100 times as many scores so this gives a better idea of how good the 3080 really works: https://www.3dmark.com/search#advanced?test=spy P&cpuId=&gpuId=1372&gpuCount=0&gpuType=ALL&deviceType=ALL&storageModel=ALL&showRamDisks=false&memoryChannels=0&country=&scoreType=graphicsScore&hofMode=true&showInvalidResults=false&freeParams=&minGpuCoreClock=&maxGpuCoreClock=&minGpuMemClock=&maxGpuMemClock=&minCpuClock=&maxCpuClock= With a 150W power limit it probably will be very hard to crack the top 100 in that ranking.
  3. I have very little respect for software that mainly tries to spy on me to show me ads or worse not to mention the trend to force upon me their "new and better" way of doing things. It is sad to see how things have developed in that area to a point where I consider software and apps by these big corporations as a lost cause and to be avoided if possible. But I got to say that I still admire a lot of the hardware when it comes to pure performance but whom am I telling that as many of us have some more recent higher end computer hardware. Everything else like the trend to absolve themselves of any responsibility when things go wrong, incredible price hikes, dubious performance claims, stonewalling right to repair and the trend to sell us iterations of the same as new and different etc. is not something I appreciate from these hardware manufacturers. The sad thing is that with everybody doing it there isn't one of those big 5 chip / hardware manufacturers that one could point to and say do it like they do as everybody tries hard to be a bad example even though they are doing it in different ways. Even so from time to time a company like framework surfaces and they have a great concept and do things different. I wish them all the best but unfortunately it is not that easy to get excited about their stuff...
  4. Haha, never heard of that one 😄 Intel obviously wants to compete with Qualcomm in certain aspects like implementing next to zero upgradeability so Lunar Lake now supposedly comes with integrated memory - all in the name of AI and battery life I guess. Apart from that I have for a number of weeks tried to make sense of all the new stuff that is coming out but have now given up as it is an utter waste of time to try and make sense of the lunacy that unfolds in the race for the next generation of CPUs and GPUs or combined chips. From what I could gather before I stopped following the big AI love fest there seems to be a competition between AMD, Apple, Intel, Nvidia and Qualcomm as to who can create the most deceptive and irrelevant charts for their new products while often not even telling us the most basic specs or release dates. So I'll be back when I can see proper benchmarks that actually allow for comparisons to previous generations of hardware.
  5. You may want to tone down your posts and try to be nice although this obviously will take some effort 😄 I posted the link to the bios I use a few posts ago (XMG) but if you are looking for a different reseller bios that is not posted online it is not that likely that people have it as most users use either the last plain vanilla Clevo bios or the XMG version.
  6. Yep, that Kingston memory is much more likely to work within specs than the Mushkin that seems to be hit and miss and it also needs more voltage. It MAY work with 64GB of memory but I would be very surprised if it works with 128GB. Looking forward to hear if you can sustain that CL20 timing with 128GB!
  7. Nice and at only 1.2V! After a quick search of my own I was surprised that there is even a CL18 memory from Mushkin: https://www.poweredbymushkin.com/Home/index.php/products2/item/130-ddr4/1581-mra4s320gjjm32gx2?ic=1 and I did not find it on their site but they go as low as CL16: https://www.amazon.com/Mushkin-Redline-Notebook-Dual-Channel-MRA4S320GJJM16GX2/dp/B08QHMYCK3?th=1 I have no experience with them whatsoever but it looks to me like this memory runs with tighter timings and higher voiltage directly from the manufacturer probably via an XMP profile so it is a bit of a long shot that this will work. According to Amazon mixed reviews, working for some but not for others: https://www.amazon.com/product-reviews/B08QHMYCK3/ref=cm_cr_arp_d_viewopt_sr?ie=UTF8&filterByStar=all_stars&reviewerType=all_reviews&formatType=current_format&pageNumber=1#reviews-filter-bar So CL16 is probably a bit too ambitious...
  8. I would expect but certainly not guarantee the following behavior: all these memory sticks will default to the timings I mentioned above if you do not mess with the bios you will be able to go back to your previous memory from those in case you have issues You MAY run into issues trying to chase tighter timings at 3200 where you will not be able to get everything back up and running. Certainly I would expect it to be extremely unlikely to get tighter timings at 128GB but you may find success at 64GB with only two memory sticks. So while I would not think it is worth a try to do any memory overclocking with all 4 sticks there might be some potential there with only two. Personally if I was after tighter timings I would at first check if they have been achieved by anybody else and also what skill level those people had and which kind of memory they tried. The next step would be to check how easy it is to use a programmer to bring your Clevo back to life in case something goes wrong and you brick it. Only then make your decision and keep in mind that faster memory will only get you so far - we are talking about relatively modest improvements in your situation even for a best case scenario. Personally I would go with memory that other have running at 3200 for 64 or 128GB and be done with it but in the end you have to decide if you want to try to go beyond that. Oh and last but not least I have used the latest XMG bios on mine and can only speak for that bios when it comes to supporting 128GB at 3200: https://download.schenker-tech.de/package/xmg-ultra-17-id-xul17e21/
  9. I have now ordered one myself from here: https://www.ebay.com/itm/326153579810 It's a reasonable price and has the QHD+ resolution that I prefer. It is a regular ca. 500nits LCD screen and if somebody wants to check it out this seems like the best price by quite some margin for the 4080 option. Currently it also is the only option available with a QHD+ screen as all other versions seem to have the UHD+ screen option that does not really work for my use case. I am especially interested in checking out how the 14900HX performs as I already had a Titan for testing that I was not that impressed with the one that the Titan had and I also need to check if I can squeeze in an additional 2230 SSD in the WiFi slot. I wanted a laptop with 4 memory slots and an 18" 16:10 QHD+ screen for some time now and could not really warm up to the Titan that looks silly to me with its combination of black, silver/grey and blue plus the screen is UHD+ for now. The Raider is "only" black and red plus it also costs a lot less so it is worth a try for me.
  10. @win32asmguy It might be worth a mention that it is possible he will have to rescue his Clevo with a programmer if the memory testing goes wrong - not everybody is as willing to take apart a laptop like you are 🙂 @NBTUser I just checked and my exact timings per CPU-Z are 22-22-22-52 G.Skill has an offering with the same timings; https://www.gskill.com/product/2/197/1540866339/F4-3200C22D-64GRS That may or may not be overclockable to be faster especially when you only use 2 of your 4 memory slots but you should be aware that there are certain risks involved in playing with the memory timings and going beyond the regular timings with all 4 slots occupied is probably pushing it.
  11. So can you switch off G-Sync and the error disappears?
  12. Just for my understanding: Even with only the HDMI cable connected to the 3080 unit you get this picture when you output 4K 60 RGB 8 bit?
  13. Not sure what you are trying to do - have you connected both outputs to one display or to two different displays?
  14. Your HDMI port is not supposed to support more than 4K 60 - can't expect 4K 120 from it 🙂 As for the display port output you should be able to do 4k 120 there. Have you tried if that works with your other SM-G that has the 2080s?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use