Jump to content
NotebookTalk

Clamibot

Member
  • Posts

    455
  • Joined

  • Last visited

  • Days Won

    4

Clamibot last won the day on December 3 2024

Clamibot had the most liked content!

4 Followers

Other options

  • Member Title
    Slayer Of BGA Garbage, Protector Of LGA Goodness

Recent Profile Visitors

1,848 profile views

Clamibot's Achievements

Proficient

Proficient (10/14)

  • Conversation Starter
  • One Year In
  • Very Popular Rare
  • Collaborator
  • One Month Later

Recent Badges

1k

Reputation

  1. It makes a big difference now if you have Lossless Scaling. You can use one GPU for rendering the game and then a second GPU for frame gen. It's awesome, and is the best $7 I ever spent on any software. We're bringing back the days of SLI and Crossfire with this nifty little program!
  2. Yup, I can attest to this. My DDR4 kit I have in my older desktop (with the super bin 10900K from bro Fox) can do 4400 MHz CL 15 with default XMP subtimings as long as I have my Noctua IPPC fans blowing on the sticks at max RPM (3000 RPM). If I use lesser fans, the sticks aren't stable even with the normal 4000 MHz XMP profile as the RAM can't be kept cool enough to not error out. No wonder the previous owner waterblocked these sticks. The 4000 MHz CL 14 kit costed double what this kit did at the time I bought this kit, so I went with this kit instead and am still satisfied with having the second best. It looks like it can definitely still be pushed further with better cooling. I typically do 4200 MHz as a daily driver speed as that is the best compromise between performance and stability (barring better cooling). A 5% overclock over XMP isn't bad given it was a brute force approach and I'm not experienced with RAM overclocking.
  3. I'd buy one of these if I hadn't already bought a 14900KF from you last November. That thing is awesome. Bro Fox is an awesome seller. You can buy with confidence from him as you will get exactly what you paid for. He will also make sure you get what you paid for. As a result, I scavenge used hardware from him like a vulture when something is available.
  4. Now that's a real laptop! We need Clevo to make something like that.
  5. I'd have to argue against your assertion that the P870's second GPU slot is not of much use nowadays, as dual GPU configs are having quite the revival right now due to Lossless Scaling being able to utilize 2 GPUs to render any game. The caveat is that this pseudo SLI/Crossfire revival is only amongst technical users, not the general public. Additionally, there are productivity workflows that benefit from multi GPU configs such as lightmap baking, scientific computing, and video rendering just to list a few. Admittedly, these tasks are better performed on multi GPU desktops, but a multi GPU laptop is a boon for people like me who want or need all our compute power everywhere we go. For the general public though, your statement does in fact hold true. I'm making an argument on a great degree of technicality here, and it really only applies to a niche set of users. I don't think the general public has really ever been interested in multi GPU configs due to the cost. However, the P870 is definitely a laptop oriented towards more technical users, so I think the multi GPU capability is still very much useful. I believe a dual RTX 3080 MXM laptop would have a little more GPU compute or similar GPU compute to the current top dog RTX 5090 laptop. Having said that, I like the X170 a lot as well. It has the best speakers I've ever heard in a laptop and has VERY good CPU side cooling capabilities. I'd really like multi GPU laptops to make a return to the market. They're awesome. Apparently it may even be possible to get a 40 series core working on a 30 series MXM board according to Khenglish, so it'll be interesting to see where that goes. Dual RTX 4090 P870? That would be a sight to see.
  6. I love blower fan cards! Although they're louder than axial fans, the noise doesn't bother me as it's not a high pitched whine. Blower fan cards also tend to be the cheaper models too (yay!), and they exhaust heat directly out of my case instead of dumping the heat inside. That's the reason I prefer them. Also, blower fan cards are only 2 slots which makes it easier for me to build out multi GPU configurations.
  7. Unfortunately, Doom: The Dark Ages is artificially demanding. It forces ray tracing, and you can't turn it off. If you ask me, it doesn't look any better than Doom Eternal while costing significantly more performance to render. This is a significant regression in their game design as it doesn't make sense to use ray tracing for static environments. Baked shadows would've been much better. The thing is, as a game developer myself, I actually have insight into this kind of stuff unlike regular players, so I know what's really needed and what isn't. I understand the team behind the game wanted to focus resources on making more levels, and that's great, but one of the selling points of Doom games were how easy they were to run despite how good they looked. Optimization went down the toilet with this release. Having said that, I still want to play the game as I'm a big fan of Doom, but now I need more powerful hardware again to reach my framerate target. It doesn't seem necessary for the game to cost so much performance to render though given what they achieved with the previous 2 installments. Hopefully we get a mod that allows us to turn all the raytracing crap off. Seriously, it doesn't take that much effort to bake shadows on levels. It's an automated process, and you just have to let it run its course. Games run much faster with baked shadows. Rasterization will always yield superior performance, which is one of the most important things to consider with realtime interactive software like video games. If you can use baked shadows, you do that. If you have a static environment, use baked shadows. Raytraced shadows only make sense for dynamic environments where you can destroy stuff in the game world. In regards to video game graphics in general, we're at a point of seriously diminishing returns. We need improvements in gameplay a lot more than pushing raytracing in every game.
  8. I saw the posts in that thread. I don't think these cards will work in the X170SM-G, but I'd like to be proven wrong. Looks like we'll need custom heatsinks to start with as the X170 models use the custom bigger Clevo form factor MXM modules rather than the classic MXM type A or type B modules.
  9. How does the Cryofuze compare to Phobya Nanogrease Extreme? Are they about the same? Phobya Nanogrease Extreme is the only paste I've been able to use in laptops without it pumping out.
  10. I went back to read your previous posts and had a thought. The Quadro P5200 is the slave card and the GTX 485m is the master card, correct? If that's the case, no wonder your benchmarks aren't as high as expected. You're running your benchmarks purely on the Quadro P5200, correct? There is a performance and time cost associated with transferring information across the PCIe bus between 2 GPUs. If the master card is tasked with an operation, then sends data to the slave card to then perform another operaiton on that data, then the data is read back to the master card, that will result in a significant performance decrease due to latency penalties. However, in your case, the slave card does the work directly right? There will still be a performance decrease in this case since there's still the readback time cost, but at least only going one way this time (slave card to master card). Since the slave card doesn't output directly to the screen, the work performed by it has to be routed through the master card and passed to the screen. There is a performance cost associated with this as both GPUs have to use their encoder and decoder units to process this transfer of data. The confusing bit to me though is the massive performance deficit you're getting versus what you would be getting if the Quadro P5200 was connected directly to your laptop's screen. I would think there would be somewhere between a 10-20% performance decrease from having to pass the output through another graphics card (depending on how good the encoder/decoder units on both cards are). However, you're getting around a 40% performance deficit, so something is definitely wrong. Perhaps the encoder/decoder units on the GTX 485m can't keep up with the thoughput of the Quadro P5200 and that stalls the rendering pipeline? The Quadro P5200 is much more powerful than the GTX 485m, so I have a feeling the GTX 485m is a bottleneck. Have you tried with any other cards as your primary display output card? Typically with dual GPU configs, you want both cards to have the same amount of processing power or close to the same, as one card will bottleneck the other if the disparity between the cards' capabilities becomes significant. What is the most powerful card that will work in the master slot of your laptop? Were you able to identify any particular reason the Quadro P5200 would not work in the master slot?
  11. @SuperMG3 Hey man, sorry for the late reply. I know you've been trying to get input from me for a bit as I've been tagged multiple times across your posts in multiple threads. I kept forgetting to respond. So to solve your performance issues with the Quadro P5200, my first suggestion would be to check your power supply. When I upgraded my Alienware 17 from a GTX 860m to a GTX 1060, I ended up having to get a 240 watt power supply, else the card would never kick into its highest performance mode (its P0 state). It would stay stuck at its P2 state (medium performance) no matter what I tried when I had my stock 180 watt pwoer supply connected to my laptop. If you already have a 330 watt power supply, that should be enough, but using dual 330 watt power supplies or one of those Eurocom 780 watt power supplies wouldn't hurt. We can help diagnose your issue by checking the performance mode the card is going into by using Nvidia Inspector (not Nvidia Profile Inspector!). You should see a section in the program that says P-State, along with the performance state readout of the GPU. From what I remember diagnosing my GTX 1060, there were 3 possible states: P8 (low power state), P2 (medium performance state), and P0 (maximum performance state). Perhaps your Quadro P5200 is getting stuck on the P2 state for some reason like my GTX 1060 was, thereby limiting the maximum power draw. You can try overclocking the P2 state from Nvidia inspector to see if that nets you any gains. You should be able to force the P0 state using this guide: https://www.xbitlabs.com/force-gpu-p0-state/ If you cannot force the P0 state and the GPU insists on staying in its P2 state, either the card is power starved or something else is wrong. If you've determined the power supply isn't the issue, then we'll have to do some further investigation.
  12. This looks a lot like thay Rev-9 laptop that came out a while back: https://www.notebookcheck.net/Massive-T1000-mobile-PC-supports-AMD-Ryzen-9-9950X3D-RTX-5090-and-other-desktop-CPUs-and-GPUs.977868.0.html Soo maybe my Slabtop dreams will be realized sometime soon since these niche true desktop replacements keep popping up?
  13. So interesting development today. I bought my mom a new laptop over the weekend since her old one was dying (had a good run though as it's 10 years old), and decided to do some tuning and benchmarking with it while setting it up for her. The Ryzen 9 HX 370 inside it is a powerhouse of a CPU when you max out the power limits. It benches higher than my 14900K, both in single core and multicore (specifically in Cinebench R15, which is the only version of that program I use for benching)! I also got rid of the windows 11 installation on it in favor of my trusty windowsxlite edition of windows 10 I really like for absolute maximum performance. In short, after doing some tuning to maximize performance, the system is extremely snappy, and my mom is loving it. She snaps her fingers, the laptop is done doing what she wanted it to. I also tuned the speakers to give a sound quality boost. All in all, the Asus Vivobook S 16 is actually a pretty good laptop for general users (but bleh BGA🤣). The integrated graphics in this thing are pretty powerful too, I mean you can actually game on this thing! Even though the laptop isn't for me, since I bought it, I might as well have some benching and overclocking adventures while I'm setting the thing up. 🤣 However, this isn't even the most interesting part. This laptop has a really nice, glossy OLED screen, so I decided to do a side by side comparison to the screen installed in my Clevo X170SM-G. The verdict? Holy crap, my X170's screen is almost just as good as an OLED. I did not realize just how close to OLED level quality it was, which I was not expecting. So basically, rip off the stupid matte antiglare layer, increase color saturation a bit (from 50% to 70%), and now your IPS display looks like an OLED screen (yes, I modded my X170's screen by removing the matte anti glare layer, so it's a glossy IPS screen now). The OLED screen on the Vivobook was kind of underwhelming when I tested it out some more. I mean, it's a super sharp 3.2K screen, but it suffers from black smearing? What? I thought OLEDs were supposed to have near instantaneous response times! It doesn't look like that's the case though as this OLED display gave me PTSD of me using my Dell S3422DWG VA panel, which has very heavy black smearing that I absolutely hate. The black smearing on the Vivobook's display isn't as bad, but it's still there, and my X170's IPS display has no black smearing whatsoever. If anything, my X170's IPS display feels much more responsive than the OLED display in the Vivobook. Granted, my X170's display is a 300 Hz display vs the 120 Hz display in the Vivobook, but OLED is supposed to have sub millisecond response times. It doesn't look that way to me at all, as sub millisecond response times should mean no black smearing. So I guess OLEDs aren't the juggernaut the hype is making them out to be. Couple that with the expiration date on OLEDs, and I no longer want one. I'll just go with glossy IPS thank you very much. IPS seems superior in every metric except image quality, which it can almost match OLEDs if the IPS display is also glossy and you tune your color saturation, so good enough for me. Just goes to say, don't fall for the hype on any technology. Always do your own comparisons and testing, because sometimes the reviewers are just flat out wrong, just like how people keep saying there is no performance difference between windows 11 and windows 10. Uhh... yeah there is. I did my own benchmarking and get 20% higher framerates on windows 10, so I call BS. I'm now calling BS on the hype on OLEDs too. I'm glad I did not buy one, and now I no longer plan to buy an OLED display for my desktop. I'll just have to find an IPS display that has the matte anti glare layer glued on top of the polarizer layer instead of infused into the polarizer so I don't destroy the screen when removing the matte layer. Either that, or I'll have to find another way to glossify my Asus XG309CM monitor.
  14. Yeah, the guide I wrote is more for maxing out gaming performance than benchmarking. It'll be really useful when I eventually get a 480 Hz monitor. I really like how these new ultra high refresh rate monitors look so lifelike in terms of motion clarity. One of my good buddies has joked on multiple occasions that Icarus keeps flying higher (referring to me) whenever I get a new even higher refresh rate monitor.
  15. I don't have any SLI capable motherboards, but I can offer you an alternative if you can't get SLI working. You can use Lossless Scaling instead to achieve pseudo SLI with much better scaling in the worst case. I posted some instructions on how to set this up a while back on this thread. You can even use a heterogeneous GPU setup for this and it works great!
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use