Jump to content
NotebookTalk

electrosoft

Member
  • Posts

    2,289
  • Joined

  • Last visited

  • Days Won

    89

Everything posted by electrosoft

  1. P870TM1 w/ Prema was incredible but I was running mine with a 1060 (like a boss) and when WoW Shadowlands dropped my frames become completely unplayable (30-40's) even with details dropped to a 7/10 when previously I was playing at 10/10. Instead of upgrading the GPU I upgraded to an X170SM-G with a 2080 Super and Prema. End the end, I think the P870TM (and the P870 series in general) will end up being my most favorite Clevo series of all time unless Clevo suddenly gets their act together. For specific games, unless you are talking a meaningful jump (IE 3200->3600+), it is better to focus on tight timings to even out 1% and .1% lows and increase overall fps. Bandwidth has its limitations inside of a small window. On the other hand, I would most likely take 3600 16-16-16 over 3200 14-14-14 but not 3333 or potentially 3466. For that tighter window of bandwidth, timings > bandwidth. I used to swear by AIDA64 till the results achieved had no meaningful bearing on my game playing experience and ran counter to my real world numbers collected for laptop gaming so I started to question why. I still use Aida64 though for tweak by tweak comparisons and then compare it against something like TS and WoW to differentiate what is actually giving me improvements. If you are able to achieve same (or close enough) latency with 3466 as 3200 and your in game results reflect it, you obviously should take 3466 but extensive testing on WoW flight runs has proven time and time again for my needs timings trump bandwidth inside the narrow laptop window which is limited due to PCB quality and trace issues. Bandwidth becomes a distant second in most laptop scenarios when they're within ~300-400mhz of each other and (most importantly) your BIOS gives you access to fine tune timings. If your .exe's favor raw bandwidth, favor that but for most gaming (and especially WoW), tight timings are king especially on bandwidth limited laptops. I took the time on my P870TM1 when I was doing those flight runs to tweak fps and lows to figure out tighter timings at 3200 > looser timings at 3466. Of course soon after a couple of YT videos came out showing tight timings are incredibly important to a point (IE 3200 14-14-14 won't trump 3600+ 16-16-16). (1) Hardware Numb3rs - YouTube On a desktop these rules basically fly out the window because they aren't limited like laptops in peak bandwidth and I wouldn't even run anything below 3600 and push for 4000+ with DDR4 along with tight timings. Again, Aida64 is not the end all of all. Try setting some benchmarks and capture fps along with lows and test various memory settings and see what works for you.
  2. It was already running nice in the X170SM-G but this will just bring it down a notch. I wanted to get back in there anyhow since Kingpin TIM was running a little off and put nanogrease extreme back on. While I was in there I was like, "eh, let me throw it on the desktop bench to see how it compares with the re-lidded stock lid LTX" and the LTX relid dusted it in temps so I decided to just delid it where in the end it pulled ahead relid vs relid by ~8w and 2c with an avg of 65.7 vs 64.2. I'll pop it back in my X170SM-G in an hour or so. If money was no object, I'd pick up the frame and a Z690 classified. Not sure why at that price they didn't include the EVGA CLC 360mm with the new style fans to match the ones on the KPE 3090ti. But yeah, the KPE 3090ti and the E1 are all super limited so it transcends it just being a "3090ti" on other levels and not subject to normal market pricing hence the KPE 3090ti is sold out for individual sales already. The rest of the cards? Normal market conditions apply. I don't know how the cards can drop any lower since Jayztwocents just told us a week or so ago they weren't going to drop any lower and we need to BUY NOW! Clearly the market is wrong.... 🙄
  3. It did the same with my LTX too when I re-delidded it and gave it a complete shine and cleaning and relidding before selling. Put the same CPUs in an Asus Z590? Scores were right. On this MSI one run nets 16.5k. Next run nexts 15.5k. Zero consistency score wise but identical pull and thermals. I'm focused on the drop in temps and pull to give my X170SM-G even more breathing room. In it even, scores are 16.5k proper. I've had this MB since 11th gen launch last year and through numerous 10th gen chips same 15k-15.5k ish scores randomly. 11th gen worked fine. I am glad for the pretty substantial temp and watt drops. That's the magic. 🙂 Here is the LTX after a proper relid with the stock IHS (same as the SL relid) same conditions and environment with a 10min CB23 run
  4. EVGA E1 custom builds/frames finally announced (3-4 week delivery time). $1600 for the frame/case itself. $3700 w/ a KPE 3090ti. $5k for a barebones build out you just add your own ram, storage, CPU....whew. https://www.evga.com/products/E1-bare-bones/
  5. Finally got around to delidding and relidding my Silicon Lottery 10900k in my X170SM-G. Test parameters: MSI Z590-A Pro motherboard EVGA 360mm CLC Gelid Extreme center dollop Fans on max (set from BIOS) Fixed ambient temp of ~70c Before delid: After delid: I'm sure the X170SM-G will appreciate a touch of breathing room.....
  6. Do they offer true 32" 10-bit panels? That is priority #1 for me followed by 4k and >90% color gamut coverage across all three (Adobe, P3, S3) along with viewing angles last. Then comes HDR600+ Then comes >=120hz, <=1ms response time
  7. I've been using a BenQ 32" 4k panel the last two years and I genuinely love it. Full 10-bit 4k with excellent color reproduction which is always my primary criteria over all else. I decided to give myself a full gaming experience and picked up a Viewsonic 32" Elite 1440p 165hz Quantum Dot IPS gaming G-sync display. I figured I could handle stepping down to 1440p. I used it for 3-4 days and boxed it back up for return for a few reasons: Desktop 1440p looks seriously chunky after having run 4k for almost 2 years but more importantly the 8-bit panel is a no go. The banding was evident in many areas. HDR600 was not as good as the HDR on the BenQ with only HDR400. I will say the actual game play itself with a bit more fluid and smoother especially in CP2077 and FO76. A deal maker? No, but enough I would be willing to take another crack at it as long as my primary criteria aren't compromised. I won't do VA. I had tried a Samsung U59 a few years ago and the view angles (along with 8-bit banding) made it a non starter and back it went. The other thing is this BenQ with being G-sync compliant helps with some of the limitations of a 60hz panel but obviously can't compete with a real gaming display with its ~5ms response time, 60hz limitations. So I'm looking for a true 32" IPS 10-bit 4k display with >90% Adobe and P3 >=120hz gaming, low latency, HDR600+ display. I've got my eye on the MSI MPG321UR-QD Xbox Edition as it seems to fit all the criteria I need while not breaking the bank but I'll continue to look for other options too.
  8. Depends on what your end game goals are but since laptops are usually limited for D2D use well below 3600, the idea is to really the tightest times possible @ >=3000mhz . I'll take 3200 14-14-14 over 3466 16-16-16 every time regardless of what Aida64 spits back at me.
  9. I could take my P870TM1 to 3466 stable 3600 would intermittently boot never stable. X170SM-G is sitting at 3200 14-14-14 as >3200 was a wash last time I worked on memory but I do have a different CPU in there now so I may go back and take another crack at it now that everything else is resolved and tuned.
  10. Aliexpress heatsink takes care of the physical mounting. You could check with Eurocom vBIOS/BIOS wise There's always Dsanke to further enhance compatibility Good luck! Looking forward to the results!
  11. Once I get ahold of at least three chips I start drawing correlations between pull and temps. More chips equal better data visualization due to increased sample size. A leaky chip set to the same frequency would draw higher current and produce more heat at the same load and may require higher Vcore. For me, leaky means higher power consumption at the same frequency and load setting which tends to also correlate to higher temps. For example two 10900k chips set to 5.3ghz. One of them draws ~264w and the other 330w+. Exact same conditions and settings. That is an extremely leaky and hungry chip which is also a furnace. The byproduct in a constrained setting thermally (like a laptop) is an instant disaster. For overclocking, that leak does sometimes let those chips handle higher voltage to push even further (as long as you have the cooling), but for our purposes leaky chips are not a good thing as you want less leakage, less draw and subsequently less voltage and heat to maximize dissipation via the smaller heatsink as much as possible. When I bin on the desktop I always monitor voltage, package power and heat under a fixed load (CB23 in this case) to draw baselines and after 3+ chips you usually can start to differentiate which chips run hotter, which are hungrier and which are both (usually bottom of the barrel chips for targeted purposes). You've tested enough 12900k/ks on your laptop now to see one of them float to the top. On a desktop board you would have much more control to more accurately differentiate between them but in the end the ultimate test is setting like conditions with your laptop and testing them and seeing which one(s) are clearly better and give you more thermal headroom to boost higher.
  12. Apparently the KPE 3090ti was a very limited run of ~500 cards. Vince said supply line issues were a major problem. Makes sense why they sold out in less than 10 days. Makes sense now why brand new KPE 3090's have shown up (and continue to show up) on eBay (KPE 3090 just finished its 1.5yr run and is finally gone from EVGA website) but zero KPE 3090ti's. Before the install Sunday. I ended up replacing the 2x new style EVGA fans with 6x AC P12 TSE temps went from ~51.5c to 47.5c with the P12s ambient temp ~70c. GPU idle temps ~23.5-24.0c. I do like the look and style of the minimal RGB EVGA fans though. I may switch back in the future for aesthetics. Same conditions and run on my returned FTW3 3090ti ~65c. Playing WoW, fans stay at lowest setting. With the FTW3 3090ti they spun up nice and audible. P2 1600w smelled like dead fish for a day or so fresh out of the box. Not pleasant. Thing is built like a tank with solid metal shroud. I hope EVGA continues this build quality of the FTW3 and KPE 3090ti for the 4000 series. @Mr. Fox as predicted the Mrs. absconded with my Strix 3080 ðŸĪĻ
  13. Nice! Hopefully it will pop right in and be better from the jump especially being delidded and relidded with a copper IHS in place. Here's some data from my 12900ks I sold to a buddy. It was an SP94 P102 E79 and it was clearly better than my 12900k on the high end about a bin better on an AIO overall but when I was going for lowest pull and temps it bottomed out at ~169w and 1.119v under load CB23 10min run for testing with a .050uv. For a single run it bottomed out at 164w with a 0.75uv but crashed 2-3 runs on CB23. If it had been better on the low end, I would have held onto it but it's in his loop now so all is well that ends well.
  14. Well this listing lasted all of 30 mins.... *** SOLD ***
  15. Price: $$SOLD$$ Condition: Used Warranty: Bulk of MFG warranty Reason for sale: Picked up a 3090ti and wife wanted my 3080 to replace her 3070 Payment: PayPal (non-cc) , F2F, trade Item location: Southern NJ Shipping: FedEx Ground, UPS Ground, USPS Parcel Post International shipping: To known former NBR users Handling time: 3 days Feedback: https://www.ebay.com/fdbk/feedback_profile/electrosoft Specification: Recently acquired a KPE 3090ti to replace my 3080 and wife promptly took possession of the 3080 to replace this card. Used only for World of Warcraft (the only game she plays). Comes complete in box.
  16. Price: $$SOLD$$ Condition: Used Warranty: Whatever remains Reason for sale: Too many laptops (look at the sig). Trying to get down to two max. Payment: PayPal (non-cc) , F2F, trade Item location: Southern NJ Shipping: FedEx ground, UPS Ground, USPS Parcel Select International shipping: To known former NBR users Handling time: 3 days Feedback: https://www.ebay.com/fdbk/feedback_profile/electrosoft Specification: The smallest and coolest running of my laptops. Excellent shape and less than 10hrs of use Factory reset 5700u 8GB LPDDR4 4266 512GB SSD FHD IPS Proof of ownership:
  17. Alright! Now we're talkin! 🙂 You'll get more out of the @Mr. Fox 10900k on the desktop than the laptop that's for sure. Both LTX and SL run like monsters on the desktop with EVGA CLC 360mm AIOs. And the icing on the cake is getting great pricing on everything. It just warms the fiscally responsible soul just right.
  18. Ditto. This is their first, real, foray into desktop level graphics. I fully expect a lot of growing pains and curves. The volume sales are always in the low to low-med end anyhow. Now by next gen if they're still lagging? Well......
  19. Nice, what are the specs of the desktop build out? Luckily the CPU and cooling are literally "right there" when you pop the bottom off so you can try to strap it on top some how for some extra zip.
  20. Well, I think we chopped it up about this a little bit ago, but it comes down to where you place yourself in the upgrade cycle. 🙂 I'm just going to ride the end of each cycle from now on versus the beginning. Same window, same upgrade experienced just where you upgrade. I'll take the best of each generation is my new approach. I seems you are opting to upgrade at the beginning of each new generation. That works too! With the leaps and bounds they have made, I definitely have my eye on the 7000 series (both) from AMD too.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use