Jump to content
NotebookTalk

electrosoft

Member
  • Posts

    2,965
  • Joined

  • Last visited

  • Days Won

    148

Everything posted by electrosoft

  1. It was already running nice in the X170SM-G but this will just bring it down a notch. I wanted to get back in there anyhow since Kingpin TIM was running a little off and put nanogrease extreme back on. While I was in there I was like, "eh, let me throw it on the desktop bench to see how it compares with the re-lidded stock lid LTX" and the LTX relid dusted it in temps so I decided to just delid it where in the end it pulled ahead relid vs relid by ~8w and 2c with an avg of 65.7 vs 64.2. I'll pop it back in my X170SM-G in an hour or so. If money was no object, I'd pick up the frame and a Z690 classified. Not sure why at that price they didn't include the EVGA CLC 360mm with the new style fans to match the ones on the KPE 3090ti. But yeah, the KPE 3090ti and the E1 are all super limited so it transcends it just being a "3090ti" on other levels and not subject to normal market pricing hence the KPE 3090ti is sold out for individual sales already. The rest of the cards? Normal market conditions apply. I don't know how the cards can drop any lower since Jayztwocents just told us a week or so ago they weren't going to drop any lower and we need to BUY NOW! Clearly the market is wrong.... 🙄
  2. It did the same with my LTX too when I re-delidded it and gave it a complete shine and cleaning and relidding before selling. Put the same CPUs in an Asus Z590? Scores were right. On this MSI one run nets 16.5k. Next run nexts 15.5k. Zero consistency score wise but identical pull and thermals. I'm focused on the drop in temps and pull to give my X170SM-G even more breathing room. In it even, scores are 16.5k proper. I've had this MB since 11th gen launch last year and through numerous 10th gen chips same 15k-15.5k ish scores randomly. 11th gen worked fine. I am glad for the pretty substantial temp and watt drops. That's the magic. 🙂 Here is the LTX after a proper relid with the stock IHS (same as the SL relid) same conditions and environment with a 10min CB23 run
  3. EVGA E1 custom builds/frames finally announced (3-4 week delivery time). $1600 for the frame/case itself. $3700 w/ a KPE 3090ti. $5k for a barebones build out you just add your own ram, storage, CPU....whew. https://www.evga.com/products/E1-bare-bones/
  4. Finally got around to delidding and relidding my Silicon Lottery 10900k in my X170SM-G. Test parameters: MSI Z590-A Pro motherboard EVGA 360mm CLC Gelid Extreme center dollop Fans on max (set from BIOS) Fixed ambient temp of ~70c Before delid: After delid: I'm sure the X170SM-G will appreciate a touch of breathing room.....
  5. Do they offer true 32" 10-bit panels? That is priority #1 for me followed by 4k and >90% color gamut coverage across all three (Adobe, P3, S3) along with viewing angles last. Then comes HDR600+ Then comes >=120hz, <=1ms response time
  6. I've been using a BenQ 32" 4k panel the last two years and I genuinely love it. Full 10-bit 4k with excellent color reproduction which is always my primary criteria over all else. I decided to give myself a full gaming experience and picked up a Viewsonic 32" Elite 1440p 165hz Quantum Dot IPS gaming G-sync display. I figured I could handle stepping down to 1440p. I used it for 3-4 days and boxed it back up for return for a few reasons: Desktop 1440p looks seriously chunky after having run 4k for almost 2 years but more importantly the 8-bit panel is a no go. The banding was evident in many areas. HDR600 was not as good as the HDR on the BenQ with only HDR400. I will say the actual game play itself with a bit more fluid and smoother especially in CP2077 and FO76. A deal maker? No, but enough I would be willing to take another crack at it as long as my primary criteria aren't compromised. I won't do VA. I had tried a Samsung U59 a few years ago and the view angles (along with 8-bit banding) made it a non starter and back it went. The other thing is this BenQ with being G-sync compliant helps with some of the limitations of a 60hz panel but obviously can't compete with a real gaming display with its ~5ms response time, 60hz limitations. So I'm looking for a true 32" IPS 10-bit 4k display with >90% Adobe and P3 >=120hz gaming, low latency, HDR600+ display. I've got my eye on the MSI MPG321UR-QD Xbox Edition as it seems to fit all the criteria I need while not breaking the bank but I'll continue to look for other options too.
  7. Depends on what your end game goals are but since laptops are usually limited for D2D use well below 3600, the idea is to really the tightest times possible @ >=3000mhz . I'll take 3200 14-14-14 over 3466 16-16-16 every time regardless of what Aida64 spits back at me.
  8. I could take my P870TM1 to 3466 stable 3600 would intermittently boot never stable. X170SM-G is sitting at 3200 14-14-14 as >3200 was a wash last time I worked on memory but I do have a different CPU in there now so I may go back and take another crack at it now that everything else is resolved and tuned.
  9. Aliexpress heatsink takes care of the physical mounting. You could check with Eurocom vBIOS/BIOS wise There's always Dsanke to further enhance compatibility Good luck! Looking forward to the results!
  10. Once I get ahold of at least three chips I start drawing correlations between pull and temps. More chips equal better data visualization due to increased sample size. A leaky chip set to the same frequency would draw higher current and produce more heat at the same load and may require higher Vcore. For me, leaky means higher power consumption at the same frequency and load setting which tends to also correlate to higher temps. For example two 10900k chips set to 5.3ghz. One of them draws ~264w and the other 330w+. Exact same conditions and settings. That is an extremely leaky and hungry chip which is also a furnace. The byproduct in a constrained setting thermally (like a laptop) is an instant disaster. For overclocking, that leak does sometimes let those chips handle higher voltage to push even further (as long as you have the cooling), but for our purposes leaky chips are not a good thing as you want less leakage, less draw and subsequently less voltage and heat to maximize dissipation via the smaller heatsink as much as possible. When I bin on the desktop I always monitor voltage, package power and heat under a fixed load (CB23 in this case) to draw baselines and after 3+ chips you usually can start to differentiate which chips run hotter, which are hungrier and which are both (usually bottom of the barrel chips for targeted purposes). You've tested enough 12900k/ks on your laptop now to see one of them float to the top. On a desktop board you would have much more control to more accurately differentiate between them but in the end the ultimate test is setting like conditions with your laptop and testing them and seeing which one(s) are clearly better and give you more thermal headroom to boost higher.
  11. Apparently the KPE 3090ti was a very limited run of ~500 cards. Vince said supply line issues were a major problem. Makes sense why they sold out in less than 10 days. Makes sense now why brand new KPE 3090's have shown up (and continue to show up) on eBay (KPE 3090 just finished its 1.5yr run and is finally gone from EVGA website) but zero KPE 3090ti's. Before the install Sunday. I ended up replacing the 2x new style EVGA fans with 6x AC P12 TSE temps went from ~51.5c to 47.5c with the P12s ambient temp ~70c. GPU idle temps ~23.5-24.0c. I do like the look and style of the minimal RGB EVGA fans though. I may switch back in the future for aesthetics. Same conditions and run on my returned FTW3 3090ti ~65c. Playing WoW, fans stay at lowest setting. With the FTW3 3090ti they spun up nice and audible. P2 1600w smelled like dead fish for a day or so fresh out of the box. Not pleasant. Thing is built like a tank with solid metal shroud. I hope EVGA continues this build quality of the FTW3 and KPE 3090ti for the 4000 series. @Mr. Fox as predicted the Mrs. absconded with my Strix 3080 ðŸĪĻ
  12. Nice! Hopefully it will pop right in and be better from the jump especially being delidded and relidded with a copper IHS in place. Here's some data from my 12900ks I sold to a buddy. It was an SP94 P102 E79 and it was clearly better than my 12900k on the high end about a bin better on an AIO overall but when I was going for lowest pull and temps it bottomed out at ~169w and 1.119v under load CB23 10min run for testing with a .050uv. For a single run it bottomed out at 164w with a 0.75uv but crashed 2-3 runs on CB23. If it had been better on the low end, I would have held onto it but it's in his loop now so all is well that ends well.
  13. Well this listing lasted all of 30 mins.... *** SOLD ***
  14. Price: $$SOLD$$ Condition: Used Warranty: Bulk of MFG warranty Reason for sale: Picked up a 3090ti and wife wanted my 3080 to replace her 3070 Payment: PayPal (non-cc) , F2F, trade Item location: Southern NJ Shipping: FedEx Ground, UPS Ground, USPS Parcel Post International shipping: To known former NBR users Handling time: 3 days Feedback: https://www.ebay.com/fdbk/feedback_profile/electrosoft Specification: Recently acquired a KPE 3090ti to replace my 3080 and wife promptly took possession of the 3080 to replace this card. Used only for World of Warcraft (the only game she plays). Comes complete in box.
  15. Price: $$SOLD$$ Condition: Used Warranty: Whatever remains Reason for sale: Too many laptops (look at the sig). Trying to get down to two max. Payment: PayPal (non-cc) , F2F, trade Item location: Southern NJ Shipping: FedEx ground, UPS Ground, USPS Parcel Select International shipping: To known former NBR users Handling time: 3 days Feedback: https://www.ebay.com/fdbk/feedback_profile/electrosoft Specification: The smallest and coolest running of my laptops. Excellent shape and less than 10hrs of use Factory reset 5700u 8GB LPDDR4 4266 512GB SSD FHD IPS Proof of ownership:
  16. Alright! Now we're talkin! 🙂 You'll get more out of the @Mr. Fox 10900k on the desktop than the laptop that's for sure. Both LTX and SL run like monsters on the desktop with EVGA CLC 360mm AIOs. And the icing on the cake is getting great pricing on everything. It just warms the fiscally responsible soul just right.
  17. Ditto. This is their first, real, foray into desktop level graphics. I fully expect a lot of growing pains and curves. The volume sales are always in the low to low-med end anyhow. Now by next gen if they're still lagging? Well......
  18. Nice, what are the specs of the desktop build out? Luckily the CPU and cooling are literally "right there" when you pop the bottom off so you can try to strap it on top some how for some extra zip.
  19. Well, I think we chopped it up about this a little bit ago, but it comes down to where you place yourself in the upgrade cycle. 🙂 I'm just going to ride the end of each cycle from now on versus the beginning. Same window, same upgrade experienced just where you upgrade. I'll take the best of each generation is my new approach. I seems you are opting to upgrade at the beginning of each new generation. That works too! With the leaps and bounds they have made, I definitely have my eye on the 7000 series (both) from AMD too.
  20. It depends. How much more can Nvidia extract with ballooning TDPs? Will we get the same meaningful upgrade as from 2000 to 3000 or will we get 1000 to 2000 which wasn't nearly as awe inspiring? Then there's AMD pushing both Nvidia and Intel to keep it real and put forth their best silicon. On the other hand, I am more than happy with a ~66% uplift. That 4090 will effectively bury every 3000 series card 6 feet under. I'll wait for the 4090ti fire breathing dragon to upgrade next time. https://videocardz.com/newz/full-nvidia-ada-ad102-gpu-reportedly-twice-as-fast-as-rtx-3090-in-game-control-at-4k
  21. Nowhere did I think the 4090 was going to deliver 2x the performance of the 3090ti but 66% is a pretty hefty uplift. I'm curious to see the maturation of RT equally if not more than the rasterization uplift. And no way do I listen to Jensen or Lisa with their presentation numbers. When reviewers and users get ahold of their products as always performance is always less than advertised. Someone doing napkin math places the 4070 below the 3080 (especially it is has the reported crippled bus): Pre-launch speculation is always so much fun! 🙂
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use