Jump to content
NotebookTalk

jaybee83

Member
  • Posts

    2,343
  • Joined

  • Last visited

  • Days Won

    54

Posts posted by jaybee83

  1. 8 hours ago, 1610ftw said:

    Biggest current generation COMBINED power draw for CPU and GPU is 250W - kind of puts things in perspective 😄

    quite the depressing perspective if you ask me 😅if this trend continues, our forum will be more reminiscing about "the good ol days" and keeping classic, EOL DTRs alive rather than discussing current tech (aside from bitchin about it, of course lol)

    • Sad 2
  2. 3 hours ago, i.bakar said:

    well this guy did a part 2 test 

     

     

    and in terms of performance this shatter every aspect that we always talked about !!

     

    well, naturally, drivers will stop being optimized for older os plus newer hardware wouldnt be supported anymore. i think the aspect we mostly despise is increased telemetry and junk running in the background / causing additional OS bloat without actually being actively utilized by the user.

    • Thumb Up 2
  3. 2 hours ago, FredSRichardson said:

     

    My understanding is that a laptop with an eGPU does not need another monitor.  I also think the real limitation here ends up being the laptops internal bus throughput (when compared to a desktop with the same GPU).  I imagine performance varies quite a lot with this approach depending on the laptop and GPU type...

     

    I'd definitely support a future in which we can buy really nice "lapdocks" with whatever level of keyboard and monitor we want.  Then plug that into say a monster desktop at home, a cell phone on the road or a portable desktop while on travel all via usb-C.  I'm not sure that future will ever happen though...

    yep, no need for an additional monitor and yep again, main limitation for egpu performance is bus bandwith and latency, add to that the sometimes anemic nature of the mobile cpu in question, which cannot keep up with the monster desktop cards...

    • Thumb Up 1
  4. 5 hours ago, Nigh on Noon said:

    Hi all, I was a lurker on NBR for its last few months. It was a great source of information for Dell Precision laptops. Its been fun reading the threads about the new Alder Lake systems.

     

    On another note, is there any interest in an owners thread for the MSI GS66? I just received one.

     

    Cheers.

    yessssssss, lets get all those NBR refugees united again under the new NBT roof. glad you found us, enjoy the new fledgling forum! we can use all the help we can get 🙂

    • Thumb Up 2
  5. On 7/30/2022 at 7:31 PM, Papusan said:

     

    Got the new "old" baby. All well. With documentation, parts and paqckage in good shape.

     

    All have a nice weekend. Yesterday was the 27th wedding anniversary for me and my lovely wife. And we finally got good weather here home, and exactly the same nice weather as our wedding day in 1995.  

     

    About time for posting new pict of my old used crocks to my friend @electrosoft 😆 Yep, I know you love my many pictures, LOOL

    G5WxRca.jpg?2

    5cm0T5R.jpg

    H5kOQE1.jpg

     

     

    https://hwbot.org/submission/5051031_papusan_3dmark_vantage___performance_geforce_gtx_980_ti_97321_marks

    2736627.jpg

     

    https://hwbot.org/submission/5050997_papusan_3dmark___cloud_gate_geforce_gtx_980_ti_78720_marks

    2736609.jpg

     

    https://hwbot.org/submission/5050959_papusan_3dmark06_geforce_gtx_980_ti_75734_marks

    2736582.jpg

     

    https://hwbot.org/submission/5050970_papusan_3dmark05_geforce_gtx_980_ti_99333_marks

    2736589.jpg

    dude are those 18(!) fans im counting?! LOL what kinda case is that? here i thought i was the only one nuts enough to go with 14 fans (17 if gpu ends up watercooled or if you count the three gpu shroud fans when aircooled) in the build im planning 😂😁👌🏼

    • Thumb Up 3
  6. On 7/30/2022 at 7:34 PM, Ishatix said:

    I've never used Nvidia Inspector, but since I have no way of utilising the integrated graphics on my laptop with a 980m, I was wondering if there is much power saving benefits that could be extracted from it with undervolting or limiting clocks? Or does the GPU firmware already do a decent job with power efficiency when under lighter loads? I can't really play games with my current health, so not really making much use of it. And my electricity prices have recently gone up by a factor of 3x, so need to save every Watt I can 👾 

    im actually using nvidia inspector to generate a batch file that i automatically run at windows startup. the batch file forces the 980M to lowest possible voltages and clocks no matter the load (completely fine for desktop usage) and thus severely limits its power usage.

    i manually tested what kinda clocks i can get away with, if u go too low u start seeing graphical glitches and/or stutters.

    • Thumb Up 1
    • Like 1
  7. 1 hour ago, Ishatix said:

    I was thinking of starting a thread on this with all the talk over BGA/socketed and upgradeability, so happy to find someone already did!

     

    I suppose that even if it's not socketed, being able to easily swap out the mainboard for a CPU upgrade is better than most laptops these days. I found a discussion on their forums about this point here:
    https://community.frame.work/t/put-the-cpu-on-a-different-pcb-than-the-rest-of-the-mainboard/20701

     

    Overall, I think it's an interesting approach but still not quite mature enough to be fully tempting yet. My main issue with it is that you only get four expansion card slots to work with at any one time, which seems far too limited to me considering they have to cover all of your interface ports, as well as storage expansion, and you need to dedicate one of those for the USB-C charging:
    https://frame.work/gb/en/marketplace/expansion-cards
     
    So I very much agree with what @Sandy Bridge says above. I might be interested if they ever do a beefier version with more expandability in the future.

    give them some time to expand, theyre barely getting off the ground atm. im routing for them and what they stand for, lets hope this is the start of something grand 🙂

    • Like 1
  8. 19 hours ago, RandomIdiot said:

    Which build would be closest to LTSC? As much as I'm loathe to have anything to do with Win.11, morbid curiosity still lurks

    they actually have an LTSC build available, if u wanna check that out:

    https://phoenixliteos.com/19044-1566-LTSC-Plus/

     

    as for Win11, the closest to LTSC would be the latest Pro Plus build:

    https://phoenixliteos.com/22621-169Plus/

    • Thumb Up 1
    • Like 1
  9. On 7/30/2022 at 7:55 PM, win32asmguy said:

    Wow thanks for checking all of that data.

     

    It is a great chip for the NH55! I should delid it as there is thermal throttling.

    i absolutely insist on delidding that chip! 😃🏼

     

    btw, new upload from GN comparing the TG contact frame at 35 bucks vs. Thermalright frame at 4.35 bucks from AliExpress *lulz*

    surprisingly, they are statistically performing the same even though tolerances on the TR frame are way higher. however, installation needs much more finesse and skill to get it just right.

     

    • Thumb Up 1
    • Bump 1
  10. 23 hours ago, electrosoft said:

    Temps progression logs playing WoW aka Operation can I play this as quietly as possible?

     

     

    30-60 min play time in WoW results (my main criteria). Includes flight runs in Ardenweald, Bastion (hardest hitting zones in the entire game) and one raid run.

     

    When I first got the laptop = 92c on CPU / 76c GPU

     

    Cleaned everything up, used nanogrease extreme 30 min = 88c on CPU / 71c GPU

     

    Went back in fixed a bent side on the GPU retention. re-delidded/lidded original binned ZtecPC CPU = 85c on CPU / 65c on GPU (GPU ok now)

     

    Picked up an SL 10900k no delid or tampering 82c right out of the box. Switched screws for CPU retention.

     

    Picked up an LTX Golden 10900k already delidded. Relidded w/ BartX Nickel IHS = 83c

     

    Relidded LTX Golden 10900k with stock IHS temp = 77c. Clearly BartX did not like the X170SM but stock IHS is working better.

     

    Used torque controlled screw driver to find optimal torque without black screens. dropping LTX temps down to 73

     

    LTX delid temps clearly beating SL even though SL is superior chip. Delid and relid SL 10900k with stock IHS. As soon as I started playing I knew next level was hit because the fans were barely moving during my entire play sessions.

     

    Final top out temps = 68c. If I forced the fans to run as they were with the >70c temps it would have probably been a smidge lower:

     

     

    594354787_WoWShadowlandsnewdelidSL.PNG.7e7a005171ded18539efc3d8e5640321.PNG

     

     

    I think I've optimized this thing as much as I possibly can. Maybe get back in there and play around with the memory to see how much more I can optimize it. Also pick up a dual rank set of sticks to compare against Jarrod's report of memory gains.

     

    @Clamibot while using a torque controlled screw driver and doing a lot of trial and error I did identify on my X170SM-G it was the lower left CPU screw that causes black screens when tension is too high. The single and right screw? I could tighten those as much as I wanted just about (within reason) and the unit still booted fine. I'm wondering if it is a combination of sensitivity and retention lever pressure on that side.

     

     

    impressive progress! just goes to show how much leeway is left in terms of wide tolerance manufacturing of laptop cooling...

    • Thumb Up 1
  11. 4 hours ago, RandomIdiot said:

    The drivers will work fine, it's just you have to side-load the control panel if you want it. If I can keep GPU selection functioning reasonably well on hybrid mode I'd prefer to keep it since the battery life is way better, and I'd be forever forgetting to switch the MUX myself

    the MUX provides more performance though, since routing the signal through the igpu as in hybrid mode robs u of significant fps in games...

    • Thumb Up 1
  12. 2 hours ago, electrosoft said:

     

    Asrock 6900xt and 6950xt down to 850 and 1000....

     

    I told my son on law to hold off and wait a bit because he's chomping at the bit to have me build him a desktop. If he plays his cards right, he will save a nice chunk and get a killer 6900xt + 5800X3D system  that will last him for years considering he's still playing on a 4810mq / 970m laptop and doing ok. His budget is right around 1500 and that pricing for a high end system is becoming a reality. I have parts galore to help round it out for him so he can fit his budget too if needed.

     

     

     

     

     

     

    dude hes gonna be over the moon!

    • Thumb Up 2
  13. 18 hours ago, Papusan said:

    Acceptable doesn't mean a anything if the new is worse than the old 🙂 New doesn't always mean better. Acceptable is watered down word. Can mean trash if you know what I mean. Windows as OS going the wrong way!

    of course bro papu, i chose "acceptable" as the description of choice because "good" or "awesome" wouldve been way too positive a connotation 😂

  14. 16 hours ago, cylix said:

    @jaybee83

    Haha, yes g8 is great so is the new Alienware AW3423dw , but i really like the 32:9 ratio and of course the real estate you get with 49 Inches. So i got it from a guy who bought it 1,5 Months ago but he didnt use it cause it was to big for his new setup, he has an audio recording studio , but he couldnt return it anymore, i paid for it 1000 Euros with extra 3 years Insurance, great deal if you count the Extra Insurance is around 250 Euros, and is insured for everything possible, even theft. I put the 1011 Firmware on it and the HDR is really improved right now.

    The only problem is that i read everwhere that if i want the best HDR experience i need to install Win 11 cus is miles ahead of the 10 HDR. And i hate W11 😑. I only did the calibration the rtings.com did. Look alright until now 🙂


    And a Photo with the beauty :), sadly i know that i have to say farewell to my Clevo, the 2080 is not enough anymore to get the full power of the G9 Neo. Just waiting for the next gen Gpus and AMD Cpus to build a Desktop.

    PS: need to buy a Ergotron Arm cus my desk is to small for those Legs 🙂

    NEO.thumb.jpg.21a95cb5c0fd157ceae919c936e337ce.jpg

     

    duuuuude that looks gorgeous, its soooooooooo wiiiiiiiiiiide 😂

    • Haha 1
  15. On 7/26/2022 at 11:34 PM, electrosoft said:

     

    I don't know how the cards can drop any lower since Jayztwocents just told us a week or so ago they weren't going to drop any lower and we need to BUY NOW! Clearly the market is wrong.... 🙄

     

     

    LOL, word! Jay is funny and knowledgeable in certain things, i.e. hands on experience with watercooling, but sometimes hes not quite on the money imho. id rather trust moores law is dead on this one, stating that the MAJORITY OF MINING GPUS HASNT EVEN STARTED TO HIT THE MARKET YET!

    meaning: were still at the BEGINNING of the gpu price crash LOL

     

    seriously guys, if this continues i might just grab a dirt cheap 3090ti as a "holdover" placeholder gpu until 40 series is out, widely available and all partners have released their lineup 😂 oh the irony! hahaha

    • Thumb Up 1
    • Haha 4
  16. 2 hours ago, Ishatix said:

    Meh... I rather agree with the sentiments expressed in this article:
    https://www.extremetech.com/computing/337005-researchers-found-an-unpatchable-security-flaw-in-apples-m1-and-you-probably-dont-need-to-care

     

    The amount of energy and drama expended seems out of proportion to the actual risks, and mainly only serves to keep IT professionals and tech writers employed as far as I can see. How many people do you know who have been impacted by any of these things? Over the entirety of the past decade, I know one person who got a minor virus, one person who was targeted by a social hack with no ill-results, and two people who have had their WhatsApp accounts hacked (and well, if you're still using WhatsApp, lol...). Social engineering is by far the most common form of attack these days, and renders most of the underlying technical issues completely moot.

    true, drama sells clicks n views. but still, better to be aware than be caught off guard.

    • Thumb Up 3
  17. 13 hours ago, i.bakar said:

    to be honest , before the release of windows 11 i was pretty optimistic that microsoft might come with something diffrent this time , once it was released i intentionally didn't watch any review for it because i wanted to give it a fair openion without any inflowinces so i installed it on last october for a few days , found out that i cant drag things to the taskbar , i cant see whats inside the folders so i just replaced it with win10 and didnt bother ,, then back in june i saw a video for the win11 insider virsion that the return thos options back so i thought OK they might finaly learn there lesson!!! so i upgraded to win11 and id that benchmark on R23 , got the results then installed win10 to compare and " wallah " i beleive you already saw the results on the thermal paste thread so since that day i decided no win11 anymore !

    bad performance on 3ds max , bad benchmarks performance , and too many curvy adges ! however the laptop battery life is better on windows11 then win10

     

    win11 reminds me with win8 ! its kind of the opposite of the latest virsion they had before that release , it  sounds like an experement that wont last long . 

    MS stays true to its crappy/acceptable cycle 😂

     

    win95 - crappy

    win98 - acceptable

    winME - crappy

    XP/2000 - acceptable

    Vista - crappy

    win7 - acceptable

    win8/8.1 - crappy

    win10 - acceptable

    win11 - crappy

     

    so as you can see, were exactly according to plan, in the middle of a crappy os cycle! 🤪

    • Thumb Up 1
    • Haha 3
  18. 23 hours ago, 6730b said:

    Their pre-release campaign & design leaks worked very well, created lots of buzz on all social nets and tech sites, the name suddenly popped up 'everywhere'.

    Then when it arrived looks like it's not deemed just an interesting gadget, the reviews are seemingly all positive, holding it's ground vs many of the best of the midrange class. Example https://www.gsmarena.com/nothing_phone_1-review-2453.php


    Am positively intrigued, something new & fresh among the otherwise stagnated design of all other brands.

    gsmarena_041.jpg
     

    hmmm....im willing to be enlightened as to what exactly makes the nothing phone so special, aside from clever marketing and hype train going choo choo 🤔 admittedly, the design is unusual and something that stands out in a crowd, but thats not enough for me to pull my wallet out....

    any special software functionality and/or update policy? accessories? support for custom roms?

    • Thumb Up 2
  19. On 7/25/2022 at 11:51 PM, cylix said:

    Just got a Samsung Odyssey G9 Neo, my dream Monitor! Got lucky and get it at a very very good price. Installed the latest firmware and everything looks ok, no dead pixels and no problems. 49 inches FTW! 😀

    cmon u tease, tell us the price 🤪

    the neo g9 has been my target monitor for quite a while now, until the neo g8 got released 😋 i decided to go for a more "standard" 16:9 ratio vs ultrawide but man the neo g9 is a beast to be reckoned with! which firmware are u running? also, what settings are u using? calibrated or not? give us da deetz!

  20. 17 hours ago, Tenoroon said:

    @jaybee83Do these models support Prema BIOS from a partner? I have a buddy who is looking into DTR's and I'm sure he would be interested in grabbing one of these if Prema has worked with a partner 🙂

    thanks for showing interest mate! the unicorn magic master and i are currently working on a mod for this machine, well have to figure out what kinda performance and functionality advantages we can achieve. based on how successful we are, there is indeed a tentative plan to release it to partners. "people" are being kept in the loop, but its still early days.

     

    in any case, ill keep the guys here updated on the progress 🙂

     

    speaking of which! we mightve gotten rid of the bug that kept us from properly signing and flashing modded firmwares, next test built is ready to be flashed 😁👌🏼 "unfortunately" *cough*, i still got a week of vacay here in croatia ahead of me and "Alfred" is waiting at home 😋

    • Thumb Up 1
    • Thanks 1
  21. 5 hours ago, electrosoft said:

    Finally got around to delidding and relidding my Silicon Lottery 10900k in my X170SM-G.

     

    Test parameters:

     

    MSI Z590-A Pro motherboard

    EVGA 360mm CLC

    Gelid Extreme center dollop

    Fans on max (set from BIOS)

    Fixed ambient temp of ~70c

     

    Before delid:

     

    1487669471_StockCB2310minStress.thumb.PNG.f4b3e4a938a4ad9f6e8e520d0a3e05ab.PNG

     

    After delid:

     

    597865625_DelidRelidCB2310minStress.thumb.PNG.1c3a6db8fdb3835a3cd94dc7637aa610.PNG

     

    I'm sure the X170SM-G will appreciate a touch of breathing room.....

     

     

    nice man, solid 10c shaved off! does that allow any extra multis?

     

    as for the price on those EVGA frames....holy guacamole. im guessing super limited run and excessive cost for the tooling required to produce the parts for them. sorry but id never even consider spending THAT money on a case, damn!

     

    btw good news, 3090 ti pricing dropped by another 200€ here in europe, the crash continues mwahahaha! cheapest available card now sits at 1399€ including tax and shipping. gotta love that 😁👌🏼

    • Thumb Up 3
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use