Jump to content
NotebookTalk

All Activity

This stream auto-updates

  1. Past hour
  2. btw, confirmed that both the Astral and the Suprim cards will sport the phase change thermal pads.
  3. Today
  4. Scummy people that do scummy things. Like this... https://www.malwarebytes.com/blog/news/2025/01/the-great-google-ads-heist-criminals-ransack-advertiser-accounts-via-fake-google-ads
  5. Asrock 8.02 Bios (January 2025) w/ Intel enforced "limits" still a flaming ball of sun fire to the face.... and this is on a good sample 14900KS: Stock 8.02 w/ Intel limits (notice the 253w limits are in place): After a little enforced tune up for gaming 5.9 all core:
  6. Hello all, hoping I can get some advice. I have a Precision 7760 i9 that originally came with a t1200. I purchased a RTX 3080m 16gb off of ebay, along with the heatsink that should go with the RTX card. I have put the card into the the laptop, along with the new heatsink, yet when I go into BIOS the dGPU Video controller shows blank, and the RTX card is not detected within Windows. When I run a diagnostic I get an error message: "Sensor: The (Video) reading (0c) is lower than expected." I am assuming that this means that the sensor for the GPU is not registering. BIOS is on the most current version. And this is a fresh image of Windows 11 24h2. Am I missing a step with swapping out the GPU? To be safe I am ordering a replacement DGFF power cable off of ebay. I am contacting Dell since there is an issue with one of the USB ports on this laptop, and they will be replacing the motherboard, however they will not touch the GPU. I am also concerned that if I mail the laptop in for repair, they'll keep the GPU and heatsink since they are not the original ones. They will be sending me a tech to service the laptop.
  7. 1.100v and +700W. 600W cap and 33% more cores need really good binned chips. Using same old node as for the new and shiny doesn't make magic for the power consumption. Somewhere it has to be a trade off. And the closest I can see is lower max boost clocks. How much voltage Nvidia will allow for Blackwell chips.... I don't know. With later revisions of 4090(FE) they decreased max voltage from 1.1v down to 1.070. The 12VHPWR meltdown, cross-flashing firmware with custom nvflash software for FE cards, the China restrictions (US ban) and Intel's failure for voltage (Vmin Instability Issue)... Anyhing can pop up from a real software company as Nvidia. Even today Nvidia have the implemented voltage capp for 900 series cards intact in newer drivers (no more allowed voltage tuning with the different AIB OC software for Maxwell). Nvidia can be pretty disgusting. Hmmm. Exactly as Intel. Always right before a new model is out. Is it a coincidence, @Mr. Fox? Incredible timing... Nvidia's January GPU driver update patches multiple vulnerabilities
  8. Hi I am new to this and is planning to buy dell m6800 for 190 euro and got some problem I am planning to upgrade the gpu to T1000 would it be possible ? it is from HP ZBook 17 G5 does it really matter if I got eDP or LVDS model ? if I understand it correctly the internal screen would work with just the igpu if i got LVDS ?
  9. The X670E Gene is easy to get used to. Other than its mATX size handicap, it is a great AM5 motherboard. Probably the best one made to date. This makes me eager to snag an X870E Apex.
  10. Off the office/home yesterday and very good experience except again browsers - have no color site ID/symbol on tabs on Opera so will have to search if there is an option or have to change browser. Felt no need to use the cover/keyboard. Getting better acclimatised with "android explorers" , good that i can jump directories like in Windows in the address bar and that menus can remind of last folders used. I think the weight 590g it is near limit to a confortable experience with one hand, maybe 630-640g would be my limit, so it was okay showing business stuff to other people in a kitchen without an usable table. The large screen is very convenient. Still have to setup my HP all in one printer to scan to the tablet and to print from it. Not doing much of that last months but want it set.
  11. Is there any upcoming tech event which Dell could wait for before anouncing the Dell Max Pro laptops?
  12. The big thing though is the 4090 really doesn’t need that much power in my experience for gaming it’s really efficient compared to my 3090 Kingpin. I agree (2) Gen 5 power cables and a 1,000 watts bios would be really nice and ideal. But 99% of the time my 4090 is barely using 300-450 watts when gaming at 4K. Right now I’m actually playing Indiana Jones game, my 4090 is overclocked as high as it can go, and I’m sitting at 430 watts. Running the card stock with maxed power limit it uses 360 watts of power. I feel like with my 666 watt power limit this is more than enough for me, I can’t max this out in any real world scenario in a game. So comfortably we can probably use the overclocked 5090 and stay under 600 watts. And maybe the silicon will be even more efficient and use even less power as well. I think the best thing to do is we place our 5090’s on water cooling and then power usage would go down a little more! And hopefully with a good sample, the clocks could be pushed pretty good and we’d never really need more than 600 watts anyways for 4K gaming. Nvidia is probably going to release the 5080Ti which bridges the gap being a “Almost 5090 performance” but more extreme power delivery from AIB’s. I’m not sure. I think I can fit more coda cores and more vram inside this 600 watt limit though. And my 4090 is a pretty terrible sample.
  13. I expect you get lower max boost clock with more cores for the 5090. 33% faster doesn't necessarily mean 33% faster when max oc'd. We'll see how much headroom more cores will get within the max board power limits. Bin quality this gen etc. Need to see results first. And there is no more HOF cards you can grab their vbios for cross-flashing. We'll see. Not so sure Nvidia will go the same route once again with the new (4090FE with an 600W bios). Depending on how confident they are about the new thin 2 slot cooler design. Remember those FE cards is meant used also in thin and slimy pc boxes. And the best Zotac card max out at 600W, so... A measly 600W power limit can in worst case gimp your maxed (OC) boost clocks if you got an average chips.
  14. I definitely recommend the 4090, more so than a 5080. Of course I want to see reviews but… When I had my 3090 Kingpin it was overclocked and was about 25% faster than a stock 3090 FE in pretty much everything. But when I was gaming in 4K, I definitely felt the need for more power. So when the RTX 4000 series came out. I was like hmmmm. Now that I own a RTX 4090, it feels so fast I really question the need for a 5090 everyday as launch gets closer. True to word, no copium being smoked at all here 😆 the 4090 is just a BRUTE man! Literally can go run MSFS at 8K maxed out “No DLSS” with Frame Gen=On at 52fps avg and it’s just butter smooth with 1% lows also at 50-51fps. It’s the GPU that is going to hold up so much better than the 3090 did I think. Because the RTX3000 series really needed frame gen. So I highly recommend the 4090. I do want to see reviews of 5080 performance at 4K of course though. Will be curious to see how these GPU’s stack up. @Mr. Fox if I can’t get the 5090 models I want at a normal retail price, Suprim/Astral/FE/Maybe a TUF lol. I’m happy keeping my Gigabut 4090 Gaming OC. These cards are so fast that the 5090 which is literally a GODZILLA 512bit monster of a GPU, is only like 33% faster lol. If it doesn’t work out, I’m going to order a waterblock for my 4090 and complete the Lian Li V3000+ Plus and hard line tubing using the 4090. I wanted to do this with a 5090. But if I can’t grab one then the 4090 is fine. I’ve got me a bunch of PETG tubing, and (18) EKWB hard tube 16MM fittings ready to go. Oh and one thing cool, that new 5090 Astral uses a phase change thermal pad which is pretty interesting.
  15. It's probably difficult to get an accurate SP rating because of how much more erratic the behavior is with Ryzen CPUs. Temperature does not affect SP rating on Intel, but the prediction in the lower right corner is 100% tied to temperature. The higher the cooler rating is the lower the voltage is. If you allow the ASUS AI to automatically monitor and adjust the cooler rating it can become a mess. To stop that you need to manually override and assign your own cooler rating. Looks like 5090 prices will be for wealthy fools that like to waste lots of money on grossly overpriced enthusiast parts that suck at overclocking. I bet that 5080 will end up being sold for 4090 scalper prices as well. It rises to the level of extreme dishonesty and demonstrates a lack of integrity, if not outright immorality, in my opinion. This is like getting screwed on steroids. They are cordially invited to dine on my feces. Hopefully they are OK with using a paper plate and disposable plastic spoon. One thing that could help stop the scalping is for AMD and NVIDIA to re-write their warranty terms so that there is no warranty whatsoever if a new GPU is purchased from a seller or retailer that is not pre-approved by AMD and NVIDIA and on their list of authorized distributors. It could be sold used and still have warranty, but only if the original purchase invoice is available to prove it was purchased from an authorized seller when new. Then all of the scalpers would have nothing to sell but parts with no warranty. They would be worth less and nobody would want them. The scalper would have to provide the original invoice showing it was purchased from an authorized seller, which would then draw attention to how badly they are screwing the people dumb enough to purchase their products. It would be instant death to scalping. (Next best thing to instant death of the scalpers.) I am glad I have two 4090s and a 3090 Ti I can ride until they die because they are still great GPUs and those new prices are unacceptable. If AMD doesn't jump on the "let's screw our customers" bandwagon, I can see a lot of people shopping for GPUs being fiscally responsible and settling for AMD even if they dislike Radeon cards. Better to settle for something inferior that is priced to value than getting sodomized on the price of a GeFarts card. Yup... it's what I call nuckingfutz. Appropriate imagery in Paul's intro. LOL.
  16. I dont really game or anything so technically, I could just get nothing LOL But no fun in that. I may try to get another RTX Titan to do SLI if I can get one for cheap. RTX 4090 FE would be nice if prices come down after 5090/5080 release too.
  17. Welcome to the dark side my friend. It's absolutely ridiculous to have to pay that kind of money for a gpu. For that kind of money it should s*** gold bricks, and serve breakfast in bed.
  18. I wonder how much performance Nvidia hold back aka rob from older gen cards to up sell the new and shiny. I don think Nvidia have squized out every drop of perfomance (be it raw performance or software based) before next gen is out. NVIDIA DLSS Boss Doesn’t Rule Out Possible Frame Generation Support for RTX 30 Series NVIDIA has used a Supercomputer 24/7 to train DLSS for 6 years straight without a break From 1:12:40
  19. Yesterday
  20. I'm seriously starting to think I can not accept a 15% markup (plus tax) on an already $2k card. I can't see paying $2300+tax= $2452 for a video card again. I say again, because I paid 2585 initially for a KPE 3090ti but it at least came with a 1600 Platinum PSU AND I was able to get them to knock it back down to $2k+ tax. I remember feeling near instant regret within a week or so of having bought that paying that much for a freakin' video card to bench and game on. I was very happy to sell it 3 months later and get back everything I put into it and I'm still using that PSU to this day in my main rig. 🙂 Even an FE is going to be $2132.50 out of my pocket after tax. Ugh. I'm going to have to mull this over seriously while I also see how the 5070 Super and 9070XT perform. I already know a 7900xtx w/o RT on is pretty righteous already in its own right at the right price. My biggest fear is WoW running into issues again like it did before and dragging on for over 10 months and basically being unplayable.
  21. Hmmm. That will be the real price if the proposed tax takes effect for the US. But close to $800 USD extra for the Suprim over MSRP cards (FE) without tax is badly enough. Gone is the days you paid $100 extra for the better custom cards. I'm sure EVGA could do decent deals with Nvidia cards if they continued with their business as we see how the prices has becom for custom cards. Imagine +3000$ 5090 K|ngp|n cards. That would be an reality. And not forget the custom blocks from EVGA. Just aldd $500-750 on top
  22. be sure to subtract the respective tax (listed in the link) before converting to USD to get closer to what you could expect in the US. but yeah, ill BARELY be able to do the list price with a small buffer on top, but not much more....ugh
  23. I'm starting to sour a bit on the 5090 if these prices translate to USD in any meaningful capacity. Would if I could, but BB limits you to one per customer. 😞 ------- @Mr. Fox I've been seeing this more and more how on AMD, AM5 SP rating is dictated by cooling which would be a bummer. On X3D, I only really care about the IMC, but still want SP to be at least one valid metric: https://www.overclock.net/posts/29416394/
  24. looks like i was right to be still worried, even with my considerable budget saved up: https://videocardz.com/newz/custom-geforce-rtx-5080-and-rtx-5090-pricing-emerges-made-for-gamers-with-deep-pockets @Papusan get both your kidneys ready bud: as for us here in euro-land: holy guacamole, the MSI Suprim will actually be MORE expensive than the Astral?! O.o
  25. What MaxxD said. Or get PTM7950 sheets for maximum longevity.
  26. I am guessing more vram, specifically, but will need to run tests.
  1. Load more activity
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use