-
Posts
2,795 -
Joined
-
Last visited
-
Days Won
128
Content Type
Profiles
Forums
Events
Posts posted by electrosoft
-
-
2 hours ago, tps3443 said:
Yes confirmed 100% the RTX 5090 FE can be shunted just like any other 5090 and it works. Waiting on my block still. Hoping EK is faster than the 6+ weeks I have been waiting already with Aquatuning. If I would have originally ordered with EK, I’d already have one.
I’m merely after a 17K Steel Nomad results that are stable for gaming with my ambient custom loop with no more than $2,350ish total (GPU+Block) which is great. This was all I wanted from the beginning. 😄
Happy at least it can be shunted so we can see what it can do shunted and blocked. Hopefully Nvidia doesn't figure a way to shut that down on their FE's next go around when they'll only be $2499.99 🤑
I'm so torn between wanting to see FE cards remain sealed and aesthetically pleasing and torn down and blocked. 🤣
-
1
-
-
2 hours ago, Papusan said:
Those defective 6000 chips will go into China as their newer AI server chips paired with +48GB GDDR7. No point make 5090Ti if they don't cost north of $4500. And that's too much for 5090Ti/Super class cards.
Agreed, I said it earlier in this thread numerous times there will be no 5090ti. I said it over on the OCN forums too.
2 hours ago, Papusan said:The sad part is if Nvidia simply slapped a "m" after them this really wouldn't be a problem......
Common sense should tell anyone you can't cram a monster sized desktop card in a relatively small laptop and expect it to run full tilt without creating a tear in the space/time fabric.
-
1
-
1
-
-
1 hour ago, tps3443 said:
Yes, Asus and MSI and PNY are chasing profit hard! So, original 4090 Strix= $2,000. And now the Astral is $3,359. It does seem just a little overkill. At least the Astral is the best 5090 though, unlike the Strix 4090 was, they were pretty lackluster. And Asus is pumping them out too everyone is buying an Astral it seems like.
But Nvidia is no better lol. 😂 think about the RTX PRO 6000. That GPU cost roughly the same to make as 5090FE, but they can wholesale crates of them for 5-6K each and Nvidia is still making so much more money than 5090FE could ever dream of, and it’s really the same money to make one, same die. Only slightly different bios. So I think that’s why the 5090FE is a rare bird, the 5090FE sales are like Nvidias charity work on the side, (We can’t give it all away) Much easier to get a RTX Pro 6000@MSRP too. No one can scalp these 🤣 We can’t exactly scalp a $8,000 dollar GPU. Which is kinda smart I suppose. Resellers are motivated to move these units at like $7K-8K range. Which makes me think lots of profit and wiggle room on these.
AIB’s need to cool it down though. They are following Nvidia foot steps too hard.
Original MSRP for Astral is $2799.
Original Windforce price is $1999.99.
Original PNY price is $1999.99
Original Windforce OC is $2199.99
Original PNY OC is $2199.99
Now look at the stupid prices.....
6000 is a different breed depending on yields.
Full GB202 = 24,576
6000 = 24,064
5090 = 21,760
So the 6000 can only have a ~2% failure rate before it then goes into the pile of defects for the "mid range" 5090's or knowing Nvidia maybe a 5090ti down the road with somewhere between 21,760 and 24,064. 5090 has a much more generous 13% margin of error defect rate and I suspect within its parameters much much better yields than the 6000. You're definitely paying for that much more golden/better yield chip that's for sure.
I am sure slowly, but surely, Nvidia is stockpiling those full fat perfect GB202's for some mega expensive card down the road.
And yes they ALL are chasing profits hard. Don't let Nvidia fool you with their 25% increase greed hidden behind a decent profit margin on their 5090FE. AIBs are given absolutely razor thin margins so their prices are higher (but obviously you get more in various ways with AIB cards), but the price hikes after that? Market conditions with a touch of greed.
Astral is the best of the lot IMHO followed by the Suprim but for me just not worth $3600. I had an absolutely flawless sample with zero problems that ran D2D OC'd no problem. I completely enjoyed those two months of ownership. But again, $3600? That is insane for me but whether $2k or $3.6k, the pricing has now reached stupid levels for a consumer grade GPU; but I applaud and am happy for everyone who picked up the model that tickles their fancy from the top end OC enthusiast like @johnksss on down to you and the $2k FE. I'm loving all the posts and results and can't wait for you to block yours.
What is the status of shunting the 5090FE atm? Yay or nay? Working as intended or?
1 hour ago, Papusan said:And I'm sure they won't reduce prices if the GPU tariff won't be as high as some think.... And not so sure the US governments is so keen make PC's, laptops and important computer parts being so expensive a few weeks before the school year start in the United States. And we all know +95% of needed tech is made in Asia. Even 10% tariff will be felt very well when the school year opens.
Or maybe the US Government want that you all should buy cheap Chromebooks/tablets for the school opening? Maybe go back to pen and paper evn in the higher leven schools? If that happens, you can be sure that education in the US will fall further behind countries like etc. China. Bad enough as it is. I doub't we will se an massive price hike due stupid high tariffs right before the school start up this summer.
But Nvidia and AIBs... They will do whatever they can to max out profits. And you don't need flagship graphics cards for School work... So.
GPU Manufacturers Are Rushing NVIDIA GPU Production To Bring Them To The US Prior To The July 9
Well, companies are fond of leveraging desperate times in order to increase profit margins, and in the case of AIBs, well, they would likely wait for the July 9th deadline to pass before releasing market inventory. This would allow them to bump up prices in case a trade agreement isn't reached, and then ultimately, drive up the prices.
See bolded text.
So corporations corporating as expected...... profits are their end game....business as usual.... 😞
But if 5090s are already starting to stay in stock and languish a bit now especially as scalpers wind down their scalping because it isn't worth it, prices will hit a wall. All the "market conditions" in the world won't circumvent demand that will not pay a price or has been satisfied.
also....
See the bolded text below...
Now see the even more bolded text below...
🧨 MEGA BOLD ACTIVATE!!!!! 🧨
1 hour ago, Mr. Fox said:If everyone would be smart and say eff ewe to companies selling $3400 5090s and $1200 9070 XTs there would be no companies selling $3400 5090s and $1200 9070 XTs. They'd be selling them for what they are actually worth and making a reasonable profit in the process. They have priced them based on how much an idiot is willing to pay for them. Too many people have issues exercising self-control, restraint and prudence. They are driven by their desires more than intelligence. They know this and they are exploting this flaw in their humanity.
We've seen how Nvidia has responded to a downturn of market conditions with their 4080->4080 Super. We've seen them bring their A game with A game pricing with their 4090. Rumors now they are readying the "Supers" to counter the sudden surge of AMD in the mid tier market.
While AI is their focus, not for a second do I think they're abandoning or not keeping an eye on their 4-5 BILLION dollar GPU gaming division.
In addition, those "launch price only" discounts from AMD are shady but the same supply and demand paradigm that applies to Nvidia also applies to AMD just sans the halo tier this time around.
You are right and self control and restraint are the only answer. Hopefully 6 months from now when the demand on the consumer side has doubled down on being satisfied and store inventories start to build up, we'll see how the market looks then vs now.
-
1
-
1
-
1
-
-
1 hour ago, tps3443 said:
I think FE pricing was similar to AIB pricing though.
4090FE= $1600
4090 Gaming Trio $1650
I don’t know what happen to AIB pricing scale. MSI Gaming Trio was +$50 more than FE. And now MSI Gaming Trio is +$1350 more than FE.
^ This is all whacked out. I really wonder how they go from +50 dollars to +1350 dollars more. That cannot be tariffs. I think it’s BS.
I think AIB’s are really thirsty like Nvidia has been for years..
I know Nvidia is slinging the RTX PRO 6000 Blackwell 96GB GPU at insane profit levels. It’s a 5090 die through and through. But, it cost $8,200 dollars. That is a big profit for Nvidia. They even had an artificial limit at 5090 launch day. And that’s where all the dies went to (Into RTX Pro 6000 lol)
It depends on the model as always.
4090 = $1600
4090 Strix = $2000
So a 25% markup for Asus.
Of course they then went bonkers with the Astral but the original launch price was $2799 = 40% mark up.
A lot of this has to do with Nvidia reducing margins even more and forcing AIBs to buy the entire GPU/mem package on top of everything.
At launch, there were several $1999 AIB models from Gigabyte and PNY and even the next tier up was $2200-2400.
IMHO, the inflation is more greed + supply/demand than tariffs but tariffs were a convenient cover to quietly raise prices even more.
Supply is really equalizing now. Prices are falling in other countries and PNY has marked down their stupidly inflated 5090's on Microcenter from $3300 and $3500 down to $2999. I expect card prices to slowly creep back down even in the face of "tariffs."
In other words, tariffs are definitely real and valid, but why is it every other sector has been barely affected by them in the computer parts? SSDs are still cheap and memory prices are falling. Motherboards remained about the same (IE, still overpriced but static) and CPU prices are either the same from last year or have fallen. Case prices have gone up though but nothing as insulting as the 5090 mark ups.
5090 pricing is a pure money grab from AIBs while demand is outlandish. I refuse to play the game anymore at this point. Even the 5090FE is stupidly overpriced. The other inflated pricing just gets even more insulting. A lot of this factored into my returning that $3600 astral.
Remember on launch I had 3 5090's Queued up at B&H and they were all $2200 or below along with queuing up for a 5080FE and 5090FE. That quickly went to bust.
-
1
-
1
-
-
2 hours ago, tps3443 said:
I like to think about the 4090 compared to the 4080. My experience was a substantial upgrade very nice boost in games. Everyone with/without a 4090 talked about it, and thought it was +$400 more well spent going for the $1,599 dollar 4090. Now consider the 5090 is also +$400 more than 4090 but an even larger upgrade than that was.
I’m beginning to seriously question if anything is ever good enough for me or most people. When we all had 3090’s, and the 4090 came out, we complained because the 4090 was too damn good. It was so fast with frame Gen it made the 3090 instantly obsolete (Nvidia was no friend to us gamers because our 3090 is obsolete now) But now the 4090 owners complain because that didn’t happen again. I don’t think we can ever truly be happy with high end hardware no matter the product they give us.
I really don’t think any company or product can fit the bill for what we actually want that would make everyone happy and sing praise lol. All we can do is waste our money, convince ourselves into buying the GPU, then we sing our praises about the things that are true.
The jump from the 3090 to the 4090 was an anomaly overall. It was just massive. Expecting that again was unrealistic jumping from the 4090 to the 5090. The jump down the entire 4000 stack was pretty substantial but a lot of that was the AI market hadn't fully kicked off AND AMD had seriously threatened Nvidia with their 6000 series.
Now? AMD is zero competition at the top. I mean not even in the same ballpark. 9070xt is a damn good card, but in the end it's main competition is the 5070ti with sporadic trips against the 5080. Pricing is a farce even with AMD and "discounts" at launch that then go again. They never made their own reference card this cycle so there is zero "real" pricing anchors and the real price of the 9070xt is $700+ with the cheapest model being the Steel Legend at $699.99.
Overall, with the 5090, we got a decent little bump over the 4090 and a savage 25% price increase at the least.
1 hour ago, Papusan said:I wonder how bad the real 5050 will be when the 5060 is in reality an mediocre 5050 class graphics card. So an new released
5050aka xx30 tier card for your pc monitor setup and not much more. Then why not instead go with an processor with iGPU? The real 5050 will be a waste of sand and should never be made. Or waste of money for the consumers. Put in what's fits.The Real Nvidia GPU Lineup: GeForce RTX 5060 is Actually a Mediocre 5050
Se bolded text....
This affects other models as well. What used to cost $350 to $400 (inflation-adjusted) now costs $550, and you're not even getting as much as you used to. GPUs that once sold for $500 to $600 prior to 2020 are now $1,000. Everything has gone up by 40% to 50% on top of overall inflation.
So why is the GeForce lineup suffering from such significant shrinkflation? Why are consumers getting less value now than ever? An obvious answer is simple: new unlimited demand for AI GPUs and profit.
Can't reach into the proverbial ether and keep pushing old generation to generation uplift percentages as the baseline for classifying cards.....
In the end, are the 5000 variants more powerful than their 4000 variants? Yes.
Are some of them even cheaper than their previous counterparts? Yes again.
Do I like posing questions to answer them online? Once again, yes....
I wholeheartedly agree though we're not getting the same performance jump as before and Nvidia's priorities are elsewhere but to assume you're going to get the same generation over generation leap as before then applying that model to reclassify the card tier is a digital strawman at its best.
10 minutes ago, Mr. Fox said:I never think in terms of FE pricing because I have no interest in FE cards, and especially not the current design with the fragmented components connected by ribbon cables. That is totally unacceptable to me. So I do not think it is accurate to say the 5090 is $400 more than 4090 when you look to the AIB options. That is also not a great comparison because nobody manufactures 4090 anymore. I think if they did, and it was priced to value the 5090 would not sell well because it is not priced to value. Everything else in the Green Goblin's product stack is a piece of trash, but still grossly overpriced and a poor value.
NVIDIA has everything so stinking effed up with the GPU market right now that I don't believe that it will ever be capable of returning to normal. They are anti-competitive, deceptive and maniplulative and they sell products that reflect that on top of being grossly overpriced or a poor value. Being better than the alternative is not good enough because of their disposition.
You are right to question whether anything is ever good enough. I think that is very much the case and that's where I am, especially with video cards. Nothing is ever good enough for PC enthusiasts and overclockers because of the degree of nonsense with control and manipulation. NVIDIA has created this problem by deliberately screwing us all and being bold about it.
To be clear, I do not like AMD. I don't like how they do things. Most of what they do is a little bit goofy and often not the best approach. However, they have become the lesser of evils. NVIDIA and Intel are capable of being better than what they have become and should be held accountable for bad decisions. AMD is the best they have ever been and seem to be putting forth the effort to do their best. Sometimes doing your best isn't enough, you have to do what is required. NVIDIA isn't doing their best or what is required. They are screwing everyone and are not even ashamed of themselves.
As much as I miss EVGA and wish they hadn't thrown in the towel, I think they were smart to do so. Nobody wins when NVIDIA is involved. They realized that before anyone else did.
I still think there's some value in the lower part of the green goblin's stack as prices stayed the same or went down along with performance bumps, but no one is doing backflips over it. It's very ho hum. I'm thinking of the average gamer. If you're holding onto a 2000 or even 3000 series class card, you can get a very uplift in performance for equal or less than last gen pricing.
Everything else is spot on in your response and then some.
-
3
-
-
4 hours ago, Papusan said:
Most people buy prebuilt systems and keep them 4-5 years. And up to $2000 for the whole systems and with 8GB vram cards in 2025 ain't nice. Rather ugly. Even $1200-1400 systems with this type of scam ain't pretty. But that's me. I don't like being scammed. I rather pay a bit more or buy used than support this form of tech degradation...
And this ain't pretty. Form over function....
Adjust settings, play the game....isn't that how we've always done it?
No one is being scammed unless they do not list the VRAM somewhere in the specs. If they aren't listing it at all, then sure. But if it is listed, isn't that consumer choice? Pick the card that fits your needs. Watching Daniel show the prebuilts, you can clearly see it shows the GPU has 8GB of VRAM.
Nephew uses a 6GB A380 and is loving it. Wife uses a 6GB 3050 on her laptop and is loving it. Both running 1080p screens/panels.I remember all the doom and gloom when hogwarts dropped, but I set up a 12500 + 3070ti 8GB system for my daughter and she played it at 4k and loved it. My brother is rocking a 5600x + 1650 4GB and his wife's nephews rock out on fortnite every chance they get and guess what? Loving it....
More is better obviously but sheesh. We watch these YTers push settings that will cause the buffers to fill up and cry wolf. Meanwhile, everybody else is just adjusting their settings or letting the game pick the settings best suited for their card and playing.....
I do agree with Daniel Owens though....
-----
As for the 5090FE, I absolutely agree. Like I said, clearly price, form factor and aesthetics is its strong point, but if you pop it in and just use it, it works fine. Runs warm, but works fine. You won't really be competing in the upper level overclocking with it, the fans are never truly silent 24/7 and those fans will definitely go to work under load but It is also $2k and at least $640 cheaper than the next 5090 available (at least here in the states) which is the Windforce and $1359 cheaper than the Astral. For me, that is $1450 difference after tax / cash back for both.
But you pay for what you want and if you want the best or a card not locked down or a larger selection of blocks or can run on air and still OC and perform very well, the FE is not for you. If any of this criteria is important to you, you run away away from the FE as fast as you can.
-----------------------------------------------
Anecdotal further cost analysis for WoW:
Stock for stock, the 5090 is 77% faster than the 9070xt in WoW (105fps vs 186fps) when not CPU bound but over 3x as much compared to even the FE or 5x as much compared to the Astral. 5080 FE is 23% faster than the 9070xt but 52% more expensive. Irony? All three suffer from GPU utilization drops with lots of player data at 4k, but the 9070xt handles it better overall.
Overclocked, the Astral was ~96% faster (210fps vs 107fps) than the 9070xt overclocked when unhindered by the CPU gasping for breath at 4k. That's just bonkers in the best way......
-
1
-
-
5 hours ago, jaybee83 said:
i can see that 5090 budget burning a hole in your wallet hahaha
i for one am done with hardware purchases for the time being, back to tuning the installed hardware (finish RAM and havent even started with the CPU yet!) and also more gaming! just finished Deliver Us Mars and also discovered a third installment is already in the making (Deliver Us Home). Now I am starting with Doom Eternal + Ancient Gods, first time ive tried it. Lets see how it compares to Doom 3 and Doom 2016 🙂
I mean I *DO* have options now lol.
I must reiterate, first thing is my room really does no longer turn into a sauna even with the 9070xt running at full tilt. World of difference between ~550-600w vs 330w pumping into my space which is what WoW pulls when doing anything not player heavy. Door open, ceiling fan going and sporadic AC wasn't enough to overcome the 5090.
Even if I eventually pick up a 5090FE, I still have almost an additional $1500 to play with differential at my disposal, but as each moment passes, I am zoning in on that $2132.48 price tag after taxes more and more while this $667 9070xt (total cost of ownership) is really holding down well enough in the here and now for Fallout 76 and especially WoW at 4k.
And I am the type who can pick a 5090FE up, look at it lovingly in my hands and go, "I know you are a mediocre card compared to most if not all the other 5090 AIBs out there for overclocking and air cooling, but gosh darn if ya aint the cheapest, smallest and prettiest of all them dang cards! Now get yer keister in this here SFF!" 🤣
-
3
-
-
2 hours ago, Papusan said:
Why the 5070 ? Maybe the worst SKU in nvidias's lineup. I know you won't keep it but support Nvidia this way ain't very productive.
Why? Because....
1. MSRP.
2. I like the look/form factor of the FE series.
3. Price:performance it is actually one of the best (if not the best with 9070/xt temporary launch pricing now a thing of the past) just like the 4070 was retail pricing which I pointed out back during ada.
Of course the above is no longer true because the temporary introductory prices of the 9070 and 9070xt no longer exist and the MSRP is now higher while the 5070 FE has remained the same.
People seem to get stuck on the whole VRAM issue. I am one of the few who thinks the whole 8gb fiasco is blown out of proportion and an 8GB card is ok depending on your use case. I also think 10GB and 12GB is just fine too. I watched the reviews of the 5060 and 5060ti and went, "Ya know, it's not that bad at all..." Same for the 9060xt w/ 8GB of VRAM.
3 hours ago, Mr. Fox said:Yeah, I am seeing higher CPU benchmark scores and better memory performance from a number of 9950X3D owners (like 49K+ in CBR23) so I thought I would investigate. That hasn't been the case in the past, as the X3D CPUs are primarily intended for gaming, which obviously is not a major driving factor or something that motivates me.
Moving that 3Dvcache underneath the CCD has made a major difference. Once they slap it on both and improve their interconnects even more, it will be near perfect.
Oh, and go ahead and offer a dual binned variant and call it "Extreme Edition" if you want ala Intel style and charge a touch more and I'm there for it.
-
3
-
2
-
-
11 hours ago, tps3443 said:
The 5070FE working well? I like FE’s always have since owning GTX 1080FE /2080 Super FE/ 2080Ti FE etc. They give off that new premium vibe. Does the 5070 overclock really well? I saw a VFcurve on one of these from a guy on overclock.net and they are like hand picked to be low voltage queens and low power chips, unless they are just naturally like that due to only a fraction of cuda cores intact. Maybe you can share its VFcurve. But this one 5070FE I saw ran about +1,000Mhz more than my 5090FE does at 0.800-875mv. I’m guessing it is because fewer cuda cores changes the curve, but It’s still incredible though.
I haven't had a chance to install it yet. I've been testing out the items I picked up from GTZ over on the OC forums. Fast and as advertised as always. Just a good guy all around. I wish the tech forums had a 1000 of him.
I'm going to move that 13400 from him into this B760 Gigabyte Gaming AX board I have w DDR5 6400 and install that before I give the 5070 a whirl in a few days. I've finished testing my 9070xt just about. 2fps+ Max OC'd vs stock on min and avg in Deux Ex, WoW, Fallout 76....yay? 🙂
10 hours ago, Mr. Fox said:Mine is arriving on Tuesday.
Uh oh! Time to test one against your two other 9950x's?
I must admit, I'm still on the fence about maybe ordering a 285k instead and building out for comparison. I wish a decent (SP85+) would pop up for cheap to take the binning out of the process.
-
1
-
1
-
-
1 hour ago, Mr. Fox said:
Agree with pretty much everything you said. 👍
Yeah, these are really good memory modules and incredibly affordable. They were recommended by gupsterg at oc.net. I had never heard of the brand before, but I was watching a YouTube video earlier today where they were used. They look very good, too. There is a little design decal (triangle) on each corner on one side that I believe are covering screws that bolt the halves together, so the heatsinks should be easier to remove than most. The only thing I don't like is the "KingBank" name on the sides of them, but I would prefer that all brands would leave their branding off of them. I don't like that about any brand.
I have a tracking number on the mobo,, so I should have it pretty soon.
I went ahead and ordered a pair. I'd like another set of 8000+ sticks to bounce off my TG 8200 sticks for testing and comparisons.
Since I returned the Astral, I have an immense tech budget at my disposal now for other toys like the 5070 FE, this memory and a few other goodies I'm thinking about. One being a 9950X3D to properly take advantage of these 8000+ kits.
-
2
-
-
21 hours ago, Mr. Fox said:
Update on the X870E AORUS Master with the loose NVMe heatsink damaging things. No replacement from Central Computer yet, but here is what I found out. Mine was not the first one. I encouraged them to open the box and examine the replacement. I told them I had no reason to think it wasn't like that before they shipped it to me. The box had no damage and did not appear to be handled roughly.
The reason I do not have a replacement is all of them (over 20) were unboxed and found to have similar problems. They're going to have fun dealing with Gigabyte on this. Apparently Gigabyte is not latching them securely when they leave the factory, or the latch is coming loose during shipment. I am actually surprised that this does not happen more often. The silly trend with the toolless NVMe heatsinks that all brands are implementing is an invitation for this kind of shipping disaster to happen. God forbid that a gamerboy would ever have to touch a screwdriver to remove a heatsink. That's way too much to ask of the point-and-click kiddos livin' la vida loca in 2025.
I spoke to their RMA rep today and they are going to send me an X870E-E Strix instead. I confirmed before agreeing to it that it DOES NOT have the same design defect as the Apex and putting my Sabrent Quad NVMe card in the bottom X16 slots won't castrate the GPU with X8 bandwidth. Should still work at X16 as long as I only install the one NVMe PCIe 5.0 in the slot above the GPU, ignore the other two and the two PCIe 4.0 chipset NVMe slots at the bottom of the mobo can be used with no effect. Two of the three PCIe 5.0 NVMe slots (those below the GPU slot) cannot be used, which is pretty much standard for all X870E thanks to AMD's obsession with mandating wasted PCIe lanes on utterly worthless USB4 crap that I don't care about. (Not sure where Intel, AMD and NVIDIA are finding idiots for product design engineers, but stupid seems to the the new normal for those employed in that vocation.)
Strix mobos have almost always been decent for me, so I am OK with the compromise. Just send me the dog-gone motherboard so I can move on and be done. My new 011D XL EVO is patiently waiting for it.
https://rog.asus.com/us/motherboards/rog-strix/rog-strix-x870e-e-gaming-wifi/
I like the tool less design of the heatsinks along with the 90 degree lock down fasteners for the SSDs, but obviously they need proper QC. I've never received a MB with them floating ever (knock on wood).
I agree. The Strix boards have always proven to bring the heat and strike a pricing balance (in the Asus inflated stack of course) price wise. You get a lot of the upper tier functionality along with SP rating.
Hopefully this works out.
21 hours ago, Mr. Fox said:
I bought this KingBank kit. https://www.amazon.com/dp/B0DSZPHLCX?th=1 - very inexpensive
It is not advertised to have an EXPO 8400 profile, but lo and behold, it does. @jaybee83
I can't do 8400 with my CPU but it runs stable at 8200 with custom timings. Nice to not have the rainbow puke eyesore. tPHYRDL is matched. I bet it would handle 8400+ on the Gene or Apex. This kit performs pretty much identical to my G.SKILL 8000 TridentZ Neo EXPO 48GB kit. A bit slower than Neo 32GB 8000 kits, but to be expected with 48GB modules due to looser timing requirements.
The heatsinks are A WHOLE LOT BETTER than the G.SKILL heating blankets.
I will continue to do some tuning/testing on the Master and the Strix (once I have it). If it turns out to be a reliable kit then I will probably buy another one and sell the G.SKILL rainbow puke sticks just for the reason to not have to install any kind of bloatware to correct the rainbow puke rubbish.
Those heatsinks look a lot like TG heatsinks which are excellent and that price is nice. I sold off all my lower tier sets (6000, 6400, 7200, 8000 2x16) and I have just one set of 6400 along with my TG sticks. Might pick up a set. They are less than half what I paid for the TG 8200 sticks last year.
G.skill heatsinks are atrocious but the worst I've ever encountered are still the Patriot heat blankets. TG heatsinks are the best I've encountered so far and they look slick with their circular logo on the front.
-
1
-
1
-
-
Went to two different Best Buys earlier with my daughter to pick up a few things (Matcha Kindle + dance dance 2025 for her). We had to travel out of our area to find one with them in stock we don't frequent. So odd to see a 5090 and 5080 along with a couple 5070s in stock and on the shelf at a Best Buy:
I snagged this while I was there for some benching comparisons and fun in WoW/FO76 vs the 9070xt:
Impressed with the fit and finish of even the 5070 FE. Dense in a good way and like it's big brothers, very aesthetically pleasing to the eye.
1 hour ago, Mr. Fox said:Yes, it works fine. I have it enabled by default since the day I received it.
Just checking. My first GB card would lock up with fast mem enabled after heating up unless I blasted the fans at 100%. The open box works fine with it.
Final check with Deux Ex in each state vs stock and total compilation. This is with the newest 25.6.1 drivers. I guess every little bit helps, but I see myself running mine with fast mem + -50 UV for a little extra punch and no added heat or watts. -75 is a touch unstable and can lock up randomly.
Final tally = +2.3% Avg, +1.9% on mins, +6.4% on max.....
Bask and Bow in the glorious light of those uplifts.....so say we all. 🙂
-
1
-
1
-
-
7 hours ago, Mr. Fox said:
That behavior mirrors mine almost exactly in terms of clocks, watts and performance. I often see a few higher FPS using a -75 UV with the power limit left to default rather than maxed out power limit.
It looks like the Alphacool CORE block for the Gigabyte Gaming and Elite GPUs will be available mid to end of July. It is in production but not yet available for purchase.
It will be good to see how a block tames it a bit and where it can go.
I was tempted to pick up a Mercury since it has the best HSF design this generation so far even better than the Taichi and its 12vhpr connector. I just wanted to see where it would go, but I haven't even tested this Gaming OC on max fans yet so we'll start there.
1 hour ago, Papusan said:For the Red Team jockey's. What you see isn't what you think it is. Samsung isn't what they were in the older days...
RX 9070 XTs with Samsung memory are almost 2% slower than those with SK Hynix despite having higher GPU frequencies
AMD has not clarified anything, manufacturers do not want to talk
We've been working on a cross-platform comparison of the RX 9070 XT recently, and during testing we noticed that many high-end graphics cards, no matter how high their base clock, how good their quality, or how capable their overclocking is, simply couldn't outperform some models with more basic materials and reference designs in 3DMark.
(Hurriedly checks to see if 9070xt has Hynix modules lol)
Like the article says, if these samsung chips have inferior timings, that's a hard pass from me.
@Mr. Fox have you tried seeing if your 9070xt can handle adrenalin's fast memory timings settings yet? I suspect when you block it and cool those hot modules down, it may work if not already.
-
1
-
-
I've been playing around with OC'ing this 9070xt and I'm not seeing much.
GB cards come out of the box OC'd fairly well with boosting to 3ghz. Pushed to a ~-75 UV and PL to max, it's pulling 360w+ with spikes over 400w for a whopping gain of......1-2fps.
The hot spot is toasty though and the gaming OC is actually one of the smaller 9070xt cards out there. It would qualify for an SFF build too.
I actually lost a few fps in performance with the 25.5.1 drivers vs 25.4.1. I did just install 25.6.1 so I'll give them a whirl later.
-
2
-
-
1 hour ago, tps3443 said:
5090FE is fine man. If I didn't like it, I would not keep it that's for sure lol! I had a 4090 Gaming OC and @johnksss that thing was cheap as hell, flimsy, light weight, and loud as hell with bad coil whine, but not this card, this card feels nice and premium and heavy etc. So, it's okay for $1,999.99 (People do try to convince me that my own GPU sucks though, they do not like it, that I like it I guess 🤣) Last I checked, Astral+Block = $4,000, too much for me to afford. But at least they perform TOP NOTCH once you water block it, flash it, and solder it.. Yes, the FE's have weak coolers; I cannot hold that against it. It's going on water/shunt anyways. I mean, your Astral is probably faster than an RTX Pro 6000, but that doesn't mean it's better than one. FE's have weak coolers but not weak cores. Definitely an acceptable GPU for the price tag and performance.
Card scores perfectly acceptable for a stock cooler, hot, non-shunted 5090 FE. Plenty of other 5090's that cannot hit these numbers out there, and this is the hottest one you can buy. 🥵
NVIDIA GeForce RTX 5090 video card benchmark result - Intel Xeon W-3175X Processor,ASUSTeK COMPUTER INC. ROG DOMINUS EXTREME35 minutes ago, Mr. Fox said:I'm glad you don't care what other people think. That is how it should be. It only matters if it affects them and this doesn't. The fact that you like it is all that matters. There are some people that make their decisions based on what their friends think is best or coolest, (without anything else to base it on,) which is really dumb.
lol, you do know we're just teasing ya @tps3443 right? I consider you a friend, and I know with my friends we constantly joke around with each other. 🤣
I agree with @Mr. Fox. Buy what you want and enjoy it for your own personal reasons. I've made plenty of left field purchases some questioned. I just let them know if they wanted to foot the bill I'd be open to listening to and using their personal recommendation. Remember when I was the only sucker, er, enthusiast who picked up a KPE 3090ti? 🤣🤑
And you clearly know how I feel about the 5090FE. SFF dream card with great aesthetics. You know when my VPA pops and/or BB drop, I'm grabbing one and maybe two and bin them for best for a SFF.
-
3
-
-
1 hour ago, Papusan said:
From what I have seen, AMD cards just run hot on hotspoot temp and same with vram (this gen and previous). Not sure why Nvidia removed it but I doub't we would see same high temps on nvidia cards.
My best guess... The FE cards run hot so Nvidia removed it from being monitored (even for the best cooled AIB cards out there). In short.... None of the customers should be allowed to know how hot it will reach. So this stupid move is only due the roasting hot running FE cards.
AMD CAN run hot but Nvidia pushing the TDP 575w AND removing the hot spot sensor can't just be a coincidence nor limited to the FE edition.
It comes down to cooler design and fan profiles too for the 9070xt:
XFX clocking in at 53 core, 69 hot spot and 82 memory respectively while pulling 358w on their Mercury OC.....
I think we'll really see what happens when cooled properly once @Mr. Fox blocks his Gigabyte.
27 minutes ago, win32asmguy said:I am surprised it is that good after having tried a 7800X3D + 7900GRE combo last generation. Lots of strange driver bugs paired with lots of strange bios bugs with fingers pointing to "its an AGESA issue, so it will be fixed when its fixed". I think after the second time my M.2 boot drive just randomly disappeared I gave up on it.
I was surprised as you too.
Last gen was pure garbage for WoW across every single 7900xtx I tested. From the first XFX I purchased right after launch to the last MSI Gaming I tried last October. So many problems from DX12 and crashes to horrific RT performance and lockups/stuttering. Turning on RT basically turned the 7900xtx into a 3080.
Even when I swapped in the 7900xtx to my Intel rig the problems were still there just that then Fallout 76 performance was at its all time low out of all four configs I tried with the 4090, 7900xtx, 13900ks and 7800X3D.
This generation has gone much better since the 25.4.1 Adrenalin drivers for WoW. Zero problems or issues on this 9800X3D+9070XT. I'm running 25.5.1 (newest).
With my first 9070xt, the launch drivers (25.3.1) had all types of problems but the next revision cleared up most of that and it's just gotten better for WoW.
----
I'll reiterate. When unhindered due to CPU exhaustion, the 5090 is an absolute monster and trounces the 9070xt, but when both cards are suffering from CPU exhaustion, I have found the 9070xt plays smoother at those lower fps than the 5090 with higher fps overall.
During normal game play, I find the 9070xt hitting 99% util more often than not whereas the 5090 would routinely tap out the CPU hitting 30-40% more frames than the 9070xt.
I like the highs, but the lows are of more concern. This was my 5090 on the last Radiant Flame final boss before returning it:
I'm to the point, I want to test another 7900xtx to see if the drivers have gotten this much better or if it is the 9070xt architecture or a bit of both.
You should scoop up a 9070xt to play with and test out to compare it against your 7800X3D+7900GRE sad times.
-
2
-
-
16 hours ago, Papusan said:
4x 8pin any day. And weird AMD cards run so damn hot compared to nvidia. A 600W Radeon card would be a disaster. It would cook itself to death within a month.
Is this with out any tax? Aka MSRP ? What cost will it be added when you pay? Right below $3000 here home without tax (MSRP).
Well, we have no idea what Nvidia's hotspot is because they took the reading from us....I wonder why that is? 😞
Memory temps are spicy though!
Depending on which card you use, your OC can be much ado about nothing as all cards tend to top out under load around 3200-3300 or lower. The Gigabyte Gaming/Masters are already coming out of the box hitting 3ghz+ at stock settings so there's not much room left.
AMD has made massive strides in RT and basically looks like they've caught up to Nvidia.
I'll be real curious to see @Mr. Fox block his and bring those hotspot and mem temps down.
His tops out at ~350w OC'd, while the Gigabytes are topping out ~360-365w OC'd with some transient spikes into the 400w+ range.
24 minutes ago, jaybee83 said:haha i get it though, bro @tps3443 is just happy with his purchase and is screaming it out into the forum void 😄
So basically the same @tps3443 we've grown to love and adore over the last 10+ years..... 💙
-
1
-
2
-
-
3 hours ago, tps3443 said:
If I drove the 2hr 31 minutes to Charlotte, North Carolina at a Microcenter and wanted to buy one, (assuming they had any) this would cost me $3,603.58 USD. This includes the 5090 Astral+ 7.25% retail sales tax. No additional warranty or replacement.For this money I would want to build an entire PC with a custom watercooled 5090FE, or save $1,500 and build an entire PC with a 5080FE. Of course it would take some effort to secure one of these GPU’s. But it could be done.
You get what you pay for.
$2k is already stupid money for a consumer level GPU.
Then you decide, do I want the Toyota or Lexus of the GPU industry and pay appropriately so.
In the long run, for true OC'ers who are looking to be in the top of the charts, it won't be the Founders Editions that bring home the bacon.
It will be the Astrals, Masters and Suprims out there.
And you know this.
Upside as always is the cost. That is where the FE shines and always will.
-
1
-
-
2 hours ago, tps3443 said:
Wow, I’m surprised you returned this thing. What 5090 are you buying next? Maybe the more budget friendly RTX Pro 6000? 🤣RTX Pro 6000 is absolute pure baller class. If price was no object, I'd scoop one up ASAP and pray it doesn't sound like Der8uaers.... 🤣 It is the 5090FE TI Super edition.....
As for the Astral 5090, I had fun for two months playing around with it. Trying several vBIOS, OC'ing it, benching and collecting data in WoW, Fallout and Deus Ex along with 3Dmark stuff. Testing V/F curve offsets vs traditional, idle testing and all the fun stuff that was offered along with the 5080FE from launch moving through all the drivers and such.
I'm good right now with the 9800X3D + 9070xt. $3600 is steep and personally I loved the 5090 overall but the bang:buck just wasn't there especially in WoW. I think I said it when I got it the problem was the 4090 was already being underutilized in PvP/PvE content and the 5090 just extended that even more.
I really gathered a lot of data between the 9070xt and 5090 with WoW and I don't know if it is driver overhead in combination with the WoW engine but when the 5090 is firing on all cylinders, it is firing monstrously. Put the 5090 in an area with zero players and letting the CPU feed it properly and it was almost doubling the fps of the 9070xt (also at 5x the price) with both cards hitting 99% utilization. The problem comes back to player data/physics and when raids/orbs/assaults come into play, both cards tank hard but I found in let's say the new Radiant Flame assault and the final boss fight that the 9070xt actually gets better fps than the 5090 on the bottom end when the CPU is just getting hammered overall. I ran it over and over switching between the 9070xt and 5090 using two independent SSDs with their own W11 installs and the 9070xt just handled the CPU tanking better than the 5090 which results in better game play overall when fps really start to dip.
I think I said it during my first round of ownership with the 9070xt when AMD fixed their drivers (understatement) and while the 9070xt was getting overall lower fps, the gameplay was just "smoother" than with the 5090 or 5080FE in WoW. Having run numerous Tier 11 delves solo, the gameplay, smoothness and responsiveness of the 9070xt is >=5090 even when it is getting ~160-170fps vs the 5090 clocking in at 238fps capped to monitor 240. This is with all the cards running 4k Ultra RT max.
I'm tempted to pick up another (that would be my 4th or 5th) 7900xtx so I can see if it is fixed properly too with the newest drivers and runs just as smooth but its Achilles Heel was RT on just brutalized it and it was always recommended to turn it off. The 9070xt doesn't suffer from that problem in WoW.
It seems odd to say it, but I'm finding the 9800X3D + 9070xt providing the best overall WoW experience atm especially during crunch time vs the 5090 when the CPU is gasping for breath and GPU utilization plummets.
On the other hand, the 5090 absolutely wrecked the 9070xt in Fallout 76 and when dealing with outdoor bosses or dailies with plenty of other players, you can definitely feel that difference between the 5090 and the 9070xt. 9070xt is completely playable, but man that 5090 brought the high heat and then some.
In the end, it was the $3600 just like in the end it was the $2665 for the KPE 3090ti that I could not reconcile....
The only cards on my radar now are potentially a 5090FE via VPA/BB because while it too is priced stupidly it isn't AS stupid but I'm in zero rush and that's because I want to dabble in some SFF build outs with it and that's the last area to explore with the 5090 for myself. When I get one, if it is a coil whine mess, off to eBay it goes.
-
2
-
2
-
-
Signal RGB is *almost* perfect for controlling all my devices. For some odd reason, it appears it won't work with AMD GPUs? It controls my Steel Series, Razor and Asus devices perfectly. Luckily the 9070xt has minimal RGB on it.
-
1
-
1
-
-
So I returned my Astral 5090 today (tomorrow was the last day), and I have to admit it feels like a monster sized weight has been lifted off my chest. No matter how I tried to spin it, I could not accept paying ~$3600USD for a consumer grade GPU to game and bench no matter how perfect it was.
Pros: low to no coil whine, 1.110 ripe for XOC vBIOS, +3000, handled a +375 V/F curve offset no problem, clocked decently, aesthetically gorgeous, ran quiet, zero fans outside of meaningful load at all times.
Cons: expensive, heated up my entire office within an hour turning it into a sauna so I had to keep the ceiling fan going and hoping the AC would kick on.
I collected a lot of WoW data and swapped between the 9070xt routinely for various scenarios I would run including lots of LFR Raids, 5 mans and Delves along with ORBs and outdoor assaults and the 9070xt along with AMD made insane strides in driver stability and performance after launch on a level that has left me impressed but that's for another post.
---
Fun convo with the guy who rang up the return. He immediately said "such a nice card." and once we got to talking, he said he noticed on my account I had picked up a Gigabyte 9070xt OC Gaming. I said I picked up an open box the second time around. He noted he used the same card and picked it up open box too a few weeks back. Yeah, that was the one I returned! It didn't even make it to an open box listing. He scooped it up. He games at 1440p, so you know it is tearing it up.
4 hours ago, Clamibot said:Interesting video. I wonder what causes the drastic differences in 1% lows to flip flop depending on the game. I'm guessing it has something to do with inter-CCD latencies and the fact that not both CCDs have the 3D VCache, which in my opinion is a bad idea. Homogeneous designs yield the highest performance.
So looks like for Intel to reclaim the performance crown, they need to do 2 things:
1. Add their own version of 3D Vcache underneath their CPU cores (I believe they have something similar to that, which they call Adamantine cache)2. Get rid of the stupid E-cores and make all cores P-cores. I don't get the point of the E-cores. All they do is reduce performance for the most part, so I leave them turned off since leaving the E-cores turned off significantly increases my performance in pretty much everything, with games getting the biggest boost. I only had one case when using my 14900KF where they increased performance, and performance didn't increase by that much. Tripling your core count to get only a 17% increase in performance is an insanely bad ROI in relation to the amount of extra cores (yes, I know the core types don't have the same processing power, but even assuming the E-cores are half the processing power of the P-cores, it's still a very bad performance increase in relation to the number of extra cores). Intel's server class Xeon CPUs have all P-cores and they perform great due to not being shackled by E-cores. No weird scheduling shenanigans.
Since Intel doesn't seem to want to ditch their E-core idea, we may have to move to their server class CPUs to escape the E-core insanity.
In previous reviews of the 9800X3D vs the 9950X3D, gaming comparisons were basically a wash overall but I did not pay close attention to the 1% lows. I'll have to go back and take a look to test your theory. Before 9950X3D launch, the biggest wish was dual 3D-Vcache but alas.....
Intel and AMD are both guilty for abandoning a proper monolithic and/or homogeneous design. At least AMD went back to it for their 9000 series GPUs.
Raptor lake was too hot and a mess for more P-Cores (even though rumors of 10 and 12 core P-Core versions had hope springing eternal everywhere), but I thought they could have made an all P-Core variant of Arrowlake. At least 12 P-cores would have been enough for me. For RPL it is a per game hit or miss fest. For Alder, it was almost universal to turn them off.
In Intel's defense, the E-cores on arrowlake are much better than RPLs but I still want a proper P-core only CPU. You might be right in the pursuit of Xeons.
-
1
-
1
-
1
-
-
10 minutes ago, tps3443 said:
Yes, I sawed it. I could see your FE sucks flag/banner raised up from my bedroom window. 🤣
If only to counter your "I ♥️ 5090FE" tramp stamp... 🤣
But in all seriousness, I like the 5090FE. It still is in my top three list based on price. size and aesthetics.
If one pops on BB or VPA, I'd pick it up and give it the coil whine test. If it had bearable or no coil whine like my 5080FE (showing it is quite possible), I'd keep it. If not, off to eBay it would go.
It is an amazing SFF card. I would then seek out a SFF case small as possible that supports either a top horizontal inverted mount so the card would exhaust directly out the top of the case or a classic vertical mount that could invert to exhaust out the side. You gotta dump that heat somewhere if you don't have a big enough case.
5090FE + 9800X3D air cooled tuned SFF build would be the end game for desktop and portable power.
-
1
-
1
-
1
-
-
285k vs 9950X3D both tuned (ish)
Overall 9950X3D has higher fps, but the 285k has higher 1% lows sometimes substantially.
4 hours ago, Papusan said:5090 is just an expensive midrange high end gamer cards. The more you pay the more you save. Here's Nvidia's new "gamer "flagship" card😀
Not sure this would please @electrosoft😁 The $10.000 cards almost sounds like a real circular saw. Very special sound from it. Not the worst I have heard but the sound from this is awful and more annoying than this one from my older post
Now THAT coil whine is identical to my 3090 FE. Just throw in a massive amount of variable pitch changes like the Aorus.....
But with that being said, outside of the price, this is what the 5090 should have been from launch with just 32GB for the stupid prices being charged for the 5090.
I LOVE the look and design of the Pro 6000 more than the 5090 FE plus you get the full fat GB202. Icing on the cake is the 96GB to run larger LLMs (well for me, I don't know about you). For AI work, it makes more sense for me to pick up a M4 macbook than a 5090. It makes more sense to pick up an M3 Ultra than the RTX Pro 6000.
"The 5090 is the waste. The garbage from the production of these (6000)" - brutal.
His end rant is spot on. You are getting the die rejects for the 5090 and yields for the full fat are much smaller. I'm sure Nvidia is stock piling the dies that can't even make it to mid range 5090 status and we'll get a 5080ti to dump those off.
The fact the 6000 Pro can look and run like a 5090 is all you need to know in regards to Nvidia just pushing and upselling everything to absurd levels.
Everything about the RTX 6000 Pro is the 5090 FE on steroids from the looks to the internals. 🤑
I still want a 6000 Pro though.....just not at $8k+
-
2
-
2
-
-
2 hours ago, Mr. Fox said:
If I had to wager I would bet that it's true because NVIDIA is who they are... control freaks. If they can find a way to screw the people that buy their products that is exactly what they will do. That's how they roll and who they are, and that is how they have been for a long time. Also one of the numerous reasons I dislike FE video cards.
And, on a happier note, I ordered my second Lian Li O11D XL EVO to house the second AORUS Master. Also ordered a D5 distro block from Radikult Customs and about 12 feet of Alphacool EDPM tubing.
Now I hear rumors of a 9080 XT with 32GB of GDDR7. I am so glad I didn't blow $3000+ on a 5090 already. Now it is no longer even tempting. Not caring actually feels really good.
Hopefully it is just user error. I do know the dude who basically ghetto rigged his FE with that crazy pseudo water cooling was able to get some decent scores (I linked it here some time ago) but nothing like some of the AIBs are clocking in like @johnksss and his Astral for starters...
------------------
The benefits of not blowing $3k+ on a 5090 is you can buy so many other toys!
I have ~2 days to return mine and I'm leaning towards just returning it and using my $656.99 9070xt. I've tried every way to justify blowing $2k+ on a 5090 let alone $3600 (ok, ok $3581) and the math just ain't mathing for me when I want to have room for some other toys to pick up and play with.....
I sure have had fun gaming and overclocking it though for the last two months. 🙂
-
2
-
*Official Benchmark Thread* - Post it here or it didn't happen :D
in Desktop Hardware
Posted
So I've been putting my Macbook Pro 16 M4 Max (Full fat 16 core CPU, 40 core GPU) through the ringer tonight with World of Warcraft 4k Ultra.
It's......playable. In my benchmark spots it is about as powerful as a 9070xt with both having RT off since WoW doesn't support RT in MacOS 9070xt is clocking in around 140fps while the M4 is clocking in around 130fps.
Another case of theoretical vs actual once that CPU gets sauced....
Problem is in raids the M4 tanks to the 50s and 40s in many spots while the 9070xt consistently keeps it in the 80+ range so once you introduce that player data M4 Max starts to take a pounding.
That is now the M4, 5080 and 5090 not handling the lows as well as the 9070xt but I will say the M4 stays pretty smooth even at those low fps...
Don't let anyone ever tell you the Macbook fans don't have some punch to them.... 😁