Jump to content
NotebookTalk

*Official Benchmark Thread* - Post it here or it didn't happen :D


Mr. Fox

Recommended Posts

12 hours ago, Mr. Fox said:

I am surprised it made it that high before the BSOD. Usually see errors start to occur about 8-10°C sooner than that. I have found that a memory overclock that is unstable at 45°C might run error-free at 35°C, and vice versa. It kind of makes sense considering the clock speeds and voltage though. You can't do that without producing heat.

 

That's why using a fan or putting them on water is recommended. Taking the stock heatsinks beautifying heating blankets off also helps, but not doing it the right way can rip the chips off the PCB because the imbeciles that manufacture RAM use adhesives that are stronger than the solder. (They don't need to use any adhesives and it is ridiculous that they do.)

 

 


I think it was because it’s just the XMP 7200 profile so it holds out a little better under higher heat. When I run 7600 it’s much more sensitive! I’ve gotta stay under 42C

  • Thumb Up 2

13900KF

Link to comment
Share on other sites

https://videocardz.com/newz/gigabyte-confirms-geforce-rtx-4070-ti-radeon-rx-7900-graphics-cards

 

Yay the 4080 12  Gb is coming 😁

 

 

@Papusanwhat do you think Jensen will want for it? 999  Dollars for its pocket its my number.🥲 so AIB will want 1199

 

  • Haha 3
  • Hands 1

7950X3D| Zotac 4090 AMP Extreme Airo| MSI MPG B650 Edge Wifi| Lian Li Galahad 360 V2| 32GB Kingston Renegade RGBZ 6000|Kingston KC3000 2TB| Fury Renegade 2TB|Samsung 970 Evo 1TB| Lian Li O11 Dynamic Evo| Corsair HX1500i| Samsung Odyssey G9 Neo

Asus Zephyrus G15 (Ryzen 9 6900HS + RTX3080)

 

Link to comment
Share on other sites

48 minutes ago, cylix said:

https://videocardz.com/newz/gigabyte-confirms-geforce-rtx-4070-ti-radeon-rx-7900-graphics-cards

 

Yay the 4080 12  Gb is coming 😁

 

 

@Papusanwhat do you think Jensen will want for it? 999  Dollars for its pocket its my number.🥲 so AIB will want 1199

 

 

 

With a less than stellar response to the 4080 16gb (Never mind during the heaviest buy period during the year) and with AMD on the horizon with a $999 4080 16GB slayer on deck, $799 is realistic pricing....$699.99 if he wants to actually content price:performance against AMD.

 

 

  • Thumb Up 2
  • Bump 1

Electrosoft Prime: 7950X3D | MSI X670E Carbon  | MSI Suprim X Liquid 4090 | AC LF II 420 | G.Skill 6000 A-Die 2x32GB | Samsung 990 Pro 2TB | EVGA 1600w P2 | Phanteks Ethroo Pro | Alienware AW3225QF 32" OLED

Eurocom Raptor X15 | 12900k | Nvidia RTX 3070ti | 15.6" 1080p 240hz | Kingston 3200 32GB (2x16GB) | Samsung 980 Pro 1TB Heatsink Edition
Heath: i9-12900k | EVGA CLC 280 | Asus Strix Z690 D4 | Asus Strix 3080 | 32GB DDR4 2x16GB B-Die 4000  | WD Black SN850 512GB |  EVGA DG-77 | Samsung G7 32" 144hz 32"

MelMel:  (Retrofit currently in progress)

 

 

 


 

Link to comment
Share on other sites

1 hour ago, electrosoft said:

 

 

With a less than stellar response to the 4080 16gb (Never mind during the heaviest buy period during the year) and with AMD on the horizon with a $999 4080 16GB slayer on deck, $799 is realistic pricing....$699.99 if he wants to actually content price:performance against AMD.

 

 

That will be realistic, but i dont think nvidia lives in our reality, not yet 😄

  • Thumb Up 1
  • Haha 1

7950X3D| Zotac 4090 AMP Extreme Airo| MSI MPG B650 Edge Wifi| Lian Li Galahad 360 V2| 32GB Kingston Renegade RGBZ 6000|Kingston KC3000 2TB| Fury Renegade 2TB|Samsung 970 Evo 1TB| Lian Li O11 Dynamic Evo| Corsair HX1500i| Samsung Odyssey G9 Neo

Asus Zephyrus G15 (Ryzen 9 6900HS + RTX3080)

 

Link to comment
Share on other sites

3 hours ago, electrosoft said:

$799 is realistic pricing....$699.99 if he wants to actually content price:performance against AMD.

$699 for 4070Ti means Nvidia has to do something with the 4080 price point. $500 gap between 3070Ti and 4080 is too big. Even $400 if they price it at more expected $799.

Only a few will go with 4080 at 1199$ and will rather go with AMD at $899 or $999. None will go for a 3070 Ti for 4K. Maybe because the card is so much cheaper than 4080? Nope. This means Nvidia have nothing to offer for those that want 4K gaming. No in hell people will pay $1200 for 4080 in the long run. Once the hype for 4000 series  is over and Nvidia don't change the price point for this card, the sales will be even worse than today.

 

I expect nvidia will be forced to change where they have put 4080 and go down to $1099. A small premium over 7900XTX for the better features. Then add 4070Ti price point around $799. This way they can also try lure in 4070 at sub $600 ($100 premium over 3070). All cards will be more expensive than ever but not so stupid high priced that many gamers will go over on the Red side. 

  • Thumb Up 2

"The Killer"  ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Link to comment
Share on other sites

9 minutes ago, Papusan said:

$699 for 4070Ti means Nvidia has to do something with the 4080 price point. $500 gap between 3070Ti and 4080 is too big. Even $400 if they price it at more expected $799.

Only a few will go with 4080 at 1199$ and will rather go with AMD at $899 or $999. None will go for a 3070 Ti for 4K. Maybe because the card is so much cheaper than 4080? Nope. This means Nvidia have nothing to offer for those that want 4K gaming. No in hell people will pay $1200 for 4080 in the long run. Once the hype for 4000 series  is over and Nvidia don't change the price point for this card, the sales will be even worse than today.

 

Or you leave the 4080 as is to give the perception of incredible value high (4090) or low (4070ti) if you go $1599.99 and even $799.99. Equally spaced ($1599.99 - $1199.99 - $799.99) also comes into play.

 

Let's accept the 4090 is going to remain at $1599.99 because it clearly overall outclasses anything and everything available. Expect the 4090ti to clock in at $1999.99.

 

Everything else for numerous reasons is in play for pricing.

 

If the 7900xt buries the 4070ti Nvidia has a lot of thinking to do especially if AMD drops it to $799.99.

 

I agree once the initial hype dies down Nvidia is in for a much less lucrative year than experienced during the heights of Ampere.

 

 

  • Thumb Up 1
  • Like 2

Electrosoft Prime: 7950X3D | MSI X670E Carbon  | MSI Suprim X Liquid 4090 | AC LF II 420 | G.Skill 6000 A-Die 2x32GB | Samsung 990 Pro 2TB | EVGA 1600w P2 | Phanteks Ethroo Pro | Alienware AW3225QF 32" OLED

Eurocom Raptor X15 | 12900k | Nvidia RTX 3070ti | 15.6" 1080p 240hz | Kingston 3200 32GB (2x16GB) | Samsung 980 Pro 1TB Heatsink Edition
Heath: i9-12900k | EVGA CLC 280 | Asus Strix Z690 D4 | Asus Strix 3080 | 32GB DDR4 2x16GB B-Die 4000  | WD Black SN850 512GB |  EVGA DG-77 | Samsung G7 32" 144hz 32"

MelMel:  (Retrofit currently in progress)

 

 

 


 

Link to comment
Share on other sites

35 minutes ago, Papusan said:

Once the hype for 4000 series  is over and Nvidia don't change the price point for this card, the sales will be even worse than today.

 

Sorry to be the bearer of bad news here, but Jensen will manage to laugh this gamer rage off. The crypto boom got replaced by AI. Nvidia Q3 Revenue was $5.9B, of which $3.8B was Data Center, up 31% YoY. Gaming is still significant at $1.5B, but down 51% vs previous year, and -25% vs previous quarter.

 

This year Nvidia are no longer primarily a gaming hardware company.

 

He can charge this much for whatever chips he can spare for the gamers - and he can clearly spare very little indeed. The workstation cards are a niche for them too BTW: $200M, and this is also down a whopping 65% YoY.

 

All aboard the AI train, except Conductor Jensen will check the tickets momentarily and get those gamers and other enthusiast and creator plebs off those premium Nvidia carriages.

  • Thumb Up 1
  • Sad 1

"We're rushing towards a cliff, but the closer we get, the more scenic the views are."

-- Max Tegmark

 

AI: Major Emerging Existential Threat To Humanity

Link to comment
Share on other sites

7 minutes ago, electrosoft said:

 

Or you leave the 4080 as is to give the perception of incredible value high (4090) or low (4070ti) if you go $1599.99 and even $799.99. Equally spaced ($1599.99 - $1199.99 - $799.99) also comes into play.

 

Let's accept the 4090 is going to remain at $1599.99 because it clearly overall outclasses anything and everything available. Expect the 4090ti to clock in at $1999.99.

 

Everything else for numerous reasons is in play for pricing.

 

If the 7900xt buries the 4070ti Nvidia has a lot of thinking to do especially if AMD drops it to $799.99.

 

I agree once the initial hype dies down Nvidia is in for a much less lucrative year than experienced during the heights of Ampere.

 

 

Not so sure Nvidia will keep same old tradition use the $x99 price label they are known for. Could be they will try change it and make it more appealing with $x49 price tags. 

 

I'm sure even Nvidia know people has less money to pay for fun nowadays and forwards. They has to do some changes to lure people to buy their over priced products. 

  • Thumb Up 1

"The Killer"  ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Link to comment
Share on other sites

3 minutes ago, Etern4l said:

 

Sorry to be the bearer of bad news here, but Jensen will manage to laugh this gamer rage off. The crypto boom got replaced by AI. Nvidia Q3 Revenue was $5.9B, of which $3.8B was Data Center, up 31% YoY. Gaming is still significant at $1.5B, but down 51% vs previous year, and -25% vs previous quarter.

 

This year Nvidia are no longer primarily a gaming hardware company.

 

He can charge this much for whatever chips he can spare for the gamers - and he can clearly spare very little indeed. The workstation cards are a niche for them too BTW: $200M, and this is also down a whopping 65% YoY.

 

Every segment counts. Even if Data Center revenue goes nuclear it doesn't mean he wants to suffer gamer market contraction or more important lose market share to AMD or Intel.

 

One thing Apple taught us is that every market segment counts whether it is a 200mil/yr segment or 20bil/yr segment.

 

 

 

#AllProfitsMatter

 

 

  • Thumb Up 1
  • Haha 2

Electrosoft Prime: 7950X3D | MSI X670E Carbon  | MSI Suprim X Liquid 4090 | AC LF II 420 | G.Skill 6000 A-Die 2x32GB | Samsung 990 Pro 2TB | EVGA 1600w P2 | Phanteks Ethroo Pro | Alienware AW3225QF 32" OLED

Eurocom Raptor X15 | 12900k | Nvidia RTX 3070ti | 15.6" 1080p 240hz | Kingston 3200 32GB (2x16GB) | Samsung 980 Pro 1TB Heatsink Edition
Heath: i9-12900k | EVGA CLC 280 | Asus Strix Z690 D4 | Asus Strix 3080 | 32GB DDR4 2x16GB B-Die 4000  | WD Black SN850 512GB |  EVGA DG-77 | Samsung G7 32" 144hz 32"

MelMel:  (Retrofit currently in progress)

 

 

 


 

Link to comment
Share on other sites

13 minutes ago, electrosoft said:

 

Every segment counts. Even if Data Center revenue goes nuclear it doesn't mean he wants to suffer gamer market contraction or more important lose market share to AMD or Intel.

 

One thing Apple taught us is that every market segment counts whether it is a 200mil/yr segment or 20bil/yr segment.

 

 

 

#AllProfitsMatter

 

 

 

Does he have unlimited amount of silicon to meet demand from all those market segments? Which one do you think is more profitable? (hint: check out the pricing of the data center cards)

His gaming revenue is collapsing and will fall down further before it reaches the bottom, but the profit margins will remain strong, which is what matters for him. As you said, he does have the more premium product. Not everyone will be able to afford it for the time being.

 

I'm sorry, but if the plan is to sit and wait for better pricing... well, looks like it's going to be a longer than expected sitting.

"We're rushing towards a cliff, but the closer we get, the more scenic the views are."

-- Max Tegmark

 

AI: Major Emerging Existential Threat To Humanity

Link to comment
Share on other sites

30 minutes ago, Etern4l said:

 

Does he have unlimited amount of silicon to meet demand from all those market segments? Which one do you think is more profitable? (hint: check out the pricing of the data center cards)

His gaming revenue is collapsing and will fall down further before it reaches the bottom, but the profit margins will remain strong, which is what matters for him. As you said, he does have the more premium product. Not everyone will be able to afford it for the time being.

 

I'm sorry, but if the plan is to sit and wait for better pricing... well, looks like it's going to be a longer than expected sitting.

Nvidia saw what happened to Intel when AMD took over more and more market shares for processors the last half decade. Intel was forced to change course. You think Nvidia didn't learn anything from this and will let AMD also eat more market shares for graphics cards? Nope, won't happen. If they change 4080 prices they are back in business any day.

"The Killer"  ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Link to comment
Share on other sites

9 minutes ago, Papusan said:

Nvidia saw what happened to Intel when AMD took over more and more market shares for processors the last half decade. And Intel was forced to change course. You think Nvidia didn't learn anything from this and will let AMD also eat more market shares for graphics cards? Nope, won't happen. If they change 4080 prices they are back in business any day.

 

But they are in business. Market share would only matter if they had enough silicon. Yes, looks like there are quite a few 4080s sitting on shelves, but this would only get discounted if it it doesn't sell at all, or if they have a supply of 4080 chips they can't control (due to the manufacturing process, for example - if the chip in the 4080 is a flawed version of the one in 4090, sorry forgot the chip designations - is that how it works?).

"We're rushing towards a cliff, but the closer we get, the more scenic the views are."

-- Max Tegmark

 

AI: Major Emerging Existential Threat To Humanity

Link to comment
Share on other sites

40 minutes ago, Etern4l said:

 

But they are in business. Market share would only matter if they had enough silicon. Yes, looks like there are quite a few 4080s sitting on shelves, but this would only get discounted if it it doesn't sell at all, or if they have a supply of 4080 chips they can't control (due to the manufacturing process, for example - if the chip in the 4080 is a flawed version of the one in 4090, sorry forgot the chip designations - is that how it works?).

Flawed silicon from 4090 (AD102) will be 4080 Ti before it will be dumbed down to 4080 (AD103). And 4080 is still not a full fat AD103 

 

NVIDIA details AD102, AD103, AD104 GPU specs: transistors, ROP counts 06

NVIDIA details AD102, AD103, AD104 GPU specs - TweakTown

 

Back in business? They mostly has to rely on 4090 now. This is the only selling 4000 cards. And of course some Ampere silicon as they scrap older EOL cards. And how much of the older gen cards is already sold out to their AIC partners? They can't charge them two times to keep up profits. They has to sell something. And 4090 is too little.

 

Imagine Ford suddenly can only sell one model. And this "only model" is the most expensive of them all. Fords market shares all over would sink as a stone in water and profits would follow same way. Down the drain. You need to offer more than one model nowadays. AMD will offer two models within 2.5 weeks and Nvidia is stuck with only the 4090 and almost no sales from 4080. Nvidia couldn't made it any easier for AMD. But they could destroy for the Red side if they did something with 4080 price point beforehand. And I expect they do so. Just hope they will change the MSRP for their second best Ada and not a dirty trick to destroy the launc of 7900 cards.

"The Killer"  ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Link to comment
Share on other sites

I’m testing my 4th 13900K today. Batch X241M888.
 

This chip is a Force2 =143 

 

Nothing spectacular, but maybe average or slightly above average. The ring is average, the E-Cores are average x45 maybe x46 and that’s it, and it loves to use power if you’re clocking it high on all cores lol!
 

But after all of that! It’s still a dang 13900K, and I have it setup at

5.7Ghz on (8) cores, 6.0Ghz (4) Cores, and 6.2Ghz on (2) cores.

 

YES! 6.2Ghz on (2) cores even this chip. While the VID’s may be slightly through the roof lol, it’s still a very reasonable CPU I have the LLC maxed out at LLC #8 so load voltages drop down drastically. I can definitely rock this everyday, or anyone could honestly. I mean, they all perform within 1-3% anyways regardless of how good that SP rating is haha. 
 

(This chip ain’t nothing like my other chip though, but it’s still a very solid chip and it surprises me how great just an average 13900K really is) I’m giving all the average guys out there with regular 13900K’s some love today!! These CPU’s are really great regardless how high or low a number may be. 
 

C08-E52-A1-E2-EF-4449-944-C-9585-B80-E31

  • Thumb Up 3
  • Like 1
  • Haha 1

13900KF

Link to comment
Share on other sites

1 hour ago, tps3443 said:

I’m testing my 4th 13900K today. Batch X241M888.
 

This chip is a Force2 =143 

 

Nothing spectacular, but maybe average or slightly above average. The ring is average, the E-Cores are average x45 maybe x46 and that’s it, and it loves to use power if you’re clocking it high on all cores lol!
 

But after all of that! It’s still a dang 13900K, and I have it setup at

5.7Ghz on (8) cores, 6.0Ghz (4) Cores, and 6.2Ghz on (2) cores.

 

YES! 6.2Ghz on (2) cores even this chip. While the VID’s may be slightly through the roof lol, it’s still a very reasonable CPU I have the LLC maxed out at LLC #8 so load voltages drop down drastically. I can definitely rock this everyday, or anyone could honestly. I mean, they all perform within 1-3% anyways regardless of how good that SP rating is haha. 
 

(This chip ain’t nothing like my other chip though, but it’s still a very solid chip and it surprises me how great just an average 13900K really is) I’m giving all the average guys out there with regular 13900K’s some love today!! These CPU’s are really great regardless how high or low a number may be. 
 

C08-E52-A1-E2-EF-4449-944-C-9585-B80-E31

 

Wow dang, you bought more 13900Ks?

 

The achievable speed on all your samples speaks to how good the 13900K is. While we all want to acquire amazing samples, there is comfort in knowing even the average ones can clock really high as well. Extremely good bins have more relevance when you go into extreme overclocking. I do think I will be partially going the extreme route though as I am now hooked on peltier coolers working in tandem with a water cooler.

AlienyHackbook: Alienware M17X R5 | i7-4930MX | GTX 1060 | 32GB DDR3L Kingston HyperX @ 2133 MHz CL 12 | MacOS Sierra 10.12.5 | Windows 10 LTSC | Hackintoshes Rule!

 

Desktop Killer: Clevo X170SM-G | i9-10900K | RTX 2080 Super | 32GB DDR4 Crucial Ballistix @ 3200 MHz CL 16 | Windows 10 LTSC | Slayer Of Desktops

 

Sagattarius A: Custom Built Desktop | i9-10900K | RX 6950 XT | 32GB DDR4 G.Skill Ripjaws @ 4000 MHz CL 15 | Windows 10 LTSC | Ultimate Performance Desktop With Cryo Cooling!

Link to comment
Share on other sites

8 hours ago, Papusan said:

No in hell people will pay $1200 for 4080 in the long run.

Oh, trust me... based on what I have seen in the past two years, the limits of human stupidity are beyond measure and incredibly stupefying. There are roughly the same number of imbeciles as there are people with common sense. Apparently, NVIDIA recognizes this and is willing to take advantage of the mentally handicapped shoppers.


On another note...

Wraith

mUnq5Ki.jpg

Half-Breed

4DsbgcI.jpg

Banshee

This is the system I am struggling to make viable on Linux. I think it may be something bugged in the ACPI implementation by the dumb-dumbs at ASUS on the Strix ZX690-E. I cannot get the CPU turbo clocks to display correctly in any desirable monitoring tools. CPU-X shows the 12900KS clocks to a fixed 5.4GHz (correct) under load, still not correct at idle, but all of the "normal" things I use show either a fixed 3.4GHz (c-states disabled) or a fixed 4.1GHz (c-states enabled). I have tried KDE, POP!_OS and ZorinOS and all have the same issue. I have tried passing a variety of kernel parameters in GRUB. I have installed different packages intended for monitoring clock speeds and it is hit or miss. The couple that actually work correctly are worthless to me because they are CLI stuff I can't use the way I want to. I also wonder if it is something with the 12900KS not being recognized properly like a 12900K and 13900K. At any rate, this is a classic example of a thing that make noobs believe that Linux is not a viable replacement for Windows, and on this system it probably isn't solely for this reason. I would not embrace Linux if this were an example of normal, but I have used it enough to know it is an exception (albeit a more common problem than desired).

I1tohXr.png

  • Thumb Up 1
  • Sad 1

Wraith // Z790 Apex | 14900KF | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower GF3 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO
Banshee // Z790 Apex Encore | 13900KS | 4090 Gaming OC+Alphacool Block | 48GB DDR5-8600 | RM1200x SHIFT | XT45 1080 Nova || Dark Base Pro 901
Munchkin // Z790i Edge | 14900K | Arc A770 Phantom Gaming OC | 48GB DDR5-8000 | GameMax 850W | EK Nucleus CR360 Dark || Prime AP201 
Half-Breed // Dell Precision 7720 | BGA CPU Filth+MXM Quadro P5000 | Sub-$500 Grade A Refurb || Nothing to Write Home About  

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second.

Link to comment
Share on other sites

6 hours ago, Mr. Fox said:

Oh, trust me... based on what I have seen in the past two years, the limits of human stupidity are beyond measure and incredibly stupefying. There are roughly the same number of imbeciles as there are people with common sense. Apparently, NVIDIA recognizes this and is willing to take advantage of the mentally handicapped shoppers.


On another note...

Wraith

mUnq5Ki.jpg

Half-Breed

4DsbgcI.jpg

Banshee

This is the system I am struggling to make viable on Linux. I think it may be something bugged in the ACPI implementation by the dumb-dumbs at ASUS on the Strix ZX690-E. I cannot get the CPU turbo clocks to display correctly in any desirable monitoring tools. CPU-X shows the 12900KS clocks to a fixed 5.4GHz (correct) under load, still not correct at idle, but all of the "normal" things I use show either a fixed 3.4GHz (c-states disabled) or a fixed 4.1GHz (c-states enabled). I have tried KDE, POP!_OS and ZorinOS and all have the same issue. I have tried passing a variety of kernel parameters in GRUB. I have installed different packages intended for monitoring clock speeds and it is hit or miss. The couple that actually work correctly are worthless to me because they are CLI stuff I can't use the way I want to. I also wonder if it is something with the 12900KS not being recognized properly like a 12900K and 13900K. At any rate, this is a classic example of a thing that make noobs believe that Linux is not a viable replacement for Windows, and on this system it probably isn't solely for this reason. I would not embrace Linux if this were an example of normal, but I have used it enough to know it is an exception (albeit a more common problem than desired).

I1tohXr.png

 

Don't recall seeing this issue with the KS on any distro I tried with the MSI board, and it was def not there on Clear Linux in htop (running kernel 6). What tends to happen is there is a bit of a lag in driver/library support etc. of new hardware vs Windows, although Clear Linux seems to be cutting edge with respect to that (ex Nvidia support of course). For instance, temp sensor data is still misaligned on the 13900K in htop - a super minor issue.

 

Your problem should get addressed with a kernel update.

 

I guess what they say is true, there is no such thing as free lunch. If you want to reap the benefits of Linux, some effort and compromises will be involved. On the upside @tps3443 still hasn't managed to beat my blender and indigobench CPU scores with his super-OCed and uber binned CPUs running on Microsoft's fantastic OS ;)

  • Thumb Up 2

"We're rushing towards a cliff, but the closer we get, the more scenic the views are."

-- Max Tegmark

 

AI: Major Emerging Existential Threat To Humanity

Link to comment
Share on other sites

7 hours ago, Etern4l said:

 

Don't recall seeing this issue with the KS on any distro I tried with the MSI board, and it was def not there on Clear Linux in htop (running kernel 6). What tends to happen is there is a bit of a lag in driver/library support etc. of new hardware vs Windows, although Clear Linux seems to be cutting edge with respect to that (ex Nvidia support of course). For instance, temp sensor data is still misaligned on the 13900K in htop - a super minor issue.

 

Your problem should get addressed with a kernel update.

 

I guess what they say is true, there is no such thing as free lunch. If you want to reap the benefits of Linux, some effort and compromises will be involved. On the upside @tps3443 still hasn't managed to beat my blender and indigobench CPU scores with his super-OCed and uber binned CPUs running on Microsoft's fantastic OS 😉

It is strange that everything seems kosher for me on the 13900K and when I was running the 12900K on the Strix D4 mobo everything was fine on Linux. It is either the Z690-E or 12900KS, or both.

I installed openSUSE last night with the ACPI=off kernel argument and fewer things are broken than before. I do not get the long list of errors when Linux is loading now and neofetch reports the correct clock speeds. CPU-X reports clocks close (100MHz below actual) but hardinfo still shows them way off (4100MHz instead of 5400 P cores and 4300 E cores).

 

It could also be the Linux applications themselves that the developers are not updating. Even if the Linux kernel is providing the proper support, if the apps are not updated they may not interpret things correctly. This kind of thing is what holds Linux back from becoming a dominant force in the PC technology realm. Most people (me included) don't have the knowledge, desire or time to compile code and fix broken Linux code.

Being "free" is both a strength and a major weakness. It is probably pretty safe to assume that most of the experienced Linux developers do not own cutting edge hardware, and they're only going to burn calories on hardware that matters to them.

  • Thumb Up 1

Wraith // Z790 Apex | 14900KF | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower GF3 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO
Banshee // Z790 Apex Encore | 13900KS | 4090 Gaming OC+Alphacool Block | 48GB DDR5-8600 | RM1200x SHIFT | XT45 1080 Nova || Dark Base Pro 901
Munchkin // Z790i Edge | 14900K | Arc A770 Phantom Gaming OC | 48GB DDR5-8000 | GameMax 850W | EK Nucleus CR360 Dark || Prime AP201 
Half-Breed // Dell Precision 7720 | BGA CPU Filth+MXM Quadro P5000 | Sub-$500 Grade A Refurb || Nothing to Write Home About  

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second.

Link to comment
Share on other sites

8 hours ago, Mr. Fox said:

Oh, trust me... based on what I have seen in the past two years, the limits of human stupidity are beyond measure and incredibly stupefying. There are roughly the same number of imbeciles as there are people with common sense. Apparently, NVIDIA recognizes this and is willing to take advantage of the mentally handicapped shoppers.

Here's a graph on how many people who fits into your chart..............  Round it up and I think the number are closer to 10% of the population. 

 

5% is happy to spend $1,100. 2% or less feel that the current $1,200 MSRP is justified or are willing to spend more than MSRP.

wTADs69H79wOezA0.jpg

 

image.png.a3989945160103bd9e7bd2bf7ef81089.png

 

$700-800 Ideal Price for GeForce RTX 4080: TechPowerUp Poll Surveying 11,000 Respondents

EXCLUSIVE by techpowerup.com
 
 Today, 12:22 Discuss 
The ideal price for the NVIDIA GeForce RTX 4080 "Ada" graphics card is around USD $700 to $800, according to results from a recent TechPowerUp Front-page poll surveying our readers. Our poll "How much would you pay for RTX 4080 at most?" received over 11,000 responses. At the number 1 spot with 22% of the vote is $800, closely followed by $700. Together, this range represents 44% of the voters. 14% of our readers think $600 is an ideal price, followed by "less than $400" at 13%. 9% think $500 seems fair, followed by 7% willing to spend as much as $900. 5% is happy to spend $1,100. 2% or less feel that the current $1,200 MSRP is justified or are willing to spend more than MSRP. There's more to a majority finding sanity with the $700 to $800 price-range.
  • Thumb Up 2
  • Thanks 1

"The Killer"  ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Link to comment
Share on other sites

6 minutes ago, Papusan said:

Here's a graph on how many people who fits into your chart.............. 

 

5% is happy to spend $1,100. 2% or less feel that the current $1,200 MSRP is justified or are willing to spend more than MSRP.

wTADs69H79wOezA0.jpg

 

image.png.a3989945160103bd9e7bd2bf7ef81089.png

 

$700-800 Ideal Price for GeForce RTX 4080: TechPowerUp Poll Surveying 11,000 Respondents

EXCLUSIVE by techpowerup.com
 
 Today, 12:22 Discuss 
The ideal price for the NVIDIA GeForce RTX 4080 "Ada" graphics card is around USD $700 to $800, according to results from a recent TechPowerUp Front-page poll surveying our readers. Our poll "How much would you pay for RTX 4080 at most?" received over 11,000 responses. At the number 1 spot with 22% of the vote is $800, closely followed by $700. Together, this range represents 44% of the voters. 14% of our readers think $600 is an ideal price, followed by "less than $400" at 13%. 9% think $500 seems fair, followed by 7% willing to spend as much as $900. 5% is happy to spend $1,100. 2% or less feel that the current $1,200 MSRP is justified or are willing to spend more than MSRP. There's more to a majority finding sanity with the $700 to $800 price-range.

What they are not capturing is what th 5% and 2% represents. They would need to exclude people like me that are not going to purchase a new GPU at any price and only take the percentage from the people actually planning to purchase a new GPU. Maybe they are calculating it that way, but it's not clear. If you include people like me in the calculation the numbers will be misleading. What they are also not capturing is how many will still pay that much to have a new GPU even though they do not feel the price is justified. I did that with the 3090 KPE. Was the price idiotic? Yes, it was ludicrous. Did I spend the money anyway? Yes, I did. NVIDIA knows this. They don't care whether people think the price represents value. They only care if people will spend the money anyway, in spite of their opinion that the price is unreasonable.

  • Like 1
  • Bump 1

Wraith // Z790 Apex | 14900KF | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower GF3 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO
Banshee // Z790 Apex Encore | 13900KS | 4090 Gaming OC+Alphacool Block | 48GB DDR5-8600 | RM1200x SHIFT | XT45 1080 Nova || Dark Base Pro 901
Munchkin // Z790i Edge | 14900K | Arc A770 Phantom Gaming OC | 48GB DDR5-8000 | GameMax 850W | EK Nucleus CR360 Dark || Prime AP201 
Half-Breed // Dell Precision 7720 | BGA CPU Filth+MXM Quadro P5000 | Sub-$500 Grade A Refurb || Nothing to Write Home About  

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second.

Link to comment
Share on other sites

7 hours ago, Etern4l said:

 

Don't recall seeing this issue with the KS on any distro I tried with the MSI board, and it was def not there on Clear Linux in htop (running kernel 6). What tends to happen is there is a bit of a lag in driver/library support etc. of new hardware vs Windows, although Clear Linux seems to be cutting edge with respect to that (ex Nvidia support of course). For instance, temp sensor data is still misaligned on the 13900K in htop - a super minor issue.

 

Your problem should get addressed with a kernel update.

 

I guess what they say is true, there is no such thing as free lunch. If you want to reap the benefits of Linux, some effort and compromises will be involved. On the upside @tps3443 still hasn't managed to beat my blender and indigobench CPU scores with his super-OCed and uber binned CPUs running on Microsoft's fantastic OS 😉


I’m sorry, I’ve been extreme busy with work stresses and testing CPU’s in general. I’ve sold the processor X241M860 and it’s going out today. It was an incredible sample, and it has really set the bar for trying to find one even better. I don’t even have those benchmarks you mentioned, but I was planning on downloading them when I had time. 
 

I was planning on keeping Batch X241M860 I had found my chip lol. But people message me with stuff like “Name your price/Not kidding” etc. And I end up falling right in hook line and sinker 
 

Anyways, my newest sample isn’t as good. But it’s still an okay performer at just slightly above average.
 

 

  • Thumb Up 2
  • Haha 2

13900KF

Link to comment
Share on other sites

18 minutes ago, Mr. Fox said:

What they are not capturing is what th 5% and 2% represents. They would need to exclude people like me that are not going to purchase a new GPU at any price and only take the percentage from the people actually planning to purchase a new GPU. Maybe they are calculating it that way, but it's not clear. If you include people like me in the calculation the numbers will be misleading. What they are also not capturing is how many will still pay that much to have a new GPU even though they do not feel the price is justified. I did that with the 3090 KPE. Was the price idiotic? Yes, it was ludicrous. Did I spend the money anyway? Yes, I did. NVIDIA knows this. They don't care whether people think the price represents value. They only care if people will spend the money anyway, in spite of their opinion that the price is unreasonable.

The numbers just show that Nvidia will lose gamers over on the Red side if 7900 series is close to 4080 or beat 4080 cards. Nvidia have nothing to offer the gamers around sub 1000$ or (sub) 1000$ cards for 4K gaming. Nvidia will has to target the 10% of population that lack common sense, all too much money or are just plain stupid. I can't see it another way. Could be nvidia don't have enough 4080 cards to sell. The one few they have will go to the already mentioned groups of people. Hence they still keep the 4080 at $1199 MSRP. 

 

Once the 7900 is launched, I expect Nvidia need to do something. 

 

Yep, they can continue throw out 4090, but how many cards do they have in shelves for sales? Binning has to happen, and not all silicon can be used for the top SKU. And only a few buy 4080, so...

  • Thumb Up 2

"The Killer"  ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Link to comment
Share on other sites

33 minutes ago, Mr. Fox said:

It is strange that everything seems kosher for me on the 13900K and when I was running the 12900K on the Strix D4 mobo everything was fine on Linux. It is either the Z690-E or 12900KS, or both.

I installed openSUSE last night with the ACPI=off kernel argument and fewer things are broken than before. I do not get the long list of errors when Linux is loading now and neofetch reports the correct clock speeds. CPU-X reports clocks close (100MHz below actual) but hardinfo still shows them way off (4100MHz instead of 5400 P cores and 4300 E cores).

 

It could also be the Linux applications themselves that the developers are not updating. Even if the Linux kernel is providing the proper support, if the apps are not updated they may not interpret things correctly. This kind of thing is what holds Linux back from becoming a dominant force in the PC technology realm. Most people (me included) don't have the knowledge, desire or time to compile code and fix broken Linux code.

Being "free" is both a strength and a major weakness. It is probably pretty safe to assume that most of the experienced Linux developers do not own cutting edge hardware, and they're only going to burn calories on hardware that matters to them.

 

Prior to making assumptions about Linux developer base (remember, Linux basically runs the world in the datacentres, where systems often sport $10k+ CPUs), I would check what kernel the imperfect Linux was running. If it's even 12 months old, the 12900KS hadn't even been out at the time. 

 

14 minutes ago, Papusan said:

Here's a graph on how many people who fits into your chart..............  Round it up and I think the number are closer to 10% of the population. 

 

5% is happy to spend $1,100. 2% or less feel that the current $1,200 MSRP is justified or are willing to spend more than MSRP.

wTADs69H79wOezA0.jpg

 

image.png.a3989945160103bd9e7bd2bf7ef81089.png

 

$700-800 Ideal Price for GeForce RTX 4080: TechPowerUp Poll Surveying 11,000 Respondents

EXCLUSIVE by techpowerup.com
 
 Today, 12:22 Discuss 
The ideal price for the NVIDIA GeForce RTX 4080 "Ada" graphics card is around USD $700 to $800, according to results from a recent TechPowerUp Front-page poll surveying our readers. Our poll "How much would you pay for RTX 4080 at most?" received over 11,000 responses. At the number 1 spot with 22% of the vote is $800, closely followed by $700. Together, this range represents 44% of the voters. 14% of our readers think $600 is an ideal price, followed by "less than $400" at 13%. 9% think $500 seems fair, followed by 7% willing to spend as much as $900. 5% is happy to spend $1,100. 2% or less feel that the current $1,200 MSRP is justified or are willing to spend more than MSRP. There's more to a majority finding sanity with the $700 to $800 price-range.

 

Well, if you poll the customers, they will certainly point to a discounted price as being ideal. I'm not trying to excuse Nvidia's pricing here, just pointing out the reality of the situation where gaming will soon represent like 25% of NVIDIA's business. Hopefully they will respond to pressure from AMD... 

 

 

  • Thumb Up 1

"We're rushing towards a cliff, but the closer we get, the more scenic the views are."

-- Max Tegmark

 

AI: Major Emerging Existential Threat To Humanity

Link to comment
Share on other sites

18 minutes ago, tps3443 said:


I’m sorry, I’ve been extreme busy with work stresses and testing CPU’s in general. I’ve sold the processor X241M860 and it’s going out today. It was an incredible sample, and it has really set the bar for trying to find one even better. I don’t even have those benchmarks you mentioned, but I was planning on downloading them when I had time. 
 

I was planning on keeping Batch X241M860 I had found my chip lol. But people message me with stuff like “Name your price/Not kidding” etc. And I end up falling right in hook line and sinker 
 

Anyways, my newest sample isn’t as good. But it’s still an okay performer at just slightly above average.
 

 

 

No worries. On my system Clear Linux is about 33% faster than Windows 11 in these CPU benchmarks. That's basically a major generational leap for "free". I was wondering how much of that gap can be bridged with extreme OC, including much faster RAM, and whether that would justify investment of effort in a custom WC loop. 

"We're rushing towards a cliff, but the closer we get, the more scenic the views are."

-- Max Tegmark

 

AI: Major Emerging Existential Threat To Humanity

Link to comment
Share on other sites

2 hours ago, tps3443 said:


I’m sorry, I’ve been extreme busy with work stresses and testing CPU’s in general. I’ve sold the processor X241M860 and it’s going out today. It was an incredible sample, and it has really set the bar for trying to find one even better. I don’t even have those benchmarks you mentioned, but I was planning on downloading them when I had time. 
 

I was planning on keeping Batch X241M860 I had found my chip lol. But people message me with stuff like “Name your price/Not kidding” etc. And I end up falling right in hook line and sinker 
 

Anyways, my newest sample isn’t as good. But it’s still an okay performer at just slightly above average.
 

 

I was extremely envious of the silicon quality of both CPUs but let's be honest. Unless you are getting paid for setting world records by a sponsor that covers the costs having a phenomenal chip is a stroke of luck that doesn't do a darned thing for you in the grand scheme of things. If someone was offering me 2 to 3 times what I paid for something that I was not directly and measurably benefitting from having I would sell it as well. At the end of the day, I (we) realize no tangible benefit from having an average sample versus a superior sample. Ranking on a leaderboard means nothing  tangible and contributes only to personal gratification and ego. There is some value in the personal satisfaction, but it's a hobby not an occupation.

2 hours ago, Etern4l said:

Prior to making assumptions about Linux developer base (remember, Linux basically runs the world in the datacentres, where systems often sport $10k+ CPUs), I would check what kernel the imperfect Linux was running. If it's even 12 months old, the 12900KS hadn't even been out at the time. 

Latest and older kernels were tested. That was also one of the reasons for trying different distros. The kernel that works correctly on my Z690 Dark and 13900K is older. 

 

I am not making any assumptions about Linux, only making comments in the context of consumer adoption. What happens in the business realm is relevant to Linux and the business that rely on it, but not to me and other consumers looking for a replacement for Windows. I was only speaking in terms of Linux being viewed as a viable replacement for Windows to consumers. As much as I loathe Windows 10 and 11, bugs aside, they generally work right and I have enough experience with Windows to fend for myself. I don't have to compile source code to install software, or recompile OS code to fix issues, and I don't usually have to do strange things to make applications work correctly with Windows. I know that some people enjoy that. If I knew how, I might enjoy it. But, I don't and don't want to burn any calories on it. To the best of my knowledge, that is how most PC owners, including enthusiasts, feel about it. I don't see that as a bad thing, just a difference in personal priorities.

 

When Linux works correctly for me it is mostly a better option than Windows and I really love Linux in general. When it doesn't work as intended, it really sucks. It's truly a love/hate relationship. I think it is better than Windows, except when it isn't, LOL.

  • Thumb Up 3

Wraith // Z790 Apex | 14900KF | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower GF3 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO
Banshee // Z790 Apex Encore | 13900KS | 4090 Gaming OC+Alphacool Block | 48GB DDR5-8600 | RM1200x SHIFT | XT45 1080 Nova || Dark Base Pro 901
Munchkin // Z790i Edge | 14900K | Arc A770 Phantom Gaming OC | 48GB DDR5-8000 | GameMax 850W | EK Nucleus CR360 Dark || Prime AP201 
Half-Breed // Dell Precision 7720 | BGA CPU Filth+MXM Quadro P5000 | Sub-$500 Grade A Refurb || Nothing to Write Home About  

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use