Jump to content
NotebookTalk

MyPC8MyBrain

Member
  • Posts

    589
  • Joined

  • Last visited

Everything posted by MyPC8MyBrain

  1. most of the above will likely also lack the education to understand they can do that!
  2. somewhere here i saw score from CB23 around the 32k if i recall correctly, best i could get with the 12950HX on CB23 was just below 26k (25997),
  3. maybe intel set a new lower threshold for heat? (usually its 100c) how much wattage is his cpu actually drawing during the bench? (he can confirm with HWiNFO32) do you know what's the wattage on power brick being used? if you follow the link above, arrange by cpu score, notice they all seem to show relatively low score for 13th gen cpu,
  4. i was wondering if this is your unit, i just looked it up its 157W for that cpu sku, same as the 12950HX only 8 more efficiency cores, higher clocks for performance cores in the same power envelope, from the graph it seems that cpu is struggling with heat dissipation, it plummets to below 3GHz, and in cpu specific test it completely takes a deep dive at the most critical stage (that indicative of heat buildup), PassMark CPU Benchmark is showing the new mobile 13xxx with 50% increase performance from the 12950HX,
  5. hence my initial confusion when i saw i was first place for cpu on that list i thought it has to be a bug, with that sorted i am indeed surprised i was expecting a much higher scores range from the newer 13xxx cpu line, even my highest 12950HX score not close to highest in its class (though i believe this bin can get there), my mobile system scored that with a 240w power supply which doesn't cover both TDP and TGP running at full throttle together, there are variants out there with 330w power supply that can push further, at some point i removed my CPU PL4 limit and or on another occasion set it to 330w, i also bought a new 330w Dell power supply, i had a power meter on the wall outlet, i saw 330w being pulled but bench results remained the same with both TDP and TGP satisfied,
  6. not to harp on what i was trying to explain earlier with regards to what these bench represent, from my confused example above we all learn together that despite my cpu scoring first place on that 4080 bench list it doesn't make the 12950HX a better/faster cpu than a 13900HX based on my bench results, this highlights clearly the point i was trying to make earlier with @ssj92,
  7. so its pulling my highest ever cpu score for the comparison not highest score overall? edit: i guess it does huh found it here https://www.3dmark.com/spy/33634086 (didnt save a screenshot, i was after higher dgpu score i couldn't get at the time), the closest bench i saved a screenshot from is this https://www.3dmark.com/spy/33539302
  8. my score number actually changes too, look at my overall score going from 13489 to 12177 when sorted by cpu, its not just my rank changes which is also incorrect btw my cpu score is not 16555 (on this bin i believe i can get close to that with LM TIM, on previous bin's i saw 15800+ with LM and even with just a good quality paste replacement, gpu score is low because at the time i didn't know how to get the gpu to use its full 150w yet 😊),
  9. @1610ftw interesting... from your link if you arrange by cpu score my 12950HX score is the highest this has to be a bug or something as my score changes completely, by overall score showing the right score for my unit (74th out of 34 total 😄),
  10. technically speaking DLSS3 is a driver implementation NVidia is intentionally not adding to 30xx line atm, if i had to guess its an executive and marketing decisions designed to promote the 40xx line, there is nothing beside that preventing the 30xx line from getting the same DLSS3 treatment, from what i read with the 40xx line NVidia cut more corners resulting in smaller die than 30xx gen had, a smaller die translate to less cooling efficiency, i am curious to learn how efficient the 40xx line compared to the 30xx line, i believe this is an intel imposed cpu lane limitation with the exception to the HX line,
  11. i was looking for that link the other day, how did you find it? did you had to construct the query string yourself to load these results?
  12. yea i am not looking to upgrade anytime soon, i am very happy with the 3080 atm, in the future if the 40xx shows significant gains or efficiency i do have the option to just swap the 3080 out for a 40xx as it is a standalone component in the chassis,
  13. from my view these are benches designed to test gaming feature performance not overall system performance, my objection is when end user is taking a single statistic ignoring everything else, the 4080 is not 40% faster overall from a 3080Ti it scored 30 some precent higher in a specific gaming bench, that's an indication that in some games there will be fps gains not that the 4080 is across the board better at the same percentage one end user decided on, if i had to guess id say the 40xx series is somewhere from 15-20% faster then previous gen overall, i agree, and i wasn't arguing that its not or trying to make it sound like the 3080 is king, just to put things in proportions, it matters to me because other people read these conversation later on and get the wrong idea unless its clarified,
  14. regardless of cpu being benched directly it is always active no matter what happens it is the heart of your system, though directly the cpu is not being benched it does play a role from its core count bus lanes and speed etc., everything the gpu calculates has to arrive to through a pipe from the cpu at some point, to claim that cpu is not part of the bench is incorrect, it is always part of the bench just not heavily leaned on during a gpu bench but it is still very much active 100% of the time, if you watched one of the posts from last week about "cpu" being bound you'd understand that a statement like that is doesn't have a leg to stand on, and again ssj claims are that the 4080 is 38% faster overall from a 3080, it is maybe true for a specific bench improvement not the entirety of the gpu, side note for me DLSS is just trickery and has no merits being benched as a performance indicator,
  15. thanks you have a keen eye, you're right i don't have 30 years hands on professional experience 😊 you can do that only if the cpu and rest of the hardware is the same! you are the one spreading false info not me! start with the system compared is not a Dell system, Dell has not release any 40xx models to the public yet, next it is just in one specific bench, that doesn't make the gpu 38% or whatever precent you deem hyping enough to type true or even close to reality, in many games the 4090 is showing 2 times the frames with DLSS3 does that makes it 200% better performance than the 3080Ti? btw my current high score ranks 43rd place out of all 4080 laptops benched on 3DMARK 😄 (and some of these "laptop" scores are with full external 4080)
  16. your math is still wrong, you are isolating gpu score while that gpu is also run with a newer cpu so you cant base that sub score alone you should look at the whole bench score as one, comparing a newer component running with other newer components is not a performance indicative for that newer component on its own,
  17. check your math, not even close to 39% slower! at best gpu results shown are 25% slower then a 3080Ti @150w, while my previous gen cpu is 25% faster! note that the same 3080Ti @175w highest scores is 14832 which shrinks these perceived percentages further to only 15% overall performance gap with the newer higher cpu core count and a 4080 mobile (i tested and still on stock dell paste mind you and no fan control beside windows power plan, I can easily nudge the top score for my hardware spec way further, I just choose not too I'm good where i am atm), you should ask yourself is it really all we get for our money from a newer gen cpu and newer gen gpu? only 15% together? also ask... why would anyone want save few bucks in few weeks buying previous gen hardware performing only 15% slower at dirt cheap prices?
  18. 9900ks paired with 2080Ti is not ancient its just not the newest, it is still a powerful desktop that can run for hours at 5Ghz with no issues and it is way behind a mobile cpu in my test, my desktop system can still be upgraded today, at the time it was first built i didn't spare a dime maybe 3 years ago it was packed with top of the line components of the time,
  19. i am using HX cpu in mobile chassis, i run a test against my desktop 9900KS and was floored with results, completion was 25 min on the 9900KS, same test on the 12950HX in the 7770 completed in 15 min, granted there are 3 cpu generation between but its a powerful desktop cpu with better cooling etc.,
  20. i saw the benchmark result for the 4090 with the new gen cpu architecture its pretty impressive, but I'm still not convinced for the simple reason it is not showing us the real picture or actual gains it is showing an entire generation as a whole performing better, we don't know if its the cpu or the gpu that's giving these high scores and we don't know if these benchmarks are DLSS3 aware of these artificial frames being injected, that is not real performance but trickery performance, NVidia didn't put the 4090 with last gen cpu that's already been benched with 3090 and show us the delta of just the 4090 with and without DLSS3, why is that if they are so proud their achievements?
  21. food prices will eventually go back down, gpu prices never go down its a one way nvidia street, they are just testing how much they can get away with,
  22. you cant, and you cant also build your own cpu but you don't see intel charging an arm and a leg for it over the years, not to buy into the hype, it is so simple for these companies to raise hype with their capital on demand in our social media world, if they know they need to deliver value instead of hype we wouldn't see these greedy eye gauging prices, i didn't infer or call anyone ignorant! there is a huge difference being an ignorant to not being educated or knowledgeable in certain areas or fields, i am not an accountant but it doesn't imply i am ignorant or don't know how to do math,
  23. its the greedy sales tactics and shady marketing behind it, when exactly did it happen that a gpu is x4 more expensive then a cpu, today it really doesn't matter what we think its what the young kids with zero understanding or technical knowledge that they are aiming after and marketing this for, those in the know will be fine navigating these technical waters those who don't are their bread and butter,
  24. i completely agree, i also feel the prices are outrageous and out of proportion, for example at the time desktop rtx 4090 are available and selling at ballpark msrp $1500, i had to cough up over $2200 for a mobile 3080Ti, when i could have bought a 4090 with external enclosure and have change left for the price i paid for 3080Ti in a mobile chassis (though in due time it should be upgradeable for a hefty fee!), i just wanted base components that will be sufficient for the next few years I finally went for it, i was on the fence the past few years and eventually had to make the move almost 6 months ago! as it happened my old faithful M3800 was on its last leg and i had to upgrade my 10 year old laptop, it ended up that i got bad unit after the other for a total of 7 units swapped across almost 6-7 months, it was just last month that my 30 day evaluation for this final unit completed, I love the unit and need a mobile station, i also have a strong desktop (though not as powerful now as the mobile 7770 is atm), to clarify i posted my price and spec for comparison, though outrageous it is in the same ballpark for top end mobile components, latest gen might be 10% more expensive initially but still not cheap either way, Edit: allot of the fuss with the new 40xx series seem to be related to DLSS3 and based on frames gained in specific games, i don't game though i might occasionally play bit, i use dGpu for code processing and other processes that are not game nor DLSS3 related, i am not yet convinced the gap is that big outside the promoted DLSS3 to warrant these new or old high prices for these components, also we should note that these gains are shown with new gen mobile cpu so true dGpu gains are artificial and somewhat obscured,
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use