Jump to content
NotebookTalk

Papusan

Member
  • Posts

    3,600
  • Joined

  • Last visited

  • Days Won

    162

Everything posted by Papusan

  1. Same up again.... Better silicon and higher board power than its AIC partners cards for around same MSRP. Yep, the AIC partners will launch their so called better binned OC variants of 4070 but those will come with a nice price premium on top. For the vram overclock, its just a pure luck of draw. I just can't see the AIC partners will make many of those low entry cards at MSRP. Whats the point make cards when you struggle to compete and on top don't make profits? Better put the assembly line in works for the more expensive custom cards. https://www.techpowerup.com/review/nvidia-geforce-rtx-4070-founders-edition/41.html 4070 is a 1440P card. Even Nvidia point that out https://www.nvidia.com/en-us/geforce/news/geforce-rtx-4070/ But it won't be a real 1440P card in 2024/25. Their real 1440P card is the 4080@16GB at $1199.
  2. 3080@10GB was a 4K card regarding to what nvidia said. And where will it stack now? In 1440P with the need for lower bling settings😎 The card is barely 2 years old. You just don’t jump on new cards every 2nd year. So the 4070 will be a 1080P card 3 years down the road. Maybe even before. My son’s friends have their cards near 4 years before they swap. Or upgrade every 2nd gen cards. Not sure how many gamers upgrade every 2nd year. Aka every gpu upgrade. The awful price hike for newer gen cards doesn’t make it easier for gamers. Btw. If you also enjoy older games then you'll be happy with Nvidia's changes for new gen low/mid end cards.... Is less more?😎 https://www.pcmag.com/reviews/nvidia-geforce-rtx-4070-founders-edition Probably why Nvidia couldn't charge more than $599 for the 4070 to match 4070Ti and 4080 price point. They know 12GB vram will be too little for future games. So now you get a brand new card that will perform worse with your older (favorite) games and you may see newer games will have same behaviour but from another reason😅 Now, the question becomes: why is NVIDIA pricing the RTX 4070 lower than what it had considered in the start? Tom’s insider claimed that NVIDIA is realizing that the RTX 4070 VRAM may be insufficient. VRAM capacity has become a hot topic in recent days following the launch of The Last of Us Part 1 on PC. The game reportedly consumes more than 12 GB of VRAM at 1080p/Ultra making GPUs like the RTX 3070 with 8GB of VRAM, which are sufficiently powerful for 1080p otherwise, perform horrible in some instances. Yep, measly $820 for a custom aircooled 4070. Nice. I wonder what price Asus will charge for their best 4070 OC card. https://videocardz.com/press-release/colorful-launches-geforce-rtx-4070-igame-and-battleax-series Edit. I'm sure EVGA saw this come... The manufacturers will hardly make big profits at least with the MSRP cards, that much I can reveal at least. I’m not allowed to do more than that because the calculations are all under NDA. Unfortunately, this also diminishes the hope that the upcoming AMD cards will be significantly cheaper. igorslab They are cutting costs down to the bone. See etc here <because especially the cheap voltage converters are disappointing><measly 12GB vram>. And yet, they still have problem make any profits from $599. Who is to blame for this? Where goes all the profits? TSMC ? Or Nvidia that don't tell the whole truth about the cost and calculations? Does this mean the new GPUs in reality should have cost more?😁 @electrosoft
  3. Not very active on ItalianExtremeModders/videos either. Last one was 3 months ago. Yep, no one should buy the 4070Ti or the 4080... But 12GB vram stop you from using your 4070 in 4K. Nvidia GeForce RTX 4070 review: Highly efficient 1440p gaming https://www.pcworld.com/article/1781139/nvidia-geforce-rtx-4070-review.html As I said right up top, the $599 GeForce RTX 4070 is the only current Nvidia 40-series graphics card potentially worth your money aside from the flagship RTX 4090. Spoiler, The RTX 4070 is overpriced and boring And Intel get payback for working hard to make phone cores works in their desktop chips. Yep, the flood of baby-cores was very important for Intel. Intel will likely make your next phone’s chip https://www.pcworld.com/article/1784223/intel-will-likely-make-your-next-phones-chip.html
  4. Nvidia is stingy or/and greedy, LOOL But 4070 won't be the worst card out from Nvidias factory. Nvidia still think 8GB vram is more than enough until you are forced to buy a new card end of next year. But huhh... You don't need rush out to buy the latest and greatest games that will come next summer/fall. Just play your old games😎And you save money on top👍 Can't beat being stingy, HaHa AMD Goads Nvidia Over Stingy VRAM Ahead of RTX 4070 Launch Games are using more VRAM, and AMD suggests Nvidia isn't offering enough. The real joke is still not released. 4060@8GB vram. I wonder what hardware Unboxed will say to a brand new 8GB low/ mid mainstream Nvidia graphics card in 2023. Nvidia GeForce RTX 4060 release date speculation
  5. Yep, the inflation is guilty for the 10% price hike. But what with the 30% performance increase over the 2 years old 3070? Within what to expect from a new smaller node and architecture?
  6. 30% more performance for $100 more. Great deal @electrosoft @Talon? And fake frames is the golden future and feature from Nvidia🙂 Maybe Nvidia could sell a feature upgrade pack (call it moment as Microsoft) for older graphics cards at half the price? Lets say DLSS4 for 100$ for 4000 series graphics cards. Or 50$ for 3000 series? Wouldn't that be nice? 🙂 NVIDIA claims GeForce RTX 4070 and RTX 3080 offer equal DLSS performance without Frame Generation videocardz.com Officially, the card is targeting gaming experience at 1440p with 100 FPS. What is important is that this number assumes ray tracing and DLSS3 are enabled. NVIDIA is now going big on using DLSS3 in their marketing. There is hardly any mention of performance claims without DLSS or raytracing.
  7. More from the old for bro @electrosoft😎 GTX 750Ti https://hwbot.org/submission/5248032_papusan_3dmark2001_se_geforce_gtx_750_ti_164463_marks https://hwbot.org/submission/5248042_papusan_aquamark_geforce_gtx_750_ti_567644_marks https://hwbot.org/submission/5248058_papusan_3dmark11___performance_geforce_gtx_750_ti_8111_marks?recalculate=true https://hwbot.org/submission/5248047_papusan_3dmark03_geforce_gtx_750_ti_96675_marks RTX 3070 https://hwbot.org/submission/5248040_papusan_aquamark_geforce_rtx_3070_581528_marks
  8. Here's another one. Method 5: Disable Automatic Startup Repair
  9. Thanks bro Fox. I will keep that in mind 🙂 Already talked about this in my older post here. If you see most of the cores have higher temps than a couple who run a lot colder on 13th gen then you may can fix it with a re-paste. But Dell also this time rely on their fantastic feature TCC Offset. Give a huge thanks to Dell's main thermal engineer Travis North for making trash to cut costs. Thermal and System Design with Travis North & Mark Gallina Conctlution. Yet another Stupid move from Dell (3:15).... Me, have the answers for all this....
  10. @Mr. Fox I found a similar External PCIe GPU fan bracket but from Silverstone.... SilverStone FDP02 SilverStone FDP02 External Fan Adapter For Cooling Your If you find out that you'll order one, then it would be nice if you ordered two. Not sure I will be able to get one here home. https://www.silverstonetek.com.tw/en/product/info/fans/fdp02/ Maybe test with Win 10. Would be nice see the new and shiny from Redmond being beaten by the older touch friendly smartphone OS.
  11. Bro @Mr. Fox @Ashtrix Why bother with Win 11 ? And if you want to tweak your OS. Why bother tweak Win 11 for gaming when you can tweak Win 10 for the same task (gaming)? On top... Less bugs. Can't be better than this for gamers. Just forget Win 11 and move on.
  12. Yep, my whole point... Totally wasted if you aren't intended to overclock or just want a gaming chips. They won't max out the cooling capacity of an decent AIO in gaming. No game will eat 300W in games with stock clocks. Neither will eat the PL2 limit (or PL1 if gamers follow Intels guidelines nowadays --- PL1 and PL2 at same value). But if I had to choose... I would go with the KS for my custom loop. Because I mostly oc and use it for my hobby. And if I got an SP117 over a SP104 chips I'm quite sure I could squeeze out a bit more if both chips had about same leakage but different bin.
  13. I see this, thanks. And why buy the KS if you only want to run it stock? No point then pay more for going from 5.5 to 5.6 GHz for premium. Just buy the cheaper 13700K for gaming with the cheapest Z690 board you can find (gamers). Or do the same but with the 13900K. In this case as you point out... KS is a waste. Regarding the two chips... I don't talk about the difference in price. And what to pick from them. I talk more about that you can get more out from the best binned chips with cooling below Chiller level cooling (custom or AIO). For the same price. RTX 4070 & GA103 Truth Leak: AIBs are MAD, but they should be FURIOUS… Who next to follow EVGA? Nvidia playing with the fire (if they still want their AIC partners to sell their GPU's). But if they intend to take over the whole GPU sale, they are in the right direction doing so.
  14. Did you test this with custom cooling/AIO or just your chiller? If it was the chiller I can see this is about correct. But not with an AIO that will push the chips above +90C and stability problems. Going with an SP117 will clearly help if you are on a weaker cooling (if leakage is about the same for both chips).
  15. https://hwbot.org/submission/5245045_papusan_3dmark03_geforce_rtx_3070_342249_marks https://hwbot.org/submission/5245047_papusan_3dmark05_geforce_rtx_3070_101462_marks?recalculate=true Edit. The Double in 3DM Vantage for RTX3070 🙂 https://hwbot.org/submission/5245056_papusan_3dmark_vantage___performance_geforce_rtx_3070_142981_marks?recalculate=true https://hwbot.org/submission/5245058_papusan_3dmark_vantage___extreme_geforce_rtx_3070_94356_marks?recalculate=true
  16. You're welcome my friend. Just sad you need to hunt down info when the manufacturer could offer a fix. Or it could be they wasn't avare that it was a major problem when you returned the block. They should have made a press release on theirown web page about this. As well posted a short video on their own YouTube channel. But I expect they will try avoid bad press and save some costs. Yep, todays tech and QC, sucks.
  17. @Mr. Fox Here is the culprit for your bad experience with EK Direct Die kit. Their QC is shrinking - stinking for both for GPU and Cpu blocks. And I expect this is why some reviewers (paid shills) got OK results from the direct Die kit while others not so good (variable quality). But do they post or make a press release about it? Nope. This is bad from a big company. And it's bad that no big tech YouTube channel talk about it. Date of the post... April 3, 2023. I wonder how many that have bought the EK kit and accepted the thermal results and moved on. They will never know that they was screwed by EKs awful QC.
  18. Asus boards SP rating is prone to fail if the socket mounting pressure is wrong. If the pressure is more correct the SP rating works as intended. Aka you get the memory speed to expect from the board. The uneven MB mem traces quality is awful on Z690 Asus Apex boards. This improved with the Z790 but also there you aren't given to get a golden "Easter" egg in their expensive HW lottery.
  19. The only Manufacturer I know thats honest about this is Gigabyte. They always list new revision numbers. And bet on it. This will always happens. The first batch HW is always flawed from Gigabyte, LOOL I doubt Asus changed it physically, they just delivered better batches of bords later in the cycle. The sad part... All firmware, features optimization done by engineers/in house overclockers is done on good boards. Here is one more that is honest. Maybe on the higher side but 3 stars is more correct than the inflated 4.5/5 stars many of the reviewers out there have given this half-baked gaming Cpu from AMD. Yep, some can complain about what GPU this Cpu is paired with in the review but the 3 stars is still valid. But the fan boys in the comment section is damn angry, LOOL $450 for a 8 cores Cpu is so "Intel" a few years ago when they hadn't competition. But charge this prices in 2023 where 8 cores is bare minimum is disgusting. There is reason the Engineers crippled the CPU and put in PROCHOT max temp 92C. Try see if you can change it to the native 100C with offset in TS Options. You may smell some electronic odur but you have already paid for the warranty when you bougth the laptop🙂 Nasty Asus. They crippled Intels specs to use cheaper cooling and power delivery. Yep, very nasty. Or better say....
  20. Don't forget add in the links bro Ryan in your sig. Same as me but just add the links😎 Double in Hwbot Ungine Heaven with 3070😎 https://hwbot.org/submission/5244260_papusan_unigine_heaven___basic_geforce_rtx_3070_14108.6_dx9_marks?recalculate=true https://hwbot.org/submission/5244267_papusan_unigine_heaven___xtreme_geforce_rtx_3070_11732.29_dx11_marks?recalculate=true And after near 5 months. I finally opened the package with the new monitor (ASUS ROG Strix XG27AQ 27" Monitor), LOOL PS, this is a weird combination. i5-11600KF paird with an 4090. This just has to be an real Cpu bottleneck even in 4K. Why not just buy the cheapest AMD board and 5800X3D? Or equal cheap 13600K and Z690? https://pc-builds.com/bottleneck-calculator/result/16J1ci/2g/cyberpunk-2077/3840x2160/ https://hwbot.org/submission/5243890_mcflipplenipps_3dmark___time_spy_extreme_geforce_rtx_4090_13779_marks
  21. Congrats. And have fun 🙂 Please post results. Put those two links below in your sig so we and you can compare or see if it performs as intended Intel Core i3-1005G1 Laptop Processor (Ice Lake) https://hwbot.org/hardware/processor/core_i3_1005g1/ Happy benching bro Ryan🙂 Edit. Forgot.... To you all🙂
  22. Not so sure about that. Look at this mess. Reviewers tried to oc the 8 core gaming chips with BCLK clock. 90°C despite a 480 mm radiator and max allowed stock voltage. Imagine this kids toy with 300W load and unlocked voltage to male it possible. This X3D chips would boil to steam within a couple of seconds and you would smell a nasty electronic smell if not the chips have died directly when you dialed in the needed voltage. We talk about 120-150W power and even a 480mm ain't enough to cool it. This is not the tech I want.
  23. The $290 13600K offer 6-10% more application performance. And the 7800X3D is locked. You can still get +5% more performance from the 13600K with undervolt/oc'ing. Undervolt an 7800X3D won't gain you more clock speed. You are stuck with what you already have at +100$ USD more. For me... A very bad value. And not a fun chips play with (I don't talk about thinkering for gaming).
  24. I'm not very well into Ryzen platform. But both Intel and AMD chips is meant to max out boost close to thermal limits. You can use Dell's way to cripple the chips. Use TCC Offset. You don't lose much with stock clocks and cap it at 90C. 8 cores Ryzen chips with 360mm AIO and max fans.
  25. 13600K for $300 USD and the cheapest platform you can find. All way. The X3D chips can't be bird and fish. Just one of them (an expensive gaming chips). In short... The 7800X3D should be priced between an 13600K and 13700K. Not a cent above. You can undervolt the Ryzen chips but they will try reach max boost all time before you get significant power reduction. Edit... Then you have this.... The usual from AMD (buggy software/drivers and firmware. it just works, LOOL Specifically with X3D I ran into another issue. After swapping from the 7950X3D to the 7800X3D I assumed that it would just work after installing the new chipset drivers. Nope.. while benchmarks like Cinebench showed the proper numbers, games were running slower than expected. Not "stutter" slow, still very fast, but "only" at levels comparable to the 7950X3D—the 7800X3D must be faster though. After some digging I found out that several CPU cores get parked during gaming, for no apparent reason. Uninstalling the AMD chipset drivers, reinstalling the newest ones, resetting power settings, copying the power plan from another PC all made no difference—I was missing around 10% in gaming performance. In their reviewer's guide AMD recommends to start with a fresh Windows installation when switching from the 7950X3D to the 7900X3D: ".. may encounter low scores when switching directly from the 7950X3D to the 7800X3D without reinstalling a fresh version of Windows OS. This is likely a result of the AMD PPM provisioning file driver still being applied to the 7800X3D processor, which was not its intended use. This performance issue is not a typical end user scenario and is only a result of switching CPUs without installing a fresh version of Windows OS." So I invested a few hours to reinstall Windows and set up my benchmarks, and oh surprise, performance numbers were in-line with expectations. While I can understand that such things might happen, especially with a new release, it's completely unacceptable to ship a driver package that can't be fully uninstalled or that changes the OS in a permanent way, so that a fresh installation is required. I hope AMD can figure out what's going on and that they will provide updated drivers and proper guidance how to detect and fix the problem.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use