-
Posts
6,018 -
Joined
-
Days Won
670
Content Type
Profiles
Forums
Events
Everything posted by Mr. Fox
-
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
I am going to have to fire up the chiller tomorrow and play with the power slider to see if I can reign in some of that because pulling 1700W from the wall is not sustainable. It will trip circuit breakers (and I have this on a 20A circuit) and exhausts the capacity of my 1500W UPS. If that does not work will I will just go back to the Matrix vBIOS and use the EVC2. I'm not gonna lie though... I just LOVED seeing that 1.150V default. For now I just flipped my GPU over to using the stock Zotac 600W vBIOS. I think letting it run wild and free on the XOC vBIOS probably isn't a smart thing to do. It might be safe, but I don't want to just assume that and hope it turns out OK. Temping fate to such an extreme degree (1700W from the wall, probably 1300W from the GPU) with the petite arson PCIe power cable should put the fear of God in all of us that possess an ordinary amount of common sense. Here is something interesting. I use a GPU overclocking program in Linux called LACT. It shows both core and hotspot temperature readings on the 5090. So, maybe the register is actually there and the NVIDIOTS gave Windoze software developers marching orders to not expose it. -?- I think they rank right up there with bad actors like crApple when it comes to being among the kings of command and control dictatorships. I received my Thermal Grizzly GPU die gaskets for liquid metal, so I really need to make time for that. The package comes with 3 for 4090 and 3 for 5090, so I can do both GPUs. I am expecting to get an offer on the job I want most on Monday or Tuesday and I am sure that I am going to initially not have much discretionary time to do things until I get past my orientation, which usually take a bit more time in c-suite positions. -
Yup, we can certainly hope they will bleed a lot. Would be great to see NVIDIA and the DDR foundries, and all of the manufacturers of end-user components that use those parts literally lose their butts and have surplus stock that they need to dump as fast and cheap as possible just to keep the lights on. AI is like crypto. Value is assigned to something that nobody can see, feel, smell or touch because it does not actually exist. It is an idea and it is based on beliefs and imaginations, not an asset. Only stupid people invest captial into popular ideas. That's not to say that ideas are bad and worth nothing. But, overvalued ideas don't hold the interest of smart investors, only the stupid ones.
-
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
Process Lasso is definitely the way to go if you have a dual CCD processor with 3D V-Cache. No need for Micro$lop XBOX feces bloatware and the extra chipset driver crap. Just select the EXE and choose the affinity and BOOM, every time you launch that game it uses the CCD with the cache. When you close the game it is back to using all cores and threads. Easy, peasy... no hokey pokey nonsense with flakey rubbish from the Redmond Reprobates. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
OK. New problem. My CyberPower 1650W UPS does NOT like this XOC vBIOS, LOL. SCREAMING BLOODY MURDER. I will be using the Matrix vBIOS one my "quiet" vBIOS position for normal daily driver usage and gaming. I launched Cyberpunk 2077 and all the sirens began blaring on my shunted 5090. I do not even need the EVC2 now because the XOC vBIOS gives me the voltage I was setting manually using the EVC2 LOL. I will test with benchmarks to confirm, but I may move the EVC2 over to the 4090 Suprim if I no longer need it on the 5090. Check the voltage and power now. Nice. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
Now that the CPU is sorted, I got the RAM dialed in stable as well. Since I have more spare time on my hands temporarily, I decided to dismantle the Thermal Grizzly Mycro direct die block and clean the jet plate fins. It wasn't too terrible, just normal accumulation from the loop during roughly a year of use. I use two filters on each PC so that helps. I've had to clean the filters a couple of times. I ordered a set of Thermal Grizzly TG GPU Guard gaskets and will put liquid metal on the 4090 and 5090 when I can muster enough motivation to do it. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
Yes, very nice results. The downside to that motherboard is the unfortunate fact that it is single chipset (X870) and lacks the extra PCIe lanes of the dual chipset (X870E). It is sad that Gigabyte chose to not make the Tachyon an X870E part. It may as well been made as a B850 motherboard. The single chipset configuration basically ruins the capability to use any add-in cards without sacrificing GPU bandwidth. It never ceases to amaze me how idiotic the motherboard engineers are at AMD, Intel and all of the AIB partners. They frequently implement the dumbest things imaginable. They are all guilty of it and none of them consistently demonstrate intelligent design. It almost feels like it is a deliberate nasty finger salute to the people that buy their products, but I suspect it is just ignorance more than anything else... maybe combined with a touch of apathy. For X870E, the USB4 is a flagrant waste of resources that could have been put to better use. It is unfortunate that AMD was so shortsighted and mandated the waste of PCIe lanes for something hardly anyone I know that buys an enthusiast board give a rat's butt about. The second chipset allows potential for the motherboard manufacturers to allocate resources that can be put to good use. More often than not, however, they do not. They waste it on something stupid. Deleting old screenshots is extra work. When I ran out of space for attachments I just started using embedded links from photos posted at overclock.net and hwbot.org and imgur. Much faster and easier for me. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
https://hwbot.org/benchmarks/cinebench_-_r23_multi_core_with_benchmate/submissions/5931059 -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
Wow, nice. I am glad to see that board released. I am going to have to check out the owner's manual to see if they effed up the bifurcation on the PCIe slots like A$$zeus did with the X870E Apex abortion. Do you know if you can install anything X1 or X4 in the bottom PCIe slot without castrating the GPU slot to X8? How do you connect the two 8-pin CPU power cables. Does that rear I/O have a decorative cover that unsnaps? In the photos it looks like they are covered by the rear I/O. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
Happy Thanksgiving, brother. And, to all the same. Life has its ups and downs, sometimes significant challenges, but there are always more reasons to give thanks than we can count. Blessings to all of you and your loved ones. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
You know things are circling the drain when Steve is kicking Winduhz to the curb. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
A few of you might know I was put on unpaid furlough on Friday, along with about 8 or 10 other people where I worked, including my daughter and the company owner's nephew that has worked there for 15 years. I had my second job interview for an opportunity that looks really promising and I expect to be an even better job than what I had (which was great in every respect until Friday). I have never been unemployed before, but I am hoping to get an offer on this next week. So, please keep me in your thoughts and prayers. I hope all of you in the US celebrate a great Thanksgiving with your loved ones. We all have so very much to be thankful for, even when there are days that it doesn't seem like it. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
This may not mean much, but it seems like all of the melted connections I have seen have been on air-cooled 4090 and 5090 GPUs. I wonder if that is somehow contributing to the melting plastics. It may be mostly due to liquid cooling being less common. My liquid cooled cards have a thermal pad to help wick away heat from the connector to the backplate, and it definitely works. The backplate is always warmer around the connector. It also looks as if both of those examples above are using the octopus adapters that ship with the GPUs and not aftermarket cables/adapters. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
Sure. Thank you. Of course when you try to plan something it never turns out exactly as you thought. The gap is normally wider than this, but the Strix still wins in Cinebench. I have all BIOS settings the same. The Windows installation is optimized exactly the same. I have set a static overclock with 55x on favored CCD and 54 on the other CCD. Both are running DDR5 at 8000 MT/s, although that does not influence Cinebench too much. Where I see most frequently the very wide variance (500-1000 Cinebench points) is when I am trying to beat my previous highest score and using chilled water and max stable overclock or when I randomly run Cinebench as a random stability check. Even with chilled water at 8-10°C the Apex cannot surpass the Strix. It is very frustrating (as you can tell by the fact I mentioned it often). I settled on the Apex for benching, forgiving its poorly appointed PCIe slot placements and it doesn't bench as well. The Strix has a more usable motherboard layout in terms of the PCIe bifurcation and slot placement. The only thing I like better about the Apex is 2-DIMM slots and dual BIOS. I like everything else more with the Strix. The first screenshot today is per your request with the static overclock mentioned above. The best of three runs on each system this morning to respond to your request. The second screenshot are my highest scores with each system posted on HWBOT. Chilled water highest achieved scores submitted to HWBOT. This is after literally hundreds of Cinebench runs with both systems, so the variance I normally see is minimized due to the cherry-picked highest score examples of the best CBR23 runs ever produced by both systems. (As a side note, the 5090 was moved from the Apex to the Strix since it is the better platform for benching. Not what I hoped for with a $700 motherboard, but it is what it is.) -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
I suspect that it is something to do with the firmware/AGESA version. Try going back to an older bios version that was allowing you to run stable with your 8000 memory settings. If you already flashed one of those firmware that tells you you cannot go back once flashed you probably still can if you do the BIOS flashback from USB. That almost always works for me. Just remember to rename the file correctly before putting it on a USB thumb drive all by itself. You could even try it with the same BIOS version first because it could be something that got corrupted in the firmware and the best way of fixing that is doing the flashback feature manually instead of trying to reflash the normal way. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
One thing that I find difficult with overclocking Ryzen versus Intel is how fussy and sensitive they are in comparison. There are lots of settings that do not play nice together and if one thing is just slightly off it is like a house of cards. Performance and stability can go from great to garbage if all of the stars are not aligned. The CPU overclocking on Ryzen is about as fickle and finicky as RAM tuning is on Intel, and RAM tuning is a tedious process on either platform. In a way it is fun, but can be the opposite of fun (a really huge pain in the butt) when finding the right lever to pull is elusive. When you factor in the wildcards of the silicon lottery (trashy CPU and memory quality samples are common) it becomes even harder to be pleased with results. The crazy memory overclocking results posted places like overclock.net are fringe examples of the small winner count in the silicon lottery and they are not representative of the norm. Over the past week or so I have moved around CPUs and memory between the Strix and Apex. The Strix CONSISTENTLY has higher Cinebench scores compared to the Apex regardless of the CPU sample (9950X SP118, 9950X SP119 and 4585PX SP120). The memory overclock limits are exactly the same on all three CPUs no matter which motherboard they are installed in. Nothing I have tested in terms of settings allows the Apex to beat the Strix X870E-E on Cinebench. EVER. This was equally true comparing Cinebench scores again the X870E AORUS Master. The Apex loses 100% of the time in Cinebench. Why? I wish I knew. Is the the idiotic PCIe lane allocation on the Apex? The 2-DIMM memory topology ASUS used on the Apex? Is it the DIMM.2 somehow dragging down the CPU? Don't have a clue. Whatever it is, Cinebench doesn't like it. As far as memory overclocking in concerned, either the Apex is hardly any better or both of my memory samples have reached their functionality threshold. Same threshold with four different CPUs. The Apex maxes out at 8200 stable with my G.SKILL 6000 C26 1.450V 32GB kit and the Strix maxes out at 8000 stable with that kit. The Apex maxes out at 8400 with sloppy timings that gives performance the same as 8000 with tight timings using the Kingbank 8400 C40 1.450V 48GB kit (which has both XMP and EXPO 8400 profiles). With that kit the Strix maxes out a 8000 with the same tight timings that work on the Apex. The Strix cannot boot 8400 stable but the end result is the same memory performance on both systems. The latency and speeds follow the CPU and RAM kit to achieve identical results in each motherboard. This essentially makes the Strix the winner overall. The fact that it is roughly $300 cheaper, depending on where and when you buy it, makes it an absolute no brainer. The Apex is a great motherboard, but it doesn't deliver based on price or the fact that Cinebench scores are CONSISTENTLY 500-1000 points lower for any given CPU installed in the Apex. I captured this comparison of the small latency penalty of 3D V-Cache. I did not bother capturing screenshots of different motherboard, RAM and CPU combos because it always ends the same. The screenshots would not show anything different overall. The latency results follow the X3D and non-X3D CPU and memory combo regardless of what motherboard they are installed in. The G.SKILL 8000 C40 kit that I picked up with the Supercool Direct Touch RAM block is an XMP kit. It is virtually impossible to tune it stable at 8000 with all manual tuning in either of the AM5 motherboards. If I use timings that are stable with either of my other two memory kits mentioned above it refuses to boot and I have to clear CMOS. I cannot figure out why that is the case and it does not make any sense. Maybe the kit is rubbish, but I don't have an Intel motheboard to test it in to find out. However, I have experienced the exact same problem trying to use XMP memory that I had that were stable on Z790 motherboards that were also unusable on X670E and X870E motherboards, so I think it is something peculiar with the XMP memory, even though it makes no sense that manual tuning doesn't work. I have been trying to get the memory latency on the 4585PX below 60ns and it is tough to do. This CPU is nothing more than a better binned 9950X3D and I think this one might be slightly better than average. While the memory and L1, L2 and L3 read/write/copy and latency values are a direct result of clock speed, I wondered if the 3D V-Cache was making the latency higher. It seems it is, but not by enough to be statistically relevant. I asked Google Gemini about it, and this is what the AI regurgitated for me. The "chef" analogy (below) is pretty good. I thought @Raiderman and @jaybee83 might find this particularly interesting. Per Google Gemini - for whatever that is worth... Question: Do AMD X3D CPUs have higher memory and L3 cache latency than the non-X3D processors? Yes, technically they do. It is a paradox of the architecture: AMD X3D CPUs generally demonstrate higher latency for both L3 cache (in cycles) and system memory (in nanoseconds) compared to their non-X3D counterparts. However, they are faster in gaming because their massive cache capacity prevents the CPU from needing to access system memory in the first place. Here is the technical breakdown of why this happens and why it doesn’t hurt gaming performance. 1. L3 Cache Latency: The "Stacking Tax" The primary difference between an X3D chip (like the 7800X3D) and a non-X3D chip (like the 7700X) is the 3D V-Cache technology. AMD physically stacks a slice of SRAM on top of the processor die. This introduces a slight delay. The Penalty: Accessing the 3D-stacked cache typically adds a ~4-cycle penalty. The Clock Speed Factor: Latency is a product of cycles and clock speed ($Time = Cycles / Frequency$). Since X3D chips run at lower clock frequencies to manage heat, the absolute latency in nanoseconds is slightly higher than non-X3D chips that boost 400-500MHz higher. Feature Non-X3D (e.g., 7700X) X3D (e.g., 7800X3D) L3 Cache Cycles Lower (Faster) Higher (~4 cycle penalty) Core Frequency Higher (~5.4+ GHz) Lower (~5.0 GHz) Result Lowest raw L3 latency Slightly higher L3 latency 2. Memory Latency: The Benchmark Illusion If you run a synthetic benchmark like AIDA64, an X3D CPU will almost always show higher memory latency (worse performance) than a non-X3D chip. Reason 1: Clock Speed: Memory latency is sensitive to core frequency. Because X3D chips have lower peak clocks, the time it takes to request data from RAM and process the return is technically longer. Reason 2: Testing Methodology: Synthetic benchmarks deliberately force the CPU to fetch data from System RAM (DDR5) to test the worst-case scenario. They bypass the cache to measure the "wire delay" to the memory sticks. In this specific "worst-case" scenario, the non-X3D chip wins because it has faster clock speeds and no cache-stacking overhead. 3. The "Effective Latency" Paradox If the latency is higher, why are X3D chips faster for gaming? The answer lies in Hit Rate. Non-X3D: Fast access to L3, but the cache is small (32MB). The CPU frequently "misses" (can't find data) and must go to System RAM. Penalty: Trip to RAM takes ~60–70ns (eternity in CPU time). X3D: Slightly slower access to L3, but the cache is huge (96MB).7 The CPU almost never misses. Benefit: It stays within the L3 cache, which takes ~10–12ns. The Math of "Effective" Latency: Even if the X3D cache is 10% slower to access, avoiding one trip to system RAM (which is 600% slower) saves massive amounts of time. Analogy: Imagine you are a chef. Non-X3D: You have a small fridge under your counter. You can grab ingredients instantly, but you frequently have to run to the grocery store (RAM) for missing items. X3D: You have a giant walk-in pantry. It takes you 2 seconds longer to walk into the pantry than the small fridge, but you almost never have to leave the building to go to the grocery store. You cook faster overall. Summary Raw Latency: Higher on X3D. (Due to lower clocks + stacking penalty). Effective Latency: Lower on X3D. (Because you rarely hit the slow System RAM). If your workload fits entirely inside the cache (like most games), X3D wins. If your workload is random and too large for even the X3D cache (like some scientific rendering), the non-X3D chip might win due to higher clock speeds. The video below is relevant because it directly compares the older 7800X3D and 7700X, illustrating how the theoretical latency differences translate into actual gaming performance. Ryzen 7 7700X vs Ryzen 7 7800X3D - Any Difference Or Not? TL;DR - See below: Lesson #1: If you want to go with AM5 and care about Cinebench scores buy an X870E-E Strix or X870E AORUS Master and save yourself $200-$300. If you a lottery winner have an extremely rare Ryzen golden silicon CPU sample and an extraordinarily superior memory kit and only care about synthetic memory benchmarks and nothing else matters, cough up the money for an Apex then have fun watching paint dry overnight with Karhu speed tests that have no measurable bearing on normal, real-world performance results in anything other that Karhu. Lesson #2: Unless you want to waste a lot of time and drive yourself insane, don't try to get an Intel XMP memory kit to work on AM5. Unless you get lucky somehow, you are doing to hate yourself for not just buying an EXPO memory kit. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
There should be no yellow visible even when it is still in the PSU box. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
Ooops, I sure did. I updated the post with this image. Pretty huge difference in high and low FPS, and those runs are 4K with no frame generation. CCD0 only (3DvCache only) - I used Process Lasso for this instead of Windoze trash -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
Awesome. I am looking forward to seeing how it runs, brother. I got up early to delid the CPU. It always amazes me how much better than makes things. I increased the core clock 100MHz across all 16 cores and my max temperature is still over 20°C cooler. I hate owning CPUs that are not delidded. The experience sucks with the IHS. So, I gave up on trying to use Windoze worthless trash software bloat to determine whether or not 3DvCache actually worked or not. Micro$lop butchers or botches up everything they get their hands on. I used Process Lasso to set parameters on the EXE file to force 3DvCache CCD0 use or all-core use with both CCDs. As you can see below, it DOES make a difference. All Core (no 3DvCache) CCD0 only (3DvCache only) - I used Process Lasso for this instead of Windoze trash All Core (no 3DvCache) CCD0 only (3DvCache only) - I used Process Lasso for this instead of Windoze trash -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
Have you identified anyone that owns a Zotac 5090 AIO version? I haven't gone looking but I don't recall seeing anyone in the forum at oc.net that has one. In general, I do not want an AIO GPU, (or CPU AIO,) but definitely think it is better than air. I do not care for the appearance of most of the AIO cooled GPUs. I think most of them are pretty garish-looking and I do not like the short and stubby look. At least the water blocked GPUs are not nearly as short and stubby as the OEM AIO models. -
When you know most of the users of your product are painted into a corner and need to use your product no matter how lousy it becomes, that makes cleaning up your messes and keeping users happy a very low priority. Switching to Mac OS or Linux is not an option for 99% of Windoze users and they know it. Most either cannot make such a switch due to lack of equivalent software (which could make it virtually impossible from a business continuity perspective) or lack of technical knowledge on the part of the user base that would totally incapacitate the users (consumer and business).
-
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
The normie sheeple and ignoramus population has become so vast and prevalent the OEMs no longer give a rat's butt about enthusiast money because they don't need to. We no longer matter. There are no customers, only victims. Clueless morons pay asinine prices for things they know nothing about, and they do it with a smile. They believe all the ads they read, all the lies their "friends" on Facepoot and Reddtit tell them, and all of the disinformation they watch on the "news" so is it any wonder that clowns like A$$zeus behave as if they have a license for sodomy? We have both to blame: the stupid sheeple and the shifty thugs peddling grossly overpriced, misrepresented and overhyped crap. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
Works fine on the Zotac Solid OC. So I am guessing with the shunt resistors I've got like 1600W (which I am pretty sure it cannot reach). -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
That Matrix GPU is only good so others can use the vBIOS. The price is extremely stupid and I think it looks stupid, too. 🤣 I have gone pee in my own backyard before. When you have a family of 7 and 3 bathrooms, there are times when you just can't hold it any longer. Now that we are down to 2 people at home (empty nesters) and 2 bathrooms I have not found myself needing to do that for a long time. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
You hate cold and I hate heat. My body has adjusted to the hot weather here, but too hot is a health risk that is equally hazardous as freezing to death. Extreme heat and extreme cold are both an issue. Temperature aside, I love living in the desert because I do not like precipitation in any form. I hate rain and I hate snow. I like it to be very dry, clear skies and low/no humidity whether it is hot or cold. I find it easier to get warm if I am too cold. I can bundle up and add more blankets to the bed. If I am too hot it is harder to fix that problem. You can be wet and naked and still be too hot, LOL. Both can kill you. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Mr. Fox replied to Mr. Fox's topic in Desktop Hardware
I delid my own with the der8auer tool. Works fantastic. You just have to spend about 30 minutes moving it back and forth with the tool until it is loose. Tedious but foolproof as long as you keep going until it falls off and don't try to lift the IHS before it is loose. Usually in the range of 500-1000 Cinebench R23 points lower with the Apex. AMD CPUs are inconsistent between Cinebench runs and can have a wide variance between runs with the same CPU. There are too many artificial algorithms in play with Ryzen. But, I can't get within 500 to 1000 points of my best scores with the Strix or AORUS Master using the Apex with the same CPU installed. It almost has to be something in the firmware. It would not seem logical for something with the motherboard hardware to affect it. Also worth mentioning that it is with the same Windows installation. To rule out an OS issue I create an image from one, restore it using Macrium Reflect on the other and reactivate. So all of the same software, tweaks, drivers and OS tuning on both machines. This makes no sense to me. There is probably something I need to do different that I don't know about. Maybe an obscure BIOS setting I need to turn on or off on the Apex. This should be an almost entirely mathematical outcome based on core count, IPC and clock speed as long as something is not interfering with the Cinebench run. This EPYC CPU seems like a really good sample and I am pretty sure I am going to keep it. It seems better than average looking at the V/F curve and voltage requirements in practice. Getting it to function like an X3D in a scenario where it should has not worked for me yet. When I try to configure it to use the 3DvCache CCD it is defaulting to the non v-cache CCD no matter what I have tried so far. Configuring it to function like an ordinary CPU it seems fine, so I suspect this is a firmware configuration problem, an OS/game/chipset driver bug issue, or even an ignorant noob issue on my part (relating to firmware configuration). I did a clean OS install with the latest chipset drivers and set everything that is supposed to be how it is done that I know of and it still parks the wrong CCD and activates the one without v-cache, LOL. I don't care about the v-cache necessarily, but if I can use it as leverage for 3DMark scores I certainly want to take advantage of it. I don't expect it to matter much at all with something like Cinebench, wPrime, Pifast, or Y-Cruncher.