Jump to content
NotebookTalk

Leaderboard

Popular Content

Showing content with the highest reputation since 09/17/2025 in all areas

  1. Just scored a 980Ti for 50 bucks on ebay. Not really anything to write home about, except that it appears the owner was not aware that it came equipped with the morpheus 2 aftermarket heatsink which can't be sourced for anything less than 250-500. Obviously not going to spend that much on just a heatsink so I'm quite happy with the purchase. It's going to be a busy winter this year :)
    7 points
  2. 7 points
  3. So I finally have a formal staycation lined up for christmas time, first time in maybe 6 years...So naturally I have some work lined up lol 2x GTX 690 w/ accelero xtreme 1x ATi HD 5870 w/ accelero xtreme 2x GTX 280's (1x w/ accelero xtreme) 2x 9800 GTX's w/ accelero xtreme (Having to modify, doesnt line up perfectly) 2x ATi HD 5970's (1x accelero xtreme) 1x AMD R9 280X TRI-X 1x AMD R9 390X TRI-X 1x ATi 4870x2 Reference 1x GTX 295 Dual PCB 1x GTX 970 MSi Gaming 1x RTX 3090Ti 1x 7900 XTX Red Devil (Never Finished) 1x GTX 1060 3GB (Reference) I might pick up a few more GPU's if the price is right. Thanks to @Mr. Fox's writeup I'll likely start to pick up some of the water cooling equipment so I can put the 7900 XTX and 10850K (also courtesy of The Fox!) under water and get a bench built around that next to my Daily Driver. Luckily I am still working more than I sleep so picking these parts up should be more than plausible... If you guys have any 3rd party heatsinks for older cards feel free to reach out, I seem to have a fascination with accelero xtremes :)
    6 points
  4. Love Steve's opening comment... 🀣 ...and who doesn't that isn't a Kool-Aid drinker? But it is always nice to hear it spoken out loud by an "influencer" on the biggest technology shill platform known to man.
    5 points
  5. well the scam part goes without saying, thats just the new normal nowadays. but the fact that crossflashing to different SKUs is BACK is absolutely awesome πŸ˜„ btw, first boot in the new case + new fans + new CPU cooler last night, w00t w00t! all fans are spinning and the wifi 7 module works beautifully after driver install, 5.8 Gbps wifi speed for the win! 😁only got two minor gripes ill need to iron out in the coming days: three of the fans arent illuminated, so need to check the cabling. and ill also need an additional y-adapter with 4-pin PWM connectors to clean up some of the cabling up front / move it to the back. installing and managing a total of 15(!) 140mm fans takes its toll πŸ˜„ will be sure to post some pics once everything is fully functional and tidy πŸ™‚ havent properly set up my fan curves yet but i did some quick testing with my 9950X3D in CB24 at stock, it maxed out in the mid 60s with 200W peak LOL. absolutely ridiculous πŸ™‚ in any CASE (pun intended!), the Phanteks NV9 is BIG and its finally got enough space to cleanly manage ALL the cables on the backside, me likes.
    5 points
  6. That's beautiful and...AND plenty of space for more GPUs to go up on the wall.... 😍
    5 points
  7. Seeing as AMD offers a complete vertical offering with CPU and GPUs, it makes sense even if it makes me feel a lot of icky. I suspect over time, based on how things go, Nvidia will continue to purchase more of Intel and could actually acquire a majority stake if not buy them outright at some point. This is the logical conclusion and AMD vs Nvidia is headed to final boss mode showdown. Heck, maybe Nvidia can right their CPU division down the road. Let's not forget Nvidia is an American company too as is AMD. A lot of this is on Intel.... And a lot of this is on AMD....
    5 points
  8. got the new case in, old case almost empty of hardware now. disassembled the motherboard last night for a nice thorough cleaning and also replaced all thermal pads on the SSDs, as well as the VRMs and chipsets while i was at it πŸ™‚ of course, i also swapped out the intel Wifi6E card with the new Qualcomm Wifi7 model. jeez what a crazy amount of steps and screws and heatsinks it takes to actually GET to the wifi card! whereas in laptops i just remove the back cover and bam, there it is....they could learn a thing or two from that on desktops πŸ˜„ next step: get the PSU and all the cabling out of the old case and prep the Noctua fans for the new AIO. that and the new fans have also arrived, so now just waiting on the extra bling light strips for the case, but that aint time sensitive for the transplant πŸ™‚ its fun to be tinkering again, but cant afford more than 2h every night before i basically drop dead from total exhaustion πŸ˜…
    5 points
  9. Hello everyone, long time no speak. Been super busy with stuff. Today I was pondering windows 7 and was wondering if this was possible? Um not too familiar on the ins and outs of hardware documentation. I asked grok A.I. this question.
    4 points
  10. All I'm saying is the math isn't mathing....and luck is not at play in engineering fault tolerance testing. Plus with the user reports all we have is their subjective takes ("I swear I checked everything right before going to church and feeding the homeless!") usually rife with oversights and human error we do not know or see. All we see is the end product which is a burned cable and/or fused connection. Then people run with it..... If the connector was that flawed, those who routinely push their cards should suffer a much noticeable higher rate of failure yet amongst the enthusiasts on the forums the rate of failure is minuscule if even that. I've long thought this since the 4000 series and it continues to prove itself out over and over. I frequent the various enthusiast forums, enthusiast FB tech message groups, tech focused Discord along with clans and guilds chock full of 4090 and 5090 owners gaming 8-12hrs/day and more waiting for this influx of burned connectors amongst the various types used considering how hard the hardware is being pushed even on non shunted cards running max OC's constantly hitting 600-650w....yet......much ado about nothing. I ran my 4090 OC'd from start to finish. I am doing the same with my 5090 too. One thing I do for my cards since Ampere is I run them max Underclocked when not gaming/benching then Max OC when gaming. Look, it's a cheap part to replace for the cable. I can swap in numerous types or models if I want. That isn't the issue. Zero loyalty to a damn connector cable 🀣 The one and only time I had issues was with the CM adapter cable I was going to use and it kept black screening and crashing and it turned out to be severely problematic and was a legitimate issue even outside of human error or potential inherent problems. The replacement one they sent me is still sitting sealed in its pouch on my shelf. I am really curious about the Ampinel and am almost 100% I'll be picking one up just out of curiosity. And you're correct, many on OC.net DO spend more money on quality products than normies. That just adds even more weight to my argument. With all that being said, I still wish they had stuck with 8-pin PCIe connectors but I guess with the way power consumption is going, we would end up with cards with 4-5 of them if not more at some point on certain models. Imagine a HOF with 8 8-pin connectors..... 🀣
    4 points
  11. From the beta thread in their Discord. https://discord.com/channels/750797327874129930/752301932558942302 Alternatively https://www.overclock.net/posts/29518003/
    4 points
  12. Dang. One set of dark sunglasses ain't enough looking on all that RGB😁
    4 points
  13. aight boiiiiz, new setup is done, pretty happy with how it turned out πŸ™‚
    4 points
  14. I did! Just finished cleaning it up. It's about 5 slots now, these fans are loud but they push a lot of air when pushing 100%. Either way it's ready for benching this winter. Also cleaned up 2x 650ti Boost 2GB cards
    4 points
  15. holy crap one of the most iconic songs of all time, blasting it through my headphones right now yessssss πŸ˜„ ladies are asleep, finally some alone time for daddy to continue tinkering πŸ˜„
    4 points
  16. Dang🀩🀩🀩And all black. You have a good taste for quality brother😍
    4 points
  17. For the lovers of Crocs it's not about looks. It's about how it feels. But I agree with Fernando in that it is better to look good than to feel good.
    4 points
  18. @electrosoft Starting to run the Pro Max 18 Plus through its paces. https://imgur.com/a/32ivRNC https://www.3dmark.com/spy/58934149 It has full W10 driver support which is great. Surprisingly enough, WoW runs properly on the P-cores without any micro managing via Process Lasso just like it does on W11. SuperPI also selects the best P-core when run as well so CPPC / Thread Director must be working better now with Arrow Lake. I have heard the GPU can sustain 150W and that leaves around 50-60W for the CPU in combined loads. However when running OCCT while I was working on another project I noticed it was down to 180W combined load after a while. It also was draining the battery (12% over the 1 hour test) so the 200W advertised really is not sustainable or good for battery health. I am hoping the "Optimized" performance mode is one where it keeps limits out of the battery drain range.
    4 points
  19. Where is the equally cheaper 5090's ? Sub $1800 would be much easier to swallow. GeForce RTX 5080 now 10% below MSRP, RTX 5070 and 5060 Ti down 13–14% at MicroCenter See also my latest post here..... Everyone should be able to be offered something better from Redmond R***ds.
    4 points
  20. Fortunately, gaming is an expendable and totally unnecessary pleasure to me. The solution for me is a simple one: don't purchase any games that mandate the enablement of Secure Boot or TPM. There is not any, and never will be any, game important or enticing enough for me to make an exception or compromise my personal choice of disabling or enabling anything that is against my personal preference. The part that angers me is the carte blanche requirement for those that do not care about online multiplayer gaming and find online gaming cheating to be totally irrelevant. If they are going to make it a requirement on that basis, it should only apply to online multiplayer and not single player campaign. I know they are stupid and I know that can't fix that. There's no point in trying or allowing them and their hopelessly idiotic nonsense to waste my time. At the end of the day I view it as a first-world problem for the game devs. I am more than happy to ignore their products and not give them my money. When all is said and done, "not my circus, not my monkey" and "frankly, Scarlet, I don't give a damn" is my position on the matter. They can join the Redmond Reprobates in having a standing invitation to kiss my butt. It sucks to be them more than it sucks to be me. I only care about me and what they want doesn't matter.
    4 points
  21. 8K gaming beast right there. I do miss the old days though. I wish it would go back to when majority of people didn’t know what PC gaming was lol. GPU’s sat in stock on Newegg for low prices. I know it really sucks. Intel was always the best. I think they need to release a TRUE 60K+ R23 BEAST-Zilla mainstream chip, and Better gaming performance than 14900KS, and they will win back the crowd love right away. I feel like everyone would want one. I’m waiting on Intel to turn it around. I still run an Intel rig daily my self. Not really sure what their plans are for HT.
    4 points
  22. I fixed my cracked 5090 FE block! RTX 5090 FE Block Plexi repair, and re-mounting!
    4 points
  23. When you feel scammed by AMD and our old good friend F. Azor XT = Xtreme tax🀐 Flashing XT BIOS on AMD Radeon RX 9070 Yields up to 25% Performance Boost If you have a Radeon RX 9070 you can make it up to 20% faster: VBIOS flashing is back If you own or were planning to purchase a non-XT AMD Radeon RX 9070, you're in luck thanks to flashing its BIOS. For context, BIOS flashing was common about 15 years ago And more nasty from tech world. Here you can see the real reason Microsoft needed all the implemented telemetry in modern OS from Windows 8 to 11.
    4 points
  24. I've thought every release in the Borderlands game franchise sucked due to crappy-looking watercolored cartoon graphics. It feels like an animated comic book to me. I greatly prefer games that strive to achieve photorealism and the less realistic the graphics are the less I like it and much greater the likelihood I willl never play it more than 1 or 2 hours tops. Too bad we live so far apart or we could have swapped WiFi cards from either of my motherboards. This is an area where cheap motherboards sometimes have a design edge over the more expensive models. The cheap models often have the WiFi/BT modules mounted in the same area that the M.2 SSDs are installed. It sucks having them under the rear I/O cover. Both of mine would be in the motherboard box if they were not installed in such an inconvenient spot. The one and only thing I dislike about the AORUS Master is that Gigabutt provides no option in the BIOS to disable the onboard WiFi/BT module. It is the only brand I am aware of that omits that option in the BIOS. I contacted them almost immediately when I first installed it and discovered that firmware defect. I alert them to their engineering mistake and asked them to provide me with a custom BIOS that wasn't missing that feature and encouraged them to include it in all BIOS releases going forward. They replied quickly but refused and said I was the only "customer" that has ever complained and requested that option, and they won't consider changing anything unless numerous customers begin to complain about not having it. I contrast that sort of disregard for customer experience with EVGA, who made several BIOS changes at my request and made them standard options in firmware releases going forward.
    4 points
  25. This game is a joke. The graphics are lame. The physics are lame. This is good meme though lool. Definitely possible to overwhelm any GPU and make it run slow though, I can go run Furmark too lol. But so many games have far better graphics, consume even more VRAM, and run way faster than Borderlands 4. I can run KDCII twice on my overclocked and shunt modded 5090FE, and still get better performance than Border Lands 4. I don’t know why some Game Developers do this. I watched this video @Papusan. And I’m actually playing Kingdom Come Deliverance II for the 2nd time. 5090 smashes it 4K full native all settings on experimental graphics, the 4090 could do the job at 4K native all experimental lol. Funny he used that in his video. Because the game is HUGE, and the graphics are great, the game is GREAT.
    4 points
  26. Yup. "The game is pretty damn optimal - which means that the software is doing what we want without wasteful cycles on bad processes," Randy Pitchford says in one post in a long thread. And for those experiencing low FPS or wanting to hit higher than 60 FPS on PC, he adds, "Use DLSS. It's great. The game was built to take advantage of it." Borderlands 4 becomes the new Crysis as Gearbox says it's already running 'optimal' on PC https://www.tweaktown.com/news/107746/borderlands-4-becomes-the-new-crysis-as-gearbox-says-its-already-running-optimal-on-pc/index.html
    4 points
  27. I missed all of you too. Outside of the people here, I really don't have anyone to talk shop (tech) with. Some nights, I get the itch to overclock something, but my 14900KS desktop lies dormant. I am sure if I had a big boy GPU, I would have an excuse to do it. And here I thought 3D printers would be a laid-back adventure, haha - far from it. I have my eye on this - Carvera Desktop CNC machine. It's a little pricey, but it is basically three 5090's, 🀣 I want to design and make my own CPU/GPU blocks. Anyway, I'll be around.
    4 points
  28. Hello everyone! Do you want to upgrade your old laptop MXM GPU? But you can't since the newest RTX Turing, Ampere and Ada aren't working with your eDP display? I have the solution for you! The backlight mod to enable eDP display for your MXM card! Finally the mystery was resolved on why these cards don't display in our eDP laptops. Every cards from Turing to Ada generation, from brands like Aetina, Adlink, PNY, ZRT and X-Vsion will work with this mod! What was the issue with these cards then? They were all lacking 3 essential pins. These pins are used to power on the panel. They are called backlight pins! Without them, your screen stays black, even if you have eDP or DP signal in the right video port. The solution: Making a solder-less mod, a flex FPC cable that allows to power the 3 backlight pins with ease! Version 1 manufactured. The 3 backlight pins are set in 23, 25 and 27 pins of the MXM slot and the MXM card. What we're trying to do is to power the backlight pins using the 3.3V from the pins 278 and 280, that's it! The flex cable is going to be insulated and will be able to handle high temperatures! I'll use double-sided insulated tape for the back side of the backlight pins, to secure the installation. Where can I get one? You might ask. You can purchase them online, I'll even give a PDF file for instructions. Price (without shipping): 16.5€ on PayPal Friendly (DM me) 18€ on eBay (listing available, just search for it) We can finally have an RTX 4080 or an RTX 4090 in the master slot of the MXM and output thru the main eDP display! Clevo P870 with dual RTX 4090? P570WM with an RTX 4090?! P775 with an RTX 3080 Ti?! Will work with (for MXM slot that allows eDP or DP on DP_D and cards that lack the backlight pins) RTX A3000 (XVSION), RTX 3060, RTX 3070, RTX 3070 TI, RTX 3080, RTX 3080 Ti (ZRT, X-VSION) RTX 4050, RTX 4060, RTX 4070, RTX 4080, RTX 4090 (ZRT, X-VSION) RTX 3500 Ada (Aetina), RTX 5000 Ada (Aetina), RTX A4500, RTX 3000, RTX 4000, RTX 5000 (Aetina, PNY, Adlink) (Tell me if I forgot some cards) Brightness control works on my P570WM with 4080 and 4090 if the vbios is set to eDP out. Sleep mode and saving modes work. RTX 3000 PNY and RTX 4070 worked on the Clevo P870, need to test more cards such as 4080 and 4090. Disclaimer, Requirements! - This mod can work with laptops that have the eDP set to DP_D and has the backlight pins assigned in 23, 25 and 27 pins in the slot - The BIOS of the laptop shouldn't have any whitelist - The BIOS should be in pure-UEFI - You need to have an eDP display that supports DP 1.1-1.2 (2011-2013 laptops) at least for RTX Ampere GPU - You need to have an eDP display that supports DP 1.3-1.4 (2014-2019 laptops) at least for the new RTX Ada (or install/edit EDID data in the OS with old DP 1.1-1.2 screens) ___________________ After a lot of research and some testing with soldering, that's the conclusion to the solution! Thank you!
    3 points
  29. I probably would not even try to refrain. Sleep is optional when there are new parts to be installed. 🀣 UPS tried to deliver and he was not home so he will be picking it up from a UPS location probably later today.
    3 points
  30. MIcrosoft choose to fight their customers and destroy themselves, No empathy for them. I knew it wouldnt be long until there another way to make a local account, I bet they going to patch these methods too eventually but then there will be another way to counter their fix!. In other news Windows 7 is slowly coming back!. https://gs.statcounter.com/os-version-market-share/windows/desktop/worldwide @Mr. Fox Will be happy 😁
    3 points
  31. I do not want to jinx any of us by saying too much. Is it because we do not use janky cables? Is it because we are extra careful about making sure cables are fully seated? Is it because the number that burn are a very tiny percentage that the media blows grossly out of proportion because they live or die based on click bait? Or, is it because we are lucky? Let me see what the Magic 8 Ball says...
    3 points
  32. Me.... Love all black. Everyday the same. Even my house is painted in blackπŸ™‚ And me supporting Microsoft with my money? Nope, won't happen. Hell freeze first!
    3 points
  33. I am trying to figure out how to run Cinebench in real time under Linux. The renice and chrt commands only seem to work on one thread and skip all of the child treads. Maybe it is because of wine being used to run it. Scores are about what I would expect for no tweaks.
    3 points
  34. Well, the new 9950X is close to an identical match to the better of my existing two. So, definitely not a lottery winner but good enough to keep. Look how close they are. Crazy. (#1 is the older CPU and #2 the one that arrived today.) The newer one might be just a hair better, but not enough difference to matter. So, now it is delid and bare die and then test the 6000C26 32GB kit (which is already installed and running the same custom profile as the old 48GB kit).
    3 points
  35. Yes, that is Red Fox. He is a smart boy, but has a sassy mouth. @electrosoft, @Raiderman and @jaybee83 there is a new version of ZenTimings available that shows all of the sensors for voltage now. I will test to confirm that is true on the Strix as well, but it shows all sensors on the Master. It also shows tPHYRDL is matched whereas the last beta showed them mismatched (even though it was not). Also shows the Nitro settings in the lower right corner.
    3 points
  36. Nice to see something decent going down in price. Total opposite of what we normally see. Thermalrite products are generally respectable quality.
    3 points
  37. What the hell man!!!! Et Tu @Raiderman?!?! 🀒 On the other hand, the system looks great as long as you remove that abomination from the foreground.... 🀣
    3 points
  38. Nice! I'll have to give these a once over and see what they do on my Hero x870e and 9800X3D w/ T-Force 8200 sticks and B650i Aorus Ultra (1DPC) and 9600x. I've just been running 2200/6400 (Hero) and 2000/6000 (Aorus).
    3 points
  39. The latest Strix X870E-E BIOS improved the memory tuning slightly. Finally able to get the latency down from about 58ns to below 56ns. Nothing spectacular, but small improvement is better than a small regression. I'll play with retightening the tRCDWR to see if I can eek a little more mojo from it without losing any stability. It's interesting that it's not better, considering the overhyped "Nitropath" gimmick ASUS chirped about and played up to be such a great invention. It has that feature but I've yet to identify any real or tangible value that it added. It is still not as good as the AORUS Master, but I have been too lazy to move the 5090 over to it and install the 4090 in the Strix. I haven't felt like messing with the cooling system and the tubing/fitting orientation is different. When I get around to it I am going to try to normalize the two configurations so I can freely move the GPUs back and forth without any kind of rigmarole. The Strix has the better of the two 9950X CPUs in it, so I should probably swap those out while I am at it. Maybe I will get ambitious when I install the EVC2 mod on the 5090 and do all of that at once.
    3 points
  40. I want to see the new Intel Nova lake with 52 cores. It’s going to have 32 E-Cores, and 16 P-Cores. This would be brutally fast! Even if it had Raptor lake IPC, and was locked down, it would be scoring 57,750 if the P Cores ran at 5.5Ghz, and the E cores ran at 4.3Ghz. (I’d still want one lol) I can only assume it would probably operate at similar or maybe lower speeds to save on power and run cool-ish. So, it would be an absolute BEAST either way. Such a chip would be amazing.
    3 points
  41. I’m playing Dying Light The Beast, this game is very well optimized! Better graphics than Border Lands 4. Game: Dying Light The Beast Release Date: 9/18/2025 Graphics: Maxed out Resolution: 4k Technology: DLAA Native 4K Frame Gen: x4 Frame Rate: 450-500fps Thoughts: Ridiculous. Because my CPU is completely stock by accident when turning on my PC this morning, so it’s at 3.8Ghz and it’s old as hell. 🀣 I believe this company learned well, because hardly anyone could play Dying Light 2 even with a 3090. My son is getting almost 200fps at 1440P maxed out graphics, with FSR frame Gen x2.
    3 points
  42. Thanks for your answer ,but you are mixing two different things to one still. This is the problem. My question was only about lcd backlighting and not about with the video signal where going thru muxeses whatever are (iGFX or SG) . These things are on mobo absolutely on different places and they are meeting up to at a single point at the end, which is the LCD connector. Ofcourse i known about the : - non dimming issue (Dimming maybe will be solved by a third-party app thru aux lanes , but it is not ideal.) - non goes off lcd when close lid issue - non power saving lcd settings issue and mabye other problems with your jumper wire solutions. Sorry ,but it doesn´t not enough to me honesly and for my 12-year old notebook too. It was clear to me ,that when you are presenting this own solution, it will needs to be improved. That's why I mentioned that version 2 of this solution needs to be developed because is unfinished yet. Maybe will be better tell the people here still in progress and it needs more testing when they are spend money for this. But as i wrote like it very much that flex board idea. Good luck. As a @ssj92 wrote: Never give up 😁
    3 points
  43. As predicted by mid Sept at the latest, price drops bottomed out on the 5090 (don't say I didn't tell ya a month or so ago), and now stock is slowly drying up on several models and prices are slowly trending back up on several models especially with MSI cards where they are also out of stock across many models everywhere on several models including the Vanguard which has gone back up from $2499 to $2599/$2669. Even the Ventus is back up to $2399.99. On the plus side, Asus Astral has dropped to $3199.99 (yay?) at Da Egg and Amazon atm.... 9070xt's continue to come down in price and a few models are back to their MSRP including the Taichi 9070xt at Microcenter for $729.99. With MCC discount, $693.50. I might snag one of these as they have the most robust build yet again this time around for my Jonsbo Z20 fun times and my sff testing is rocking a dirt cheap, open box Cool Master 850 SFX w/ 12vhpwr connector. ------------------------------------------------- Glad to see you check in @Rage Set! Hope everything is going well in life and yeah 5090s and 4090s even are still commanding a pretty penny. As stated above, prices are slowly trending back up on 5090s in the retail channel along with stock dropping. I'm seeing a lot of out of stock on many 5090 models at Newegg and Microcenter now where a month ago almost all of the models were in stock everywhere. ------------------------------------------------------- Funny to see China sour on Nvidia a bit lately especially with all the improper channels to get cards into the country at better prices. As long as they can memory mod the 4090s and 5090s in tandem with the cost for a 6000D, Nvidia will win out in the end either way. They can just slow production and redirect dies elsewhere as needed. Two points here. #1. The Japanese Yen is very weak atm so I would expect higher costs. #2. Making a better / more robust version of something doesn't mean the original isn't still competent. Look at the tiers of PSUs when Gold is more than enough in the face of Platinum and Titanium. Look at their tiers of GPUs all with the same die in there and the price variance based on the surrounding components. This is no different. On the other hand, this is Asus...... 🀣 All you have to do is use DLSS and FG..... (massive eye roll here) but in all seriousness, this is how it goes (sans poor coding/optimizations). Devs push the boundaries in upper end settings. Modern cards struggle to run it as is pushing high end settings (4k, High/Ultra settings, etc...). As time passes, what was once nearly impossible to run will eventually run with ease as hardware advances over X amount of years I like to call this the 'Crysis Paradox' πŸ€‘ ----------------------- At this rate, you're going to be in competition with @Papusan for collecting and benching older cards! Do either of you have a dedicated wall kind of like JayZ2C has for his EVGA card collection for your collections?
    3 points
  44. We missed you, brother. Hope all is well with your family, too.
    3 points
This leaderboard is set to Edmonton/GMT-06:00
Γ—
Γ—
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use