Jump to content
NotebookTalk

Clevo with Framework GPU interconnect standard?


KabyZen

Recommended Posts

1 hour ago, jaybee83 said:

welp, one can dream...i like the way that OP it thinking. however, in the end, Clevo is now following the money trail, thin n light BGA crap FTW! 😢

Or go  to Apple ARM M series chip , 60w consume and get the same gpu performance as a stock mxm 2080. The cpu power is grater than the i9 .... If they managed to release the direct X support , then in the gaming bussiness will be completly broken. 

  • Thumb Up 1

Clevo P751TM1-G 8700K/16GB/GTX 1080@120W/1.5TB/FHD 144 G-Sync/ Asus PG279Q

Link to comment
Share on other sites

5 hours ago, Csupati said:

With TB4 they managed to bypass the PCH and other interconnection,  its directly wired to the cpu.

Only on certain implementations like 45W H series 11th Gen and up, but not necessarily on any HX chips.

 

Also TB4 has increased latency due to PAM signal encoding so it was always feel slower than PCIe even at the same bandwidth.

Desktop - 12900KS, 32GB DDR5-6400 C32, 2TB WD SN850, Windows 10 Pro 22H2

Clevo X170SM - 10900K LTX SP106, 32GB DDR4-2933 CL17, 4TB WD SN850X, RTX 3080 mobile, 17.3 inch FHD 144hz, System76 open source firmware, Windows 10 Pro 22H2

Link to comment
Share on other sites

41 minutes ago, Csupati said:

If they managed to release the direct X support , then in the gaming bussiness will be completly broken. 

 

Ha ha... (As a guy who recently dumped Windows laptops for a MacBook Pro, and has been happily using it for gaming...)

  • As you may be aware, in June, Apple released beta software "Apple Game Porting Toolkit" (GPTK), which is basically a custom packaging of Wine with an additional library called "D3DMetal" which translates DirectX 11/12 calls to Metal calls.  It was presented as an evaluation tool for game devs considering a possible macOS port, but folks on the Internet had full Windows titles like Elden Ring up and running in a matter of hours.  GPTK was released with a tight license, preventing it from being used with any third-party products or game ports.
    • The guy at Apple driving GPTK is Nat Brown, who used to be a graphics & VR guy at Valve.
  • Since then, Apple has been releasing updates to GPTK every few weeks, and performance has gone up bit with each release.
  • Tooling built up around GPTK like CXPatcher and Whisky, which made it a bit easier for non-technical folks to try to use it.
  • In August, Apple changed the license to allow redistribution of the D3DMetal library.  (Still can't be used to ship a game port, though.)
  • In mid-September, CodeWeavers released a beta version of CrossOver (23.5) which integrates D3DMetal, making it much easier for "anyone" to use this to run Windows DirectX 11/12 games.  Install CrossOver, install the Windows version of Steam, toggle D3DMetal on, install a game, and run it.
  • Later today, macOS 14 "Sonoma" is launching and all of this stuff should be heading out of beta.

Definitely an Interesting time to be a Mac gamer.  (There's also really interesting stuff going on in MoltenVK.)  There's still a notable hole, though; there is currently no way to run games that use AVX instructions on Apple's ARM CPUs, because neither Rosetta 2 or Windows 11's x86-to-ARM emulator support AVX instructions.  (Ratchet & Clank: Rift Apart, The Last of Us: Part 1, et. al.)

 

Not to derail the thread, this will be my only post on the matter, but there has been some chatter over here.

  • Confused 1

Apple MacBook Pro 16-inch, 2023 (personal) • Dell Precision 7560 (work) • Full specs in spoiler block below
Info posts (Windows) — Turbo boost toggle • The problem with Windows 11 • About Windows 10/11 LTSC

Spoiler

Apple MacBook Pro 16-inch, 2023 (personal)

  • M2 Max
    • 4 efficiency cores
    • 8 performance cores
    • 38-core Apple GPU
  • 96GB LPDDR5-6400
  • 8TB SSD
  • macOS 15 "Sequoia"
  • 16.2" 3456×2234 120 Hz mini-LED ProMotion display
  • Wi-Fi 6E + Bluetooth 5.3
  • 99.6Wh battery
  • 1080p webcam
  • Fingerprint reader

Also — iPhone 12 Pro 512GB, Apple Watch Series 8

 

Dell Precision 7560 (work)

  • Intel Xeon W-11955M ("Tiger Lake")
    • 8×2.6 GHz base, 5.0 GHz turbo, hyperthreading ("Willow Cove")
  • 64GB DDR4-3200 ECC
  • NVIDIA RTX A2000 4GB
  • Storage:
    • 512GB system drive (Micron 2300)
    • 4TB additional storage (Sabrent Rocket Q4)
  • Windows 10 Enterprise LTSC 2021
  • 15.6" 3940×2160 IPS display
  • Intel Wi-Fi AX210 (Wi-Fi 6E + Bluetooth 5.3)
  • 95Wh battery
  • 720p IR webcam
  • Fingerprint reader

 

Previous

  • Dell Precision 7770, 7530, 7510, M4800, M6700
  • Dell Latitude E6520
  • Dell Inspiron 1720, 5150
  • Dell Latitude CPi
Link to comment
Share on other sites

19 hours ago, Csupati said:

Or go  to Apple ARM M series chip , 60w consume and get the same gpu performance as a stock mxm 2080. The cpu power is grater than the i9 .... If they managed to release the direct X support , then in the gaming bussiness will be completly broken. 

 

welp that opens up a completely other can of worms: walled garden, even MORE BGA and add to that software locks (for which apple is infamous for) plus apple has to still prove themselves as a "friend to gamers". sure, they started doing at least SOMEthing, but theyll need to stand the test of time first. and yea, lets not even start talking about price to performance here.... 😅

 

18 hours ago, Aaron44126 said:

 

Ha ha... (As a guy who recently dumped Windows laptops for a MacBook Pro, and has been happily using it for gaming...)

  • As you may be aware, in June, Apple released beta software "Apple Game Porting Toolkit" (GPTK), which is basically a custom packaging of Wine with an additional library called "D3DMetal" which translates DirectX 11/12 calls to Metal calls.  It was presented as an evaluation tool for game devs considering a possible macOS port, but folks on the Internet had full Windows titles like Elden Ring up and running in a matter of hours.  GPTK was released with a tight license, preventing it from being used with any third-party products or game ports.
    • The guy at Apple driving GPTK is Nat Brown, who used to be a graphics & VR guy at Valve.
  • Since then, Apple has been releasing updates to GPTK every few weeks, and performance has gone up bit with each release.
  • Tooling built up around GPTK like CXPatcher and Whisky, which made it a bit easier for non-technical folks to try to use it.
  • In August, Apple changed the license to allow redistribution of the D3DMetal library.  (Still can't be used to ship a game port, though.)
  • In mid-September, CodeWeavers released a beta version of CrossOver (23.5) which integrates D3DMetal, making it much easier for "anyone" to use this to run Windows DirectX 11/12 games.  Install CrossOver, install the Windows version of Steam, toggle D3DMetal on, install a game, and run it.
  • Later today, macOS 14 "Sonoma" is launching and all of this stuff should be heading out of beta.

Definitely an Interesting time to be a Mac gamer.  (There's also really interesting stuff going on in MoltenVK.)  There's still a notable hole, though; there is currently no way to run games that use AVX instructions on Apple's ARM CPUs, because neither Rosetta 2 or Windows 11's x86-to-ARM emulator support AVX instructions.  (Ratchet & Clank: Rift Apart, The Last of Us: Part 1, et. al.)

 

Not to derail the thread, this will be my only post on the matter, but there has been some chatter over here.

blasphemy! how dare you jump over to the apple side of life! 😮 GAIZ, GET HIM! /s 😄 seriously tho, what made u switch?

 

jokes aside, interesting to see that Apple has made at least some strides into the right direction. in the end, its still going through translation layers and losing performance vs. native support. plus, as mentioned above, apple will need to prove they mean "gaming business" in the long term, so not just a quick n dirty project that quickly falls off the wagon come next hardware / macOS gen (after Sonoma)... so i remain a bit skeptical in that regard.

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022-24)
AMD Ryzen 9 7950X (TG High Perf. IHS) / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / Teamgroup T-Force Delta RGB DDR5-8200 2x24 GB / Seagate Firecuda 530 4 TB / 5x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 (Push/Pull 6x Noctua NF-A14 IndustrialPPC-3000 intake) / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG (3x Arctic P12 A-RGB intake / 4x Arctic P14 A-RGB exhaust / 1x Arctic P14 A-RGB RAM cooling) / Samsung Odyssey Neo G8 32" 4K 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB / PDP Afterglow Wave Black / Beyerdynamic DT 770 Pro X Limited Edition

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022-24)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel Core i5 12600 / Nvidia Geforce RTX 3070 Ti / Mushkin Redline DDR4-3200 2x32 GB / Samsung 970 Pro 1 TB / Samsung 870 QVO 8 TB / Intel AX201 WIFI 6+BT 5.2 / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

1 hour ago, jaybee83 said:

 

welp that opens up a completely other can of worms: walled garden, even MORE BGA and add to that software locks (for which apple is infamous for) plus apple has to still prove themselves as a "friend to gamers". sure, they started doing at least SOMEthing, but theyll need to stand the test of time first. and yea, lets not even start talking about price to performance here.... 😅

 

blasphemy! how dare you jump over to the apple side of life! 😮 GAIZ, GET HIM! /s 😄 seriously tho, what made u switch?

 

jokes aside, interesting to see that Apple has made at least some strides into the right direction. in the end, its still going through translation layers and losing performance vs. native support. plus, as mentioned above, apple will need to prove they mean "gaming business" in the long term, so not just a quick n dirty project that quickly falls off the wagon come next hardware / macOS gen (after Sonoma)... so i remain a bit skeptical in that regard.

Apple ARM chip and os is just way better , way simple then windows,  its kind of ludicrous how a 60w single chip can outperform a 135w i9 and a 165w 2080.

Clevo P751TM1-G 8700K/16GB/GTX 1080@120W/1.5TB/FHD 144 G-Sync/ Asus PG279Q

Link to comment
Share on other sites

1 hour ago, Csupati said:

Apple ARM chip and os is just way better , way simple then windows,  its kind of ludicrous how a 60w single chip can outperform a 135w i9 and a 165w 2080.

its always about trade-offs. sure, apple can achieve way higher effiency in their walled garden approach cuz they got complete vertical integration from silicon design up to software and os. but id never want to give up on flexibility and modding capability, thats why ill also never switch from android to crapple 😄 

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022-24)
AMD Ryzen 9 7950X (TG High Perf. IHS) / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / Teamgroup T-Force Delta RGB DDR5-8200 2x24 GB / Seagate Firecuda 530 4 TB / 5x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 (Push/Pull 6x Noctua NF-A14 IndustrialPPC-3000 intake) / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG (3x Arctic P12 A-RGB intake / 4x Arctic P14 A-RGB exhaust / 1x Arctic P14 A-RGB RAM cooling) / Samsung Odyssey Neo G8 32" 4K 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB / PDP Afterglow Wave Black / Beyerdynamic DT 770 Pro X Limited Edition

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022-24)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel Core i5 12600 / Nvidia Geforce RTX 3070 Ti / Mushkin Redline DDR4-3200 2x32 GB / Samsung 970 Pro 1 TB / Samsung 870 QVO 8 TB / Intel AX201 WIFI 6+BT 5.2 / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

3 hours ago, jaybee83 said:

its always about trade-offs.

 

Exactly right.  There are tons of choices with various pros and cons.  Only you can decide what is best the best option for you.

 

6 hours ago, jaybee83 said:

seriously tho, what made u switch?

 

What can I say .....

 

I got fed up!  Fed up with Windows.  Fed up with the state of high-end laptop offerings (specifically from Dell but many of the issues are "global"), compromises involved with them, thermal issues, power management issues (specifically random GPU power drops because of it trying to balance CPU/GPU power and not doing it right), and general lack of attention to detail all around.  I've posted about these issues elsewhere on the forum.  (Never mind the issues I have with the direction that Microsoft is going with Windows 11, I tried switching to Linux for about two months and posted in that thread about some frustrations I was having with Windows; and later my rationale for switching from Linux to macOS, and I've also written about my frustrations with the Dell Precision 7770 which I was initially so excited about.)

 

So I went to a Mac because it seemed like the only place to go that made any sense.  (Apple's recent attention to the gaming space did help push me over the edge.)

 

What I have found that I didn't fully expect going in is how %@#$ good of a laptop a MacBook Pro is.  I'm not talking about "as a PC", but "as a laptop" specifically.  It offers a balance between high performance when needed and cool/quiet/long battery life the rest of the time that you simply can't get from anyone else, where you have to pick between one or the other when deciding which model to get.  And it offers other things that you'd want a laptop to have; quick battery charging, a really good display panel, good speakers as well, and a solid mostly-metal build.

 

And yes, there is definitely a compromise when it comes to both "tweakability"/"upgradeability" and "absolute top performance".  For the latter, I am settling for "good enough" performance (which is pretty darn good) because, to me at least, the benefits from this system more than make up for it.  And, I mean, I played through Shadow of the Tomb Raider at 1510p resolution and "highest" graphics settings preset, it had no trouble maintaining a stable 60 FPS, and that was running through the Rosetta 2 x64->ARM CPU emulator to boot, so I have nothing to complain about with regards to gaming performance.  I might not be able to run at 4K/120 or use the latest ray tracing glitz, but I can play games fine and they look nice enough for me.

 

With regards to the former, it's sort of something that I've just accepted that I have to deal with.  But, laptops have been become less upgradeable all around so if I'm going to be pushed down that road no matter which manufacturer that I go with, I might as well get a something that works as a good laptop rather than something that tries to be a mini desktop and, as a result, is on the bulky side and runs noisy and hot.  (I'm really at a point where I can't have my computer stuck in one place, so actual desktops are not under consideration for me right now.)  Still, while I don't mind having to pay up front to max out the CPU/GPU/RAM, the lack of at least replaceable/upgradeable storage is indeed ridiculous.

 

4 hours ago, Csupati said:

Apple ARM chip and os is just way better , way simple then windows,  its kind of ludicrous how a 60w single chip can outperform a 135w i9 and a 165w 2080.

 

To be clear, I have found multi-threaded performance to be not quite as good as a 12th gen Core i9 laptop CPU (never mind a 13th gen which is notably faster).  The M2 Max doesn't top the chart in CPU performance, but it is up there near the top, and certainly can trounce anything more than a couple of years old.  (Intel sure does have throw a lot of power at the CPU to beat the M2 Max, though; M2 Max probably wins in terms of power efficiency.)

 

6 hours ago, jaybee83 said:

in the end, its still going through translation layers and losing performance vs. native support.

 

Something I've been mulling over lately.  What's the difference between a translation layer and "native support", anyway?  Modern games are largely just bits of computer code that target a set of APIs, and if you swap out those API implementations with something else that works the same... is it a "translation layer" or just an "alternate execution environment"?  Could a "translation layer" still offer "full performance"?

 

...Leaving Mac aside for a moment — obviously it is running a whole different CPU architecture so it must be translated if you want to run x86/x64 games.  Let's look at trying to run Windows games on Linux, on a PC with a typical x64 CPU.

 

Modern Windows games generally just x64 binaries (with supporting x64 libraries & data files) that make calls to various Windows APIs to handle various things that they need (open files, play sounds, network chatter, etc.) and use the DirectX 11 or 12 APIs for graphics.

 

On Linux, running on an x64 CPU, running the code from the game binary should not be any slower than it is on Windows.  What you have to worry about are the external API calls that the game makes, which flat out don't exist on stock Linux.

 

Valve, CodeWeavers, and others have basically cooked up replacements for the APIs (Wine, plus DXVK & VKD3D for DirectX/graphics).  The game doesn't care if it is making calls to "real" Windows+DirectX or to Wine+DXVK, as long as the implementation is good and the right stuff happens, it just executes the same code and the game chugs along.  And, if the "alternate API implementations" are good enough, there is no real reason why it should be "slower" than running a game on Windows directly.  After all, it's still just the same sort of CPU churning through the same x64 game code, in the end, just the parts where it calls out to the OS to do something have been swapped out.  And, you'll find that under Proton, many games run just as well as they do on Windows ... and sometimes, even better.

 

There are issues obviously when the behavior of the alternate implementation doesn't match up with Windows's behavior, which is why some games are broken (DirectX 12 in particular since that is newer), but if you compare Proton today to just like three years ago you will see that they have made incredibly rapid progress in fixing that up.

 

On the Mac side, to pull this off you need to both have a replacement backend for the OS APIs (something like CrossOver with D3DMetal) and you have to translate the CPU instructions from x64 to ARM (Rosetta 2).  The biggest performance hit seems to come from translating graphics API calls right now (D3DMetal or MoltenVK), but that has been doing nothing but getting better over time.  D3DMetal performance has actually gone up like 20%-80% (depending on the game) over just the past three months or so.  On the CPU translation side, Rosetta 2 performance actually seems to be really good, again looking at my experience playing Shadow of the Tomb Raider which has not been compiled for ARM.  It does have a one-time performance penalty hit when you first launch the game, as it basically trawls through the executable and translates all of the x64 code that it can find to ARM code.  It might happen again when you hit new bits of code (i.e. when getting from the menus to the game proper for the first time).  The results are cached, so you don't have to wait for the same translation a second time (unless the game is updated).  This is probably why I originally complained that loading times seemed longer than they should be early on, but that problem quickly went away.

 

So, is there performance overhead? ......... In some cases, yes, though it is shrinking as these technologies mature, and there are groups of people that are working really hard on "unlocking" games from the environment that they were written for.

 

Honestly, the whole approach of making alternate implementations of backends that games need (and ideally, open source ones) is critical for long-term preservation.  There's been chatter recently following this study that, unlike other media like music and films where translating from one format to another is relatively easy, the vast majority of games ever made are currently unavailable to general users.  If you are like me and like playing older games as well as new ones and don't want to have to maintain a collection of old hardware, then you've gotta get used to making your own backups and playing in an emulator or translation environment of some sort.

 

 

Heh.  That's all of my off-topic ranting for now!

  • Thumb Up 2

Apple MacBook Pro 16-inch, 2023 (personal) • Dell Precision 7560 (work) • Full specs in spoiler block below
Info posts (Windows) — Turbo boost toggle • The problem with Windows 11 • About Windows 10/11 LTSC

Spoiler

Apple MacBook Pro 16-inch, 2023 (personal)

  • M2 Max
    • 4 efficiency cores
    • 8 performance cores
    • 38-core Apple GPU
  • 96GB LPDDR5-6400
  • 8TB SSD
  • macOS 15 "Sequoia"
  • 16.2" 3456×2234 120 Hz mini-LED ProMotion display
  • Wi-Fi 6E + Bluetooth 5.3
  • 99.6Wh battery
  • 1080p webcam
  • Fingerprint reader

Also — iPhone 12 Pro 512GB, Apple Watch Series 8

 

Dell Precision 7560 (work)

  • Intel Xeon W-11955M ("Tiger Lake")
    • 8×2.6 GHz base, 5.0 GHz turbo, hyperthreading ("Willow Cove")
  • 64GB DDR4-3200 ECC
  • NVIDIA RTX A2000 4GB
  • Storage:
    • 512GB system drive (Micron 2300)
    • 4TB additional storage (Sabrent Rocket Q4)
  • Windows 10 Enterprise LTSC 2021
  • 15.6" 3940×2160 IPS display
  • Intel Wi-Fi AX210 (Wi-Fi 6E + Bluetooth 5.3)
  • 95Wh battery
  • 720p IR webcam
  • Fingerprint reader

 

Previous

  • Dell Precision 7770, 7530, 7510, M4800, M6700
  • Dell Latitude E6520
  • Dell Inspiron 1720, 5150
  • Dell Latitude CPi
Link to comment
Share on other sites

5 hours ago, Aaron44126 said:

 

Exactly right.  There are tons of choices with various pros and cons.  Only you can decide what is best the best option for you.

 

 

What can I say .....

 

I got fed up!  Fed up with Windows.  Fed up with the state of high-end laptop offerings (specifically from Dell but many of the issues are "global"), compromises involved with them, thermal issues, power management issues (specifically random GPU power drops because of it trying to balance CPU/GPU power and not doing it right), and general lack of attention to detail all around.  I've posted about these issues elsewhere on the forum.  (Never mind the issues I have with the direction that Microsoft is going with Windows 11, I tried switching to Linux for about two months and posted in that thread about some frustrations I was having with Windows; and later my rationale for switching from Linux to macOS, and I've also written about my frustrations with the Dell Precision 7770 which I was initially so excited about.)

 

So I went to a Mac because it seemed like the only place to go that made any sense.  (Apple's recent attention to the gaming space did help push me over the edge.)

 

What I have found that I didn't fully expect going in is how %@#$ good of a laptop a MacBook Pro is.  I'm not talking about "as a PC", but "as a laptop" specifically.  It offers a balance between high performance when needed and cool/quiet/long battery life the rest of the time that you simply can't get from anyone else, where you have to pick between one or the other when deciding which model to get.  And it offers other things that you'd want a laptop to have; quick battery charging, a really good display panel, good speakers as well, and a solid mostly-metal build.

 

And yes, there is definitely a compromise when it comes to both "tweakability"/"upgradeability" and "absolute top performance".  For the latter, I am settling for "good enough" performance (which is pretty darn good) because, to me at least, the benefits from this system more than make up for it.  And, I mean, I played through Shadow of the Tomb Raider at 1510p resolution and "highest" graphics settings preset, it had no trouble maintaining a stable 60 FPS, and that was running through the Rosetta 2 x64->ARM CPU emulator to boot, so I have nothing to complain about with regards to gaming performance.  I might not be able to run at 4K/120 or use the latest ray tracing glitz, but I can play games fine and they look nice enough for me.

 

With regards to the former, it's sort of something that I've just accepted that I have to deal with.  But, laptops have been become less upgradeable all around so if I'm going to be pushed down that road no matter which manufacturer that I go with, I might as well get a something that works as a good laptop rather than something that tries to be a mini desktop and, as a result, is on the bulky side and runs noisy and hot.  (I'm really at a point where I can't have my computer stuck in one place, so actual desktops are not under consideration for me right now.)  Still, while I don't mind having to pay up front to max out the CPU/GPU/RAM, the lack of at least replaceable/upgradeable storage is indeed ridiculous.

 

 

To be clear, I have found multi-threaded performance to be not quite as good as a 12th gen Core i9 laptop CPU (never mind a 13th gen which is notably faster).  The M2 Max doesn't top the chart in CPU performance, but it is up there near the top, and certainly can trounce anything more than a couple of years old.  (Intel sure does have throw a lot of power at the CPU to beat the M2 Max, though; M2 Max probably wins in terms of power efficiency.)

 

 

Something I've been mulling over lately.  What's the difference between a translation layer and "native support", anyway?  Modern games are largely just bits of computer code that target a set of APIs, and if you swap out those API implementations with something else that works the same... is it a "translation layer" or just an "alternate execution environment"?  Could a "translation layer" still offer "full performance"?

 

...Leaving Mac aside for a moment — obviously it is running a whole different CPU architecture so it must be translated if you want to run x86/x64 games.  Let's look at trying to run Windows games on Linux, on a PC with a typical x64 CPU.

 

Modern Windows games generally just x64 binaries (with supporting x64 libraries & data files) that make calls to various Windows APIs to handle various things that they need (open files, play sounds, network chatter, etc.) and use the DirectX 11 or 12 APIs for graphics.

 

On Linux, running on an x64 CPU, running the code from the game binary should not be any slower than it is on Windows.  What you have to worry about are the external API calls that the game makes, which flat out don't exist on stock Linux.

 

Valve, CodeWeavers, and others have basically cooked up replacements for the APIs (Wine, plus DXVK & VKD3D for DirectX/graphics).  The game doesn't care if it is making calls to "real" Windows+DirectX or to Wine+DXVK, as long as the implementation is good and the right stuff happens, it just executes the same code and the game chugs along.  And, if the "alternate API implementations" are good enough, there is no real reason why it should be "slower" than running a game on Windows directly.  After all, it's still just the same sort of CPU churning through the same x64 game code, in the end, just the parts where it calls out to the OS to do something have been swapped out.  And, you'll find that under Proton, many games run just as well as they do on Windows ... and sometimes, even better.

 

There are issues obviously when the behavior of the alternate implementation doesn't match up with Windows's behavior, which is why some games are broken (DirectX 12 in particular since that is newer), but if you compare Proton today to just like three years ago you will see that they have made incredibly rapid progress in fixing that up.

 

On the Mac side, to pull this off you need to both have a replacement backend for the OS APIs (something like CrossOver with D3DMetal) and you have to translate the CPU instructions from x64 to ARM (Rosetta 2).  The biggest performance hit seems to come from translating graphics API calls right now (D3DMetal or MoltenVK), but that has been doing nothing but getting better over time.  D3DMetal performance has actually gone up like 20%-80% (depending on the game) over just the past three months or so.  On the CPU translation side, Rosetta 2 performance actually seems to be really good, again looking at my experience playing Shadow of the Tomb Raider which has not been compiled for ARM.  It does have a one-time performance penalty hit when you first launch the game, as it basically trawls through the executable and translates all of the x64 code that it can find to ARM code.  It might happen again when you hit new bits of code (i.e. when getting from the menus to the game proper for the first time).  The results are cached, so you don't have to wait for the same translation a second time (unless the game is updated).  This is probably why I originally complained that loading times seemed longer than they should be early on, but that problem quickly went away.

 

So, is there performance overhead? ......... In some cases, yes, though it is shrinking as these technologies mature, and there are groups of people that are working really hard on "unlocking" games from the environment that they were written for.

 

Honestly, the whole approach of making alternate implementations of backends that games need (and ideally, open source ones) is critical for long-term preservation.  There's been chatter recently following this study that, unlike other media like music and films where translating from one format to another is relatively easy, the vast majority of games ever made are currently unavailable to general users.  If you are like me and like playing older games as well as new ones and don't want to have to maintain a collection of old hardware, then you've gotta get used to making your own backups and playing in an emulator or translation environment of some sort.

 

 

Heh.  That's all of my off-topic ranting for now!

 

bro @Aaron44126be like

Obama Mic Drop GIFs | Tenor

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022-24)
AMD Ryzen 9 7950X (TG High Perf. IHS) / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / Teamgroup T-Force Delta RGB DDR5-8200 2x24 GB / Seagate Firecuda 530 4 TB / 5x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 (Push/Pull 6x Noctua NF-A14 IndustrialPPC-3000 intake) / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG (3x Arctic P12 A-RGB intake / 4x Arctic P14 A-RGB exhaust / 1x Arctic P14 A-RGB RAM cooling) / Samsung Odyssey Neo G8 32" 4K 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB / PDP Afterglow Wave Black / Beyerdynamic DT 770 Pro X Limited Edition

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022-24)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel Core i5 12600 / Nvidia Geforce RTX 3070 Ti / Mushkin Redline DDR4-3200 2x32 GB / Samsung 970 Pro 1 TB / Samsung 870 QVO 8 TB / Intel AX201 WIFI 6+BT 5.2 / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

On 9/26/2023 at 1:58 PM, jaybee83 said:

welp, one can dream...i like the way that OP it thinking. however, in the end, Clevo is now following the money trail, thin n light BGA crap FTW! 😢

I can only confirm you!! Unfortunately, Clevo, like the other manufacturers, depends on the rate-based money.... The current monetary system forces the manufacturers to such measures...Only when this monetary system cleans up again, you could see something different....The future will also show how customers react to it all, thereby also less sales will be made. The question is, whether the people want to afford that also far. In the past, a P870xx cost 4000 Euros with immense equipment, today you get a 2cm book for 5000 Euros that lasts 2 years and is not upgradeable!!!...

  • Bump 1
  • Sad 1
Link to comment
Share on other sites

On 9/26/2023 at 3:29 PM, win32asmguy said:

Only on certain implementations like 45W H series 11th Gen and up, but not necessarily on any HX chips.

 

Also TB4 has increased latency due to PAM signal encoding so it was always feel slower than PCIe even at the same bandwidth.

That's what I'm saying, many only look at the bandwidth and forget that any inter-coupled chip causes latency. From a purely physical point of view, this causes delays in real time. Another example is PCIe 3.0 to 4.0, where there is little difference in games even though the data rate should be much faster. Here, nothing has changed physically on the data line, so the transmission is approximately the same and CPu works just as fast. There is no huge difference between 9th gen - 13/14 gen....

Link to comment
Share on other sites

On 9/26/2023 at 5:58 AM, jaybee83 said:

welp, one can dream...i like the way that OP it thinking. however, in the end, Clevo is now following the money trail, thin n light BGA crap FTW! 😢

 

It is disappointing that they do not have a modular GPU interconnect. Hopefully it is already in development and can be released next year. I have noticed that the BGA motherboards are much higher price than the old LGA + MXM models. For instance, X170KM motherboard is 420usd, but the NP50PNJ with 12700H and 3050 is 875usd.

 

Things are getting better on the software side though. We can get a Coreboot + open source EC supported Clevo X370 from System76 which is very flexible to tune and tweak with the right expertise and desire. It is the first 13th Gen laptop that I have seen that lets you configure every aspect of power limits and fan control.

Desktop - 12900KS, 32GB DDR5-6400 C32, 2TB WD SN850, Windows 10 Pro 22H2

Clevo X170SM - 10900K LTX SP106, 32GB DDR4-2933 CL17, 4TB WD SN850X, RTX 3080 mobile, 17.3 inch FHD 144hz, System76 open source firmware, Windows 10 Pro 22H2

Link to comment
Share on other sites

On 10/1/2023 at 6:38 PM, win32asmguy said:

It is disappointing that they do not have a modular GPU interconnect. Hopefully it is already in development and can be released next year. I have noticed that the BGA motherboards are much higher price than the old LGA + MXM models. For instance, X170KM motherboard is 420usd, but the NP50PNJ with 12700H and 3050 is 875usd.

The monetary system with interest is decisively involved in it! It will become even worse, I think...

Link to comment
Share on other sites

On 10/1/2023 at 6:38 PM, win32asmguy said:

 

It is disappointing that they do not have a modular GPU interconnect. Hopefully it is already in development and can be released next year. I have noticed that the BGA motherboards are much higher price than the old LGA + MXM models. For instance, X170KM motherboard is 420usd, but the NP50PNJ with 12700H and 3050 is 875usd.

 

Things are getting better on the software side though. We can get a Coreboot + open source EC supported Clevo X370 from System76 which is very flexible to tune and tweak with the right expertise and desire. It is the first 13th Gen laptop that I have seen that lets you configure every aspect of power limits and fan control.

Yes the x170 mb is cheaper cuz its dönt have a GPU and CPU, if You buy a i9 10900k, and a 3080for it, it will cost about 1.5k usd probably.

  • Thumb Up 1

Clevo P751TM1-G 8700K/16GB/GTX 1080@120W/1.5TB/FHD 144 G-Sync/ Asus PG279Q

Link to comment
Share on other sites

On 3/28/2023 at 7:45 PM, KabyZen said:

My question is that since MXM is dead could Clevo jump on this and start using this standard to bring back modular GPUs? Could we see a revival of the p870? 😃😮

That would be my solution to keep the MXM laptop(P870xx) supplied with new graphics card generations!:-) And another picture shows my last system Spec:-)

MXM_2.jpg

P870TM_1.jpg

Link to comment
Share on other sites

If i understand right, your 4070TI is in a EGPU enclosure and instead of connecting it to a TB3 port its connected directly in the MXM interface? Can you post a picture?

7950X3D| Zotac 4090 AMP Extreme Airo| MSI MPG B650 Edge Wifi| Lian Li Galahad 360 V2| 48GB GSkillTrident Z RGB 7600|Kingston KC3000 2TB| Fury Renegade 2TB| Lian Li O11 Dynamic Evo| Corsair HX1500i| Samsung Odyssey G9 Neo

Asus Zephyrus G15 (Ryzen 9 6900HS + RTX3080)

 

Link to comment
Share on other sites

24 minutes ago, cylix said:

If i understand right, your 4070TI is in a EGPU enclosure and instead of connecting it to a TB3 port its connected directly in the MXM interface? Can you post a picture?

The RTX4070ti is connected directly to the MXM port x16 lanes, yes! Nothing with TBT or other things. It's still in an experimental stage and there should be a RTX4080 installed soon in P870xx:-) I have indicated it on the picture, as the location of the card is in the notebook. I do not want to take pictures of my ideas for now. Make but the given time.

  • Thumb Up 1
  • Bump 1
Link to comment
Share on other sites

I see, so the 4070TI PCB is inside the laptop ? Crazy stuff but what about the power and the cooling, the 40xx have a massive cooling system.It is doing the Laptop thicker as it is? Yeah i understand, take the time and when your ready put some pics, i am very curious about the looks. Now you just need to get some new gen CPUs in there 😄

 

7950X3D| Zotac 4090 AMP Extreme Airo| MSI MPG B650 Edge Wifi| Lian Li Galahad 360 V2| 48GB GSkillTrident Z RGB 7600|Kingston KC3000 2TB| Fury Renegade 2TB| Lian Li O11 Dynamic Evo| Corsair HX1500i| Samsung Odyssey G9 Neo

Asus Zephyrus G15 (Ryzen 9 6900HS + RTX3080)

 

Link to comment
Share on other sites

51 minutes ago, cylix said:

I see, so the 4070TI PCB is inside the laptop ? Crazy stuff but what about the power and the cooling, the 40xx have a massive cooling system.It is doing the Laptop thicker as it is? Yeah i understand, take the time and when your ready put some pics, i am very curious about the looks. Now you just need to get some new gen CPUs in there 😄

 

The P870xx then has real desktop GPU power and real desktop CPU power. I think this is a novelty 🙂

  • Thumb Up 1
Link to comment
Share on other sites

3 hours ago, Developer79 said:

That would be my solution to keep the MXM laptop(P870xx) supplied with new graphics card generations!:-) And another picture shows my last system Spec:-)

MXM_2.jpg

P870TM_1.jpg

There was a fully desktop gtx 980 mxm card (its called gtx 990m) . That was very good when it was released. Hope clevo reconsider that line.

Clevo P751TM1-G 8700K/16GB/GTX 1080@120W/1.5TB/FHD 144 G-Sync/ Asus PG279Q

Link to comment
Share on other sites

23 hours ago, Csupati said:

There was a fully desktop gtx 980 mxm card (its called gtx 990m) . That was very good when it was released. Hope clevo reconsider that line.

I rather think not! The are forced by the shareholders and the pressure by the contracts and the monetary system....Would be nice of course, but at the moment it does not look like that, as I said...The GTX990M was but also MXM Factor. I'm talking about a complete desktop card:-)

Link to comment
Share on other sites

48 minutes ago, Developer79 said:

I rather think not! The are forced by the shareholders and the pressure by the contracts and the monetary system....Would be nice of course, but at the moment it does not look like that, as I said...The GTX990M was but also MXM Factor. I'm talking about a complete desktop card:-)

its stronger then a desktop 980 cuz the desktop variant has 4gb vram, and the clevo 980 has 8 gb, any other specs are the same, so its was a beast its time, and power hungry with 100-200W TDP. gtx 980 laptop and a review

Clevo P751TM1-G 8700K/16GB/GTX 1080@120W/1.5TB/FHD 144 G-Sync/ Asus PG279Q

Link to comment
Share on other sites

35 minutes ago, Developer79 said:

I know what you mean. My development allows to run cards with 300- 400 W in the P870xx...:-)

yes i know im following it 😄 , in those days the 980 mxm was a good card for the P870DM/-G , but with the KM and after the TM it was weak compared to the 1080 and 20xx series.

Clevo P751TM1-G 8700K/16GB/GTX 1080@120W/1.5TB/FHD 144 G-Sync/ Asus PG279Q

Link to comment
Share on other sites

On 10/6/2023 at 10:25 AM, cylix said:

I see, so the 4070TI PCB is inside the laptop ? Crazy stuff but what about the power and the cooling, the 40xx have a massive cooling system.It is doing the Laptop thicker as it is? Yeah i understand, take the time and when your ready put some pics, i am very curious about the looks. Now you just need to get some new gen CPUs in there 😄

 

Yes, the electrical power comes directly from the 2 graphics card cables! I have developed a complete internal power supply for the P870xx...Yes, the cooling will be as thick as that of the vapour chamber of the p870xx...The Cpu wiwrd difficult because the gen. have a different power protocol.... The 10.Gen still goes, the other would be associated with a lot of effort....

  • Thumb Up 2
Link to comment
Share on other sites

21 hours ago, Developer79 said:

Yes, the electrical power comes directly from the 2 graphics card cables! I have developed a complete internal power supply for the P870xx...Yes, the cooling will be as thick as that of the vapour chamber of the p870xx...The Cpu wiwrd difficult because the gen. have a different power protocol.... The 10.Gen still goes, the other would be associated with a lot of effort....

dude that sounds amazing. keep us updated on your progress, im sure the community would be VERY interested in this 🙂 

  • Thumb Up 1

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022-24)
AMD Ryzen 9 7950X (TG High Perf. IHS) / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / Teamgroup T-Force Delta RGB DDR5-8200 2x24 GB / Seagate Firecuda 530 4 TB / 5x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 (Push/Pull 6x Noctua NF-A14 IndustrialPPC-3000 intake) / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG (3x Arctic P12 A-RGB intake / 4x Arctic P14 A-RGB exhaust / 1x Arctic P14 A-RGB RAM cooling) / Samsung Odyssey Neo G8 32" 4K 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB / PDP Afterglow Wave Black / Beyerdynamic DT 770 Pro X Limited Edition

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022-24)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel Core i5 12600 / Nvidia Geforce RTX 3070 Ti / Mushkin Redline DDR4-3200 2x32 GB / Samsung 970 Pro 1 TB / Samsung 870 QVO 8 TB / Intel AX201 WIFI 6+BT 5.2 / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use