As a person who wants to play games on a Mac, sometimes I feel like Charlie Brown trying to kick the football. But Apple's custom silicon has me hoping again. I keep seeing more and more stories like this where ported games don't just work "acceptably", but actually work better on M1 and M2 chips.
Apple's hardware is unquestionably very good now, and their graphics APIs are actually seeing some uptake. The recent stories about Resident Evil Village especially sound positive.
Be careful; the only performance comparison they made was between the x86_64 builds and the arm64 builds. Both were done on the same hardware, with the x86_64 build necessarily running in emulation. This only proves that the ported game runs better than the non–ported game run in emulation, not that it runs better on an M1 Mac than on other computers.
The post actually mentions what map they used for benchmarking, and linked to a bunch of other benchmarks on the same map[0].
They quote around 200 UPS average. It's hard to compare to the linked benchmarks because those quote p75 numbers instead of average, but it seems like the results are in the same general ballpark as the Ryzen 9 5950x.
Wow thanks for this.
I noticed the 13700k results are surprisingly lackluster. The highest speed memory kit there is 5600. I just got a 13700k with Hynix A-die (now clocked to 7200). I have an AIO arriving in the mail in a few days to replace my NH-D14. The 13700k has quite a bit of headroom when not thermally constrained.
I think that the 13700k and 13900k with the same turbo ratio should perform almost the same in gaming workloads. The only difference should be in the 36 MB of LL cache vs. 30 MB. It's a modest difference, but factorio is memory subsystem performance sensitive.
I'll add a benchmark to that page in a few days with a 5.8 GHz clocked 13700k to test the theory.
Update for posterity: The ALF II 280 actually performed the same as my NH-D14 in thermal stress testing. I ran y-cruncher for an all-core workload that reliably thermal throttled 100 C @ 220 W. The clock frequency is dependent on the voltage given, which has been a pain to tune. I don't think this chip can go beyond 5.6 GHz stable without adding so much voltage that it is actually lower performance in most workloads. 5.7 GHz can be made borderline for most workloads, 5.8 GHz is unstable, and 5.9 GHz does not boot. I know the adaptive selection voltage mode should be able to address this, but something about the BIOS is incorrect. These results refute my theory that the 13700k could match the 13900k and the 13900k is in fact well-binned.
That doesn't mean the 13700k can't match (or exceed) the out-of-box Factorio performance of the 13900k when given better memory, hence the score of 304 UPS.
Bonus: E-cores have thermal headroom at stock and can be stable at 4.5 GHz if given +0.1 V, but this cuts into the thermal headroom of the P-cores in all-core workloads and lowers the overall performance. Bumping to 4.3 GHz from 4.2 GHz with no voltage increase is stable.
Not sure about the performance of NH-D14 and 13700K, but my NH-D15 (upgraded from NH-U12A) is running OK for 13700k with Intel's PL1/PL2 settings, hovering around 60~70c during full load at 25c ambient temperature. Unlimited PL2 is a different story and can go up to 100c during full load. My previous NH-U12A build ran at about 2-4c higher temperature under load.
I've been experimenting with different settings and found that unlimited PL2 and undervolting the CPU by -150mV give the best temperature to performance at ~80c during full load. It has been running stable for few days, and I'm pretty happy with the result so far.
For stability testing I like y-cruncher. It teases out edge cases that many XMP profiles are unstable with. I'd take a slightly less efficient system that I am confident will not error.
Also, those numbers aren't far off the out-of-box behavior I had, but I like to tinker. I throttle with PL set to 190 but not at 180.
I'd avoid Intel entirely if heat/power efficiency matter to you at all. AMD has had acceptable performance with far better heat/power use across the board for a few years now.
> with the x86_64 build necessarily running in emulation
Rosetta 2 is not emulation, at all, it's AOT, static binary translation, backed by hardware that implements Intel specific behaviour from the latest chips down to the oldest 8080 or something. It's eerily fast.
In fact, it happens that arm64-translated x86_64 running on Apple Silicon can often be faster than x86_64 running on the latest Macs with Intel processors.
So you really have to ask two questions here:
- does the x86_64 Factorio build run faster on Apple Silicon than on a comparable† Intel?
- on Apple Silicon, does the arm64 Factorio build run faster than the x86_64 Factorio?
>Rosetta 2 is not emulation, at all, it's AOT, static binary translation, backed by hardware that implements Intel specific behaviour from the latest chips down to the oldest 8080 or something.
Imagine you only speak English and you want to read a novel in French.
Emulation: you hire a translator to read the novel to you. They translate each word while reading.
Static translation: you hire a translator to transcribe the book from French to English. They give you a printed book purely in English. But simple French words like flâner and râler are expanded into lengthy passages because there is no simple English translation.
Rosetta 2: you hire the translator to transcribe the book to English, but they leave in unique French words and teach you what they mean so you can understand them in an English phrase without even noticing that the word isn’t “real” English.
Rosetta 2 isn’t emulation because no instruction is translated on the fly to a different ISA. It’s static translation plus ISA extensions. There is no lower level emulating anything.
As a slight correction, I believe Rosetta 2 also has a JIT mode, which is a bit more like conventional emulators. But it's used infrequently, eg when dealing with x86_64 apps that themselves use a JIT.
It does have JIT translation (not a JIT "mode" though, as it always use AOT translation, only relying on JIT translation at runtime for the parts that need it)
> which is a bit more like conventional emulators
Not at all†, Rosetta 2 does the same†† translation step on dynamic Intel code, whose arm64 output can be reused afterwards
> But it's used infrequently, eg when dealing with x86_64 apps that themselves use a JIT
Yes, although it's more like "exceedingly rarely" in practice since usually those interpreters are up to date enough to have a native arm64 release.
Thanks, that is definitely correct. Rosetta 2 can do JIT, and it gets exercised for native JIT / dynamic code.
I could probably extend the metaphor to an avant garde French novel that asks the reader to look up and include today’s headlines from Le Monde, but it was already stretched.
Factorio runs quite well on the M1. The graphics system (FPS) is partially decoupled from the factory simulation side (updates per second, or UPS), so there are two components to performance. UPS mostly depends on how big and complex your entire factory is, and on what mods you're running, and FPS mainly depends on how many sprites are on screen. FPS is limited to be <= UPS, since there's no point in redrawing until the game state changes, but UPS can be greater.
FPS: Despite being a 2D sprite game, sometimes it has trouble keeping FPS at 60, at least when running at max graphics and max zoom level with a graphically intensive mod. I would guess it's using OpenGL, and Apple's OpenGL stack isn't great. You can see the article mentioning the M1 Max only hitting 45 FPS in one of the tests, and this is without mods (but with a huge base and presumably a wide zoom level). In my experience, if you adjust the graphics settings appropriately (eg max sprite atlas size and max vram usage, since integrated graphics use unified memory), you can usually keep it at a smooth 60 FPS 99% of the time even in graphically-intensive setups with max or almost-max quality settings.
UPS: Scoring 199 UPS on the flame_sla 10k base puts the M1 Max above any other laptop processor for that benchmark. This matches my experience: the simulation part of the game almost never lags, except for unavoidably heavy operations (eg generating new worlds when playing with mods that do that). See a comparison at:
Yeah, Factorio is multithreaded, but in practice it usually only runs a few ways in parallel. Instead its performance is determined in large part by the memory subsystem, which is why the X3D processors do so well. It's probably also part of the M1's great performance: with a large cache and stacked DRAM, it has very competitive bandwidth and latency.
Some anecdata: I've played Factorio on both an Intel MBP (2018) and an M1 MBP (2021), the performance even under Rosetta blew away the Intel chipset. Being M1 native means even faster performance with a lower power impact.
The current Steam problem on macOS is more about 32 bit games that never received 64 bit builds, which means they are not playable anymore since Catalina independently of the Intel vs Apple Silicon hardware, notably the first-party GoldSrc and Source engine games, and all of their third party derivatives.
I would not really care if my game library was going through Rosetta 2, as I'd rather take a theoretical performance hit (vs a native arm64 build) than outright be unable to play.
At one point even intel macs broke compatibility with every game that was 3 years old or something stupid like that. I remember that my working game simply refused to execute after a macos version upgrade. Exact same machine, refusing to run my software overnight.
This kind of attitude just isn't conducive for gaming, where people like to build libraries in steam and expect everything to keep working for a long time.
On my PC, I can fire up games from 20 years ago and they work perfectly. Witcher 3, a 7 year old game, is getting an overhaul. I expect no problems in downloading it on steam from my library and playing it seamlessly on my relatively new PC.
Yep - they killed 32 bit compatibility and it’s annoying.
IIRC win64 finally killed win16 support but that was rarely used for games and those games you can dosbox (which amusingly enough works fine on Mac in many cases).
Meanwhile I’ve been happy playing games in Parallels (virtual Windows arm64) like Against the Storm. Crusader Kings 3 has better perf in Parallels than the macOS build.
Does protonDB work for Mac titles, or is it only for Windows? I've been able to play basically every game in my steam library on linux using protonDB. I have hundreds of games.
An alternative would be using CrossOver (which pulls from Wine and adds stuff like MoltenVK), which is what Proton does as well (pulling from Wine and adding stuff, but not MoltenVK) and vendors internally, Valve "just"† doesn't pull from the CrossOver changes nor expose Proton on macOS.
† scare quotes because it may not be as easy as it seems
Proton helps you run Windows games on Linux. A few years ago, the problem with Mac games on Steam was running 32-bit Mac games on 64-bit only macOS builds. Now, it's running x86 Mac games on arm64 Macs. Often, we're talking about those same 32-bit games that never got updated to 64-bits x86, let alone arm.
IIRC from the benchmark M1 have some truly great single-core performance. Then again Apple stuff is so fucking expensive you can get faster for cheaper easily...
I get the impression that the lack of releases on mac are generally down to non-technical reasons.
Building for ARM is probably not a big challenge if you’re already building for Apple’s x86 toolset.
Metal would be more of a challenge I imagine, but a bridge probably worth crossing all else being equal (or one that you don’t need to cross at all if you’re using something like unreal or unity).
The Mac isn’t a games platform as Apple hasn’t shown much interest in the mainstream gaming market, and I can’t imagine major publishers are eager to fork over a third of their revenue on the App Store for sales they’ll probably pick up elsewhere without more work and cost. Sure theres Epic and Steam on Mac, but they’re ghost towns, and publishers are likely waiting to see what way the EU Digital Markets Act shakes out globally anyway (as other governments are pressured to provide the same freedoms).
There was talk at one point of Apple working on a game console (a more powerful Apple TV) but who’s the market for that?
They’ll not be cost-competitive with Xbox or content-competitive with PlayStation and Nintendo.
At best they’d be likely to produce a similarly powered box with little content and a high price tag in a market already retailing hardware below cost price.
The most important "goalpost" is: Every major cross-platform AAA game gets released simultaneously on Windows, Xbox and PlayStation. When macOS gets added to the list in a consistant manner, people will consider it a "major" gaming platform.
Having access to a lot of mobile games doesn't matter for hardcore gamers.
Gaming on a Mac is like gaming on a PC just worse. There's less games, every game on Steam that supports Mac supports Windows. Performance is usually worse too. For gaming it's just a worse PC. And it's usually more expensive. And there hasn't been an exclusive worth playing since like 1997. With consoles there is at least stuff you can only play there and an extra ease of use layer plus they are cheaper. Mac is just all downside for gaming.
The goalposts haven't moved. Where are the games? Macs will game when macs can game. The gaming industry has chosen the APIs they are willing to support. When Apple meets them there then macs will be able to game.
> The gaming industry has chosen the APIs they are willing to support.
The gaming industry will go where the money is.
The majority of the industry revenue is now on mobile and the lions share of mobile revenue is on iOS and Metal.
I think Apple Silicon Macs will end up benefiting from studios having experience with Metal on iOS in the same way that XBox benefited from studios having experience with Direct3D on Windows.
Disagree on the last point, I'm a university student who spent 1k on a MacBook Air, and I don't have a console or gaming PC because I can't afford it.
I study Computer Engineering, so having a good laptop was important. Prior to this I owned refurbished ThinkPad running Linux which cost me around $500 but had multiple issues with performance and speed, to the point where buying an M1-class machine was almost necessary for me.
From what I understand, Apple decided to force Metal upon everyone instead of using Vulkan like the rest of the industry. This causes friction with game development.
If you want to claim that Direct3D is king, I'll agree with you. I'm merely pointing out that all of the DirectX libraries are fairly irrelevant when they run just fine without DirectX.
Ooof, I've seen some bad takes but this one is a really bad one.
DXVK is currently only used as a drop in replacement for DX9 games. DX10 has been forgotten (thank god), DXVK's D3D11 implementation is not good, and more and more games are going on D3D12 which affords a hell of a lot of control.
Additionally, DirectX is not just a graphics API, it's also sound (XACT and XAudio2), ray tracing (DXR), Input handling (XInput & DirectInput), CUDA-like calculations (DirectCompute), storage handling (DirectStorage), etc. Most of these have either an alright equivalent (DXR has an equivalent in Vulkan with extensions, and that's about it, and Valve's input implementation is _really good_, but it's not a usable API as far as I know.) or a wildly inferior alternative (at least for the PC space that is Windows/Linux/MacOS). D3D12 also is one of the drivers of new GPU programming features in the PC space (once again, the console side of things is a bit weirder, although MS does bring some lessons in from the Xbox side of things), while Vulkan is kind of stuck doing everything as extensions that may or may not be available, and Metal is still a piece of shit.
Remember, to start, Windows only officially supports DirectX. OpenGL and Vulkan comes from your GPU vendor and Microsoft waives all responsibility for them. Vulkan is, quite literally, a 3rd-party API that can run on Windows - not something Windows supports or endorses.
Xbox does not support Vulkan. DirectX or get rejected.
Only 60% of Android devices support Vulkan. Guess you’ll also need ANGLE or OpenGL for backwards compatibility.
PlayStation does not support Vulkan. Better learn gnm, gnmx, and PSSL.
Nintendo Switch has Vulkan but it is almost unusably slow, on a console that is already not known for speed. Better use NVN if you want anything decent.
iOS does not support Vulkan. Better use Metal.
So… what does Vulkan support, exactly? Windows, Linux, and not enough of Android. If your game only runs on desktop, it’s a good option - but why not target DirectX? Windows, Linux with Proton, and most of the Xbox support all in one. For this reason, I have yet to see a Vulkan game that does not have a DirectX mode.
Blaming macOS for being proprietary is disingenuous in an industry full of Proprietary APIs.
You're doing a bit of slight of hand on Windows support.
Windows doesn't "support" the DirectX version shipped by your GPU vendor either. The drivers shipped by your GPU vendor, and all the APIs provided by them, are supported by your GPU vendor.
So the real thing we're talking about is hardware vendors. Nvidia and AMD support Vulkan, OpenGL, and DirectX where applicable. Apple only supports Metal. The console vendors have always had weird variant APIs based on the open standards but not identical, except MS where the console is very close to desktop Direct X.
On mobile hardware vendors ubiquitously support OpenGL ES and there's widespread support for Vulkan.
So it's complicated. In the desktop space, as a percentage of market share, Vulkan is extremely widely supported. Same with mobile. Consoles have always been an odd man out.
So Apple, which doesn't sell a console, is absolutely breaking from the pack in the markets they target.
If you consider 60% of Android users, and 0% of iOS users, "widely supported," sure. That's less than half of mobile phones in use right now, making Vulkan the odd-one-out on mobile as well. You certainly can't build a mobile app right now that only uses Vulkan without cutting out huge parts of your audience.
> So Apple, which doesn't sell a console, is absolutely breaking from the pack in the markets they target.
Apple wants the same API on all of their devices, and I can't blame them. They are the odd-ones-out in Desktop only.
But does that really matter? If you are making a game only for Desktop, namely Windows, and weren't going to just use DirectX for some reason, it does (which I think, nowadays, is a rare situation). But if you are targeting any game consoles, or any mobile phones, you're adding multiple graphics APIs anyway and Metal is just another one.
> If you consider 60% of Android users, and 0% of iOS users, "widely supported," sure.
But that's what we're talking about, if it weren't for Apple Vulkan would be a near ubiquitous desktop and mobile API. Apple is the one standing in the way of that.
If Vulkan were a near ubiquitous mobile and desktop API, _maybe_ the console vendors would be more willing to tolerate it.
Microsoft can't be expected to just give up the gatekeeping that is DirectX on their own (well, maaaybe if antitrust was more effective ??).
Sony only cares about their console(s), so they prefer to focus on optimization rather than compatibility. (While Nintendo cares more about gameplay than graphics.)
Isn't that 40% of Android devices old / very cheap ?
Apple is one of the biggest (and especially, most profitable) companies in the world, and with great power comes...
So "Microsoft can't be expected", "Sony doesn't care" but we should only blame Apple. Gotcha.
Because only Apple has the power and hence the responsibility, not the tiny helpless companies Microsoft and Sony (~99% of consoles, 76% of desktop).
Edit: don't forget, Apple absolutely must support Vulkan (and others don't) because Metal is proprietary and non-cross-platform (just like any other graphics API on all major platforms) even though Vulkan appeared two years later than Metal (but neither Microsoft nor Sony are expected to drop their APIs which also appeared earlier than Vulkan).
If you're _really_ aiming for cross platform support, an RHI is only the beginning of your problems though. Replacing the rendering code may not be _trivial_ but if you're stuck at that point you're likely going to struggle with networking, filesystem, permissions, user accounts, store requirements. Modern rendering APIs are _similar enough_ in feature sets and abstraction layers that it's not an insurmountable task.
Total addressable market matters here. 100% of Android-based VR headsets support Vulkan. Granted, that's mostly the Quest 2, but it's not the only HMD in town anymore.
Also, a lot lot lot of Android devices are garbage-tier <$100 that you wouldn't want to target anyways because you won't get any sales on them. So the % of Android devices supporting Vulkan may be misleading in the sense that you might be aiming for a segment of devices with much, much higher, if not complete, support for Vulkan.
This seems disingenuous. The previous claim was that Apple ignored an existing emerging standard, so it is relevant. Your new claim is that Metal is a bad API, which I haven't heard anyone else say. What makes it bad?
My larger point is that Apple can obviously support both APIs. It doesn't matter if Metal is good, bad or even awful, Vulkan is what people are using and Vulkan can translate from DirectX. Apple is shutting themselves off from the rest of the industry with this move, which I would argue (judging by how many Mac users wish they could game) is a bad thing.
> It doesn't matter if Metal is good, bad or even awful
I'm confused. Your previous statement was saying the main point is that Apple have kept a bad API.
> Vulkan is what people are using and Vulkan can translate from DirectX. Apple is shutting themselves off from the rest of the industry with this move, which I would argue (judging by how many Mac users wish they could game) is a bad thing.
Possibly, FSVO translate, but I don't think anyone was commenting on this new point. More with the previous points.
It could cook my breakfast for me, and it would still be useless if it exclusively targets Apple products. People use Vulkan because it targets multiple platforms. There's nothing stopping Apple from providing both APIs, and if Apple's API is truly nicer to use than Vulkan, then Apple has nothing to worry about.
Vulkan is counter to Apple's vertical integration approach. It would be a foundational API, sitting between their graphics hardware and their OS, yet one they don't control.
Better for someone else to build a Vulkan API on top of Metal, which is what has happened. It's not perfect, but it's the only thing that can work. The pressure on Apple should be for Metal to better support Vulkan by providing APIs it needs to work optimally.
Beyond that, Apple might want to contribute to the Vulkan-on-Metal implementation... though they're only going to do that if it makes strategic sense, which I don't see. For cross-platform, what makes more sense is to encourage games to use a higher-level engine that supports metal among its platforms, like Unity and Unreal.
Well, thus-far it has failed. We have Factorio, Tomb Raider and one of the Resident Evil games running on MacOS - most of which had to implement Metal by hand. If you're right, Apple's strategy needs to change.
The thing is, you need a whole game to work on macos, not just the graphics. Vulkan is just a cross-platform graphics API, not a cross-platform framework.
Typically, if a game maker wants to make cross-platform a priority, they wouldn't target just Vulkan, they'd target a cross-platform framework. That would be true whether Vulkan was supported by Apple or not. And if they don't make cross-platform a priority, the chances of a mac port go down regardless.
So...
We're looking at the incremental gain of Apple providing first-party support for Vulkan vs the existing third-party support. Looks like a lot of work for Apple for little gain. Just doesn't seem worth it to me. Also, the Vulkan version would always be out-of-date since Apple would pin the supported version to an OS release, and would need to be conservative about it, since they aren't going to hold an OS release for Vulkan.
Really, Vulkan on macOS is much better done by the interested third parties, and the focus on Apple should be to get them to better support a Vulcan API on top of Metal.
And a ton of other games because things like Unity make that easy. If you aren’t focused on AAA, you should have no problems finding more games than most of us have time for.
That's part of the problem. MacOS actually had the same Proton compatibility layer as Linux a few years ago, but with Catalina Apple changed what you're allowed to run on your Mac and it broke completely. Neither Valve nor Codeweavers have gotten DXVK or Proton to run on MacOS since, IIRC.
That doesn't sound right. Codeweavers has been shipping fairly recent version of DXVK alongside with MoltenVK since CrossOver 20. On community side there's Gcenx/DXVK-macOS[1] that patched DXVK to work with some of Apple GPU quirks in some games and closely track upstream.
Apple depreciated 32-bit, a year before Apple Silicon came out, which allowed them to more precisely target 64-bit processor speed without legacy baggage. Apple Silicon on iOS had dropped 32-bit years before and they didn’t want to bring it back for macOS.
Android is following with 32-bit deprecation this year, actually. Pixel 7 doesn’t support 32-bit apps.
Except for brief blips here and there Apple seems to not put any priority on gaming. It's a bit disappointing because they could have a really strong showing if they cared.
FWIW, addition of MetalFX upscaling (comparable to DLSS/FSR 2) in Ventura has been a nice surprise and one of the biggest development on Mac gaming in recent memory.
Digital Foundry did some review of MetalFX in Resident Evil Village[1] and was pretty positive about it. (From DF's findings, MetalFX has some problems with transparent texture, but details preservation/restoration are pretty good.)
It's not so much technical as business focus. There's a big issue with their app store cut on in-app purchases and whether they want to set a precedent on lowering/changing the rules to be compatible with games like Fortnite and the third party stores (like Steam and whatever Epic's is called). If they found a solution to that then they need high quality and timely ports, that's a whole organization of marketing, bizdev, devrel, etc people.
At a technical level it seems like they could get more console-like levels of tuning for their platforms. Very few chips to support, only a handful of thermal targets, all chips have a common CPU/GPU architecture, all devices have very fast storage. Conceivably they could field a winning platform for competitive gaming.
My guess is they look at Sony & Microsoft and don't see much value in reshuffling priorities to likely just be #3.
Edit: hajile above convinces me that rather than concerns about spending money to be #3 they probably already are in the top few by gaming revenue and could have lots of reasons for not being more aggressive about taking more share.
With Apple being the largest game company by revenue, they do take it seriously, but for every dollar made from an AAA game, there's probably 20-50 to be made on less "hardcore" games. Thusfar, Apple is serious about money rather than capability.
That seems to be changing though as there are pretty strong supply chain rumors that they are working on AR/VR headsets (allegedly delayed due to terrible market conditions and some supply chain issues). They've also made pretty big investments into their subscription game service.
I think Apple TV is very overlooked by developers too.
There's a lot of processing power in those things. The weak 2021 model has 15% more GPU power than a docked switch (30% more than an undocked switch). The older 2017 model uses an X processor which should give it even more GPU power (almost double an undocked switch). The latest A15 model (with a major price drop vs the previous generation) has more GPU power than the Xbox One S and isn't so far off from the PS4.
Nintendo Switch 500 GFLOPS (docked) 390 GFLOPS (portable)
Apple TV (A10x) 770 GFLOPS (2017)
Apple TV (A12) 580 GFLOPS (2021)
Apple TV (A15) 1500 GFLOPS (2022)
Playstation 4 1850 GLOPSS
Xbox One S 1400 GFLOPS
Apple TV shipments since 2017 seem to be in the 50-80M units range. Compared to 25M PS5, 17M Series X/S, 111M Switch, and 117M PS4, that's a pretty significant number.
I looked for a quantitative comparison. If the numbers in [1] are accurate Apple App store gaming revenue was around $11B in 2021. Meanwhile Microsoft was around $15B [2]. That's way closer than I thought, big enough that they could reasonably be concerned about avoiding any appearance of strength that could trigger anti-trust action. Interesting.
Don't use flops as a comparison please. Meaningless numbers especially across platforms. Especially across vendors for those devices that are pushing out said flops. Same thing with texels per second or gigapixels per second or any comparison like that between 3D graphics accelerator devices. If those numbers meant anything then my Intel A750 would be one of the fastest graphics cards in my house right now.
> I think Apple TV is very overlooked by developers too.
Because no one nows if the platform will be there or not. Apple's commitment to it has been lackluster. And since there are no dedicated controllers, you'll need controllers from a system... that you probably already own, so why play on Apple TV?
apple is in an active lawsuit with one the biggest gaming companies in the world. fortnite hasn’t been available on apple devices in 2 years. if they really cared they wouldn’t have shut down one of the most popular games of today from their devices
This comment doesn't make sense to me. Epic broke one of the major tenets of their contract with Apple. Apple faced losing a significant revenue stream if Epic's efforts were allowed to continue. In particular, since they intend to use the same model for future devices in other verticals, they would have also lost out on a simply colossal amount of future revenue.
The business case on this was a slam dunk. Other companies running virtual product stores with similar terms would have done the same irrespective of how popular Epic's products were.
False. iOS doesn’t support Vulkan, 40% of Android doesn’t support Vulkan, Windows does not officially support Vulkan, Nintendo Switch has a Vulkan implementation that is basically unusable, PlayStation doesn’t support Vulkan, Xbox doesn’t support Vulkan, shall I go on?
Contrary to widespread opinion, Vulkan is not an industry standard. It’s a 3rd-party DirectX alternative for Windows, the best API for Linux, and a curiosity on Android. And that’s literally it, nothing else supports it (except Switch, but it is so slow, almost no games use it, opting for the proprietary NVN).
saying "Windows does not officially support Vulkan" is a completely blatant cope by maclovers, nvidia and amd, the people that make the cards support it, and windows "supports" their cards, so let's stop being dishonest.
"Vulkan is not an industry standard" I mean yeah, in the same way Microsoft word is not an standard.
"except Switch, but it is so slow" again that seems to be a lie, doom eternal orks way, way better on it than it would be on similar software on a different API.
> saying "Windows does not officially support Vulkan" is a completely blatant cope by maclovers, nvidia and amd, the people that make the cards support it, and windows "supports" their cards, so let's stop being dishonest.
It is completely honest. On a fresh install of Windows, if you don't have graphics drivers, you can't run Vulkan or OpenGL. Windows washes their hands of any responsibility. You can at least run DirectX with software rendering regardless of hardware support. It is also for this reason that the locked-down Xbox where Microsoft can assert more control has zero tolerance for OpenGL or Vulkan.
> "Vulkan is not an industry standard" I mean yeah, in the same way Microsoft word is not an standard.
Microsoft Word, and the DOCX format by extension, has >90% market share. Vulkan has almost no presence on consoles, presence on less than half of smartphones in use, and mixed presence on Desktop because MacOS doesn't have it. Word is more of a standard than Vulkan.
> "except Switch, but it is so slow" again that seems to be a lie, doom eternal orks way, way better on it than it would be on similar software on a different API.
DOOM Eternal is one of the few games that uses Vulkan. >90% of Switch games do not use Vulkan, and found it preferable to use the proprietary API. That developers would overwhelmingly opt not to use Vulkan on Switch tells you all you need to know about the state of it. If adding another graphics API (such as Metal) was such a big deal, why in the world would they do it if Vulkan was cross-platform and worked fine? It doesn't work as well as it needs to - and adding another graphics API isn't as much of a blocker as we like to think.
>It is completely honest. On a fresh install of Windows, if you don't have graphics drivers, you can't run Vulkan or OpenGL. Windows washes their hands of any responsibility. You can at least run DirectX with software rendering regardless of hardware support.
DirectX with software rendering doesn't actually result in games actually being playable, unless they are 2d games that barely touch the GPU to begin with. So the software rendering fallback is completely irrelevant here, and what matters is what APIs will work when you do have the GPU drivers correctly installed. And at that point, it doesn't matter what degree of support Microsoft provides for Vulkan, only the degree to which the GPU vendor provides that support. (And the software rendering fallback actually makes it less straightforward to diagnose why a game isn't running as expected, in the case of GPU drivers not being installed. Plus, what game developer cares about the software rendering fallback enough to even test their game against it?)
So no, it's not completely honest. It's a disingenuous red herring.
For me one of the best things about PCs for gaming is access to lots of old games and long tail stuff...in fact I rarely play new graphically-demanding titles. That said OpenEmu on the Mac is the slickest emulation manager / front-end I've used on any platform and DOSBox-X runs fine on my iMac. But I still keep a gaming PC around.
Apple's hardware is unquestionably very good now, and their graphics APIs are actually seeing some uptake. The recent stories about Resident Evil Village especially sound positive.