Graphical fidelity has not materially improved since the days of Crysis 1, 16 years ago. The only two meaningful changes for how difficult games should be to run in that time are that 1440p & 2160p have become more common, and raytracing. But consoles being content to run at dynamic resolutions and 30fps combined with tools developed to make raytracting palatable (DLSS) have made developers complacent to have their games run like absolute garbage even on mid spec hardware that should have no trouble running 1080p/60fps.
Destiny 2 was famously well optimized at launch. I was running an easy 1440p/120fps in pretty much all scenarios maxed out on a 1080 Ti. The more new zones come out, the worse performance seems to be in each, even though I now have a 3090.
I am loving BG3 but the entire city in act 3 can barely run 40fps on a 3090, and it is not an especially gorgeous looking game. The only thing I can really imagine is that maxed out the character models and armor models do look quite nice. But a lot of environment art is extremely low-poly. I should not have to turn on DLSS to get playable framerates in a game like this with a Titan class card.
Nvidia and AMD just keep cranking the power on the cards, they’re now 3+ slot behemoths to deal with all the heat, which also means cranking the price. They also seem to think 30fps is acceptable, which it just… is not. Especially not in first person games.
Also if a quick Google result is anything to go on, Apple sells hundreds of millions of iPhones a year. 3% of that is still a fuckload of people and IMO proves there is a market for it. Just maybe not a market that needs yearly attention. You also have to remember that’s split between tons of SKUs, so you would expect all of them to hover in the single digits to low teens.
I got my wife a 12 Mini - she loves it. The battery life is absolutely the worst thing about it, but it sounds like the 13 Mini was a huge upgrade in that regard and I had hopes it would continue to get better with future versions.
Something else that may not be taken in to account - the kinds of people buying the Mini are I would wager on a longer upgrade period than the kinds of people who buy e.g. a base iPhone or Pro model. The kind of person buying a Mini I would bet is closer to the kind of buyer that has historically bought the SE - they probably only upgrade every 3 or 4 years rather than the more stereotypical 2. Pro numbers are also skewed by the hyper fans who upgrade yearly and therefore show up in the stats a lot more, even though they’re both a firm Apple customer.
There is also this interesting note at the end of the article:
I think the Mini should become the new SE. Keep it on 2+ year old CPU, keep it 60Hz, at least the form factor and design language will match the rest of the lineup unlike now where the SE has a design from 2016. That would be perfect for people like my wife, who want the smallest cheapest phone that’s technically an iPhone, and are only going to upgrade every few years.