What's crazy is that the M4 Pro is in the Mac mini, something so tiny can handle that chip. The Mac Studio with the M4 Max will be awesome but the Mini is remarkable.
Back in the early 90s, when Apple was literally building their first product line with the Mac, they would come out with their second big honking powerhouse Mac: the Macintosh IIx. It blew everything out of the water. Then they would come out with their next budget all-in-one machine. But computing was improving so fast, with prices for components dropping so quickly, that the Macintosh SE/30 ended up as impressive as the Macintosh IIx with a much lower price. That's how the legend of the SE/30 was born, turning it into the best Mac ever for most people.
With how fast and impressive the improvements are coming with the M-series processors, it often feels like we're back in the early 90s. I thought the M1 Macbook Air would be the epitome of Apple's processor renaissance, but it sure feels like that was only the beginning. When we look historically at these machines in 20 years, we'll think of a specific machine as the best early Apple Silicon Mac. I don't think that machine is even out yet.
In the 90s, you probably wouldn't want to be using a desktop from 4 years ago, but the M1 is already 4 years old and will probably be fine for most people for years yet.
No kidding. The M1 MacBook Pro I got from work is the first time I've ever subjectively considered a computer to be just as fast as it was the day I got it.
I think by the time my work-provided M1 MacBook Pro arrived, the M2s were already out, but of course I simply didn't care. I actually wonder when it will be worth the hassle of transferring all my stuff over to a new machine. Could easily be another 4 years.
Maybe the desktops, but the laptops were always nigh-unusable for my workloads (nothing special, just iOS dev in Xcode). The fans would spin up to jet takeoff status, it would thermal throttle, and performance would nosedive.
There was a really annoying issue with a lot of the intel MacBooks where due to the board design one of the two power sockets would cause them to run quite a bit hotter.
Yeah I remember that, I posted a YouTube video complaining about it 6 years ago, before I could find any other references to the issue online. https://www.youtube.com/watch?v=Rox2IfViJLg
That would cause it to throttle even when idle! But even on battery or using the right-hand ports, under continuous load (edit-build-test cycles) it would quickly throttle.
Or your lap gets hot. Or the fans drive you mad. Good luck with the available ports. Oh, it’s slow AF too, but if you get the right model you can use that stupid Touch Bar.
Apple's marketing is comparing this season's M4s to M1s and even two generations of Intel ago. The 2x or 4x numbers suggests they are targeting and catering to this longer cycle where subliminally suggested updates are remarkably better, rather than suggesting an annual treadmill even though each release is "our best ever".
I mean, most people don't buy a new phone each year, let alone something as expensive as a laptop. They are probably still targeting Intel Mac, or M1 users for the most part.
So long as Apple is willing to keep operating system updates available for the platform. This is by far the most frustrating thing. Apple hardware, amazing and can last for years and even decades. Supported operating system updates, only a couple of years.
I'm typing this from my mid-2012 retina mac book pro. I'm on Mojave and I'm well out of support for the operating system patches. But the hardware keeps running like a champ.
Apple hardware, amazing and can last for years and even decades. Supported operating system updates, only a couple of years.
That’s not accurate.
Just yesterday, my 2017 Retina 4k iMac got a security update to macOS Ventura 13.7.1 and Safari even though it’s listed as “vintage.”
Now that Apple makes their own processors and GPUs, there’s really no reason in the foreseeable future that Apple would need to stop supporting any Mac with an M-series chip.
The first M1 Macs shipped in November 2020—four years ago but they can run the latest macOS Sequoia with Apple Intelligence.
Unless Apple makes some major changes to the Mac’s architecture, I don’t expect Apple to stop supporting any M series Mac anytime soon.
To be fair, MOST computers are like that nowadays, regardless of brand. I'm using a Intel desktop that is ~8 years old and runs fine with an upgraded GPU.
Sure, apple isn't the only one making good laptops, though they do make some of the best. My point was just that we definitely aren't back at 90s level of progress. Frequency has barely been scaling since node shrinks stopped helping power density much, and the node shrinks are fewer and farther between.
I bought an M1 MacBook Pro just to use it for net and watching movies when in bed or traveling. I got the Mac because of its 20 hours battery life.
Since Snapdragon X laptops caught up to Apple on battery life I might as well buy one of those when I'll need to change. I don't need the fastest mobile CPU for watching movies and browsing the internet. But I like to have a decent amount of memory to keep a hundred tabs open.
Agreed. It might share the title with the M1 Air which was incredible for an ultraportable, but the M1MBP was just incredible period. Three generations later it's still more machine than most people need. M2/3/4 sped things up but the M1 set the bar.
It’s not a server so it’s not a crime to not always be using all of it and it’s not upgradable so it needs to be right the first time. I should have got 32GB to just be sure.
Apple's sky-high RAM prices and strong resale values make this a tough call, though. It might just about be better to buy only the RAM you need and upgrade earlier, considering you can often get 50% or more of the price of a new one back by selling your old one.
Thankfully, Apple recently made 16GB the base RAM in all Macs (including the M2/M3 MacBook Airs) anyway. 8GB was becoming a bad joke and it could add 40% to the price of some models to upgrade it!
Yep, that's definitely a thing I'm proud or correctly foreseeing. I was upgrading from an old machine on 8GB, but I figured especially with memory being non upgradable it was better being safe than sorry, and if I kept the machine a decade it would come out to sanwitch money in the end.
I was the same with the M1 Air until a couple months ago when I decided I wanted more screen real estate. That plus the 120Hz miniLED and better battery and sound make the 16" a great upgrade as long as the size and weight aren't an issue. I just use it at home so it's fine but the Air really is remarkable for portability.
I have the M1 Air, too. I just plug in to a nice big Thunderbolt display when I need more screen!
I'll likely upgrade to the M4 Air when it comes out. The M4 MacBook Pro is tempting, but I value portability and they're just so chunky and heavy compared to the Air.
I owned an SE/30. I watched my first computer video on that thing, marveling that it was able to rasterize (not the right word) the color video real-time. I wish I had hung onto that computer.
>the Macintosh IIx. It blew everything out of the water.
naa... Amiga had the A2500 around the same time, the Mac IIx wasn't better with regards to specs in most ways. And at about $4500 more expensive (Amiga 2500 was around $3300, Mac IIx was $7769), it was vastly overpriced as is typical for Apple products.
Worth remembering that Amiga went out of business just a few years later, while Apple today is the largest company in the world by market capitalisation. Doesn't matter how good the product is: if you're not selling it for a profit, you don't have a sustainable business. Apple products aren't overpriced as long as consumers are still buying them and coming back for more.
The 100 million dollar investment from Apple ended up not being needed. Jobs put the hatchet to enough projects to reverse the trend himself. That investment from Microsoft was valuable because they promised to keep releasing Office for Mac.
Nonetheless, Apple was almost out of business at one point. Microsoft invested in Apple instead of Commodore. If the opposite happened, we may be having a discussion about Commodore now, and not Apple.
That doesn't even make sense. Microsoft hadn't released any software for the Amiga AFAIK, while the Mac market for Word/Excel/Powerpoint was still a decent chunk of revenue for Microsoft at the time (obviously still much less than the Windows/PC market).
I've got one and it's really not that impressive. I use it as a "desktop" though and not as a laptop (as in: it's on my desk hooked to a monitor, never on my laps).
I'm probably gonna replace it with a Mini with that M4 chip anyway but...
My AMD 7700X running Linux is simply a much better machine/OS than that MacBook M1 Air. I don't know if it's the RAM on the 7700X or the WD-SN850X SSD or Linux but everything is simply quicker, snappier, faster on the 7700X than on the M1.
I hope the M4 Mini doesn't disappoint me as much as the M1 Air.
There are tons of affordable and upgradable NUC form factor machines for running Linux, so why a Mac Mini (if you're not running LLMs locally, then the good support and fast integrated RAM might be a reason)?
Yes, but I suspect the 64GB of memory in the studio compared to 24GB in the mini the is going to make that studio a lot faster in many real-world scenarios.
It would be $2,199 for the highest end CPU and the 64GB of memory but I think you're point remains: the Studio is not a great buy until it receives the M4 upgrades.
But it was a great buy for the customers who needed it when it was released. I presided over IT at an architecture firm that bought a bunch of Studios when they were new. Just because it's no longer a good buy two and a half years later, when compared to the thing that ships next week, doesn't mean it wasn't a great machine.
Personally, the Mac Mini will be my reentry into desktop computers after more than 1.5 decades [1]. Your comment got me thinking: could this be another perfectly calculated move by Apple? After all, I’ve only bought mobile devices from them until now. I’m eager to see Apple’s financial results for Q4 2024 and Q1 2025 to understand how this strategy plays out.
I’m already planning on swapping mine for an M4 Ultra.
I love my M1 Studio. It’s the Mac I always wanted - desktop Mac with no integrates peripherals, a ton of ports - although I still use a high end hub to plug in… lot more. Two big external SSDs, my input peripherals (I’m a wired mouse and keyboard kind of guy) then a bunch of audio and USB midi devices.
It’s even a surprisingly capable gaming machine for what it is. Crossover is pretty darn good these days, and there are ARM native Factorio and World of Warcraft ports that run super well.
I haven’t dug too much into gaming since I have a Linux PC that supports all my steam games but what’s the experience of running Crossover on Apple Silicon like? Can you run x86 Windows games using Rosetta (or is it some other method)?
I mean, you actually probably COULD play Dark Souls on it, but you'd be turning settings down a bit I bet (I don't actually know how optimized that game is). I'm pushing a 1440p display which certainly doesn't make things easier on the Mac.
The biggest annoyance I've hit actually is that game controller support is pretty bad. Don't expect generic USB HID game controllers to work, support for that isn't baked into MacOS the way it is Windows (Via DirectInput, etc).
The happy path is pretty much specifically a bluetooth xbox controlle.
I like it in every way except price. It just works, comes back online after a power outage, etc. I don't recall any unscheduled disconnects.
--
Additional thoughts:
I think there are complaints about the fan being loud so I swapped it out when I first got it. I also have it in my basement so I don't hear anything anyway -- HDDs are loud, especially the gold ones
I am still astounded the huge change moving from an Intel Mac to an Apple Silicon Mac (M1) has had in terms of battery performance and heat. I don't think i've heard the fans a single time I've had this machine and it's been several years.
I never thought I'd see a processor that was 50% faster single-core and 80% faster multi-core and just shrug. My M1 Pro still feels so magically fast.
I'm really happy that Apple keeps pushing things and I'll be grateful when I do decide to upgrade, but my M1 Pro has just been such a magical machine. Every other laptop I've ever bought (Mac or PC) has run its fan regularly. I did finally get fan noise on my M1 Pro when pegging the CPU at 800% for a while (doing batch conversion of tons of PDFs to images) - and to be fair, it was sitting on a blanket which was insulating it. Still, it didn't get hot, unlike every other laptop I've ever owned did even under normal usage.
It's just been such a joyful machine.
I do look forward to an OLED MacBook Pro and I know how great a future Apple Silicon processor will be.
My best Apple purchases in 20 years of being their customer: The Macbook M1 Pro 16 inch and the Pro Display XDR. When Steve Jobs died I really thought Apple was done, but their most flawless products (imho) came much later.
Yeah, don’t forget the 10 dark years between the Butterfly Keyboard Macbook Pro 2016, the Emoji Macbook Air, until the restoration of the USB ports… around 2022.
That's the common hipster take on it...but I kinda liked the way the butterfly keys felt and my impression of the touchbar was chaotic neutral. By the time they let up a little on the ports, I'd lived with 4 USB-C ports so long that it really wasn't that big a deal.
What got me, however, was that was the time where their trade-in program was really kicking in. I think I got $800 for my touchbar mac which made the jump to an M1 Pro 14 a little less painful. Now you don't seem to so much pay for hardware as lease the Apple experience, so long as the hardware is still good.
I had the 2015 MBP and I held onto it until the M1 came out…I still have it and tbh it’s still kind of a great laptop. The two best laptops of the past decade for sure.
With 6k I can have both vscode and the website I'm working on in one display, while still having lots of vertical space and no noticeable pixels. I tried using 2 4k screen next to each other among other things, but nothing works ergonomically as good as this.
Yeah, I have an M1 Max 64GB and don't feel any need to upgrade. I think I'll hit the need for more ram before a processor increase with my current workload.
I've got a coworker who still has an Intel MacBook Pro, 8-core i9 and all that, and I've been on M chips since they launched. The other day he was building some Docker images while we were screensharing and I was dumbfounded at how long it took. I don't think I can even remember a recent time when building images, even ones pushed to CDK etc., takes more than a minute or so. We waited and waited and finally after 8 minutes it was done.
He told me his fans were going crazy and his entire desk was hot after that. Apple silicon is just a game changer.
For sure. I had one of those for work when I got my personal M1 Air and I couldn't believe how much faster it was. A fanless ultraportable faster than an 8-core i9!
I was so happy when I finally got an M1 MBP for work because as you say Docker is so much faster on it. I feel like I don't wait for anything anymore. Can't even imagine these new chips.
Sounds like they were building an aarch64 image, building an x86_64 image on Apple Silicon will also be slow - unless you are saying the M* builds x86_64 faster than an i9?
Emulating the target architecture's SDK can be much easier and simpler to setup and avoid mistakes. You do not need to make any changes or configuration to make it compile.
For exmaple, that is generally the way you cross compile Flatpaks in cli or ide. In, for example, GNOME Builder you can just select the device you want to build for, like your smartphone, and it uses QEMU to emulate the entire SDK in the target architecture, you can also seemlessly run the Flatpak on your host through QEMU user mode emulation too.
It doesn't need (unless it does in some cases), but entire build process needs to be aware of what needs to be native and what doesn't. Mainly a lot of issue would come from linker. Yes, it's not part of the compilation technically, but aside from hellworld.c everything you build would probably require linker at one point or another.
Then there are some gcc quirks:
- for gcc compilation target is defined when gcc is compiled, so only way to cross-compile with gcc that I know of is emulation of target arch and runing gcc for target arch
However, we're talking about docker containers here, emulation would be the default way and path of least resistance.
Again, I will reiterate: every cross-compilation strategy falls into one of these two buckets. In some cases what I've describe in #1 is possible (WASM, java bytecode or really (almost) anything that targets a VM), in some cases it isn't and then you gotta go with #2 (docker, gcc)
That was what finally got me to spend the cash and go with Apple Silicon - we switched to a Docker workflow and it was just doooooog slow on the Intel Macs.
But this M1 Max MBP is just insane. I'm nearly 50 and it's the best machine I've ever owned; nothing is even close.
> I am still astounded the huge change moving from an Intel Mac to an Apple Silicon Mac (M1) has had in terms of battery performance and heat
The battery life improvements are great. Apple really did a terrible job with the thermal management on their last few Intel laptops. My M1 Max can consume (and therefore dissipate) more power than my Intel MBP did, but the M1 thermal solution handles it quietly.
The thermal solution on those Intel MacBooks was really bad.
Those MacBooks were designed when Intel was promising new, more efficient chips and they didn’t materialize. Apple was forced to use the older and hotter chips. It was not a good combination.
Another factor might be that Intel MacBook Pros got thinner and thinner. The M1 MBP was quite a bit thicker than its Intel predecessors, and I think the form factor has remained the same since then.
The 2012-2015 MBPs were slimmer than the previous unibodies but a lot of that was due to dropping the optical and spinning hard drives. The thermals were not a particular problem. Where that did become a problem was with the 2016 redesign that introduces the wedge-shaped case. The design of that would have been locked in 2-3 years earlier. From what I have read, when the case designs were being worked on Intel was promising Apple that their next generation of chips would be on a smaller processor node and would run cooler so that defined the thermal envelope of the new MBPs. Unfortunately, as we all saw, Intel’s production stalled for 5-6 years and the only chips they could produce were power hungry and hot. That caused problems for those thinner MBPs.
Apple seems to have taken that to heart when they designed the cases for the Apple Silicon MBPs and those have excellent cooling (and more ports).
> My M1 Max can consume (and therefore dissipate) more power than my Intel MBP did, but the M1 thermal solution handles it quietly.
You have to really, REALLY put in effort to make it operate at rated power. My M2 MBA idles at around 5 watts, my work 2019 16-inch i9 is around 30 watts in idle.
On extremely heavy workloads the fans do engage on my M1 Max, but I need to get my ear close to the machine to hear them.
Recently my friend bought a laptop with Intel Ultra 9 185h. It roared fans even when opening Word. That was extraordinary and if it was me making the purchase I would have sent it back straight away.
My friend did fiddle a lot with settings, had to update BIOS and eventually the fan situation was somewhat contained, but man I am never going to buy Intel / AMD laptop. You don't know how annoying fan noise is until you get a laptop that is fast and doesn't make any noise. With Intel is like having a drill pointed to your head that can goes off at any moment and let's not mention phantom fan noise, where it gets imprinted in your head that your brain makes you think the fans are on, but they are not.
Apple has achieved something extraordinary. I don't like MacOS, but I am getting used to it. I hope one day this Asahi effort will let us replace it.
When I play Baldur's Gate 3 on my M2 Max, the fans get loud. You need a workload that is both CPU-heavy and GPU-heavy for that. When you are stressing only the CPU or the GPU but not both, the fans stay quiet.
I have an i7 12th gen thinkpad and the fans would often be audible when I first got it with Windows 11. Then I installed linux. Now the fan is only audible when something has gone wrong and a process (usually chrome) is pinning a core.
If it's a M1 Macbook Air there's a very good reason you've never heard a fan!
Blows my mind how it doesn't even have a fan and is still rarely even anything above body temperature. My 2015 MBP was still going strong for work when I bailed on it late last year but the transition purely on the heat/sound emitted has been colossal.
Factorio: Space Age is the first piece of software that my M1 shows performance issues with. I'm not building xCode projects or anything, but it is a great Mac. Maybe even the greatest.
There's a known issue on arm macs with external monitors that messes with the framerates. Hopefully it gets fixed soon because pre-space age factorio was near flawless in performance on my m2.
There's a few threads on the technical help forums that are tracking this -- I've subscribed to them because I'm super excited about them getting around to fixing it! :)
It's not just that; at times I pushed all CPU cores to 100% in the M1 Mini and even after 30+ minutes I couldn't hear the fan. Too bad the Macbook Airs got nothing but a literal aluminium sheet as cooling solution.
Lol ... You were not around for the ppc -> Intel change ... Same thing happened then ... Remarkable performance uplift from the last instruction set ... And we had Rosetta which allowed compatibility... The m1 and arm took power efficiency to another level .... But yeah what has happened before will happen again
The thing then was it was just Apple catching up with windows computers which had had a considerable performance lead for a while. It didn't really seem magical to just see it finally matched. (Yes Intel Mac's got better then Windows computers but that was later. At launch it was just matching)
It's very different this time because you can't match the performance/battery trade off in anyway.
Intel chips had better integer performance and PowerPC chips had better floating point performance, which is why Apple always used Photoshop performance tests to compare the two platforms.
Apple adopted Intel chips only after Intel replaced the Pentium 4 with the much cooler running Core Solo and Core Duo chips, which were more suitable for laptops.
Apple dropped Intel for ARM for the exact same reason. The Intel chips ran too hot for laptops, and the promised improvements never shipped.
The G5 in desktops was more competitive but laptops were stuck on G4s that were pretty easy to beat by lots of things in the Windows world by the time of the Intel switch. And Photoshop was largely about vectorized instructions, as I recall, not just general purpose floating point.
Yes, and when it became clear that laptop sales would one day outpace desktop sales, Apple made the Intel switch, despite it meaning they had to downgrade from 64 bit CPUs to 32 bit CPUs until Core2 launched.
The Apple ecosystem was most popular in the publishing industry at the time, and most publishing software used floating point math on tower computers with huge cooling systems.
Since IBM originally designed the POWER architecture for scientific computing, it makes sense that floating point performance would be what they optimized for.
I do wonder if PC desktops will eventually move to a similar design. I have a 7800x3d on my desktop, and the thing is a beast but between it and the 3090 I basically have a space heater in my room
I sincerely believe that the market for desktop PCs is completely coopted by the gaming machines. They do not care one whit about machine size or energy efficiency, with only one concern in mind: bare performance. This means they buy ginormous machines, incredibly inefficient CPUs and GPUs, with cavernous internals to chuck heat out with no care for decibels.
But they spend voriously. And so the desktop PC market is theirs and theirs alone.
Desktop PCs have become the Big Block V8 Muscle Cars of the computing world. Inefficient dinosaur technology that you pour gasoline through and the output is heat and massive raw power.
Desktops are actually pickup trucks. Very powerful and capable, capable of everyday tasks, but less efficient at them. Unbeatable at their specialty, though.
Yeah. It's been the case for a while now that if someone just wants a general computer, they buy a laptop (even commonly a mac).
That's why the default advice if you're looking for 'value' is to buy a gaming console to complement your laptop. Both will excel at their separate roles for a decade without requiring much in the way of upgrades.
The desktop pc market these days is a luxury 'prosumer' market that doesn't really care about value as much. It feels like we're going back to the late 90's, early 2000's.
The price of a high end gaming pc (7800x3d and 4080) is around 2k USD. That's comparable to the MacBook Pro.
Yeah sure, if you start buying unnecessary luxury cases, fans and custom water loops it can jump up high, but that's more for clueless rich kids or enthusiasts. So I wouldn't place pc gaming as an expensive hobby today, especially considering Nvidia money grubbing practices that won't stay forever.
A game I play with friends introduced a Mac version. I thought it would be great to use my Apple Silicon MacBook Pro for some quiet, low-power gaming.
The frame rate wasn’t even close to my desktop (which is less powerful than yours). I switched back to the PC.
Last time I looked, the energy efficiency of nVidia GPUs in the lower TDP regions wasn’t actually that different from Apple’s hardware. The main difference is that Apple hardware isn’t scaled up to the level of big nVidia GPUs.
It would make sense, but it depends heavily on Windows / Linux support, compatibility with nvidia / amd graphics cards, and exclusivity contracts with Intel / AMD. Apple is not likely to make their chips available to OEMs at any rate, and I haven't heard of any 4th party working on a powerful desktop ARM based CPU in recent years.
I just bought a Beelink SER9 mini pc, about the same size as the Mac Mini. It's got the ridiculously named AMD Ryzen AI 9 HX 370 processor, a laptop CPU that is decently fast for an X64 chip (2634/12927 Geekbench 6 scores) but isn't really competition for the M4. The GPU isn't up to desktop performance levels either but it does have a USB4 port capable of running eGPUs.
It would be nice. Similarly have a 5950X/3080Ti tower and it’s a great machine, but if it were an option for it to be as small and low-noise as the new Mini (or even the previous mini or Studio), I’d happily take it.
For what it is worth, I'm running that with open loop water cooling. If your chassis has the space for it, my rig won't even need to turn on fans for large amounts of the day. (Loop was sized for a threadripper, which were not really around for home builders) Size is an issue, however :)
ARM processors have always been good at handling heat and low power (like AWS Graviton), but what laptop did you have before that would overheat that much during normal usage? That seems like a very poor design.
My experience is the same. I only owned one Intel MacBook Pro.
Was the only one I needed thankfully. Missed the whole port reduction and power bar mess.
I love my M1 Air. It is the first general purpose computing hardware that felt like a real advance. I measured that two ways:
How much closer to my mobile is it?
How much faster is it?
The Air feels like a Mobile Computer if that makes any sense. One USB port expander to serve as a dock of sorts later and it makes for a great desktop experience.
When using it on the go, it has that light, powerful feel much like running my phone does.
Great machine. It is easily my favorite computer Apple has ever made, 8 bit greatness and an older age aside.
Mine is sticky. As in when others get hold of it, next thing I hear is usually, "oooh" and then it takes some time for it to come back!
I got mine for a song. Sweet deal, but it is the 8GB 256GB configuration. Not too big of a deal, but more internal storage would be nice. Maybe I will send it out somewhere to get a boost.
Would have already, but I worry a little about those services.
I had a MBP 2019 which with default fan settings was really hot from the 1h videocall in Bluejeans. Or 5 minutes navigating in Google maps and street view in Chrome.
Only solution was to increase fan speed profile to max rpm.
On my 2019, if a single process hits 100% of one core the fan becomes quite noticeable (not hairdryer though) and the top of the keyboard area where the CPU is gets rather toasty.
anything that pegged the CPU for extended periods of time, caused many Apple laptop models to overheat. There is some design tradeoff about power specs, cooling, "typical workloads" and other things.. A common and not-new example of heat-death-workload was video editing..
Not for everyone. It turns out by following standard ergonomic guidelines I was doing more damage. I have to actually look way down at monitors, even on my desk. It has to be well below eye height, basically slammed.
Sometimes even not opening any apps is not enough if Spotlight decides that now is the time to update its index or something similar. Honestly nuts looking back at it.
I remember when macOS switched to evented way of handling input and for some reason decided that dropping keyboard events is okay...anyway if spotlight was updating its index, then unlocking your laptop with a password was impossible.
Last year I bought an M1 Pro used, but the last MPB I had was an early 2015. I just didn't bothered upgrading, in fact the later Intel models were markedly worse (keyboard, battery life, quality control). The Apple Silicon era is going to be the PowerPC prime over again.
> I don't think i've heard the fans a single time I've had this machine and it's been several years.
Yes I agree. I sometimes compile LLVM just to check whether it all still works. (And of course to have the latest LLVM from main ready in case I need it. Obviously.)
16” M1 still perfectly good machine for my work (web dev). Got a battery replacement which also replaced top cover and keyboard - it’s basically new. Extended applecare for another year which will turn it into fully covered 4 year device.
Is it crazy? The chip itself is small. I'm not up on the subject but is it unusual? Are we talking power draw and cooling adding greatly to the size? I guess the M4 Pro must have great specs when it comes to running cool.
Does anyone know if this Mac Mini can survive longer than a year? Apple's approach to hardware design doesn't prioritize thermal issues.
I've had an M1 Mac Mini inside a hot dresser drawer with a TV on top since 2020.
It doesn't do much other than act as a media server. But it's jammed pretty tight in there with an eero wifi router, an OTA ATSC DVR, a box that records HDMI, a 4K AppleTV, a couple of external drives, and a full power strip. That's why it's hot.
So far, no problems. Except for once when I moved, it's been completely hands-off. Software updates are done over VNC.
Yes, but you only really encounter that when pushing the CPU to 100% for more than a few minutes. The cooling is objectively terrible, but still easily enough for most users, that's the crazy thing.
maybe? as local LLM/SD etc get more common it might be common to push it. I've been getting my fans to come on and get burning hot quite often lately because of new tech. I get that I'm a geek but with Apple, Google and everyone else trying to run local ML it's only a matter of time.
After posting this I thought of a few possible use cases. They might never come to pass but ... Some tech similar to DLSS might come along that lets streaming services like youtube and netflix to send 1/10th the data and get twice as good an image but require extreme processing on the client. It would certainly be in their interest (less storage, less bandwidth, decompression-upscaling costs pushed to client) Whether that will ever happen I have no idea. I was just trying to think of an example of something that might need lots of compute power at home for the masses.
Another could be realtime video modification. People like to stream and facetime. They might like it even more if they could change their appearance more than they already can using realtime ML based image processing. We already have some of that in the various video conferencing / facetime apps but it's possible it could jump up in usage and needed compute power with the right application.
Apple's chips already have AI accelerators for things like content-based image search. They would never retroactively worsen battery life and performance just for a few more AI features when they could instead use it as selling point for the next hardware generation.
And if you regularly use local generative AI models the Pro model is the more reasonable choice. At that point you can forget battery life either way.
No, I'm saying if you have 640k you can't just download more RAM, and Apple wouldn't ever try to because that's a free new feature they can market in the next model.
Every Apple Silicon MacBook Air throttles after 5-10 minutes of sustained load because they have passive cooling - but the amount of load needed and the throttled speed is not noticeable to casual users.
You only notice throttling on the MacBook Air when doing things like video renders that use max power for an extended period of time.
Hopefully not? I honestly don't know. It's been around three years (whichever year it was they replaced Target Disk Mode) since I followed Apple news very closely.
It might be different post-Intel? I'm too lazy to dig up sources for Apple's past lost class action lawsuits, etc.
That Rossman guy, the internet-famous repairman, built his youtube channel on videos about Apple's inadequate thermal management. They're probably still archived on his channel.
Hell, I haven't owned a Mac post the year 2000 that didn't regularly hit temperatures above 90 celsius.
Why would you, or anyone, ever compare a line of Intel machines with a line of machines that have a vastly different architecture and power usage? It'd be like comparing Lamborghini's tractors and cars and asking if the tractors will scrape on steep driveways because you know the cars do.
On the other hand, it is comparing Apples to Apples.
The Gods didn't deliver specs to Apple for Intel machines locking the company to placement/grades/design/brands/sizes of chassis, fans, logic board, paste etc. Apple, in the Intel years, just prioritized small form factor, at the expense of longevity.
And Apple's priorities are likely still the same.
My concern is that, given cooler-running chips, Apple will decrease form factor until even the cooler-running chips overheat. The question, in my mind, is only whether the team at Apple who design chips can improve them to a point where the chips run so coolly that the rest of Apple can't screw it up (ie: with inadequate thermal design).
If that has happened, then... fantastic, that's good for consumers.
Jonny Ive left and Apple decided thinness wasn’t the only value.
100% Apple Silicon is that for computers. Very rarely do my fans whizz up. It’s noticeable when someone is using an x64 and you’re working with them because you will hear their computer’s fans on.
The work Apple has done to create a computer with good thermals is outrageous. Minimising distances for charges to be induced over.
I run Linux on my box. It’s great for what it does but these laptops are just the slickest computers I have ever used.
Never gets hot. Fans only come on during heavy compilation tasks or graphic intensive workloads.
That is encouraging to read, and hopefully it truly is the case that Apple has weened itself from its obsession with thinness.
Some of the choices Apple made after SJ's death left such an unpleasant taste in my mouth that I know have knee-jerk reactions to certain Apple announcements. One of those is that I experience nausea when Apple shrinks the form factor of a product. Hopefully that has clouded my judgement here, and in fact these Mac Minis have sufficient airflow to survive several years.
110 celsius heat... not good for lead-free solder... not good for computer.
This whole thread is starting to feel surreal to me. Pretty soon everyone will have me believing I dreamt up Apple's reputation for bad thermal management.
Well, when you don’t appear to know or care about the actual issues stemming from poor thermals (Intel relying too much on turbo clocks, toasty crotches, low battery life, noisy fans) and instead complain about made-up issues, yeah.
My frustration was with the totality of comments in the thread, not yours exclusively. I'd have no problem with any one reply in this thread, on its own. Apologies if I came across as rude.
There's nothing in a comment thread so cringeworthy and boring as a person trumpeting their own expertise, so I'll refrain, and leave off here.