When the Touch Bar launched, I felt like I was touching the future. It was an OLED strip that recognized touch and replaced the function array keys on the 2016 MacBook Pro. You can easily browse video and audio files in QuickTime. In Safari you can choose very nice bookmark buttons. Preview also had a whole host of useful buttons. The Touch Bar itself was a beautiful set of buttons that changed depending on context, and it felt like iOS’s contextual menus were brought into the physical world.
The problem was that the Touch Bar never quite made it out of the proof-of-concept stage, despite being in most MacBook Pros made between late 2016 and 2019. And if it were to replace the entire feature row, with its easily accessible physical mute and volume buttons, it had to be more than a proof of concept. It had to actually be valuable.
Adobe and a handful of other companies like Pixelmator have made streamlined efforts. But even the most streamlined attempts were rarely as customizable as you’d like, and the vast majority of attempts were rudimentary at best. One of my favorite examples was the way Google Chrome handled the Touch Bar versus Safari. Safari had nice bookmarks. Chrome just replicated the normal menu, but on the Touch Bar. And while someone somewhere probably found it useful to have a search bar and a back button there, most people didn’t.
Yet Apple itself was by far the worst offender. The Touch Bar in Apple’s apps was often a miracle, but it only seemed like a miracle for some Apple apps. I could jog through an entire audio file in QuickTime, but I couldn’t do the same in the native Voice Memos app, where I actually created that audio file. And if I wanted to remove the Siri button, which my dumb fingers had a habit of hitting every hour, I had to hunt through the System Preferences app in macOS.
Once I got there, I found that removing the Siri button was one of the few things you could actually customize on the Touch Bar. If I wanted an easily accessible row of my favorite emoji, an app selector, or even a cool widget to consistently show me my upcoming meetings, I’d struggle to do it natively. Instead, I dropped money on BetterTouchTool. And look, BetterTouchTool is a great application that gives you control not only of the Touch Bar, but of all your macOS inputs. Still, I really don’t think I should shell out $10 for a standard license, when many of BetterTouchTool’s features should have been built into macOS by default.
That wasn’t the case because the population of Touch Bar users was a lot smaller than the population of macOS users. You could only get the Touch Bar on a MacBook Pro, and it was only available on the most expensive ones. If you wanted the cheapest MacBook Pro, you’d have to make do without the Touch Bar, which meant most people did it and most developers ignored the Touch Bar.
I like to think that we, as a people, knew the Touch Bar was ready when it never made its way to another Mac. There was no full-size keyboard with an OLED strip with contextual buttons. No optional Touch Bar for the MacBook Air. Even to an avowed Touch Bar fan like me, it was clear almost from the start that Apple was never interested in taking the feature or its potential seriously, and encouraging other developers to be serious as well. Instead, it was a selling point: It rolled out in 2016, when Mac fans were in their darkest days and were begging for a Mac laptop that looked as nice as a Dell XPS 13 and ideally had a processor that didn’t at least three years old was old.
The Touch Bar was essentially Apple replanting its design flag and declaring that it cared about laptops. And now, with the M3 and a line of fast computers with some of the best advertised battery life around, it’s clear that Apple cares about the laptop again. It doesn’t need the flashy Touch Bar to convince people it cares about the future of computing.