Title adjournment is my own.
This is especially timely with Apple supposedly polling MacBook Pro owners about whether they use their headphone jacks.
Apple seems increasingly out-of-touch with the professionals that helped make them the brand they are. I guess that doesn’t matter if you can hook the large masses of casual users.
Apple was always about incompatible standards and limited extensibility:
It wouldn’t be hard to find a handful of other examples. (I am still upset about removing IR remote support ;).) Apple’s history is full of incompatibilities and oddball interfaces. The professionals made them the brand they are despite this (me included).
The 2011 MacBooks only had Thunderbolt as a faster connector, while it was already clear that USB 3.0 would be the standard. 2011 users are stuck between slow USB 2.0 devices or extremely expensive Thunderbolt devices.
It’s worth noting Thunderbolt 3 IS a standard, and can carry DisplayPort signals, and most interestingly, PCI Express. Unfortunately, only Apple put it in wide adoption. Now USB Type C is actually carrying Thunderbolt signals.
While it may now seem normal, the iPhone had a proprietary 30-pin connector, while the rest of the industry used standard USB connectors.
A few years ago, most PDAs have their own proprietary connectors, as well as most flip phones. It was only just around that time others started to switch to USB, and even then, the first time around, it was mini USB, which got replaced quickly.
Apple does use proprietary ends, but they’re good about sticking with it for a few years. The 30-pin connector lasted around a decade, but I hope Apple switches to Type C across the whole line soon. The Macs at least are getting that way.
The original Mac was basically a reaction to the more complicated buyer experience of PCs and other micros of that era taking simplification of the UX to an extreme. Maybe too far. If you look at the rest of those choices in the context of when they were made they tend to make sense though.
Mini-B was the closest connector USB had but it didn’t support audio. Remember the horrible quasi-USB connectors on WinCE smart phones of that era? Same reason.
My Motorola Razr had a standard USB connection a couple of years before the iPhone became available. I didn’t use it for anything else than charging though ;).
2011 Macbook USB 3: Did many people really care? I never noticed.
I did. It sucked.
Ah, the memories:
Who else was shipping USB3 laptops at the beginning of 2011? I don’t remember what the rollout was like.
It took until Haswell in 2013 to make USB 3 a standard part of the chipset. Until then, USB 3 was a separate chip.
I didn’t get a USB 3 laptop until late 2012. The laptop I bought at the end of 2011 was USB 2 only.
If I’m not mistaken the i7 X220 (released April 2011), has a single USB 3.0 port - oddly, I thought they were more mainstream by then. According to Wikipedia:
Intel released its first chipset with integrated USB 3.0 ports in 2012 with the release of the Panther Point chipset.
Yeah, only on the i7 model. Not the base i5 models. So that’s like a $200 USB port. :) Default USB 3 didn’t arrive until the 230 series more than a year later.
Also the W520 had two, and the T520 didn’t.
The article also mentions “floppy drives, serial, or PS/2 ports”, and “12” MacBook removing MagSafe, USB, and Thunderbolt”.
IMHO A single USB on a desktop or laptop is the single most annoying thing Apple does.
They don’t even support ZIP drives anymore!
I always wondered why that sort of information isn’t collected among the people who opt in to share data. Knowing how often ports are used and how many devices are attached seems like a non-privacy invading thing to monitor when people opt in.
Maybe I’ll eventually get a laptop with the 8000 USB ports I need.
Apple seems increasingly out-of-touch with the professionals that helped make them the brand they are.
I think you got it the other way around. Professionals use Macs because Apple is the brand that it is, which is because they make the right products.
Of course professionals will always believe they ‘made the company’ since that sounds nicer to them, but if Apple didn’t exist, would these professionals have ‘made the company’ of Dell?
I’m betting lighting won’t gain wide adoption outside of Apple, but Apple doesn’t care, and this is an opportunity to force upsell of audio gear. There is no discernible difference in audio quality with the change, It’s a revenue play.
But headphone companies care. Until now they’ve had SKU variance only for colors, sizes, maybe component quality. Now they are forced into working with Apple as well, or loosing out on a ton of potential sales, because they don’t work for iPhones… Smart move by Apple, buying Beats, eh?
I’m saving this article to study and hopefully improve my writing style. The transition from the analogy to the main topic is so darn seamless.
There’s a video of Steve Jobs expanding upon the word “courage” with respect to the courage to not use Flash on the iPhonr back in 2010.
It’s hard to say this without sounding like a fanboy (which I’m not), but Apple’s first generation of products after making a “courageous” decision (ugh) are often quite poor. The first generation MacBook Air was widely derided (and rightly so) but after they’d perfected it, it was a laptop model the whole industry adopted. Similarly with the plugless iPhone - it will probably take a few generations for it to get perfected (probably through improved Bluetooth).
One interesting image I saw a few days ago made an interesting point. Many third-party iPhone-compatible credit card readers use the headphone socket to connect. They won’t work anymore (well, they’ll be unwieldy with an adapter). Make of that what you will…
As @mattgreenrocks said, Apple is increasingly out-of-touch with professionals - I am very curious to see what the next round of laptop upgrades bring, as well as what will happen with the Mac Pro. It used to be the case that Apple received preferential treatment from Intel (heck, look at the custom CPU in the aforementioned 1st generation Air), but today they’re still shipping a two-generations-behind Haswell-powered 15" MacBook Pro (I’m being kind here and ignoring Kaby Lake, which was only just announced).
Many third-party iPhone-compatible credit card readers use the headphone socket to connect. They won’t work anymore (well, they’ll be unwieldy with an adapter). Make of that what you will…
This is largely incorrect: Square themselves say things are fine, see http://www.macworld.com/article/3117649/hardware/dont-worry-squares-card-reader-works-with-apples-iphone-7-headphone-jack-adapter.html for details from the horse’s mouth.
It’s a bit depressing how so many people writing paranoid articles about how “Apple is trying to kill $foo” clearly didn’t even talk to the people who made $foo. (There’s enough valid reasons to complain about a lack of headphone port without making them up, IMO.)
Yep, I know about Square not requiring it, but there are others (uDynamo, Shopify Swipe, etc).
I guess I should’ve expressed myself a bit more clearly - there’s always so much FUD when technology companies make a “courageous” decision. Queue the ulterior motive comments.
I think as the GP said, the reader will work, but it’ll be unwieldy. A nice thing about the Square reader is that it’s (more or less) flush with the phone, which provides a good “base” to swipe the card on (imagine holding the Square reader up while swiping when it’s plugged into the lightning adapter).
On the other hand, at least 90% of the Square readers I encounter day to day (Los Angeles) are plugged into iPads, not iPhones. And with chip and pin, the reader is a larger, separate device anyway.
Many third-party iPhone-compatible credit card readers use the headphone socket to connect. They won’t work anymore (well, they’ll be unwieldy with an adapter).
The single USB move on Apple’s part was also stupid. I have a Macbook Pro with 2 USB ports and that isn’t nearly enough ports either. I end up using both ports with one port running a 4 port expansion and every last port is used.
Lightning is a strange proprietary standard that’s unlikely to gain wide adoption,
No value judgement there, right? :)
This blog post is worse than removing the 3.5mm audio jack. It says nothing new, all of this has been said about a thousand times in the past months. This has been discussed to death leading up to the iPhone 7 announcement, and is now beating a dead horse.
If you don’t like the removal of the jack, don’t buy an iPhone 7. But if you truly think it was a mistake, you’re wrong. The sales of the iPhone 7 are solid and probably will remain so. The reality is most people don’t need it. Apple is right. They have the usage statistics to make this decision based on data, not based on lame editorials.
Lightning is definitely not the future of getting audio out of an iPhone, wireless is. Maybe Bluetooth will get replaced by something else. But right now lots of people use Bluetooth. I use it to get audio to play in my car every day, and it works fine. It’s could be better but it’s good enough. It’s a lot more convenient than plugging my phone into an audio jack. I don’t even have to take it out of my pocket.
Heh. People do love their schadenfreude. So the day after the iPhone 7 announcement, people were cackling that the stock price was tanking. “See, the market hates the new iPhone.” It’s now up more than 10% since then…
Wireless is not the future. The future is not a second device to power. The future is not EMI bound. The future is not an inferior DAC on an external device.
There will always be demand for a wired connection, in the same way that you still have computers talking to each other over ethernet.
Wireless is a compromise, and always will be.
Every technology choice you make represents a compromise. There is no one absolute best choice.
I chose to compromise speed and reliability of my home network in exchange for the flexibility to use it anywhere in my house and super easy installation - no fish tape for me!
Wired speaker systems offer the best sound output but have the same drawbacks as above.
Most people who use headphones have a relatively crappy set of earbuds that they use to talk on the phone and listen to music as they do other things. Apple decided the benefits of removing the cord outweigh the drawbacks. And I wager they’ll be proven correct.
Yeah, I mean that whole WiFi thing never took off. Don’t even get me started on the failed technology behind Bluetooth mice/keyboards and wireless video game controllers.
I mean, history is just filled with technology transitioning from wireless back to wired connections, like… um… like uh… hmm. Well I’m sure there’s something. No one uses wireless devices anymore.
WiFi? Urgh. It’s 2016 and I still can’t get decent performance at home, even sitting in the same room as the AP. Admittedly, I am running OpenWRT on 802.11ac hardware (not the best supported combination, but I refuse to run vendor firmware), but still.
Perhaps I should bite the bullet and buy some enterprise hardware (or even “prosumer” stuff like Ubiquiti hardware) - most places I’ve consulted at over the last ~10 years have been wireless-only for laptops and, with the right setup and hardware, it works pretty well. Admittedly, open plan offices are pretty easy to support…
eero is apparently amazing. Never tried it.
To each their own, but I’m a little puzzled that people insist on their own OS for an access point, but run stock firmware on their switches. If anything, my Ethernet backbone is far more critical than its wifi extension.
For me, and I assume for most individuals even at the “running their own firmware” level, the AP, switch and router are all the same piece of hardware (and/or the switches are dumb).
Yes, you’re right. I use enterprise switches at home but have never thought of looking for models with open source firmware (probably because I tend to buy second hand on eBay).
behold the wireless accessory future (http://www.geek.com/wp-content/uploads/2015/10/magic_mouse_2_charging.jpg)
Further, lets do a speed test with 10gb ethernet and wifi then.
Or regular gigabit. Or 100Mb, if somebody’s using the microwave. Wireless connections are really bad, but it turns out the general public mostly uses the sum total of technological innovation since the 70s or so to share their political views with people who don’t care and watch cat videos, and so most of the sum total of technological innovation since the 90s or so has been gobbled up by Wirth’s Law and not really used for anything. So people don’t care, and choose what to buy based on what’s shiniest.
I think that’s an overly pessimistic view. I use wireless at work and it’s perfectly adequate. I download files from servers on our intranet at 50-80 megabytes a second with no problem. While technically slower than the theoretical maximum of ~125 megabytes a second on gigabit ethernet, I still can’t be assed to plug in the ethernet cable that sits unused on my desk 2 feet from my laptop every single day.
I stream Netflix over LTE on my train ride to work and the quality is perfectly adequate, no complaints. The only things I ever need 10 gigabit for usually involve a cluster of servers, not my laptop.
Oh please post the 10gb vs wifi speed test benchmark results. I have been wondering if I need to get an external NIC attached via PCIe over thunderbolt in order to fully utilize my 25mbs home internet connection. I have a wireless AC router do you think that’s enough for 25mbs internet?
I’d get two NICs and bond the ports just to be sure.
Print this out and hold onto it for 10 years and you might learn some humility.
And yet here I am, using wired speakers and using wired internet.
The point is that for every leap in wifi, there is an equal if not larger leap in wired throughput. The physics involved favor wires.
At some point, yes wireless will be good enough for most people, but absolute performance will always reside with wires.
First you say:
Wireless is not the future.
Then you say:
wireless will be good enough for most people
So, wireless is, and is not, the future?