The fundamental problem with USB-C is also seemingly its selling point: USB-C is a connector shape, not a bus. It’s impossible to communicate that intelligibly to the average consumer, so now people are expecting external GPUs (which run on Intel’s Thunderbolt bus) for their Nintendo Switch (which supports only USB 3 and DisplayPort external busses) because hey, the Switch has USB-C and the eGPU connects with USB-C, so it must work, right? And hey why can I charge with this port but not that port, they’re “exactly the same”?
This “one connector to rule them all, with opaque and hard to explain incompatibilities hidden behind them” movement seems like a very foolish consistency.
It’s not even a particularly good connector. This is anecdotal, of course, but I have been using USB Type-A connectors since around the year 2000. In that time not a single connector has physically failed for me. In the year that I’ve had a device with Type-C ports (current Macbook Pro), both ports have become loose enough that simply bumping the cable will cause the charging state to flap. The Type-A connector may only connect in one orientation but damn if it isn’t resilient.
Might be crappy hardware. My phone and Thinkpad have been holding up just fine. The USB C seems a lot more robust than the micro b.
It is much better, but it’s still quite delicate with the “tongue” in the device port and all. It’s also very easy to bend the metal sheeting around the USB-C plug by stepping on it etc.
The perfect connector has already been invented, and it’s the 3.5mm audio jack. It is:
Every time someone announces a new connector and it’s not a cylindrical plug, I give up a little more on ever seeing a new connector introduced that’s not a fragile and/or obnoxious piece of crap.
Audio jacks are horrible from a durability perspective. I have had many plugs become bent and jacks damaged over the years, resulting in crossover or nothing playing at all. I have never had USB cable fail on me because I stood up with it plugged in.
Not been my experience. I’ve never had either USB-A or 3.5mm audio fail. (Even if they are in practice fragile, it’s totally possible to reinforce the connection basically as much as you want, which is not true of micro USB or USB-C.) Micro USB, on the other hand, is quite fragile, and USB-C perpetuates its most fragile feature (the contact-loaded “tongue”—also, both of them unforgivably put the fragile feature on the device—i.e., expensive—side of the connection).
You can’t feasibly fit enough pins for high-bandwidth data into a TR(RRRR…)S plug.
You could potentially go optical with a cylindrical plug, I suppose.
Until the cable breaks because it gets squished in your bag.
3.mm connectors are not durable and are absolutely unfit for any sort of high-speed data.
They easily get bent and any sort of imperfection translates to small interruptions in the connection when the connector turns. If I – after my hearing’s been demolished by recurring ear infections, loud eurobeat, and gunshots – can notice those tiny interruptions while listening to music, a multigigabit SerDes PHY absolutely will too.
This. USB-A is the only type of usb connector that never failed for me. All B types (Normal, Mini, Micro) and now C failed for me in some situation (breaking off, getting wobbly, loose connections, etc.)
That said, Apple displays their iPhones in Apple Stores solely resting on their plug. That alone speaks for some sort of good reliability design on their ports. Plus the holes in devices don’t need some sort of “tongue” that might break off at some point - the Lightning plug itself doesn’t have any intricate holes or similar and is made (mostly) of a solid piece of metal.
As much as I despise Apple, I really love the feeling and robustness of the Lightning plug.
I’m having the same problem, the slightest bump will just get it off of charging mode. I’ve been listening to music a lot recently and it gets really annoying.
Have you tried to clean the port you are using for charging?
I have noticed that Type C seems to suffer a lot more from lint in the ports than type A
It’s impossible to communicate that intelligibly to the average consumer,
It’s impossible to communicate that intelligibly to the average consumer,
That’s an optimistic view of things. It’s not just “average consumer[s]” who’ll be affected by this; there will almost certainly be security issues originating from the Alternate Mode thing – because different protocols (like thunderbolt / displayport / PCIe / USB 3) have extremely different semantics and attack surfaces.
It’s an understandable thing to do, given how “every data link standard converges to serial point-to-point links connected in a tiered-star topology and transporting packets”, and there’s indeed lots in common between all these standards and their PHYs and cable preferences; but melding them all into one connector is a bit dangerous.
I don’t want a USB device of unknown provenance to be able to talk with my GPU and I certainly don’t want it to even think of speaking PCIe to me! It speaking USB is frankly, scary enough. What if it lies about its PCIe Requester ID and my PCIe switch is fooled? How scary and uncouth!
Another complication is making every port do everything is expensive, so you end up with fewer ports total. Thunderbolt in particular. Laptops with 4 USB A, hdmi, DisplayPort, Ethernet, and power are easy to find. I doubt you’ll ever see a laptop with 8 full featured usb c ports.
This is a great overview of things to bring up when someone on their high horse is trying to denigrate your profession using knowledge culled from a brief interaction with Wordpress or something else equally terrible.
Frankly, from my perspective, I wonder if all this attention to external aesthetics has compromised attention to the OS and the hardware.
It’s rather unlikely that the folks drafting curves in AutoCAD are doing double duty as OS engineers.
yes, but here “attention” means organizational priority (manifesting as team management quality, team size, team experience etc), not individual engineers. In smaller companies such priorities are often reflected in individual engineers' time prioritization, yes, in larger companies it’s different.
Apple has enough resources that it can’t be a penny-pinching tradeoff. If there is a deficiency in the OS and hardware, it is just that, a deficiency.
Sure, but it’s a deficiency driven by the internal prioritization - and, in my enterprise experience, is typically unrelated to “not having enough dollars” but “not having enough focus”. Solving the problem means increasing the organizational visibility of the teams responsible for OS/HW, and that’s not something extra money can solve.
To be fair, OS and hardware has never been a priority for Apple. They’ve always prioritized the product. That’s why macOS only runs on Apple hardware (without some jiggering). Because they’re selling you a product, and the software exists because the product demands software exist, not because Apple is interested in making good software.
I like to think that macOS doesn’t run on other hardware (legally) so that they avoid legal issues around monopolies. Back in the 90s Apple clones were prevalent, and they ran Mac OS. I was young, but I don’t recall clone companies being sued out of existence or anything (would love more context). But, MSFT had legal problems with their strategy of windows on every computer. By controlling the hardware, and the OS, it’s easy to make your own rules, me thinks.
The clones you’re referring to from the 90s were all officially sanctioned by Apple (see Wikipedia). When Steve Jobs returned in 1997 he brought an end to the program and since then Apple have been pretty quick to chase anyone making clones.
There’s nothing about being vertically integrated that changes the rules about monopolies. If anything, it makes it easier for competitors to say “you’re a monopoly!” if you’ve gone that route. Apple isn’t concerned about anti-trust actions, for the most part, because they simply don’t have the market-share for that to matter.
Microsoft’s problems had nothing to do with the popularity of Windows and more how Microsoft used that popularity- mostly, by bundling a browser with their OS in an era when browsers weren’t considered free (as in beer) software.
If anything, it makes it easier for competitors to say “you’re a monopoly!”
That seems counter intuitive to me, but IANAL. It seems like it would / should be perfectly legal for Apple to say, “you cannot develop software to target macOS, or iOS,” and then sell you only their own software. Or, allow you to write whatever you want, but bundle their own apps (like they do). But! Only because they control the whole experience. I can’t have macOS without buying a computer from Apple. I can simply boycott Apple if I don’t like this.
But, consider Windows. In the 90s and early 2000s (and maybe now too? I don’t know), you could buy a shrink wrap copy of Windows off the shelf. But you didn’t have to unless you were upgrading. That’s because MSFT used anti-competitive practices. If Dell wanted to offer pre-installed copies of Windows on their machines, they could buy an OEM license, and save tons of money (making it possible to charge less for PCs), or they could buy full licenses and charge more for their machines. Naturally consumers want a cheaper option, and so companies like Dell were boxed into OEM deals, and consumers Windows. Suddenly, the difference between Gateway and Dell was branding and stock RAM configuration.
Microsoft was marketing Windows as an enabling experience. It’s just an OS. You can buy Microsoft Office, or WordPerfect, and Photoshop… All this software from these other vendors is compatible with Windows! Build your software for Windows! It’s designed for Windows! To run on ANY PC! And they marketed that. Constantly. So, when they bundled IE, suddenly that claim, and that sort of “promise” became even more anti-competitive. It was an unfair advantage, since everyone was forced into having Windows already to begin with.
I don’t know if that matters, but that’s always been my take on it.
That seems counter intuitive to me, but IANAL.
Nor am I, but anti-trust laws treat monopolies like decency laws treat pornography: you know it when you see it. Apple, by running its own walled garden app store that explicitly prohibits apps which compete with its own applications, is clearly being anti-competitive. But that’s not enough to run afoul of anti-trust laws, alone. There are a lot of other factors, like whether they have a dominant market position, whether there is other competition, and whether there’s any signs of collusion between them and other actors in the market- which is why they did get in trouble for price fixing with the iBooks store (which for the record, I think was a dumb decision, as was the Windows/IE decision).
Anti-trust laws are designed to limit vertical integration, but not prevent it. What is generally not allowed, for example, is owning the mine, owning the refinery and foundries, and then selling steel to the car factory (which you also own) at below the market rate.
Solving the problem means increasing the organizational visibility of the teams responsible for OS/HW, and that’s not something extra money can solve.
Right, exactly. That’s why a deficiency in the OS is probably wholly unrelated to dollars being thrown at the industrial engineering at the expense of OS engineering.
It’s not an either-or situation, it could easily be both-and if Apple wanted to, which is why the initial assertion of “external aesthetics [compromising] attention to the OS and the hardware” is a rather silly false dichotomy.
Agreed. It’s most likely related to all the companies eyeballs being thrown at industrial engineering at the expense of OS engineering - which is, I believe, the point kghose is advancing :)
It’s easy to spend money in an enterprise, and I wager Apple’s OS engineering team isn’t suffering financially relative to the industrial design team. It is very hard, however, to end up on your four-over-manager’s yearly goals unless your team directly aligns with your five-over-manager’s edict to make beautiful products.
I wonder if all this attention to external aesthetics has compromised attention to the OS and the hardware.
Or the sense of care given to industrial design could have created an environment where taking the time for excellence was known to be allowed, and perhaps motivated the OS folks to do the same. Who knows, if the design team didn’t try so hard, maybe the OS would be shittier too?
My guess is OS work just has shorter lifecycles and is generally more “rushed” than the industrial design and hardware. I mean, hardware can be years in the making. Industrial design has many prototype phases too. They still have to churn out OS updates in the meantime.
I think the findings make sense if you consider he used a speechwriter, like every President before him. Improvisational Trump is a wildly different orator than this Trump, even if he delivery is similar.
I like the idea but I take some issue with this line since some of the sites just haven’t changed in over a decade:
Brutalism can be seen as a reaction by a younger generation to the lightness, optimism, and frivolity of todays webdesign.
Yes, the author is essentially ignorant of the web and its history. (Also, brutalist architecture is teeth-hurtingly ugly and unpleasant; these websites aren’t).
“designer wank” is my classification of this sort of writeup: ignorance, faux-intellectual thought, and attention purely on the superficial aspects of a web site.
80% accuracy is really bad for this. People are really offended if you get their gender wrong and they notice, especially if, like most people, they haven’t been worn down by that being an everyday occurrence.
It’s an interesting description of a machine learning technique, I just know that there are companies using things like it, and am deeply saddened by that.
Agreed. If you think you need gender information, you probably don’t.
Well, you have to pick a pronoun at some point. You can use ‘they’ but there’s also people who get wildly upset at that for no good reason. Pick your poison.
Why? The only reason I’ve run into for software to generate text referring to users in the third-person is when it’s some sort of social networking or communication tool. And when it’s that, users should be specifying their pronouns (as Facebook and Google+ both allow; Twitter doesn’t, but avoids pronouns entirely).
There are languages where second-person pronouns have to be gendered, and that must indeed be awkward wrt addressing the user. I tend to favor messaging that avoids trying to greet the user familiarly, which would avoid this issue… but I am told it has great appeal from a marketing standpoint, and I’m not equipped to argue with that.
Oh, I wasn’t thinking of just software. Missed the context.
The thing is, sometimes ‘they’ is the correct pronoun, and yet when someone says “you have to pick a pronoun” they rarely mean to include that as one of the options.
Better to ask for pronouns if you need pronouns, and not gender information.
I never used to use “they” because I was taught to use “he or she” but the singular “they” is becoming accepted by style guides and the like, so I feel less weird using it.
If it makes you feel any better, singular they was common usage until 20th-century prescriptive grammarians decided it shouldn’t be. Most people still use it, but only in relaxed speech - if you point out they’ve been doing it, it’s common for people to switch to “he or she”. What we’re told at age ten has this way of sticking with us…
this is one place where us southerners do it right - when in doubt, just say y'all. It works for second person plural, singular, and I’ve even heard enterprising folks stretch it into the third person.
I hate to be this guy, but those margins must be razor-thin. Change-your-profession-thin.
I’ve said this before, but I guess I’ll say it again, Laravel really makes PHP pretty bearable.
Somewhat confused what the problem is. Somebody created a domain. If I understand correctly, that somebody is owner of that domain, since they appear able to host content with it. That somebody also got a certificate. That sounds a lot like what is supposed to happen.
Yes, Let’s Encrypt is not what’s being abused here; the victims' DNS hosting is. It’s absurd to suggest that Let’s Encrypt should have a role in preventing this, but I don’t want to address that in detail because there’s actually something more important on the table.
Let’s Encrypt was founded to work towards a future where it’s realistic to expect SSL for every connection. This is a goal I profoundly agree with, as I’ve posted before. SSL for everything is fundamentally incompatible with SSL only for nice people, and that’s okay - that was always the goal.
Unfortunately, from the perspective of many players, SSL is a marketing tool (“See, you can trust us!”), and anything it actually achieves or fails to achieve is incidental. What Let’s Encrypt is doing weakens the marketing value of serving via SSL, because it means users are going to have to adapt to the idea that only explicitly illegitimate sites don’t have it, and there will no longer be extra trust for the sites that do. It also can’t help but bring the price of certificates down in general, in the long run, which upsets the for-profit CA business model, even though it probably won’t demolish it.
It isn’t necessary for critics of Let’s Encrypt to disclose their financial incentives explicitly, because that’s clear just from looking at what types of business they are. There’s nothing shadowy happening here, but there is a large-scale disagreement going on that not everybody has noticed, and this sort of thing needs to be read with some cynicism.
Unfortunately, from the perspective of many players, SSL is a marketing tool (“See, you can trust us!”), and anything it actually achieves or fails to achieve is incidental.
Well stated. The “lock” icon is synonymous with “it must be OK to enter my Credit Card, or Bank Account password.” The lock icon and those idiotic “Norton Secured” buttons.
which upsets the for-profit CA business model, even though it probably won’t demolish it.
One can dream, though, right?
The problem is, as stated in the article, Trend Micro itself is a CA, and is missing out on some SSL certificate cash because of Let’s Encrypt. This is a minor hit piece.
At least now when malware steals your data it will be encrypted!
That is a security improvement. It means an MITM can’t steal your data a second time. And state actors have been doing that routinely as a hard-to-trace exfiltration strategy, so…
Yes! I can respect malware authors who have enough respect for their victims to ensure that no one else can steal it. It’s certainly a big step up from the wild west that existed before.
Thanks, Let’s Encrypt!
 Though, I’d respect them a lot more if they went with an EV cert. That’d show that some real initiative.
“Somebody created a domain … [and] is owner of that domain”
Nope, the problem was that attackers were able to control a third-party’s DNS, create a subdomain, and host (malicious) content on that subdomain, protected by a fresh LetsEncrypt cert.
Put in my two weeks at a full-stack position, trading in my PHPs for an all front-end Angular role
Moving my crummy MUD-like game from a TCP client/server model to a Flask app using Flask-SocketIO. Pretty easy going so far since the core of the server was very loosely coupled with the TCP server. The curtsies Python library is good – but I just don’t have the patience for curses, even if it’s abstracted via a wrapper. Excited because as a non-CS guy originally I managed to create a scripting language for use within the game of which maybe only 95% is a tire fire.