That is not a good idea. Many sites might be affected without seeing that notice. Bing has larger usage than it might seem.
Plenty of time for people to yell. How many discourse sites pull the latest code on a regular basis ? :)
I do. Luckily, I saw this post in my RSS/Atom reader. You can write an opinion here.
The robots.txt spec does not include it, but most bots support the extension: Crawl-delay
Rather than block bing, they should be adding Crawl-delay: 10
Bing documents this:
Maybe there’s a punitive angle to this action?
The only ones who will feel real pain from that action are the webmasters who didn’t see the announcement and lose traffic.
As noted in the very thread you linked, that’s only done on crash, and can trivially be disabled if you wish with a toggle in settings. This isn’t meaningfully different from Firefox and Chrome sending memory dumps when they crash. I can see your argument it should be opt-in, and I think I’d agree, but it’s not nefarious.
Well, the meaningful difference is that Firefox prompts before sendig crash reports.
Yup. What @gecko said. Our InfoSec group has blessed it for internal use provided said feature is disabled, and they’re pretty damn hard core about such things. To me if this is your only blocker you should look again.
Considering MS’s history on that topic, I don’t trust them and wouldn’t install it myself.
Don’t trust. Verify. It’s an open source project. Go read the source code yourself, build it yourself, etc.
git clone https://github.com/Microsoft/vscode.git
cd vscode && find . -name '*.ts' | xargs wc -l
Be my guest.
Not even counting dependencies here. You can’t realistically verify that kind of software by yourself.
Even if you manage to read and carefully inspect this codebase (rogue commits tends to be quite hard to spot, you wont spot it with a simple distracted read), you’ll have to still read the ~100 daily commits every day.
My point is: nowadays, ultimately, you’ll always rely on trust to some extent when it comes to your security, even when using open source sotware.
There are similar problems with AMP. If you don’t use AMP (which is bad for Web publishers), you won’t appear at the top of the search rankings on mobile.
I think that the trend is going to change due the recent PATENTS file decision. Many companies can’t use React due to it, and that number is probably going to increase.
A problem with that PATENTS file is that it appears to give Facebook a license to violate your patents. If you don’t defend your PATENTS, then you risk losing them. If you challenge Facebook for violating your patents, you risk losing React. Many companies don’t use Facebook’s libraries for that reason.
It isn’t wise to have that kind of thing hanging over “Free software” projects. Because React, Immutable, and other Facebook libraries are becoming dependencies of other projects, it’s poisoning the open source ecosystem.
Easy solution: delete the PATENTS file from all Facebook repos and keep them under a simple, well-known BSD license.
Is the site offline? I get an error, even when switching networks and browsers: “This page isn’t working - voice.mozilla.org is currently unable to handle this request.”
Works for me
Those messages already appear on Chrome and Firefox.
I believe (in Chrome, at least) that currently, that’s only for forms with password or CC fields. Non-HTTPS sites with just generic fields don’t display “Not secure” in the address bar - at least for me.
Oh, I see. All text inputs now.
The article explains how this is changing in new versions of Chrome. I believe the Chrome team have a target to move away from telling people a site is “secure” and instead notify them for any site that is not secured with TLS. This is just one more step along that road.
Github says: “Sorry, this file is invalid so it cannot be displayed.”
Excitingly, it works for me.. I love computers. Is it corrupted if you download it?
No, the download link delivers a working PDF.
It works now. There must have been a glitch with Github.
An experienced keyboard user with a keyboard is much faster than an experienced mouse user with a mouse. A non-trained user might be faster with a mouse than a keyboard, because fast keyboard usage requires more training.
Like the article says too, this depends largely on what you wish to accomplish. There are tasks which will almost always be faster with a mouse. Aiming in FPS games comes to mind as a spectacular example.
FPS gaming is just one specialized example. For general computer use (web surfing, programming, email, etc.), experienced keyboard users are much faster than experienced mouse users, especially when using tools that are made for keyboard enthusiasts:
This sounds a lot like the point the post was making. People who like keyboards think they’re faster with keyboards even when they’re slower.
Well, how many professional gamers use any of those? Or, to give another example: a lot of things in Photoshop or the GIMP are going to be much faster - and much easier - with a mouse, than with a keyboard. Some other things in both are going to be faster with a keyboard.
Point still is: the keyboard is not always faster. For a lot of things, it is. For a lot of other things, it is not. And how much of these each person uses varies from person to person.
I’m not a gamer or designer, but some things in those programs are probably faster with a mouse. GIMP and Inkscape are faster with knowledge of keyboard though. It’s much faster to hit ‘o’ than to grab a mouse and find the eyedropper tool in GIMP, or ctrl-shift-f in Inkscape to open the fill settings.
Web surfing, text editing, window management, and other common tasks are unambiguously faster with a keyboard (by a trained keyboard user). It’s worth the time investment.
Shortcuts are faster with the keyboard, yes. But once you selected the eyedropper tool, will you use the mouse, or the keyboard to do something with it?
It is worth investing into using the keyboard efficiently - I never said otherwise. But it will never be unambiguously faster for everything. It will always be “it depends”.
Sorry, I might not have explained my point well. I don’t mean literally for every single hand movement – only that someone who is experienced with advanced keyboard control will be much faster than an advanced mouse user who doesn’t use many keyboard commands, all other things being equal.
It isn’t just that new editors don’t know the rules. The real problem is that admins tolerate horrible behavior from long-time editors which drives away other editors who are following the rules.
Long time editors know exactly what behaviour they can get away with whilst driving newbies into paroxysms of frustration that ends up with them breaking rules they didn’t even know about.
The combination of the two effects is incredibly toxic. It’s like a breeding ground for bad faith behaviour.