This story is totally a viral marketing stunt and probably has nothing to do with “AI” except an algo that finds an exit from a maze. But hey, maybe it lands them some business.
This, together with recent massive pishing through a google docs app really shows that clever social engineering is as effective as it was in the Mitnick-era. No need to be a prodigy programmer, just connect the dots ;)
I think Let’s Encrypt is an awesome service, and providing certs for free is really great for admins, but… I can imagine a scenario where, after LE grows substantially and for example, renews 100,000+ certs per day, a serious havoc spreads through the web when their renewal service goes offline for a good portion of a day, and the 100K+ websites use expired certs. What will their millions of visitors do? Add exceptions? Browse elsewhere? I have no good solution in mind. Maybe it’s just the hidden cost of LE. Not that using other cert authorities is any better (it’s worse).
I don’t believe that’s an issue. LE certificates are valid for 90 days, but most clients are set up to renew them after only 60 days. Consequently, the LE servers would have to be down for a full month before certificates start expiring.
Ah, forgot about that. You are correct, 30 days would be enough to sort things out.
Aside from the clickbait title, the article is nice. Microsoft’s decision is correct. Looking at how restricted iOS compares to loose Android, I can see the point. It makes sense to give regular users a restricted Win S that maybe finally will put an end to the malware infested debacle called Windows. And for the savvy ones, Win Pro is a click away.
At least WikiNews could be an additional source.
In general, you only provide links with the headlines as text. That should be fine for all the other sources as well.
Yes, it should. But I had a problem with correct filtering of crappy news or clickbait headlines. With your weighting approach it would be easier. Additionally, for context cross-reference, I’m relying on wikipedia keywording for now, but with additional sources I would have to implement keywording based on, for example, parts-of-speech dissection (verbs).
The general idea seems to be like Google News, but without the tracking and the Google and images. I like that and I made a similar site in german.
This looks cool! How do you aggregate news links from different sources? A combined feed?
It is essentially a feed reader with manual weights for ordering.
Nice idea with weighting by tag and source quality, I didn’t think of it like that.
Thanks for the comments: