My blog runs on Pelican. So far it’s been a good experience. The greatest selling point for me was support for per-category (though not per-tag) Atom feeds. It’s a regrettably rare feature, but if you plan to ever add yourself to blog aggregators, or simply give readers an option to filter out the kind of stuff they are not interested in, it’s really nice to have.
Yes. A good thing about Pelican, among others is that it doesn’t require a gazillion dependencies and just works.
I also recommends https://github.com/spanezz/staticsite because it doesn’t force a filesystem layout or HTML contents or markdown formats on you. Contrarily to other generators, you can use it to improve an existing “handmade” website without having to start from scratch.
My blog has been on hiatus since my most recent child, but another benefit of pelican is that it also is one of the last homes for ReStructuredText holdouts.
I also found the hooks for adding logic/post-processing to be painless.
TypeMatrix 2020 (dvorak). An excellent key arrangement.
1900: people going around on horses, public lightning using gas.
1960: cars, jet and nuclear powered airplanes, satellites, semiconductors, computers with LISP and COBOL compilers, antibiotics, fiber optics, nuclear fusion experiments (tokamak)
2020 - another 60 years and do we really have to show?
Compared to “commonplace” things like cars and antibiotics? Internet, GPS, maglevs, a vast array of surgical techniques, the absence of smallpox…
Compared to “works but government and academia only” things like satellites and compilers? Hololens, quantum computers, drones, railguns, graphene, carbon nanotubes, metamaterials…
Compared to “wildly experimental and probably won’t ever happen” things like tokamak and nuclear airplanes? Probably a lot of classified shit. Antimatter experiments at LHC. Arguably a lot of work with AI
Maglevs were invented in 1950s and first operated in 1970s. I also don’t have anything made from graphene, or know anyone who knows anyone owning a graphene artefact.
More importantly, none of that is imagination shattering from 1960s point of view. We do not have things mid-century people couldn’t come up with.
More importantly, none of that is imagination shattering from 1960s point of view. We do not have things mid-century people couldn’t come up with.
Antibiotics, heavier-than-air flight, cars, and computers (if you count Jacquard Looms) were all demonstrated before the 1900’ss. They weren’t imagination shattering from a 1890’s point of view.
Even the internet isn’t imagination shattering from an 1890’s point of view.
Antibiotics, heavier than air flight, and a programmable computer were not demonstrated before 1900s.
Do any of these look close to what our modern conceptions of these things are? Not really. But it shows that the evolution of the first demonstrations of ideas to widespread use of polished version takes time.
There’s a huge difference between observation of mold and a concept of antibiotics, no matter how trivial that sounds with hindsight.
The “uncontrolled hop” does not qualify as a flight, except in the most trivial sense.
The loom is not a computer, but I’d love to see a fizzbuzz with Jacquard patterns to prove me wrong.
It still means that all of the “imagination shattering” stuff in the 1960’s had precedents more than half a century old. We do not have things mid-century people could not have come up with. They did not have things 1800’s people could not have come up with, so we shouldn’t be thinking that our era is particularly barren.
I think it is reasonable to say that the reworking of daily life has slowed.
The stove, the refrigerator and the car changed the routine of life tremendously.
The computer might be more impressive by any number of measures but it didn’t rework daily life so much as add another layer on top of ordinary life. We still must cook meals and drive around.
The linear extension of the car and the stove would be the auto-chef and the flying/auto-driving car.
Both things are still further than is sometimes claimed by the press but the seem a bit closer than 2012. However, the automation offered by externally available power, which began in the 1800s, definitely has reached a point of diminishing returns.
We may experience further progress through computers, AI and such. But this seems to hampered by a “complexity barrier” - an equivalent amount of daily life automation as various technologies offered earlier through power now requires systems that are much more computationally complex. Folding towels really does turn out to be the hard part of washing, etc and even with vast advances in computational ability, we may still be at diminishing returns.
There have been significant advances since then (for instance, in medical treatments like cancer therapies and surgery—life expectancy in the US has risen from 70 to 79 since 1960), but nothing revolutionary, that would seem remotely as magical as the developments across the first half of the century.
The feature selection for the detailed comparison looks quite odd. All features get the same weight e.g. calling assembly code is as important as speed (!).
There’s no mention of compile times, memory efficiency, readability and expressiveness, macros / programmability, target platforms, memory safety, binary size, community, documentation…
There’s no mention of compile times, memory efficiency, readability and expressiveness, macros / programmability, target platforms, memory safety, binary size, community, documentation…
I’m not very familiar with pascal and ada, but perhaps in some (most?) of those cases they didn’t want to flood the chart ties? For readabilities sake they only included what they felt were the most relevant for them.
Good on you. It’s worth mentioning here that Microsoft is going in the other direction. https://www.mercurynews.com/2018/06/19/microsoft-defends-ties-with-ice-amid-separation-outcry/amp/
In response to questions we want to be clear: Microsoft is not working with U.S. Immigration and Customs Enforcement or U.S. Customs and Border Protection on any projects related to separating children from their families at the border, and contrary to some speculation, we are not aware of Azure or Azure services being used for this purpose. As a company, Microsoft is dismayed by the forcible separation of children from their families at the border.
Maybe I’m missing something, but it seems they are going in the exact same direction…
It’s a very confusing article; my best guess is that they are working with ICE, but not on “projects related to separating children from their families at the border”.
And just because Microsoft isn’t directly helping, they are still helping. That nuance is discussed in OP’s article - any support to an morally corrupt institution is unacceptable, even if it is indirect support.
But that perspective is very un-nuanced. Is everything ICE does wrong? It’s a large organization. What if the software the company that @danielcompton denied service to is actually just trying to track down violent offenders that made it across the border? Or drug trafficking?
To go even further, by your statement, Americans should stop paying their taxes. Are you advocating that?
ICE is a special case, and deserves to be disbanded. It’s a fairly new agency, and its primary mission is to be a Gestapo. So yes, very explicitly, everything ICE does is wrong.
On what ground and with which argument can you prove your statement? I mean, there is probably an issue with how it’s run, but the whole concept of ICE doesn’t sound that wrong to me.
From https://splinternews.com/tear-it-all-down-1826939873 :
The thing that is so striking about all three items is not merely the horror they symbolize. It is how easy it was to get all of these people to play their fascistic roles. The Trump administration’s family separation rule has not even been official policy for two months, and yet look at where we are already. The Border Patrol agent is totally unperturbed by the wrenching scenes playing out around him. The officers have sprung to action with a useful lie to ward off desperate parents. Nielsen, whom the New Yorker described in March as “more of an opportunist than an ideologue” and who has been looking to get back into Donald Trump’s good graces, is playing her part—the white supremacist bureaucrat more concerned with office politics than basic morality—with seeming relish. They were all ready.
I’m going to just delegate all arguments to that link, basically, with a comment that of it’s not exceedingly obvious, then I probably can’t say anything that would persuade you. Also, this is all extremely off-topic for this forum, but, whatevs.
There’s always a nuance, sure. Every police force ever subverted for political purposes was still continuing to fight petty crime, prevent murders and help old ladies cross the street. This always presented the regimes a great way to divert criticism, paint critics as crime sympathisers and provide moral leeway to people working there and with them.
America though, with all its lip service to small government and self reliance was the last place I expected that to see happening. Little did I know!
Is everything ICE does wrong? It’s a large organization.
Just like people, organizations should be praised for their best behaviors and held responsible for their worst behaviors. Also, some organizations wield an incredible amount of power over people and can easily hide wrongdoing and therefore should be held responsible to the strictest standard.
Its worth pointing out that ICE didn’t exist 20 years ago. Neither, for that matter did the DHS (I was 22 when that monster was born). “Violent offenders” who “cross the border” will be tracked down by the same people who track down citizen “violent offenders” ie the cops (what does “violent offender” even mean? How do we who these people are? how will we know if they’re sneaking in?) Drug trafficking isn’t part of ICEs institutional prerogative in any large, real sense, so its not for them to worry about? Plenty of americans, for decades, have advocated tax resistance precisely as a means to combat things like this. We can debate its utility but it is absolutely a tactic that has seen use since as far as I know at least the Vietnam war. Not sure how much nuance is necessary when discussing things like this. Doesn’t mean its open season to start dropping outrageous nonsense, but institutions which support/facilitate this in any way should be grounds for at the very least boycotts.
Why is it worth pointing out it didn’t exist 20 years ago? Smart phones didn’t either. Everything starts at some time.
To separate out arguments, this particular subthread is in response to MSFT helping ICE, but the comment I responded to was referring to the original post, which only refers to “border security”. My comment was really about the broader aspect but I phrased it poorly. In particular, I think the comment I replied to which states that you should not support anything like this indirectly basically means you can’t do anything.
Its worth pointing out when it was founded for a lot of reasons; what were the conditions that led to its creation? Were they good? Reasonable? Who created it? What was the mission originally? The date is important because all of these questions become easily accessible to anyone with a web browser and an internet connection, unlike, say, the formation of the FBI or the origins of Jim Crow which while definitely researchable on the net are more domains of historical research. Smart phones and ethnic cleansing however, not so much in the same category.
If you believe the circumstances around the formation of ICE are worth considering, I don’t think pointing out the age of the institution is a great way to make that point. It sounds more like you’re saying “new things are inherently bad” rather than “20 years ago was a time with a lot of politically questionable activity” (or something along those lines).
dude, read it however you want, but pointing out that ICE is less than 20 years old, when securing a border is a foundational issue, seems like a perfect way to intimate that this is an agency uninterested in actual security and was formed expressly to fulfill a hyper partisan, actually racist agenda. Like, did we not have border security or immigration services or customs enforcement prior to 2002/3? Why then? What was it? Also, given that it was formed so recently, it can be unformed, it can be dismantled that much easier.
I don’t understand your strong reaction here. I was pointing out that if your goal was to communicate something, just saying it’s around 20 years old didn’t seem to communicate what you wanted to me. Feel free to use that feedback or not use it.
any support to an morally corrupt institution is unacceptable, even if it is indirect support
A very interesting position. It just requires you to stop using any currency. ;-)
No, it requires you to acknowledge that using any currency is unacceptable.
Of course not using any currency is also unacceptable. When faced with two unacceptable options, one has to choose one. Using the excuse “If I follow my ethics I can never do anything” is just a lazy way to never think about ethics. In reality everything has to be carefully considered and weighed on a case by case basis.
Of course not using any currency is also unacceptable.
Why? Currency is just a tool.
Using the excuse “If I follow my ethics I can never do anything” is just a lazy way to never think about ethics.
I completely agree.
Indeed I think that we can always be ethical, but we should look beyond the current “public enemy”, be it Cambridge Analytica or ICE. These are just symptoms. We need to cure the disease.
Appreciate the honesty here. My take: GitHub stars aren’t real. Twitter followers aren’t real. Likes aren’t real. It’s all a video game. If you want to assess the quality of the code, you have to read it. You can’t rely on metrics except as a weak indicator. I predict there will be services to let you buy Github stars if the current trend of overvaluing them continues.
The endless self-promotion and programmers-masquerarding-as-brands on Twitter and Medium generates a huge amount of noise for an even larger amount of BS. The only winning move is to not engage.
This is more true than one might think. There are a couple of projects on GitHub with thousands of stars, some more than all the BSDs source codes combined, with the promise to bring something amazing, while not even having a working proof of concept, and being completely abandoned.
However, since it is true (to some degree) that having a larger user base in programming historically means that you won’t have to maintain a project on your own in the end it’s easy to be fooled by anything that appears to indicate a large userbase, like GitHub stars.
Many people use GitHub more like a “might be interesting, let’s bookmark it” or “Wow, so many buzzwords”, etc.
On the other hand there is quite a few projects that do one thing and do it well. Programmed to solve a problem, with 0-10 stars.
One might think that are extreme cases, they are only in the sense that 0 stars is the extreme of not being able to have fewer. They are not rare cases.
Another thing to consider is that GitHub is built a lot like a social network, so you have network effects, where people follow other people and one person liking something results in timelines, causing others to like it to remember to look at it, or “in case I need this some day”, and so one ends up having these explosions. Hackernews, Lobsters, reddit, etc. and in general having someone mention it to a bigger audience can help a lot too - and be it just “I have heard about this, but not looked at it yet”. It appears to be similar to the same story having zero upvotes on one day, and hundreds or thousands on another.
The rest is probably rooted in human psychology.
Spot on. On top of the detrimental “programmers-masquerarding-as-brands”, many GH repos are heavily marketed by the companies behind the projects. Covert marketing might be more popular than what people think.
Corporate OSS is winning the mindshare war. Plenty of devs would rather use a massive framework by $MEGACORP instead of something simple that doesn’t box them in. Pragmatism, they say.
(Of course, they don’t think twice about pulling in a community-sourced standard library (JS).)
Favorite example of this was a CTO talking about how they used Sinatra instead of Rails for their API endpoint and the flood of surprised replies, “but what if you need to change feature X?”, to which he said, “well, we understand all of the code, so it’s no big deal. Can you say the same about Rails?”
But why everyone blame npm and “micro-libraries” as the main problem in js? Aren’t all other languages (except C/C++) has the same way of dealing with dependencies? Even in conservative Java installing hundreds of packages from Maven is norm.
Something to consider is that JavaScript has an extreme audience. People who barely consider themselves programmers, because they mostly do design use it, or people just doing tiny modifications. Nearly everyone building a web application in any kind of language, framework, etc. uses it.
I think the reason there is so much bad stuff in JavaScript is not only something rooted in language design. JavaScript isn’t so much worse than other popular bad languages, it just has a larger base having even more horrible programmers and a lot of them also build some form of frameworks.
Don’t get me wrong, JavaScript is not a great language by any stretch, but blaming the ecosystem of a language who certainly has at least a few of the bright minds designing and implementing (working at/with Google, Mozilla and Joyent for example) it should not result in something so much more unstable.
Of course this doesn’t mean that it’s not about the language at all either. It’s just that I have yet to see a language where there isn’t a group writing micro-libraries, doing bad infrastructure, doing mostly worst-practice, finding ways, to work around protections to not shoot yourself in the foot, etc. Yes, even in Python, Rust, Go, Haskell and LISP that exists.
Maybe it’s just that JavaScript has been around for ages, many learned it do so some animated text, they wrote how they did it, there is a ton of bad resources and people that didn’t really learn the language and there is a lot of users/developers that also don’t care enough, after all it’s just front-end. Validation happens on the server and one wants to do the same sending off some form and loading something with a button, updating some semi-global state anyway.
JavaScript is used from people programming services and systems with it (Joyent, et al.) to a hobby web designer. I think that different approaches also lead to very different views on what is right and what isn’t. Looking at how it started and how the standards-committee has to react to it going into backend, application and even systems programming direction probably is a hard task and it’s probably a great example of how things get (even) worse when trying to be the perfect thing for everything, resulting in the worst.
On a related note: I think the issue the community, if you even can call it like that (there are more communities around frameworks rather than the language itself, which is different from many other scripting languages) doesn’t seem to look at their own history too much, resulting in mistakes to be repeated, often “fixing” a thing by destroying another, sometimes even in different framework-layers. For example some things that people learned to be bad in plain JavaScript and HTML get repeated and later learned to be bad using some framework. So one starts over and builds a new framework working around exactly that problem, overlooking other - or intentionally leaving them out, because it wasn’t part of the use case.
there are more communities around frameworks rather than the language itself, which is different from many other scripting languages
In general I tend to agree, but at least at some time ago I am pretty sure the Rails community was larger than the Ruby community. The Django community in Python also seems to be quite big vocal, but probably not larger than its language community given that the Python community is overall way more diversified and less focused on one particular use of the language.
A lot of Java frameworks predate maven - e.g. Spring was distributed as a single enormous jar up until version 3 or so, partly because they didn’t expect everyone to be using maven. I think there’s still a cultural hangover from that today, with Java libraries ending up much bigger than those in newer languages that have had good package management from early on (e.g. Rust).
Even including all transitive libraries, my (quite large) Android app Quasseldroid has 21 real dependencies. That’s for a ~65kLOC project.
In JS land, even my smallest projects have over 300 transitive dependencies.
It’s absolutely not the same.
In technical terms, npm does not differ much from how python does package management. Culturally, however, there are a big difference in how package development is approached. Javascript has the famous left-pad package (story). It provided a single function to left-pad a string with spaces or zeroes. Lots of javascript libraries are like it, providing a single use case.
Python packages on the other hand usually handle a series of cases or technical area - HTTP requests, cryptography or, in the case of left-pad, string manipulation in general. Python also has PEP8 and other community standards that mean code is (likely to be) more homogeneous. I am using python here as that is what I know best.
Doing packaging for Debian
thanks!
Who said packaging is a thankless job? You are welcome!