It looks similar to TSL from David Simmons.
Unfortunately the only information I can find is in a proprietary format.
What kind of information are you looking for? Here’s the github repo if you are interested.
The Learn page of the website has lots of info including the language specification.
Is it possible to provide a homebrew installation method on the Mac? Ideally installing ballerina should be as easy as:
$ brew cask install ballerina
Edit: it appears I spoke too early. It is available on homebrew, but the docs don’t reflect that
$ brew info ballerina
ballerina: stable 0.970.1
The flexible, powerful and beautiful programming language
https://ballerina.io/
Not installed
From: https://github.com/Homebrew/homebrew-core/blob/master/Formula/ballerina.rb
==> Requirements
Required: java = 1.8 ✔
I have just found this: http://brewformulas.org/Ballerina
Have not tried it myself yet (I am on a Mac but I just used a zip and set up the paths manually, I know that there is also an official Mac installer download pkg from https://ballerina.io/downloads/)
And another one: https://formulae.brew.sh/formula/ballerina
Thank you. I spoke too soon. It is available in homebrew (brew info ballerina). The docs should be amended to include this information.
It is a very fast moving project with lots of community involvement. By the way, the whole site and all the docs are in github too (not just the code itself) - so feel free to send a pull request whenever you see anything incorrect or missing.
Sent a PR
It’s not listed in the features and I can’t load the demo… does this support receiving all forum messages by email and replying by email? I consider those the killer features that set discourse apart from other forums
you know the drill, gather sample links that would benefit from having the tag then make a meta post.
Duly submitted.
Thank you (and the admins) for doing this. D now has a tag: https://lobste.rs/t/d
I’m not sure why D isn’t more popular. My guess is some teething issues back in the day limited its growth:
From what I understand, these have all since been resolved. It will be interesting to see if D usage picks up, or if those early speed-bumps made too much room for other (newer) languages to pass it by (Go, Rust, Swift).
For me, D is known and perceived just as a “better C++” (Alexandrescu is one the brightest evangelists of D and his C++ past does not help with the language image) and I do not want a better C++. I do not want another deeply imperative programming language that can do some functional programming accidentally. I do not want another language born from the culture of obsessive focus on performance at the expense of everything else.
What I want is a ML-ish language for any level of systems programming (sorry, D and Go: having GC is a non-starter) with safety and correctness without excessive rituals and bondage (like in Ada). Rust fits the bill: it’s explicitly not functional, but has strong safety/correctness culture.
Precisely because of the lack of GC and the focus on lifetimes, Rust is much more similar to (modern) C++ than D will ever be. Writing Rust is like writing correctly written C++.
D, having a GC, leads to different programs (than C++ or Rust) because of this global owner for resources that are only memory. eg: slices have no ownership information in the type system. This makes scripting very friction-less, at the cost of some more problems with non-memory resources. But not at the cost of speed.
D has @safe which is machine-checked, opt-in memory safety.
Thanks for the clarification, indeed I had a slightly wrong impression about the D programming style and ignored the profound influence of garbage collection on the programming style.
Still, everything that I learn about D irks me (native code with GC by default? metaprogramming with native code, without dynamic eval? opt-in @safe?!) and feels too much like the old C++ culture with its insensible defaults.
native code with GC by default?
This is why Go appealed to so many people isn’t it. This is the “new normal”. (Of course, OCaml etc had this before)
dynamic eval
This is probably a bridge too far if you are appealing to people from C++ background (C++ programmers as target audience is bad market strategy for D, IMHO)
Opt-in
@safe.
I agree with you on this.
feels too much like the old C++ culture
Well, the two main D architects are old C++ hands after all!
I think I addressed at the very top the current biggest obstacles to D adoption. I encounter them often when I try to excitedly discuss D with anyone.
I agree with all those points.
I was more thinking about folks skipping over D in the past, and how that potentially limited its uptake, than from the perspective of a D programmer looking at the current popular trends. Certainly an interesting perspective though. Thanks for sharing!
Indeed all those points have since been resolved. What hasn’t been resolved is that there isn’t a simple message to give as a marketing motto, since D tends to check all the boxes in many areas.
I think it should go back to its roots, which is why I happen to like it: “D: the C++ you always wanted.”
Very true!
Also, I must be one of the few people who learned C++ and never learned C. “C/C++” has always irked me, as if the two were easily exchangeable, or worse, the same language.
I’m really curious about the background of Nim users. I imagine its more folks coming from a Python/Ruby/Smalltalk background as compared to a C/C++ background but I have no actual basis for that assumption.
If you are an active Nim user, I’d love to know your background and why Nim appeals to you.
I’m a python user; slowly trying to replace python scripts with nim. Nim’s standard library is pretty decent. Unless i’m doing something highly specialized (eg: I’ve not tried to do any AWS automation in Nim), nim is often quick enough to write with the same amount of effort I would have spent writing Python.
I have 3 primary questions:
FWIW, I’m a polyglot programmer and I really like Nim but the only thing I’ve used it for is the Advent of Code puzzle solutions last December. It’s a lot of fun to program with and I like that it compiles to a single binary with no deps like Go.
Many of the links are broken. Example: Exploring Math
edit: they are working now. Someone from jsoftware is watching. Thanks, whoever that is!
“View history” shows Roger Hui is on it.
It seems that the link to vector.org.uk is dead: Iverson, K.E., APL in the New Millennium, Vector, Volume 22, Number 3, 2006-08.
This is a great post. It posits an idea “Let’s treat Bash development as software engineering, with all the discipline of any other programming language”. It presents the meat of this idea, and that’s all.
Definitely. I hope the Oilshell guy sees this and thinks about this as part of his improvements to shell.
I knew about (and regularly use) shellcheck.net. It is a wonderful tool to learn about bash and fix potential bugs. Oilshell integrating (at least some form of) shellcheck’s linting capability and shfmt (automatic formatter) would be great UX wins.
FWIW I wrote a little bit about ShellCheck in this comment: https://www.reddit.com/r/oilshell/comments/7fjl5t/any_idea_on_the_completeness_of_shellchecks_parser/
Skip down to where I say Oil was “negatively inspired” by ShellCheck. In summary, I’ve used ShellCheck, but I don’t like how many false positives it gives. Oil’s approach is to instead design a language where the obvious thing is not wrong, and where a nicely written program doesn’t produce lint errors on every other line.
Also, Oil should be more statically analyzable, so diagnostics should be more accurate. For example, statically resolvable imports would make a lot of errors more accurate.
Although keep in mind this is the goal, not something I’ve done yet. I still need to meet feature/performance parity with bash clone before really tackling the Oil language. As for shfmt, I definitely want to have an Oil language formatter, but that’s also future work.
“But we want your cake now!!” :) Seriously though. I really appreciate your work with OilShell. I hope your experiment is very successful, and once it stabilizes a bit more I hope to use it as a replacement for Bash. I’m excited about your work in fixing up our vegetables, so we can have nicer cake. And I’ve probably run that analogy into the ground now, so I’ll shut up :)
Another popular tool for this kind of conversion is pandoc .
Install: brew install pandoc or apt install pandoc
Use: pandoc https://daringfireball.net/projects/markdown/ -o markdown.md
Recently I discovered yasha for similar uses. But, your render library appears to have more features than yasha!
You should try install using cabal or stack, (I have not tried either on OpenBSD).
You may also want to checkout commonmark by the same developer as pandoc - jgm
I like the simplicity of the cmark tool, but I wish it supported page metadata, such as linking the output to a stylesheet and setting the page title, which is one reason I’ve settled on pandoc.
D, familiar , yet far more expressive, reusable. auto foo = [1,2,3] - light on type annotations in some ways.
If one wants “High Level C”, they should probably look at D with “-betterC” flag options, which allows you to use high level language features without the “penalty” of a runtime and GC.
See: http://www.active-analytics.com/blog/interface-d-with-c-fortran/
“however at the time of writing this article, the flag is partially implemented and removes only module information. “
so does it turn off the runtime and GC?
I really want to give pijul a try, but unfortunately I cannot install it using cargo ): (I get an error when it tries to compile sanakirja)
You may have to use the nightly builds of rust. I was able to install pijul with cargo.
$ rustc --version
rustc 1.17.0-nightly (b1e31766d 2017-03-03)
It appears I had previously built and installed rust manually, so… my mistake, sorry. And thanks for the hint!
I mostly agree that plain HTML is a great way to go, and letting the browser display semantic markup however it likes (ideally well, and according to user preferences) is right.
But setting max-width to 800px? That jumps out as a very bad idea.
It’s pretty subjective, but I just picked it as a number that’s roughly in the range of what most text-oriented sites seem to use. Medium uses 700px, for example. My linked blog above uses a slightly wider 900px. BBC News uses a bit narrower 600px (not counting the right sidebar).
I like limiting it to 52em^H 40 em. That way the printed page looks exactly like the web page. See for example: https://www.btbytes.com/notebooks/nim.html
The BBC News article body width has had a surprising amount of thought and effort put in. The intention is to strike a balance between readability (~80 characters max per line) and aesthetics (too much whitespace on either side can look strange).
The choice of 645px has worked well for the last several years but is too narrow on high resolution displays. We’re going to be rebuilding the article pages later this year and I hope we will take the approach of using rems instead of pixels, as mentioned by @btbytes.
I think it’s better to specify a max-width in ems, because that naturally accommodates people with vision issues who’ve increased their font size to cope (and some older people apparently make fairly drastic font enlargements). How many ems it should be, I don’t know. I’m currently using 45 ems and the result seems okay, but I picked it more or less out of the air.
Hmm, that’s a good point. I didn’t realize it’s common for people set font size explicitly. (I know browsers support user-specified CSS, but thought of it as basically a “strictly for programmers” feature.) I personally like larger font size than most pages have by default, but I use the zoom function for that instead of specifying fonts in user-CSS, which is also what the elderly people i know do. From some testing, specifying max-width in ems or pixels behaves the same w.r.t. zoom. But there’s no real reason not to use ems afaict, so I might switch to that if there’s an advantage.
This is interesting. I just did a test in (desktop) Firefox and Chrome, and they behave differently. In Firefox as I have it set (which is to not zoom images, just text), zooming does not change a max-width set in px, so the text on your page doesn’t get wider. In Chrome, zooming does apparently increase the max-width even when it’s set in px, so the text on your page gets wider, eventually going up to a full screen width. Given the difference in behavior it seems worthwhile to set the size in em.
Odd. I was also basing my comment on testing desktop Firefox and Chrome (on OSX), which with the default settings for me both do zoom width on my blog post, and with seemingly identical behavior whether you specify px or em. I wonder if it’s your don’t-zoom-images setting for Firefox that turns off resizing of all pixel-specified areas' sizes, not just images per se? I don’t see an option for that in the prefs; is it one of the ones behind about:config?
I’ve been persuaded that some max-width is a good idea. Some number of people do browse with their browsers in full-screen mode on wide (or absurdly wide) screens, and if you don’t limit your site’s width you get really long lines of text that are hard to read. It annoys me to have huge amounts of empty space that could be filled with something useful, but so it goes.
(I prefer to center the text in an over-wide screen rather than leave it at the left side, but that’s a taste issue. I think it looks better in the full-screen browser scenario, and it puts the text hopefully straight in front of the person, if they’ve centered themselves in front of their screen.)
I like the long lines of text. I would ask site makers to please please find a way to let me have the long lines when I want them. If motherfuckingwebsite can manage to make it work then so can you.
in typography a traditional line length is 2-3 alphabets long, aka like 60-70 characters. This usually falls much below 700px, so already web designers are more generous than say, magazine layout designers or newspapers - but having a line length that’s too long results in reader fatigue from too much left-right motion and not enough vertical.
The effect of the fatigue is something testable and measurable - so no doubt news websites from profit from people clicking multiple stories will try not to strain the viewers eyes after reading their first article.
repeating my comment from hn, I continue to be astonished that D gets left out of these discussions. it’s more capable than go and easier to use than rust, which would make it look like the sweet spot for a lot of people moved to compare the two. (that said, every time i’ve tried to use the language the tooling around it has been a miserable experience. that might have changed in the last couple of years.)
D was (maybe it fixed all these by now, haven’t looked in a few years) a complete mess. Multiple standard libraries, GC was in this weird sort of “being made optional, but not working yet”, and it tried to be universal and never found a niche. Basically, the issues around GC and the standard libraries made it an untrustworthy community in the eyes of many developers.
The standard library split, at least, was resolved about 5 years ago, with the outcome that there’s one standard library (the one bundled with the compilers), with appropriate changes made so that the former “alternate” standard library could be reworked as a third-party extension library on top of it, rather than an alternative. So they now have roughly the relationship that, say, the C++ standard library and boost have.
This is not useful. “few years” is a long time. If I were to talk about 2010 Go, I’d be saying the same things about “the issues around GC and the standard libraries made it an untrustworthy community in the eyes of many developers.” and not be wrong.
D was released in 2001, when I explored it, it was close to a decade old and still a huge mess. In 2010, Go was about a year old, and the GC and stdlib where already well flushed out and while both have been improved, neither has broken or changed. Go has a core value of stability and backwards compatibility which has been I believe a cornerstone of its growth (see: https://golang.org/doc/go1compat).
So, I disagree with your points on two fronts, (1) Go had decided on GC and the stdlib by 2010 [actually by 2009] and (2) D was 10x as old at the point I looked at it and STILL lacked stability on EITHER of those fronts.
Hard to shake a bad reputation, I have barely looked at D but the general vibe I get is people don’t like it.
In addition to the library issues mentioned elsewhere and unsearchable name, wasn’t there some funky licensing history? I imagine that’s probably resolved by now, but at the time I remember seeing it wasn’t properly open-source and so figured it was best ignored.
yeah, from what i recall the reference compiler (dmd) was free-as-in-beer, but not free-as-in-speech, but there was an open source front-end to which people hooked up a gcc backend. it was still a bit funky, but not unviable. (as opposed to, e.g., shen, which is a great language whose community was essentially stillborn due to bad licensing)
To use actual C as a scripting language, check out TCC!
TCC can also be used to make C scripts, i.e. pieces of C source that you run as a Perl or Python script. Compilation is so fast that your script will be as fast as if it was an executable. You just need to add
#!/usr/local/bin/tcc -runat the start of your C source:
#!/usr/local/bin/tcc -run
#include <stdio.h>
int main()
{
printf("Hello World\n");
return 0;
}
The output of compiling Little is tcl code. See http://www.little-lang.org/why.html
I’m late, but GCC works fine too, works for CGI nicely, here are two one-liners:
» cat demo.sh
#!/opt/misc/chax
void main(void){printf("Content-Type: text/plain\n\n");printf("Hello World!\n");}
» cat /opt/misc/chax
#!/bin/sh
(TMPO=`mktemp`;sed -n '2,$p' "$@"|gcc 2>/dev/null -std=gnu99 -pipe -O2 -x c -o $TMPO - &&$TMPO 2>/dev/null;rm -f $TMPO)
It looks a lot like Python at first glance. Are there any significant differences between the two?
statically typed unlike Python, Compiles down to a binary (via C code generation) unlike CPython, (and even to Javascript).
Gotcha, for some reason didn’t notice you could compile binaries with it. I’ll be looking into this.
Really nice work! I’m definitely gonna look into these posts next weekend. Also thanks for not focusing on repl.
Does OCaml have what kind of package manager/ how do you manage development dependencies?
Opam is OCaml’s package manager. You can “switch” to different OCaml versions, install packages under each version.
The project dependencies are managed using Dune build system.