1. 27

Is there anything that …

  • creators should have done from day 1, but is still not there yet?
  • got gradually worse over the years?
  • frustrates you, because a fix would be easy, but nobody seems interested in fixing it?

Please, share your stories and experiences!

  1. 13

    JavaScript. It’s not “mine” in that I don’t really have any affection for it, but it’s most of what I do for now.

    The problems? Less and less of the real meaning of any program is visible to me as time goes on. There’s so much implicit in the run time namespace and semantic environment that has no visible connection to the code you’re editing. We keep adding tooling that allows us to more easily express our intentions for a given volume of code. But when the result is not what we expected, it’s becoming harder and harder to know where to make adjustments.

    The constant drive towards interchangeability, compartmentalisation, and composability currently (and I think maybe always) causes all the interface points between the parts to trend towards the lowest-common-denominator. This is a darwinian process that is driven by a positive feedback loop, and does have some benefits - it allow us to evolve the individual parts.

    I can change from webpack to browserify to rollup with relative ease, because they do a specific job with a dumb interface of plain text source code. This competition leads to them being better than they otherwise would be.

    But the problem with this approach is that there’s no feedback from your “bundler” to let Babel know the rules about module resolution, so that it can in turn make better decisions, nor is there a way for it to let Flowtype know about what you’ll actually get from an attempted import, should the bundler be doing anything remotely out of the ordinary, like Facebook’s globally-unique-filename-modules, or webpack’s stupid “import css” chicanery. There’s no way for rollup to get access to all the information Flowtype has collected about the entire call graph to let it make better decisions about what can be thrown away.

    Don’t get me started on god-damned testing “libraries” that require special launchers and fill the global namespace with garbage so that your tests can “read kind of like really poor english”

    And when something goes wrong? You might need to simply locate and fix a typo in an import, that for some reason wasn’t brought to your attention by literally 3 different tools which each parsed and analyzed your entire program during build. Or, you might need to make a change in any one of 6 different .config files in any number of locations, all of which are poorly documented, none of which will tell you when you’ve put a typo in there – they simply carry on as if that part of the configuration does not exist.

    And when you’re looking at a 60-thousand line file of transpiled code, glommed together with everything else dragged in via 700 transitive dependencies, and you need to diagnose and fix the fact there are two slightly different versions of React in the one file with no information about how they got there

    …you might wish you’d been a carpenter. As I keep saying, I’m not angry… just disappointed.

    1. 4

      The TL;DR of what you said in my own opinions:

      A good lang lets developers write good software, not novice developers write any software.

      Compiling JS seems like a hilariously ridiculous exercise that gets in the way of debugging, deploying, and compatibility. Note: Bundlers/etc like browserify are reasonable.

      Writing tests with “English” is nonsensical, English is horribly contextual and nuanced.

      People seem to use so many tools that they can’t just write code. If you don’t know explicitly what your tool is doing, don’t use it. I know what a hammer does, so I’ll use it. I don’t know what Entomology Forceps are for so I don’t use them. Don’t buy a hardware store if all you need is a screw driver.

      1. 5

        People seem to use so many tools that they can’t just write code.

        Correct. Many developers seem to not want to be programmers, but rather technicians, who assemble pre-made parts and then (endlessly) diagnose the contraption they’ve made: “but, this is the fastest HTML templater right now, it’s worth all this trouble!”

        There’s a real element of consumerism here, too. Part of me suspects that lots of business problems are boring and ill-specified, which causes people to seek out intellectual stimulation elsewhere, even if it’s in problems of their own making.

        1. 4

          A good lang lets developers write good software, not novice developers write any software.

          That has to be one of the more insightful lines I’ve ever read about programming languages.

      2. 13

        Haskell: the barrier for entry is becoming higher and higher, as GHC becomes more and more complicated. A lot of modern Haskell code requires familiarity with 10 different LANGUAGE extension pragmas. Real World Haskell is now 8 years old, so there isn’t a good authority on what packages we should use for what. Often, the best solution is to ask on #haskell. A lot of the material to learn advanced Haskell is buried in stackoverflow answers and blog posts, but this is getting better. The tooling isn’t great either, and we don’t really have any usable IDEs (doesn’t bother me, but many would see this as important). Another thing I see as problematic is how Haskellers love to be as terse and general as possible - if you can’t resolve the types yourself, you’re screwed even if you just want to use the library and not care about how it works. This is made even more difficult with a huge monad transformer stack, and the compiler output is basically unreadable.

        1. 8

          Scala: Leadership doesn’t care at all about documentation for beginners and basic getting-started tutorials.

          1. 2

            Tried to learn Scala just to throw together a PoC web app and this is the issue I encountered. Lots is written about Scala, little of it helps you hit the ground running with a toy app. I might return someday. Is it worth it? I can’t tell if Scala is trending up or down.

            1. 3

              It is certainly tending upward, heavily and is absolutely worth learning even if you never use it for anything. The biggest thing Scala teaches is that all the terribleness in languages that we have grown to accept is not some god-given fact: Not all languages are equal or “just have issues in different parts”.

              Which makes it even more infuriating that people in charge can’t get their shoes together.

              Example: Scala.js is one of the best, if not the best compile-to-JS languages out there. Yet after all the years of success stories, there is not a single mention of Scala.js on scala-lang.org.

              If I had to guess what could be Scala’s downfall, it certainly would be complete disregard of beginner issues, the disinterest of communicating coherently with developers and the complete disaster that is anything related to PR and marketing the language.

          2. 7

            SML: Its biggest problem is that it is a dead language.

            Which is frustrating, because it’s really quite an excellent language, and it is unique in being a language which has been formally proven.

            If you’re not familiar with it, I like to describe it broadly as SML is to Haskell what C is to Java. That is, there are lots of syntactic similarities, but where Haskell is sprawling and featureful, SML is small enough to fit inside your head (and also a bit more primitive). They’re both typeful functional languages, but SML is strict and impure, as opposed to lazy and pure. But it’s a compact, beautiful language.

            Now, you might say “SML sounds great, but can’t you just use ocaml?” Well, sure. But it’s not the same. ocaml isn’t proven, isn’t nearly as clean, doesn’t handle packaging as well, has some ill-conceived extra functionality (looking at you, weird OO subsystem), and has only a single compiler that isn’t as good as SML compilers. But, you know, if that’s your thing …

            But these are all ramblings about a dead language. Could it be non-dead some day? Unlikely. I don’t think anything fills its niches as well, but there are plenty lively languages which fill them well enough.

            How did it get dead? Personally, I think it has to do with Robin Milner, the man who wrote the original ML (you may recognize the name for work relevant to writing ML). Milner was also strongly involved in the SML project, and accorded a great deal of deserved respect. And he felt that SML was completed after the first revision (completed in 1997). He refused to authorize any further revisions. Once he passed away in 2010, it was already too late. The internet boom had produced a programming boom, and a language boom, and SML had not taken part at all. It was dead.

            But, you know, I still love it. And since it’s frozen in time, and mostly unused, I’ll never have a reason to not love it. :)

            1. 2

              ocaml isn’t proven


              isn’t nearly as clean, doesn’t handle packaging as well


              has some ill-conceived extra functionality (looking at you, weird OO subsystem),

              Everyone I know who uses OCaml has a fine time just pretending it doesn’t exist.

              and has only a single compiler that isn’t as good as SML compilers.

              It has two compilers in the mainline distribution if you don’t count experimental stuff like jocaml.

              Anyway, I sympathize because I really like the notion of “small enough to fit in your head”.

              1. 3

                Yeah, as /u/jfb suggests, I was using “isn’t proven” in a formal sense. It’s certainly been successfully used in production.

                As for compilers, I was thinking of ocamlc and ocamlopt as the same compiler, emitting to different targets. Jocaml doesn’t count, as it’s extended ocaml enough to become its own language. If there are other compilers for ocaml, I haven’t heard of them.

                SML, on the other hand, had a dozen or more compilers in its heyday, and still has a bunch that are relatively active: Poly/ML, SML/NJ, MLton, mosml, off the top of my head. MLton is notable for being a whole-program compiler.

                Anyway, clearly I can go on all day about SML.

                1. 1

                  SML has a formal specification.

              2. 6

                Vala. My favorite pet project is in Vala so I’ll start with it.

                1. My biggest complaint about Vala is I think it’s dying. As a single person I can’t stop it from dying. No matter how much code I turn out, as an individual there’s nothing I feel I can do to stop this. One of its best contributors (Luca Bruno) recently left the project. The creator never really seems to contribute as far as I can tell past issuing releases.
                2. There’s also a complete lack of tooling. No good IDEs. It’s a second class language even in Gnome-centric IDEs and Gnome is supposed to be the primary use of the language. Debugging of any kind, IMO, is a PITA mainly due to the fact it compiles to C before going to GCC.
                3. I’ve never honestly been a huge fan of languages trans-compiled to another language before being actually compiled. I think this causes more headaches than it’s worth and Vala absolutely demonstrates that with the insane number of compilation errors and warnings that GCC spits out.

                Python. At work I use Python.

                1. The fight between 2 and 3 drives me nuts. 99% of stackoverflow is examples written in 2, which isn’t a huge problem in that I can almost always translate them, but it’s still a PITA.
                2. The GIL naturally.
                3. I don’t see a lot of complaints anymore floating around the net, but I think python packaging is a disaster. I have no idea what the difference between pip and pip3 is?? I have to use some packages that supposedly work with 2 and 3 but end up only working with one or the other.
                1. 2

                  Vala is awesome

                2. 6

                  Python: The language is being packed full of crap that happens to be fashionable without thought to whether it really fits well into the language. Type hints are the worst example of this (see the now-infamous thread on python-dev), but the new syntax for async/await is also a bad one: changing the very core of the language for a foreign programming style this is likely to be useful to perhaps 5% or 10% of programs, when there was already (imho) acceptable library-level support for it.

                  Meanwhile the far more common problem of decent concurrency remains unsolved in Python. I’d much rather see a decent threading implementation that used CoW for an efficient actor model without the GIL (having no shared mutable state should make the existing objections to removing it obsolete) — though PyPy’s work on STM does look very interesting, but I doubt it will make it into CPython.

                  Also, still, fragmentation. The sooner we see the back of Python 2 entirely the better.

                  1. 2

                    You may find http://pyparallel.org/ interesting, which gets multicore scaling without removing GIL. It is currently Windows-only though.

                  2. 5

                    Scheme: I wish there were more batteries out of the box. Every time I get started on a new Scheme project, I end up writing a lot of dumb util functions (for-i, vector manipulation functions, etc). I really like Ruby’s standard library and want something like that put in the base stdlib (having to remember all the srfi libraries isn’t particularly pleasant either). I’m not going to use common lisp, because it has it’s own weird set of issues, and seems a bit too bloated to really be “lispy”. I really wish Arc had taken off.

                    I’m currently working on a non-standard complaint scheme thing (and I’m planning on making good interoperability with C, so bindings will be easy). If it ever gets finished, it’ll hopefully solve this problem.

                    1. [Comment removed by author]

                      1. 1

                        I’ve been meaning to check out racket a bit more. It’s still a bit large for what I want, but I’ll probably draw some inspiration from it.

                    2. 5

                      I love Golang, not least because of the standardization via the likes of gofmt; it makes any codebase very accessible.

                      However, to contradict myself regarding standardization, what I dislike is the GOPATH hassle; it’s franky a PITA to have all my Go code structured in a place away from all my other repos. For years I had a directory with all my repos underneath it which made pathing easy, but with Go I need to maintain the whole Go source tree hassle, e.g.

                      $ tree Code/Go/src/ -L 1
                      ├── 9fans.net
                      ├── bitbucket.org
                      ├── code.google.com
                      ├── github.com
                      ├── golang.org
                      ├── gopkg.in
                      └── honnef.co

                      This makes navigating around the filesystem to work on files super cumbersome. Surely there must be a way of using GOPATH to find all the libraries Go needs for compilation, without making the programmer adhere to this structure.

                      It’s about the only thing I can think of that I dislike; unfortunately it’s jarring since it’s something a new Gopher discovers right at the beginning of trying out the language. Anecdotally, I’ve seen more than one programmer walk away from the language right at this stage.

                      1. 2

                        I gotta say, my happiness with the GOPATH crap shot way up as soon as I configured my .bashrc to always set GOPATH to PWD. Now, my go incantations always assume the current directory is the root of the GOPATH and oh man, so good.

                        ~ $ cd $(mktemp -d)
                        /tmp/tmp.65pcSFeA1s $ go get -v github.com/whatever


                        ~ $ cd whatever/project
                        ~/whatever/project $ ls
                        ~/whatever/project $ go install .../name
                        ~/whatever/project $ ls
                        bin pkg src
                        1. 1

                          For the curious:

                          prompt_setup() {
                            export GOPATH=$PWD
                          1. 1

                            Interesting; thank you for sharing that. I’ll play with this setup and see what you’re getting at.

                            Much appreciated! :-)

                        2. 5

                          There’s a lot of friction in using F# with WPF. It’s easier to just write a C# GUI which calls an F# library.

                          1. 1

                            How do you like F# as a language? I haven’t tried it yet but would like to. Is it like Haskell?

                            1. 2

                              It’s basically OCaml.Net. Defaults to immutability, but mutability is available. Has algebraic data types and pattern matching, but there is also an object system. I enjoy this because I can write code in a functional style, but there’s an imperative safety hatch.

                              OCaml and F# have slightly different object systems, because the latter was designed for C#/.Net interop. Also, it doesn’t have OCaml’s “functors”, which are generic/parameterized modules. F# does have “type providers” but I haven’t looked into it very thoroughly.

                              Overall I’m a big fan of the language. It feels very pythonic to me. The code is readable and inferred static types make for nice error messages.

                          2. 7

                            Well, something annoying is all the new languages that are missing old tricks:

                            • Arrays are great in array languages, but dogshit everywhere else. Most languages implement some kind of hashmap or linked list, or tree-vector instead of a regular old array. What is so hard about defining a+b to be something useful instead of an error?
                            • Memory management looks finally solved by rust (and the new C++ finally has && move semantics hooray), in almost exactly the same way Array languages have been doing it for over fifty years.
                            • Messaging. Erlang and KDB has actual messages, which are better than generators and promises, while being almost as expressive as callable continuations (and with linking: none of the leak) “stringifying” a data structure is important, but too few languages work on this. When Erlang did this for messaging, they also got logging cheap (KDB gets it with a commandline switch wow!)

                            However these aren’t the things that really bother me: These are things that are all solved in my favourite languages already. The things that frustrate me, are things where a fix won’t be so easy as to just look around:

                            • IO is still done painfully in most languages. “nodejs” (if you consider it a separate language) at least thought about it, and came up with the wrong answer, but languages like Python and rust are just stupid for not dealing with this despite being new (or at least, recently refreshed). The Windows IO model is better, but the Unix one won – even in Microsoft’s own languages…
                            • Error handling is done wrong in every language except common lisp, where it’s simply done-wrong in practice (which isn’t much better).
                            • Modules are bust. The biggest improvement in the last thirty years of libraries wasn’t namespaces, but relative paths to the module tree (nodejs). Unfortunately, we also got npm.
                            • Numbers! Can we really not accept that the number literal 9999999999999998.0 as typed into a program source code is a different number than the bytestring 0x4341C37937E07FFF? Because clearly 9999999999999999.0 is different than the bytestring 0x4341C37937E08000! Perl6 might be the only thing “widespread” getting this right, although go also comes up with the correct answer.
                            • Release management. If you implement the universal server, you can get hot upgrades for free, but you can’t necessarily get hot downgrades, and you might still have an outage if you make a mistake. What I want probably looks more like virtualisation.

                            To fix these things, not only do you first need to appreciate that there’s a problem, but that the language is the right place to fix it. In all of these cases, I’m not sure there’s consensus there.

                            1. 3

                              The Windows IO model is better, but the Unix one won – even in Microsoft’s own languages…

                              Can you explain the Windows model? I’m familiar with Unix’s approach and thought Windows worked the same way.

                              1. 4

                                The gist is that Unix has you ask the OS, who is ready to perform IO? then you tell the OS perform this IO, whilst the Windows method has you tell the OS, when your'e done performing this IO, let me know.

                                The Windows way means fewer system calls, and it means that if you tag your memory regions correctly, it means fewer copies (and is faster).

                                Unfortunately, that let me know in practice is actually quite complicated. Signalling involves interruption, or stack swapping, which means more memory, and IO wait looks weirdly similar to the UNIX method (just backwards, and makes you think this is about whether userspace or kernelspace is committing the memory). What we really need is actual IO messaging (one of my other wants), but this is hard work.

                                1. 1

                                  What do you mean by “actual IO messaging”?

                                  1. 2

                                    What I was referring to in my original comment as “actual messages” – a major problem is designing the language to be interrupted.

                                    Interrupting C is hard, not because of anything in the language, but in how people program C. You could write C in an exclusively coroutine-style, or restrict yourself to a very small stack, but in practice, people don’t do this very often, so what actually happens is people make “wait points”, using select or poll, or WaitForMultipleObjects, however there’s no system “message” that you can wait for on Unix that means “I’m done sending this file”, all you can wait for is “You can call write() on this fd” which isn’t the same thing at all. You also can’t make your own messages using this mechanism.

                                    Once you have those wait points, you actually have to use them. Most C programmers have a single “wait point” in their program, because they find this kind of messaging special, after all, it involves magic function calls that interact with the operating systems, however in other languages that do “actual messaging”, they have separate words for “send message” and “receive message”: in Erlang they’re ! and receive, and in KDB they’re the application of integers and read0. Having these built-in to the language means that you can use them all over the place without surprising people.

                                    A better way to interrupt C is with signals, but very few programmers use these. First of all, they’re crude, and slower than a function call, and everyone knows they’re a minefield for interacting with the system. But I think this might be worth getting past because signals also have some really nice properties like the fact that the realtime signals form a kernel-side queue, and the fact that you can wait for a signal with an alternate stack which means you really do have coroutines.

                                2. 2

                                  I googled for a link and this is the first one that came up. The question isn’t your question, but the description contains a really succinct answer to yours: http://programmers.stackexchange.com/questions/293908/readiness-vs-completion-async-io-memory-usage

                                  The Windows model is named IOCP, and the C in there is not for “completion”, but it’s how I remember which is which, personally.

                                  1. 2

                                    Oh neat apparently Solaris implements it. I wonder if illumos does too.

                                3. 2

                                  “What is so hard about defining a+b to be something useful instead of an error?”

                                  Oh my god yes. What is the point of having types if this doesn’t work for arrays out of the box?

                                  EDIT: I appropriately got a ‘me-too’ downvote for the above comment made in haste, so I should elaborate and better answer the original question…

                                  If I am using a strongly (or even dynamically in some cases) typed language, then I expect core operations to be available for those types when it makes sense. Forcing me to use, for the above array+array example, a concat method is a poor language implementation because I have now removed the benefit of the type and hoisted the burden upon the programmer. This has several detriments - importantly, if I now change the type, I likely need to change my code, as there are dependencies where there should be none. It also adds complexities to the language, forces the programmer to learn trivial idiosyncrasies, impacts implementation in up-stream generics, and generally makes for less readable code.

                                  1. 4

                                    What is the point of having types if this doesn’t work for arrays out of the box?

                                    The point is that for the function x+y, and the arguments "foo"-32, the result depends on whether "foo" is an atom or not:

                                    • If "foo" is an atom (a string), then the result is an error
                                    • If "foo" is an array of characters, the result could be the valid string "FOO" or the array [70,79,79]

                                    One of the values that people are getting out of programming languages is a copilot: someone to tell them they might be about to make a mistake, so a lot of people like having "foo"-32 be an error because they like parenthesis and could always uc("foo") or similar, if that is what the programmer actually meant.

                                    Another example: What about [0,1,2]+1 – did the programmer want [1,2,3]? Or did they want [0,1,2,1]? Or were they thinking this was a set and meant [0,1,2]?

                                    Another example: What about [0,1,2]+[2,1,0] – do I want [0,1,2,2,1,0] or [2,2,2] or [[0,1,2],[2,1,0]] – all of these interpretations are useful, but which one does + mean?

                                    No, I understand the reason why, I just wholeheartedly disagree with it. Iverson figured out decades ago what + should actually mean, but because word hasn’t gotten around yet, new languages are still making the old mistakes.

                                4. 4

                                  LuaJIT refusing to add features from 5.2+ which introduce incompatibilities with 5.1 (which was released in 2006).

                                  In Lua, metamethods allow you to override the behavior of regular tables with your own functions. However, in version 5.1, the only things you can override are table lookup and table insertion. Since you can’t override iteration, any abstraction that you use metamethods for will be very leaky–your tables behave mostly like regular tables, but only if you use your own iterators, which means you can’t pass them to functions that you didn’t write. Since tables are the only compound data type, this basically leaves you very limited in terms of the abstractions you can employ.

                                  This was fixed in 5.2, but LuaJIT is based on 5.1. More recently a compile-time flag was introduced that allows certain 5.2 features into LuaJIT, but unfortunately it’s not on by default, so you can’t rely on it if you distribute code to anyone who is using a LuaJIT that they didn’t compile themselves.

                                  1. 1

                                    NB: There are several other obvious major problems with Lua, but most of them can be easily caught with Luacheck.

                                  2. 4

                                    Haskell: String should be replaced by Text and strict ByteString. Also, OverloadedStrings should be default.

                                    1. 4

                                      PHP: Native UTF. Granted the ext-mbstring exists and I utilize it, still wish it was native to the language itself. My understanding is that PHP 6 was supposed to be full UTF and then something happened. Along with that I think people need to get over their “ew, php?” phase because I get it, we all have read a fractal of bad design. And if you haven’t and you’re dabbling in PHP, I highly recommend it. Great read. I also think applications such as WordPress, which is usually the first example of PHP for a lot of people, and I use a lot professionally, show how bad the language can get.

                                      So to sum it all up, the problems I see with PHP is UTF, the outside community looking in, and WordPress.

                                      1. 4

                                        C++, the standard library is a performance minefield. Sometimes it’s fast, sometimes it’s terrible. For example last I checked std::regex is actually slower than python’s regex engine. Of course you can use re2 but it’d be nice if you didn’t have to.

                                        Many languages have this problem, but for a performance oriented language like C++ it’s a pain in the ass.

                                        1. 1

                                          Also boost

                                        2. 3

                                          Ruby, and I think that a multithreaded application model has been sorely missing. With all of the talks of the proposed “Guild” model, I think this might be (initially, anyways) solved!

                                          1. 3

                                            This might not be a generalized problem. but anyway.

                                            Python : Slow regex matching. If only Guido and co implemented the Thompson NFA algorithm for that (used in Unix tools like grep, awk, etc).

                                            So, as an NLP engineer, I seek refuge in Go.

                                            1. 2

                                              You can install re2 from PyPI to get Thompson NFA regex.

                                              1. 1

                                                Yeah, that’s there. I as just ranting :)

                                              2. 2

                                                There is a drop-in replacement for re called regex which will hopefully replace it in future versions, with more features and faster performance. It doesn’t use NFAs but caching of some kind (I don’t fully understand it myself) to speed things up and reduce the chance of you accidentally creating pathological behaviour.

                                              3. 3

                                                Javascript: Promises and async/await. I’ve only just recently posted here about it so I won’t go into detail: https://lobste.rs/s/ym5ke7/i_promise_this_was_bad_idea

                                                1. 2


                                                  • Not yet packaged in Debian stable or backports
                                                  • An annoying tendency for libraries to depend on cutting-edge compiler features, which will hopefully mellow as the language ages


                                                  • The GIL, ugh
                                                  • Dynamic typing is a trap that wastes unimaginable thousands of programmer hours when they make minor typos that can’t be caught prior to runtime
                                                  • The transition from 2 to 3 was horribly mismanaged
                                                    • 3 gets unicode precisely wrong

                                                  Java, the language which is certainly not my favorite but pays my bills:

                                                  • Logging, affectionately^W referred to as “the logging clusterfuck”. In brief, the language didn’t introduce logging in the standard library until 1.4, six years after the initial release, and java.util.logging is very restrictive and underfeatured. Thus, there’s a hellish mishmash of j.u.l and third-party logging libraries with an unimaginably complicated compatibility graph. It’s virtually guaranteed you’ll struggle with it the instant you introduce any third-party code to your project.
                                                  • Enterprise bullshit infests the ecosystem
                                                    • Annotations were a mistake
                                                    • Reflection was a mistake
                                                  • Third-party documentation is somewhere between “misleading” and “nonexistent”, and first-party documentation is only good for reference
                                                  • Prior to Java 8, a severe lack of ergonomic features in the language syntax and standard library. 8 didn’t fix everything, but is a massive improvement.
                                                  1. 2

                                                    C#: Tuples need to be made less clunky. One extremely disturbing thing about the C#7 Tuple proposals is that while the syntax is much nicer, they also propose to make Tuple mutable. I would rather have clunky syntax than mutable tuples. Other than that I have no complaints about the language (aside from null, but even that’s handled better than a lot of languages).

                                                    1. 2

                                                      they also propose to make Tuple mutable


                                                      1. 2

                                                        My reaction as well. I can’t believe they would even think of doing that. The whole point of tuples is to be immutable. Otherwise use a list.

                                                        I’m seriously perplexed. Currently tuples in C# are immutable, so the proposal to make them mutable was a conscious decision.

                                                    2. 2

                                                      Elm: Higher Kinded Types. I appreciate that Evan wants to keep them out of the language during the early days, so that people see them as an “extra feature” rather than a “core necessity.” I even agree with him, but I’m still wistful.

                                                      1. 2

                                                        C#: Inclusion of reflection, period. It shouldn’t be there. And related, attributes shouldn’t require you to reflect on a type to get to them. These things should be “properties” that you can access on the class, instead of the Type, and not be interactive at all at runtime.

                                                        After that? Nullability should be optional for all the class-based/reference types (while yes the “reference” to the instance is the actual value you’re looking at, this could still be compile-time enforced) and deeply immutable structures should be less terse (features contributing to fixing the latter problem is a WIP in the new c#!). Lower on my list, but still something I’d love to see is the removal of inheritance and abstract classes since everybody just uses interfaces nowadays anyways ?

                                                        EDIT: Also I remembered, a huge gripe of mine is how many .net core libraries don’t provide testable interfaces so you have to write adapter interfaces to keep your code testable where dummy structures are needed. It’d be a lot nicer if the interfaces are standard, so the code is easier to parse for colleagues.

                                                        1. 1

                                                          Most pressing for me in Haskell is the lack of Hystrix/Zipkin-type libs. Circuit breakers, distributed tracing, etc. As much as the string issues are annoying, they really don’t hold me back. Having best in class “architectural” libs would make my life significantly better I think. That said, it’s the most pressing from my personal perspective, not that of a beginner, etc.

                                                          JavaScript… Perhaps Promises. I think Generators need a boost in awareness and redux-saga type libs so that code isn’t littered with Promise/Async-Await (which tend to cause errors to “disappear”).