1. 1

    I’m being a good son and took my mum to London with me, as I was going over to attend the friday nights Taylor Swift concert.

    So no programming this weekend.

    1. 2

      Do I like to code, I suppose that I like to code. If I wasn’t getting paid, I probably wouldn’t code much, if any at all.

      I have transitioned from viewing code as a goal, to viewing code as a tool.

      To me this means that I’ve started to think a lot more about the design and how to improve code just by design, and to make it hard/impossible to do things wrong by design, where as I would’ve skipped ahead and started to code previously and figure out a design based on the prototype.

      I guess it goes back to the fact, that I’m much more of an engineer than most of the people I know in real life, and I just happen to do ICT instead of Mech or Civil, and that I’ve done a lot of different things to make money - and that’s really the end goal to me.

      I work to live and not live to work, and as such I tend to optimize towards least amount of time spent, earning as much as possible.

      This of course means I need to have a certain level of work provided, so I do keep myself up to date and try out new techniques, but I do that strictly in paid time.

      To end this, I like to think I have a professional approach to this, I take pride in my work but I don’t let work define me.

      1. 1

        I am somewhat successfully using them at my current work.

        We are a small team though, we’re 5 in my team, and ideally we like to finish it in >10 minutes - two minutes per person, one minute for what you did yesterday and one minute for what you’re going to do today and if you can see any challenges.

        The one minute for yesterday is a small retrospective, to try to reflect on whether what was said yesterday was somewhat correct, in the sense, was the prediction of what was going to be hard correct or not?

        I think there’s some value in it, but it’s nothing that wasn’t already available by a short daily status meeting prior to SCRUMification.

        1. 3

          More Scala finding its way into Java. Now they just need some more immutability, deprecate the null keyword and removed checked exceptions. :-P

          1. 2

            Deprecating null… Well, C# is attempting it, so Java will probably get it in 5 years time ;)

          1. 3

            I’m playing around with porting the RxSwing library (https://github.com/ReactiveX/RxSwing) to work on RxJava 2.

            Work wise I’m making a custom sort routing which is locale aware for C strings… The wonders of working on an embedded device without a full C standard library.

            At least I’ve gotten to brush up my C++ for the test code - and props to CppUTest for being by far the best c++ unit testing framework I’ve tried so far.

            1. 1

              Still working on rewriting a Swing Java App from old school button handlers does it all into a MVVM approach.

              For fun, I saw an article here about building your own kernel for the Raspberry Pi - as I have a few lying around, I figure I’ll try that out.

              1. 4

                Is there someone who can elaborate on why it’s seemingly a need to be able to block headless browsers from accessing sites?

                1. 4

                  I’m speculating, but I suspect it’s to do with verifying that the client is driven by a “real human” for advertising and tracking purposes.

                  Edit I followed some links and found this article:


                  Quoting from the second section:

                  Why detect headless browser?

                  Beyond the two harmless use cases given previously [doing tests or taking screenshots of webpages], a headless browser can also be used to automate malicious tasks. The most common cases are web scraping, increase advertisement impressions or look for vulnerabilities on a website.

                  1. 2

                    Thank you for the elaboration gerikson.

                    So it’s basically a few attempts at making it slightly harder to use a headless Chrome to do bad stuff. It just seems like it’s on the wrong level the attempt is being made.

                1. 2

                  Looking into JavaFX, to see if it’s usable for a refresh of a UI in a Java program. Is there other frameworks out there for Java that’s more MVVM centered? (… possibly something like WPF but for Java?)

                  Also working on more automatic testing for the handsets - it’s growing into a nice little system by now which can simulate a user.

                  1. 2

                    I’m reading a book about PostgreSQL - “Mastering PostgreSQL in Application Development” by Dimitri Fontaine - it’s to get a refresher for my SQL as I haven’t been using it for a while, and I hope to learn some of the newer features of SQL, I’ve been quite inspired by reading Markus Winands Modern-SQL.com site.

                    In the fiction department, I’m about to read Dan Browns latest Robert Langdon book - Origin

                    1. 21

                      The fundamental problem with USB-C is also seemingly its selling point: USB-C is a connector shape, not a bus. It’s impossible to communicate that intelligibly to the average consumer, so now people are expecting external GPUs (which run on Intel’s Thunderbolt bus) for their Nintendo Switch (which supports only USB 3 and DisplayPort external busses) because hey, the Switch has USB-C and the eGPU connects with USB-C, so it must work, right? And hey why can I charge with this port but not that port, they’re “exactly the same”?

                      This “one connector to rule them all, with opaque and hard to explain incompatibilities hidden behind them” movement seems like a very foolish consistency.

                      1. 7

                        It’s not even a particularly good connector. This is anecdotal, of course, but I have been using USB Type-A connectors since around the year 2000. In that time not a single connector has physically failed for me. In the year that I’ve had a device with Type-C ports (current Macbook Pro), both ports have become loose enough that simply bumping the cable will cause the charging state to flap. The Type-A connector may only connect in one orientation but damn if it isn’t resilient.

                        1. 9

                          Might be crappy hardware. My phone and Thinkpad have been holding up just fine. The USB C seems a lot more robust than the micro b.

                          1. 3

                            It is much better, but it’s still quite delicate with the “tongue” in the device port and all. It’s also very easy to bend the metal sheeting around the USB-C plug by stepping on it etc.

                          2. 6

                            The perfect connector has already been invented, and it’s the 3.5mm audio jack. It is:

                            • Orientation-free
                            • Positively-locking (not just friction-fit)
                            • Sturdy
                            • Durable

                            Every time someone announces a new connector and it’s not a cylindrical plug, I give up a little more on ever seeing a new connector introduced that’s not a fragile and/or obnoxious piece of crap.

                            1. 6

                              Audio jacks are horrible from a durability perspective. I have had many plugs become bent and jacks damaged over the years, resulting in crossover or nothing playing at all. I have never had USB cable fail on me because I stood up with it plugged in.

                              1. 1

                                Not been my experience. I’ve never had either USB-A or 3.5mm audio fail. (Even if they are in practice fragile, it’s totally possible to reinforce the connection basically as much as you want, which is not true of micro USB or USB-C.) Micro USB, on the other hand, is quite fragile, and USB-C perpetuates its most fragile feature (the contact-loaded “tongue”—also, both of them unforgivably put the fragile feature on the device—i.e., expensive—side of the connection).

                              2. 4

                                You can’t feasibly fit enough pins for high-bandwidth data into a TR(RRRR…)S plug.

                                1. 1

                                  You could potentially go optical with a cylindrical plug, I suppose.

                                  1. 3

                                    Until the cable breaks because it gets squished in your bag.

                                2. 3

                                  3.mm connectors are not durable and are absolutely unfit for any sort of high-speed data.

                                  They easily get bent and any sort of imperfection translates to small interruptions in the connection when the connector turns. If I – after my hearing’s been demolished by recurring ear infections, loud eurobeat, and gunshots – can notice those tiny interruptions while listening to music, a multigigabit SerDes PHY absolutely will too.

                                3. 3

                                  This. USB-A is the only type of usb connector that never failed for me. All B types (Normal, Mini, Micro) and now C failed for me in some situation (breaking off, getting wobbly, loose connections, etc.)

                                  That said, Apple displays their iPhones in Apple Stores solely resting on their plug. That alone speaks for some sort of good reliability design on their ports. Plus the holes in devices don’t need some sort of “tongue” that might break off at some point - the Lightning plug itself doesn’t have any intricate holes or similar and is made (mostly) of a solid piece of metal.

                                  As much as I despise Apple, I really love the feeling and robustness of the Lightning plug.

                                  1. 1

                                    I’m having the same problem, the slightest bump will just get it off of charging mode. I’ve been listening to music a lot recently and it gets really annoying.

                                    1. 2

                                      Have you tried to clean the port you are using for charging?

                                      I have noticed that Type C seems to suffer a lot more from lint in the ports than type A

                                  2. 6

                                    It’s impossible to communicate that intelligibly to the average consumer,

                                    That’s an optimistic view of things. It’s not just “average consumer[s]” who’ll be affected by this; there will almost certainly be security issues originating from the Alternate Mode thing – because different protocols (like thunderbolt / displayport / PCIe / USB 3) have extremely different semantics and attack surfaces.

                                    It’s an understandable thing to do, given how “every data link standard converges to serial point-to-point links connected in a tiered-star topology and transporting packets”, and there’s indeed lots in common between all these standards and their PHYs and cable preferences; but melding them all into one connector is a bit dangerous.

                                    I don’t want a USB device of unknown provenance to be able to talk with my GPU and I certainly don’t want it to even think of speaking PCIe to me! It speaking USB is frankly, scary enough. What if it lies about its PCIe Requester ID and my PCIe switch is fooled? How scary and uncouth!

                                    1. 3

                                      Another complication is making every port do everything is expensive, so you end up with fewer ports total. Thunderbolt in particular. Laptops with 4 USB A, hdmi, DisplayPort, Ethernet, and power are easy to find. I doubt you’ll ever see a laptop with 8 full featured usb c ports.

                                    1. 2

                                      I just finished reading What If?: Serious Scientific Answers to Absurd Hypothetical Questions by Randall Munroe, which is a delightfully absurd book.

                                      I also just placed a book order and I’ll get the following books to read soon:

                                      Michael Sikorski “Practical Malware Analysis: The Hands-On Guide to Dissecting Malicious Software” Abelson, Harold “Structure and Interpretation of Computer Programs, 2nd Edition (MIT Electrical Engineering and Computer Science)” Zalewski, Michal “The Tangled Web: A Guide to Securing Modern Web Applications” Seitz, Justin “Gray Hat Python: Python Programming for Hackers and Reverse Engineers” Perry, Brandon “Gray Hat C#”

                                      So a pile of different technical books to play along with and a single fiction book:

                                      Brown, Dan “Origin: (Robert Langdon Book 5)”

                                      I’ve loved the previous 4 installments in the series, so I had to pick up the newest one as well :)

                                      1. 5

                                        As echoed by the others, if you set up some way to donate a few dollars for the server maintenance and to give a round of drinks for the moderator team every now and then, I’ll be happy to chip in.

                                        1. 1

                                          It’s a bit short on advice on how to avoid these pit falls.

                                          Is there any good books that uses this reverse approach?

                                          1. -1

                                            Calling that an “optimization” is hilarious. The standard says that it does not specify what happens on a null call and the LLVM compiler writers have made the nutty determination that they can then assume there is no UB in the code.

                                            1. 1

                                              Actually given all the extra work they have put into static warnings and UBSan….

                                              Actually they are doing the right thing.

                                              Admittedly In several places I believe the standards committee should just have had the balls to define a behaviour… which is one of the things I like about D.

                                              1. 1

                                                One of the weird thing about the standard is that the committee says that non-portable code is a core part of C

                                                1. C code can be non-portable. Although it strove to give programmers the opportunity to write truly portable programs, the Committee did not want to force programmers into writing portably, to preclude the use of C as a “high-level assembler;’’ the ability to write machine-specific code is one of the strengths of C. It is this principle which largely motivates drawing the distinction between strictly conforming program and conforming program. ( http://www.open-std.org/jtc1/sc22/wg14/www/docs/n1250.pdf )

                                                And then it goes and tosses non-portable code into UB.

                                                1. 1

                                                  Even though I work with C in my day job, I always thought it’s nothing short of fucking bonkers that compiler writers are so focused on synthetic benchmarks that we have somehow come to accept, that of course this makes sense when it so clearly doesn’t.

                                              1. 5

                                                Wouldn’t it make sense to do a general BSD tag instead, and merge {Net,Open,Free} BSD into that? It’s usually easy to deduce which BSD it’s from, based on who’s posting it ;)

                                                1. 13

                                                  I disagree. See this discussion.

                                                  1. 12

                                                    I strongly disagree for a common BSD tag and expressed that when the NetBSD tag was suggested here: https://lobste.rs/s/n5vowd/new_tag_suggestion_netbsd

                                                    I am all for adding a DragonFlyBSD tag.

                                                    1. 3

                                                      m-o o-n, that’s how you spell unix.

                                                    1. 4

                                                      Writing a whole lotta test cases at work and reworking a lot of my material for a computer networking course I’m teaching. so if any of you know about great youtube videos about networking theory / nice graphics / other things I might be able to use, I’d appreciate a link and I’ll buy you a beverage of choice if we ever meet ;)

                                                      1. 1

                                                        All I could think of while reading this was “wow, Java is or used to be really problematic.”

                                                        1. 3

                                                          Can you be more specific?

                                                          At least the optimization itself would be applicable to many programming languages: Setting a good initial size for a container.

                                                          1. 0

                                                            Amen. Makes me happy millenials killed it (along with its bastard companion XML).

                                                            1. 15

                                                              Java is alive and well. I have no idea how you come to the conclusion it was killed.

                                                              1. 0

                                                                Java is dead in the sense C++ is dead. Once dominant, now one of the languages used by increasingly old guard. Of course there are still projects in Java, and even likely some people coding applets for old times sake.

                                                                But you can ignore Java at this point without handicapping your career.

                                                                1. 6

                                                                  I am working for start-ups in the Bay Area and I can tell you that java is very much alive and well and used for new things every day. Nobody writes GUI apps in it anymore, but in the back-end it is widely popular.

                                                                  1. 3

                                                                    People do tons of new projects in C++ too. Still nothing like its heyday mid-90s.

                                                                  2. 3

                                                                    But you can ignore Java at this point without handicapping your career.

                                                                    I agree with you, but I can’t think of a language that’s not true of. There are a lot of language ecosystems that don’t overlap much if at all - Java, Ruby, Python, .NET, Rust, Erlang…

                                                                    1. 3

                                                                      I think if you don’t have some level of understanding the level of reasoning that C works at, that can be a bit of a handicap, at least from a performance standpoint. Though that’s less of a language thing than it is about being able to reason about bytes, pointers and allocations when needed.

                                                                      1. 0

                                                                        That wasn’t true say 15 years ago. Back then if you wanted to have professional mobility outside certain niches, you had to know Java.

                                                                        1. 2

                                                                          I’m going to respectfully disagree. 15 years ago, you had Java, and you had LAMP (where the “P” could be Perl, PHP, or Python), and you had the MS stack, and you still had a great deal of non-MS C. After all that, you had all the other stuff.

                                                                          Yes, Java may have been the biggest of those, but relegating “the MS stack” to “certain niches” perhaps forgets how dominant Windows was at the time. Yes, OSX was present, but it had just come out, and hadn’t made any significant inroads with developers yet. Yes, Linux was present, but “this is the year of Linux on the desktop” has been a decades-long running gag for a reason.

                                                                          1. 1

                                                                            MS stack was in practice still C++/MFC at the time, and past its heyday. The dotcom boom dethroned desktop, Windows and C++ and brought Java to prominence. By 2000, everyone and their dog were counting enterprise beans: C++ was still massive on Monster, but Java had a huge lead.

                                                                            Then Microsoft jumped ship to .NET and C++ has not recovered even since. In mid-90s you were so much more likely to land a job doing C++ vs plain C; now it’s the opposite.

                                                                            My karma shows I hurt a lot of feelings with my point, but sorry guys Java is in visible decline.

                                                                            1. 1

                                                                              Oh, my feelings weren’t hurt, and I don’t disagree that Java is in decline. I merely disagree with the assertion that, 15 years ago, you had to know Java or relegate yourself to niche work. I was in the industry at the time. My recollection is that the dotcom boom brought perl and php to prominence, rather than java.

                                                                              Remember that java’s big promise at the time was “run anywhere”. Yes, there were applets, and technically servlets, but the former were used mostly for toys, and the latter were barely used at all for a few years. Java was used to write desktop applications as much as anywhere else. And, you probably recall, it wasn’t very good at desktop applications.

                                                                              I worked in a “dotcom boom” company that used both perl and java (for different projects). It was part of a larger company that used primarily C++ (to write a custom webserver), and ColdFusion. The java work was almost universally considered a failed project due to performance and maintenance problems (it eventually recovered, but it took a long time). The perl side ended up getting more and more of the projects moving forward, particularly the ones with aggressive deadlines.

                                                                              Now, it may be that, by 15 years ago, perl was already in decline. And, yes, java took some of that market share. But python and ruby took more of it. A couple years later, Django and Rails both appeared, and new adoption of perl dropped drastically.

                                                                              Meanwhile, java soldiered along and became more and more prominent. But it was never able to shake those dynamic languages out of the web, and it was never able to make real inroads onto the desktop. It never became the lingua franca that it wanted to be.

                                                                              And now it’s in decline. A decline that’s been slowed by the appearance of other JVM languages, notably scala (and, to a lesser degree, clojure).

                                                                  3. 6

                                                                    Incidents of Java in my life have only increased as my career has, I’m quite certain Java is far from dead and we’re all the worse for it. I’ve even worked for “hip” millennial companies that have decided they needed to switch to Java.

                                                                    1. 5

                                                                      Java is still alive and kicking, having a language that has proven itself to be good enough with a rich ecosystem with different vendors having implemented their own JVM, we’re all the worse for that because?

                                                                1. 2

                                                                  I’ve been using Dia but it’s just a free version of Visio + upnp mappers. Sadly still manual work.

                                                                  1. 4

                                                                    Playing around with Nancy, which while a bit too magical for my tastes, (Like, a lot of dynamic in C# smells of someone wishing it was Ruby.) is probably the best experience I’ve had for C# webdev. ASP.NET Web Forms is a bit strange, but since that point it’s been seemingly non-stop churn especially in the MVC front. It doesn’t seem like a stable place to build an application, which is a shame.

                                                                    1. 3

                                                                      Unless you want to be hosting an ASP.NET application on non-Windows, ASP.NET MVC 6 is the way to go still, I would say. There’s too much churn on .NET Core to base anything critical on it.

                                                                    1. 3

                                                                      For work I’m in the progress of breaking up a Java application into more manageable pieces - it’s written as C with Java syntax, so I’m slowly reworking it piece by piece to become easier to maintain, as it’s obviously ridden with a whole lot of globals and logic in button handlers.

                                                                      For fun times, I’m working on a minimal FTP client in Java to help me teach a course in the autumn semester.