1. 5

    Whether it is Material Design, Bootstrap, or some other pseudo-standard there is a lot of advantage in starting from a point of things “working/looking the same” across products. If “it just works” for the end user because they have encountered the same design choices on multiple other apps/sites they use then it is a convenience to the end user.

    There are definitely cases where completely unique designs are good, but this argument that it is a political statement to use Material Design is weak, but it went beyond weak to down-right offensive by comparing Material Design to the Nazi swastika.

    1. 3

      I think you misinterpret that comparison. In my reading it wasn’t comparing the swastika with Material Design. It was explaining how “nothing is neutral”, by noting that although a swastika is in principle just an abstract shape, no one in the Western world can view it like that any more: it has become a very meaningful shape, in whatever form it appears.

      I think it’s not a very good explanation, because it takes one of the most strong examples to explain a principle that seems to apply in at most moderate extent to Material Design

      1. 2

        I picked up this part of the analogy as well, and if the only intent was to show that symbols take on meaning over time then something like a dove (symbol of peace & religion) could be used, or the caduceus (universal symbol for a medical alert), or many many others.

        It is my opinion based on the rest of the content in the article that the author chose the swastika intentionally to evoke a strong negative emotion from readers that they would then transfer onto Google and Material Design whether consciously or unconsciously.

        1. 2

          I didn’t get that impression. I think the author was just looking for a strong example of a symbol evoking an idea due to past context, rather than anything inherent about its shape.

    1. 3

      This looks like a great book. Really scratches my recent itch for writing compilers! Kind of an odd question, but is there a way to pay you for your efforts? And/or get a paper edition when you’re finished?

      1. 6

        Kind of an odd question, but is there a way to pay you for your efforts?

        Strangely enough, this is one of the most common questions I get. It’s a nice feeling when the biggest concern most readers have is how to give me money. :)

        For right now, no, there isn’t. I suppose you could buy a copy of my other book (“Game Programming Patterns”) to put some cash in my pocket.

        And/or get a paper edition when you’re finished?

        Right. Once the book is done, I’ll put together a print edition. At that point, that will be the canonical way to support me. Until then, your safest bet is to join the mailing list so I can let you know when it’s available. (I only post about once a month when a new chapter is done, so don’t worry about spam overload.)

        1. 4

          For right now, no, there isn’t. I suppose you could buy a copy of my other book (“Game Programming Patterns”) to put some cash in my pocket.

          Let’s say you don’t set anything else up to make it more direct. Readers might wonder along lines, “I’d like to buy the book to get him paid but I don’t need a Game Programming Patterns book.”

          (to other readers)

          In that case, consider buying the book for his sake then donating that book to your local library to inspire future programmers. Alternatively, a gift for a programmer you know that’s shown some interest in game programming. You get to do two good things with who knows what ripple effects down the line.

          1. 2

            Is there any particular reason you don’t add a Patreon or a similar service?

        1. 4

          The single USB move on Apple’s part was also stupid. I have a Macbook Pro with 2 USB ports and that isn’t nearly enough ports either. I end up using both ports with one port running a 4 port expansion and every last port is used.

          1. 18

            I wonder how many people lock their laptops because they’re worried about “hackers” versus how many do it because they have an obnoxious friend. Don’t want to get hacked? Don’t be friends with this guy.

            Or don’t install bash.

            1. 11

              One company I worked at deliberately cultivated a culture of “if someone leaves their laptop unlocked you send a silly email to the company-wide list”, similar to Google’s “tailgate someone and they buy you lunch”. It was much more effective at getting people to lock their computers than sending monthly emails about it, and people tended to be creative/funny about it rather than obnoxious (or maybe that’s just a difference in perception).

              1. 11

                Not sure if we worked at the same company, but same situation for me at a past company. I took a screen shot of my desktop and set that as my locked “screen saver” to trick people into thinking I was leaving it unlocked when I would get up to leave my desk. It was a real laugh.

                1. 2

                  Where I work, you’ll almost definitely end up with some crazy wallpapers or other inconvenience (like accessibility tools on, or rotated displays, or disabled mouse/keyboard) if you leave your computer unlocked. It’s mostly just a fun way of making sure people don’t leave their computers unlocked, though afaik it isn’t officially condoned by the company.

                  1. 2

                    If the article had said coworker instead of friend I probably wouldn’t have thought twice about it. I think it depends on the context and the result as to whether doing this sort of thing is obnoxious or not.

                    I have xscreensaver set to start/lock after 10 minutes and I don’t immediately lock my computer every time I stand up because my threat model doesn’t include a crack team of hacker-ninja paratroopers crashing through the roof the second I leave the room. And I’m not going to lock my computer when I leave it unattended with a friend (who I presumably trust) around just on the off chance they one day get exposed to red kryptonite and decide to steal my ssh keys. But if my friend Bob decides I need to see the error of my ways and start taking the threat of flying computer ninjas and mind-altering comic book rocks seriously, and changes my terminal font to comic sans every time I run outside to catch a pokemon, assuming I couldn’t just change the locks or fake my own death to get rid of Bob, I’d start locking my computer – but only when Bob is around. So in this case there is no real threat, Bob has become the only threat by trying to demonstrate that there is a threat, and the result isn’t useful. Bob is being obnoxious.

                    In a workplace, that coworker who is sending that company-wide email professing your undying love of Taylor Swift could just as easily be harbouring a grudge from that time you forgot their birthday 5 years ago and instead send corporate secrets from your machine in an attempt to get you fired. In this case, after putting up with people breaking into Shake It Off every time you enter the room for a week, you’re likely to start locking your computer at work. And others might learn from your mistake. So there is a real potential threat and the shenanigans produce a real, useful result.

                    The author doesn’t state the context, but I have a feeling that any situation where a friend leaves their laptop unattended and unlocked and another friend is able to run this script is going to be closer to Bob than Taylor Swift.

                    1. 3

                      I don’t immediately lock my computer every time I stand up

                      Why? You’re being sarcastic and I get your perspective, but consider this. It takes a key combination to lock your computer, so what’s stopping you?

                      The largest threat to any company are malicious insiders, whether they’re disgruntled employees or “agents”. Why take the risk?

                      Additionally, if you were being targeted, it would be as simple as following you around to a coffee shop, distracting you/waiting for you to get up and get your latte, and running a single command like in the post. If you always lock your computer, you’ve thwarted that vector.

                      With full disk encryption on by default, a locked device is hard to break into. Look at Apple v. FBI.

                      Up your OPSEC!

                      1. 6

                        I think some people feel psychologically more relaxed when they aren’t thinking constantly about such threats. I think that’s valid.

                        I’m in the opposite camp; I can’t stand up without reflexively locking my screen, because I’d be too stressed if I forgot to.

                        Neither of these is an actual security posture. :)

                        1. 3

                          It takes a key combination to lock your computer, so what’s stopping you?

                          The key combination to unlock it is a lot more complicated :-)

                          I’m not saying I leave my laptop unlocked and unattended in coffee shops, where I’d probably be more worried about it being stolen than someone messing with it anyway, just that I’m not sure locking my computer whenever I go to make a cup of tea is going to change anything.

                        2. 4

                          Because unlocking a computer requires a password which is easily spied upon, and then used for more mischief later, when I really do want my laptop locked.

                          1. 4

                            Yubikey has a (poorly written) Windows app to allow you to login with a tap.

                            It’s a nice flow, they don’t advertise it at all though. If anyone has a Yubikey I recommend checking it out.

                            EDIT: https://www.yubico.com/why-yubico/for-individuals/computer-login/windows-login/

                          2. 1

                            With full disk encryption on by default, a locked device is hard to break into.

                            Not if the device is a computer currently holding the decryption key in RAM external to the CPU, which is typically the case when a screensaver is running.

                      2. 5

                        I lock it because I’m that guy. I once changed a coworker’s keyboard layout to turkish when he was away and in turn he switched my system region (not language, not keyboard) to Japanese… I only noticed weeks after because the weather widget in my start menu was in Japanese.

                        Ok that’s stupid, the real reason I do that is because I don’t want people peeping on my IM logs, we tend to be talkative between coworkers and some topics are private, that’s all.

                      1. 4

                        What I get so tired of is how this is always misquoted.

                        “We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%.” -Donald Knuth

                        Knuth is not saying that optimization is wholesale bad, but rather we shouldn’t waste time on optimizing the pieces that don’t matter and instead focus on the big wins where things really count.

                        1. 1

                          And use a profiler. There is no point optimizing a function for 3 days and finding that it did absolutely nothing because it is hardly ever called. And add unit tests before refactoring anything.

                        1. 18

                          There’s something important that’s being missed here, due in no small part to the FUD of people who are brainwashed into thinking that C is hard. The problem space and culture is just very different than what a lot of these folks are used to.

                          Working on a game is about as different from normal application development as you might ever get. You will not need to maintain it. You (if the survival rates of studios are any indication) will not need to support it into perpituity. You are (in the case of consoles) guaranteed certain hardware behaviors.

                          You will spend a lot of time debugging and tweaking. You will spend a lot of time making lots of sweeping changes for whatever reason. You will spend a lot of time doing finicky little optimizations because the world is cruel and unjust. You may find yourself having to scale (on the PC) from machines that are better than most EC2 instances down to a measly Raspberry Pi.

                          The thing that people don’t seem to get is that, in a more featureful language (cough C++ cough), the language starts to really fight you once you start making these changes. C++ has this elaborate template and class system, and it’s really great and wonderful if you don’t find yourself having to make big refactors. If you do, though, life gets annoying fast. Abstractions you thought were correct turn out rubbish, and can even (in the case of, say, writing a graphics pipeline) create permanent performance deficits because you are working in a thoughtspace totally divorced from the reality of hardware. And in games, guess what: the hardware matters.

                          C code is also trivial to patch up without doing something utterly bonkers by somebody else on the team. It’s all right there what is going on, and as such it is super easy to follow the logic. C++, especially with function and operator overloading and multiple inheritance and weird tricks with references and whatnot, has no such cap on complexity.

                          The thing is, I guess, that C is easy to maintain in the field and easy to add last-minute hacks to without things just blowing up everywhere. C++ or other languages can get really wonky as you pile on hacks, and can outright prevent them from happening if you follow their best practices.

                          EDIT:

                          Also, the boilerplate isn’t bad at all if you choose your abstractions and modules carefully. And pointers are not some scary eldritch trap waiting to byte you. :|

                          1. 4

                            A big part of this is choice of abstractions.

                            C++ provides more expressive power than C, which in turn demands more from the developer: do I use this language feature to represent this? Do I know all ways in which it will be used, and what the perf implications are? Can it be used incorrectly? How does it interact with the architecture as a whole? What is the equivalent C code for this abstraction? C++ demands that you ask these questions about every abstraction you create. Problem is, C++ presents a big bag of tools, which tempts developers to use them all, and think about they all interact together. This is not a language to fart around in and “try things out.” You need to know what the right tool is for the situations you run into ahead of time. It won’t guide you to them! I think it gets a fairly bad rap from the Internet partially because of this.

                            I like to say that C++ requires a lot of taste to use well, but I don’t want to make it sound like I’m excusing some of the choices it has made. You have to be critical about just about every abstraction you introduce in your program, verifying that it is what you want. You’re going to have a bad time if you don’t do that.

                            I don’t use C++ much anymore (thank goodness), but I learned a lot from it:

                            • absolute necessity of attention to detail when programming (indexing off-by-one error: possible memory corruption causing crash much later)
                            • value of simplicity in solving problems
                            • learning that production code is the worst place to expand
                            • possibilities of generic programming

                            One last point: C++ used to be hot shit in industry. Like, Javascript-level hot. Now look at how it’s regarded. Don’t think for a second that it won’t happen to JS, and whatever comes next. Today’s coolest, hottest thing ever is tomorrow’s despised, ignored language. Fundamental programming concepts, like abstraction, modularization, and solid knowledge of paradigms age really well. Language-level minutiae? Not so much.

                            1. 5

                              My small pet theory: When writing C, one writes a lot of boilerplate code, and this can be a soothing activity for developers. Its a bit like knitting on the couch.

                              1. 2

                                I felt tbis when writing go. Its kind of satisfying to just be typing for a while.

                                1. 1

                                  This rings true to me. When I wrote a lot of C, there was this ladder from boilerplate up to almost-correct abstraction, usually through terrible pre-processor hi-jinx, which would then crash back down to lots and lots and lots and lots more boilerplate.

                                2. 2

                                  Regarding the maintenance aspect; this was true a couple of console generations back and holds true for a subset of today’s games, but a larger and larger subset do require longer term maintenance.

                                  Since consoles have all gained online connectivity it has become a continuous cycle of 1) launch, 2) patch, 3) release expansion repeat steps 2&3 until the checks quit coming in.

                                  Popular multiplayer PC games follow this as well especially as there is a larger movement toward freemium. League of Legends was released more than 5 years ago and still releases updates roughly every two weeks.

                                  The majority of successful mobile games follow the freemium model of continuously updating.

                                  1. 1

                                    Freemium is a really terrible development and business practice for everyone, devs and players alike.

                                    I appreciate that it’s been proven to have efficacy in business, but it’s ruining the games industry.

                                1. 6

                                  On the whole I agree with the article. I have worked for 3 startups and been burned 3 times. I’m much happier with the mega-corps now. That said; I keep seeing these types of articles talking about $250K packages for a senior programmer and how you can work from anywhere with this kind of salary.

                                  This has not been my experience and the Bureau of Labor Statistics seems to be more apt from what I have experienced

                                  http://www.bls.gov/ooh/computer-and-information-technology/computer-programmers.htm

                                  I’m well above the BLS median, but not even half of $250K. Is this really achievable as “the norm” from anywhere in the USA, and if so where are all of these opportunities that I’m so obviously missing?

                                  How much of that $250K is actual salary and then how much of it is “value of benefits”? This is the other piece that is a head scratcher since the $250K always includes “value of benefits” I feel like it is a bit of sleight of hand that hides the lower actual salary.

                                  Here is the BLS breakdown by state, which is also interesting, http://www.bls.gov/oes/current/oes151131.htm#st California median is around 89K and Washington seems to have the highest median, still only 115K.

                                  1. 3

                                    Note that the BLS definition of “Computer programmer” appears to be very low level:

                                    “…They turn the program designs created by software developers and engineers into instructions that a computer can follow.”

                                    It’s likely that many of us here who would call ourselves programmers might fit into another statistical bucket for the BLS, like “software developers” - median $97k, or even “Computer and Information Research Scientists”, median $108k. Honestly, even those seem low and so I assume they’re wrapping together some jobs that I wouldn’t consider equivalent.

                                    In general it seems really hard to tell how much of these stories of high compensation to believe without just asking your peers, something I’m always reluctant to do. Certainly having some idea of what sort of field these offers are being made for is useful - Dan’s article was helpful in pointing out that people in “hot fields” get gobs more money.

                                    It’s also not always totally clear what being “senior” means. I think I had “senior” in my title once, but I think it means different things at different places. :)

                                    1. 1

                                      How much of that $250K is actual salary and then how much of it is “value of benefits”?

                                      Zero. A mediocre compensation package for a senior engineer today is $150k salary, $100k/yr of equity that’s not quite as good as cash (but pretty close) and bonuses.

                                      1. 7

                                        It’s certainly not anywhere near that in New York or Boston. Perhaps some SV outliers.

                                        1. 2

                                          That’s what people I know at Google make in Madison, WI. I’ve that numbers in places with a similar cost of living (like Austin, TX) are similar. Numbers are often much higher in SV, of course.

                                        2. 1

                                          Roughly how many years of industry experience does a senior engineer at Google correspond to? (I know that years of experience is a horridly imperfect metric, but it can be useful for HR-type stuff.)

                                          1. 1

                                            A decade, give or take a few.

                                            1. 1

                                              Three or Four or more

                                        1. 1

                                          Obviously lobste.rs since I’m posting here…, but also the programming subreddit as well as some language specific subreddits.