Threads for bowyakka

    1. 10

      How can tickets be saved offline if they can’t also be transferred outside of TicketMaster?

      This ticket is digital. Saving data offline is the same as copying it to your hard drive. If data can be copied, it can be transmitted. If it can be transmitted, it can be shared. If it can be shared, it can be sold.

      This thinking is incorrect. It assumes a textbook von Neumann architecture without any security measures implemented in silicon. With the power of hardware restrictions, it is indeed possible to create systems that are end-to-end secure without the user being able to interfere with or extract information. All modern computers and smartphones already have TPMs and support things like remote attestation. It is only a matter of time until such secure computing schemes gain widespread adoption.

      1. 24

        Because heaven forbid that the user is actually in control of their tickets or their device.

        1. 15

          I mean, this is Ticketmaster, perhaps one of the most reviled corporations on the planet. Them not respecting user’s rights is their entire business model.

          1. 3

            Not heaven, but the powers that be. I merely explained the technological reality without giving a value judgement. I care deeply about the users being in control of their devices.

            1. 2

              Yet, if they had applied your technically correct logic, they would have arrived at an incorrect inference. Instead, their simplified understanding led to a correct inference about what was likely, not what was technically possible.

              1. 1

                Yet, if they had applied your technically correct logic, they would have arrived at an incorrect inference.

                At the risk of sounding like Spock… That is incorrect. They arrived at the correct solution by following an incorrect sequence of conclusions. This does not imply the reverse, i.e. that if you reject the incorrect sequence of conclusions, you do not arrive at the correct solution.

                Or, put succinctly, even knowing the above, using the Chrome Developer Tools to inspect the website is a rational and productive way to approach the problem.

                1. 2

                  If they had rejected their “incorrect” conclusion that the described behavior implied (but, yes, did not assure) a trivially exploitable implementation, you think they would nonetheless have pursued its exploitation as a free time project?

                  The subtleties of human motivation have evaded you yet again, Spock. ;)

            2. 1

              To be fair, hardware secure tooling can support things like having password managers that aren’t just “a bunch of usernames on disk”, and a bunch of stuff that makes the hardware secure against nasty threat actors (like, say, govt officials demanding you hand over your phone to inspect the internals).

              Obviously this can exist with users still having control of their devices, but perhaps the user wants their device to be more secure for these axes! I, for one, appreciate that a good chunk of the population now uses smartphones with this kind of security, making it harder to have blanket inspections of what’s on them.

              1. 2

                Corporations tend to collaborate with governments and state agencies (otherwise a problem e.g. with their taxes may occur). If trusted computing should work for the user, it must be fully under control of this user.

                However, back to the technology. It is very difficult to create a device that can be trusted even if someone else „borrows“ it. You may recognize if you get back a different notebook or phone (same model, different piece), but you will not recognize whether the motherboard, HDD or some chips were replaced by the attacker. You will probably enter your password or fingerprint as usual. It may even work as normally, but your credentials will be sent to the attacker who has a copy of your disk etc. Maybe the device could prove you that it is the original device by showing an OTP code that you will check against a token (that you should never leave alone, because it also can be replaced or modified by the attacker to match the modified device). But this does not protect you against e.g. a keylogger between the keyboard and motherboard.

                Technologies like Secure boot or TPM may help protect average user against random thief or cracker. But in these cases, even open technologies that are fully controlled by the user will work (like free software full disk encryption).

            3. 2

              Yes, it is called Treacherous computing. And the fact that you can not resell your tickets is just the tip of the iceberg.

              1. 1

                All modern computers and smartphones already have TPMs and support things like remote attestation. It is only a matter of time until such secure computing schemes gain widespread adoption.

                Yeah I’m honestly surprised they didn’t do that here. TOTP is easy, but I expect they could just as easily store an attested public key (Android, iOS) for the device, and have the barcode show a signature over the timestamp from that key. The work is certainly more involved, but we’re talking a difference of weeks not months of dev time for a single team.

                Of course I’m assuming there a relatively flexible setup. Maybe their ticket readers can TOTP but don’t support the more complex cryptography of public keys. The data management is also harder; dealing with multiple devices for the same account becomes much harder, not to mention revocations or device migrations.

                I hear the concerns of others about user control, but that ship has I think firmly sailed. If you’re running an unrooted Google Android build or iOS, your phone is already doing this. And maybe that’s mostly ok? I like to think of a TPM not as a computing device I own, but as a general purpose copy protection dongle that comes built into my hardware. I don’t think this should be an essential requirement for you to use your device, but I think it’s reasonable to have hardware-bound keys that are remotely verifiable as such.

                1. 4

                  Np, I like the approach and kinda want it to work in firefox, maybe I will fiddle a pull request at some point.

                  1. 3

                    It more-or-less works on firefox right now. Seems to crash occasionally, but I’ve been able to use it. The only thing is I haven’t looked into packaging it.

                  2. 2

                    Hey, could you describe the format of the annotations? People have tried to do stuff like this before (in a more centralized way) and the hard — I would almost say “insoluble” — problem is how you refer to the text being annotated, when the page it’s on can change arbitrarily.

                    (IIRC, much of the complexity of Ted Nelson’s (in)famous Xanadu had to do with this. In Xanadu, links could point to any range within a document.)

                    1. 2

                      (IIRC, much of the complexity of Ted Nelson’s (in)famous Xanadu had to do with this. In Xanadu, links could point to any range within a document.)

                      This is kind of reminding me of the Chrome’s fragment anchors - kind of a worse is better version of Xanadu’s.

                      1. 2

                        Right now, the annotations contain a pair of (path, offset) tuples. The first (path, offset) indicates where an annotation starts, and the second one where it ends. The path is for selecting a DOM node, and the offset is for textual offset within that node. I’m aware that this is not how it’s always done in other systems. For instance, I’ve seen a pattern matching approach (“text that starts with A, and ends with B”); , it seems like this would malfunction on pages here similar text is often repeated. However, this approach would prevent structural changes (wrapping the content in a div, say) from mangling the annotation.

                        You’re completely right that pages changing presents a serious problem. There was a writeup by hypothes.is (I’m struggling to find it now) about the sort of annotation model they use, which involves several different schemas (e.g., a combination of both formats I’ve described above). I am hoping to apply a similar strategy to Matrix Highlight. However, it’s currently in its early stages, so I’ve had to worry about other parts of the software.

                        Ultimately, no annotation format is resilient to a page being completely replaced or fundamentally altered. Another hope of mine is to integrate with a web archiving technology (e.g. WARC) and store an archive of the page being annotated at the time it was accessed. This way, even if the original changes, the old, annotated version will remain intact.

                    2. 7

                      In other words, vi -> vim -> neovim would be a reasonable learning path, but beginners don’t do that and the NeoVim team actively recruits people to their cause without any consideration for the important of a progressive learning approach.

                      Wait wha? I cant use a tool because I didnt learn its ancestors? I didnt learn vim from vi, just like I am sure people forgot to take the journey of ed -> ex -> vi -> vim. Are we expected to learn emacs from TECO emacs -> gosling -> Gnu -> Xemacs -> Gnu ?

                      NeoVim looks in $XDG_CONFIG_HOME for its configuration files which means that it follows the ~/.config/… location convention that is now the Linux standard. I love this! I love their concern for this standard. Unfortunately, after more than two decades, no one cares because you already are maintaining your Vim configuration in a dotfiles repo and providing symbolic links.

                      I errr version ~/.config too so ¯_(ツ)_/¯. Also is it that terrible to have

                      cat ~/.config/init.vim
                      " I am lazy and have lots of ~/.vimrc stuff, pretend for my old self
                      set runtimepath^=~/.vim runtimepath+=~/.vim/after
                      let &packpath = &runtimepath
                      source ~/.vimrc
                      

                      The second thing listed on NeoVim’s comparison page is the 42 different defaults from Vim. These are completely > and totally irrelevant because anyone using Vim should always disable all the defaults and begin with a clean slate in their vimrc file.

                      You mean the defaults that all the distributions and every vimrc file in existence has? I guess I must force new users to learn why backspace is a bit weird and how that still relates to ex?

                      … the biggest being full shell integration for extensibility, not supporting Lua and NodeJS plugins. NeoVim has made itself into a serious joke among those who know and use Vi/m as has been down for decades for all the right reasons. … json_decode are just silly when commands like jq exist. They even renamed viminfo to shada for nothing but vain not-invented-here reasons. And Lua and Python support? Pffff. Please. You can be glad you learned to use Vi/m correctly and without a bunch of unnecessary bloat that would directly affect your performance on every other system with Vi while diminishing your ability to actually use your most powerful tool, the shell in which Vi/m is running.

                      ? json_decode is in vim too?

                      The if_perl has been dropped. Nothing screams “we are all morons” more than dropping Perl support from something that has had it for 2 decades just because you buy into the trendy Perl-hate.

                      I guess python or lua is bloat, but Perl is not? In some ways we should pour one out for Perl, but it is on the wane, do you care that mzscheme is also gone now?

                      NeoVim removed several core tools used regular by Vim users for seriously important use cases:

                      Maybe I am missing something?

                      ex - binary not installed (vim does) nvim -e ?, I bet a symlink would work too (you know how vim does this right?)

                      :ex - not accessible from vim command line You got me, but I guess Q is a bad key?

                      view - cannot run vi in read-only mode nvim -R ?

                      … etc etc

                      Again, incredibly inexperienced decisions from people who never actually learned to use Vim for anything significant in the first place. The fact that they removed :shell completely confirms they don’t value shell integration which is the basis of all of Vim’s magical power. The fact that they removed vimdiff shows none of them have ever worked on any cybersecurity project of any significance.

                      Ok so :shell is gone (I had to look it up, didnt miss it since tmux and or CTRL-Z and or :!), but vimdiff is still there, I would be surprised if I suddenly could not do a git merge

                      I kinda gave up at this point, it feels like old man yells at cloud

                      1. 4

                        I use the ones I made :p

                        • broot for exploring files, navigating, searching content, etc.
                        • lfs which more or less adds du and df plus parts of blk and some disk tagging
                        • rhit to query nginx logs

                        and some I didn’t made

                        • scmbreeze which has interesting shortcuts for git
                        • neovim, if it counts
                        1. 2

                          I like the idea of broot, but it glitched out between the UI and rm and I wound up blowing away a lot of files by accident.

                          1. 2

                            I never got any such report. Would it be possible to have more details (in a GH issue or in the chat if needed) ?

                            1. 1

                              Chat probably works, its hard to reproduce (and I dont want to :))

                              I was using the previous sort by size magics (which seems to now be --sort-by-size), finding dirs and then asking broot to delete.

                              That calls rm under the hood which at the time freaked broot out (it tried to refresh and put me in the top level), because of lag I would end up hitting the key combination again and then … toasting the parent (very large dir)

                              I might give it another spin and see if it lags out on me again

                        2. 1

                          Humm is this wise, the urban legend has always been asan has big security holes.

                          http://seclists.org/oss-sec/2016/q1/363

                          1. 4

                            The major specific vulnerability described here seems to be specific to suid binaries, which Firefox is not.

                          2. 2

                            By the same measure, wifi speeds are getting faster too. Also, power consumption is less on wifi.

                            1. 14

                              Javascript, haskell and even rust also have a bunch of these ‘features’ that need to be learnt. Its just the nature of the beast, nothing specific to C.

                              1. 5

                                What are some examples from Haskell and Rust?

                                1. 8

                                  Using Haskell for even moderately complex systems usually requires you to use (and learn) several language extensions that are GHC-specific and can add complexity to the language. It’s not common to see a file with 6-10 language extensions.

                                  This isn’t necessarily a bad thing. The core language has had a conservative evolution and most of the extensions that you’ll actually use are safe and well-understood. It gives the programmer the ability to customize the language, which is neat. It’s not beginner-friendly, though. This isn’t a major problem for intermediate or advanced Haskell programmers, but it puts people off to the language, especially if no one tells them that they can use :set -XLanguageExtension in ghci to bring the language extension in and explore its effects.

                                  Rust, like C++, is going to seem impossibly baroque if you’re learning it because you have to (i.e. because you were put on a project that uses it) and don’t understand the reasons why certain decisions were made. It makes explicit a lot of the rules that are implicit in sound C++, and those just take time to learn. If you get into it because you heard that it was like Haskell, you’re going to be disappointed, because it’s designed to be much more low level.

                              2. 5

                                Yep. C’s just from a different era, where there was much less of a gap between the designer and the user. Stuff like this was par for the course – software and computers in general were more arcane and it was just sort of an accepted fact of life.

                                1. 6

                                  I dug out C’s history in detail. The design specifics of C were done the way they were mostly because (a) author like BCPL that forced programmer to micro-manage everything; (b) they didn’t think their weak hardware could do anything better & it occasionally dictated things. BCPL was actually made due to (b), too. It wasn’t about design or arcana so much as (a) and (b). It then got too popular and fast moving to redo everything as people wanted to add stuff instead of rewrite and fix stuff.

                                  Revisionist history followed to make nicer justifications for continuing to use that approach which was really made personal and economic reasons on ancient hardware. That simple.

                                  1. 7

                                    I would argue, however, that if C had not been such a strong fit for a certain (rather low, compared to what most of us dow) level of abstraction, it wouldn’t have been successful. If C had been less micromanage-y, then the lingua franca for low/mid-level system programming would be some other language from the thousands that we’ve never heard of. Maybe it would be better than C, and maybe not; it’s hard to say.

                                    1. 6

                                      Modula-2 and Edison were both safer done on same machine. Easier to parse and easy enough to compile. Just two examples from people who cared about safety in language design.

                                      http://modula-2.info/m2r10/pmwiki.php/Spec/DesignPrinciples

                                      Modula-2 was designed for their custom Lilith machine:

                                      https://en.wikipedia.org/wiki/Lilith_(computer)

                                      These developments led to the Oberon language family and operating system:

                                      https://en.wikipedia.org/wiki/Oberon_(programming_language)

                                      Also note that LISP 1.5 and Smalltalk-80 were ultra-powerful languages operating on weak hardware. I’m not saying they had to go with them so much as the design space for safety vs efficiency vs power tradeoffs was huge with everyone finding something better than BCPL except the pair that preferred it. ;)

                                      EDIT: Check your Lobster messages as I put something better in the inbox.

                                2. 4

                                  C was less designed than organically grown over the past 40 years. Even if it was removed, you’re going to need to learn it to be able to read C.

                                  Once you learn this, its not that big of a deal.

                                  1. 2

                                    I think that not having a better macro syntax built into the language is just a byproduct of the fact that C fills a niche between usability and control. Speaking generally, if one were to standardize too many of these ‘shortcuts’, C may become more usable but also might become more bloated and infringe upon access to low level control. I think people want access to some low level features without being forced to use assembly.

                                    I’m not necessarily saying that this applies to do {...} while (0) (because IMO C should offer a better way to do this), but I think there’s a need to recognize a slippery slope of making higher level/black box things part of a language geared towards granular control.

                                    1. 4

                                      I think that not having a better macro syntax built into the language is just a byproduct of the fact that C fills a niche between usability and control.

                                      The designers had a PDP-11 with tiny memory/CPU, optimized for space/performance, preferred tedious BCPL, and didn’t believe in high-level languages or features like code as data. Combination plus maybe backward compatibility led to the preprocessor hack. It was really that simple. What you’re posting is revisionist although probably unintentional.

                                      1. 2

                                        I noticed all the people doing the more secure stuff intentionally went with a PDP-11/45. Difference must have been significant. In any case, they could’ve still done basic type-checking and such on the other one. My main counterpoint was that they could do Modula-2-style safety by default with checks turned off where necessary on per module, function, or app basis. All sorts of competing language designers did this. Hard to tell what would’ve been obvious in the past but it seems to me they could’ve seen it and just didn’t care. Personal preference.

                                        1. 2

                                          Thanks for the details! I think you’re right about using earlier model to boost credibility. Due to broken memory, I can remember specifically but I know I read something along those lines in one of the historical papers.

                                  2. 2

                                    Not really it was more bolted on from some things that were floating about bell labs at the time, the original language designers had little to do with it.

                                    To quote dmr

                                    | Many other changes occurred around 1972-3, but the most important was the introduction of the preprocessor, partly at the urging of Alan Snyder [Snyder 74], but also in recognition of the utility of the the file-inclusion mechanisms available in BCPL and PL/I. Its original version was exceedingly simple, and provided only included files and simple string replacements: #include and #define of parameterless macros. Soon thereafter, it was extended, mostly by Mike Lesk and then by John Reiser, to incorporate macros with arguments and conditional compilation. The preprocessor was originally considered an optional adjunct to the language itself. Indeed, for some years, it was not even invoked unless the source program contained a special signal at its beginning. This attitude persisted, and explains both the incomplete integration of the syntax of the preprocessor with the rest of the language and the imprecision of its description in early reference manuals.

                                3. 17

                                  Levine’s classic. Probably the first online book I have referenced in a “publication”; my high-school (A-Levels) project was an x86 disassembler. Actually, it was a database course management project, but I changed it a month before graduation, and my teacher refused to grade it; so she let me do disassembly with the tacit understanding that I wasn’t gonna get any help from her, and I was at the mercy of the outside graders. (I also changed the implementation language from Pascal to C and x86 assembly)

                                  It was my first real program. And I bled. The x86 binary format is not for the faint of heart, at least not a non-programming teenager.

                                  And I didn’t do well ;-)

                                  Trivia time: John Levine is (was?) the moderator of comp.compilers in the 90s, which I read religiously. He would edit posts with his own addenda (“[I think the Dragon book has this algorithm” – John]“), etc. I didn’t realize it was an edit, so I took on to asking questions in usenet and other online fora, but if I ever had a doubt about my question, I would add ”-John" addendum at the end with my own alternate theories. To this day, my teenage alias is archived with that embarrassing signature for perpetuity.

                                  Another trivia. One night I refused to go hang out with my teenage friends because I wanted to read up on SML/NJ, and work through some of Norman Ramsey’s “Hacker Challenges”. Ramsey was then at Harvard, IIRC, and had a page of challenges for “elite” “hackers”. Me being a “blackhat”, then, totally misunderstood the label; I spent close to a month studying SML/NJ, because one of Ramsey’s challenges was an optimizing linker for SML/NJ. I poured over Levine’s book and all sorts of publications trying to live up to this challenge. I thought writing a linker for SML/NJ would label me “elite” and give admission into exclusive IRC channels for top-criminals ;-)

                                  That month was the last time I had casual friends for the next decade. Everyone of those boys moved on and I never noticed us growing apart. The next time I looked up from this “research”, it was 2 years later and I was by now a Unix programmer (up, or down, from an Win 9x script-kiddie.) I missed prom, home-coming, graduation, new year’s eves .. the entire millennium, and I didn’t even care. I had better things to do.

                                  I found a new, different kind of pride. I was no longer another immigrant Somali kid “hustling” in America; I now had role models. I was better than bad-ass, I was curious. And I am grateful to this day I did!

                                  1. 1

                                    That is one hell of an interesting story, I am shocked as to how far a few books took you in life