1. 8

via AppSec Ezine - 172nd Edition

  1.  

  2. 13

    Given how every paragraph contains the name of the product they have been using, I have to assume this is an advertisement.

    1. 6

      Also explains why they call it “potential vulnerability” instead of “bug”.

      1. 1

        If you look at the history of medium posts by “Harry Lee” they’re clearly connected to the PVS-Studio developers in some way, although that name doesn’t appear on the PVS-Studio team page. There’s a post authored by the CEO of the company in there for a start.

      2. 12

        Mostly not vulnerability looking to me.

        1. [Comment removed by author]

          1. 13

            Downvoted for not telling us how easy it is to configure PVS-Studio.

          2. 3

            The unicorn logo is cool, though!

            1. 2

              Or you could say: “Thanks for bothering.”

            2. 6

              I checked all the bugs listed for wireless drivers, and all of them were added after porting drivers from OpenBSD to FreeBSD (in some cases the drivers listed exist in FreeBSD only).

              1. 5

                Well yeah…this is just another example of the difficulties involved in bringing back software from OpenBSD. You have to add your own bugs again!

              2. 5

                It’s sad, because last time I tried to get a copy of PVS-Studio is was the sort of “Call us and talk about pricing” thing (email in my case).

                Like, just give me a button to push and give you money. Why are you making this weird?!

                1. 3

                  Somewhere online are the slides from an early Coverity talk, which are darkly hilarious.

                  Selling code-checker tools turns out to be hard to start with & it’s even more difficult to successfully make your sales process ‘low-touch’ (in marketing speak).

                  Edit: here’s an ACM article by the Coverity people: https://cacm.acm.org/magazines/2010/2/69354-a-few-billion-lines-of-code-later/fulltext Recommended reading for anyone that seeks to ship a software tool I think.

                  1. 2

                    That’s an excellent read, worthy of a submission of its own (hint, hint!). Thanks!

                    1. 1

                      Done!

                    2. 1

                      That’s an excellent read, worthy of a submission of its own (hint, hint!). Thanks!

                    3. 3

                      I would say supply and demand. I believe there is not enough demand for such tools so if they let the market work the prices they would plummet down. Having the ‘call us’ gate keeper narrows down the demand to corporations who are more willing to eat up larger bills without informing anyone else on the market what most people want to pay for this.

                      1. 3

                        What mulander said is true but maybe understated. These tools are really hard to build. They usually start out with brilliant people in CompSci coming up with some new methods for statically analyzing things. Then, they have to get the false positives down or handle weirder things that the method isn’t suited for. That means piles of heuristics and/or separate techniques that require debugging often with manual ways since that’s not what their existing tooling was chosen for. Only exceptions I’ve seen were those smart enough to do it in Ocaml or something like that. Also, they need IDE integration, graphical tools, and helpful error messages. This is all collectively piles of labor with a mix of brilliant people (expensive) and others (still five digits). All that for a small market where a startup’s QA tools are a hard sell which drives up prices charged per company because they want profit more than testimonials. :)

                        Same reason the high-assurance OS’s certified under the TCSEC were so expensive. Design of GEMSOS with all the security-enhancing activities cost $15 million. NSA said the evaluation cost them $50 million but they’re also crazy inefficient (who knows). STOP OS in XTS-400 due to high costs and low volume was basically $100k a unit w/ $250k for high-availability with commercial support since they spent the money on confidentiality & integrity, not availability. OpenVMS suddenly looks cheaper than the competition. The DO-178B stuff starts at $50,000. The cheapest model I’ve seen outside Cleanroom is Altran/Praxis’s Correct by Construction using Z specs and SPARK Ada to get jobs done with 50% premium. You still have to convince businesses to pay for a minimalist product (think suckless) for 50% more than a feature-packed one. “A Hard Sell” barely begins to describe that conversation.

                        So, to fix it, we probably need to recreate the model done with proof assistants and some languages where academics keep FOSSing the critical components that were hardest to build. Then, a non-profit startup mostly operating by paid academics to keep costs of dedicated personnel down sells basic tooling at a price that scales with size of customer. Free individual w/ no support, cheap for small businesses, competitive for mid-sized, and so on. Consulting, esp training and support, offered to keep dedicated personnel paid. As commercial income increases & more grants come in, they keep expanding the capabilities. If designed to integrate well, then various Universities or private contributors keep expanding what it can do each working on their own components handling various things. This already happened with ACL2, Coq, Isabelle/HOL, Ocaml, Racket, etc. Except for ACL2, those were mostly done without the commercial angle: just paid CompSci folks that kept adding capabilities to shared tools. There’s also open-source tooling from NASA in C++ to build… static analyzers or abstract interpreters can’t recall which… that I’ve seen at least one project use.

                        Much potential for improvement here if CompSci teams do the heavy lifting. Keep throwing their undergrads at it. We all benefit in the long term when something like Astree Analyzer becomes FOSS.

                      2. 3

                        I can find any number of “potential vulnerabilities” you want, as long as you don’t ask me to provide evidence to back up that term. I just believe that every line of code is brimming with potential!

                        1. 1

                          This is a really cool error check (search for it in the artcile):

                          PVS-Studio warning: V778