1. 1
  1.  

  2. 3

    I’ll admit to only skimming, but I wasn’t inspired by their choice of examples: heartbleed and the Sony hack. It’s like an article on human mortality that starts by discussing cancer and car crashes. Anything the two have in common is going to be pretty damn high level.

    1. 1

      They should’ve said Playstation Network hack. It wouldn’t be that high-level then. Just total negligence on stuff so common a basic checklist or decent tools on common vulnerabilities would’ve prevented problems of both. However, if you’re talking incentives by those in control, OpenSSL and Sony Pictures have a lot in common:

      http://www.cio.com/article/2439324/risk-management/your-guide-to-good-enough-compliance.html

      OpenSSL was doing something similar with design strategies and implementation assurance. Even for certification bodies, too, albeit on lower levels of most bullshit one. Your pals called them out great with LibreSSL much like some other people who knew about secure coding or networks called out Sony Pictures’ management. ;)

    2. 1

      Without buy-in from managers, secure coding usually goes nowhere. The additional time necessary to make sure developers are trained, and to ensure that secure coding practices are followed, has to be justified up front. Managers need to understand that development will slow down (at least in the short term. I think long term secure coding has beneficial effects for development pace, but taking the long view when you have a release schedule to meet is a difficult thing to do), but that the positives do outweigh the negatives.