1. 8
    1. 15

      It’s always hard to tell in cases like this where the truth lies, but for reference here’s a document from the Australian Home Affairs office which is meant to explain the bill:

      https://www.homeaffairs.gov.au/how-to-engage-us-subsite/files/assistance-access-bill-2018/explanatory-document.pdf

      Key text:

      Specifically, the Bill inserts a new Part 15 into the Telecommunications Act. This Part will: …

      • Allow the Director-General of Security, or the head of an interception agency, to issue a technical assistance notice requiring a designated communications provider to give assistance they are already capable of providing that is reasonable, proportionate, practicable and technically feasible. This will give agencies the flexibility to seek decryption in appropriate circumstances where providers have existing means to decrypt. This may be the case where a provider holds the encryption key to communications themselves (i.e. where communications are not end-to-end encrypted).

      If this is correct they are not banning end-to-end encryption nor insisting on implementation of back doors in any existing encryption. What they are requiring is that the intelligence office be granted assistance to exploit any pre-existing back doors, by the communications provider.

      In fact:

      Systemic weaknesses or vulnerabilities cannot be implemented or built into products or services. The Bill expressly prohibits technical assistance notices or technical capability notices from requiring a provider to build or implement a systemic weakness or systemic vulnerability into a form of electronic protection

      If this is all correct, it doesn’t seem so bad and some of the noise against it is hyperbole.

      1. 2

        I’m reading it right now and yeah, Division 7—Limitations:

        A technical assistance notice or technical capability notice has no effect to the extent it requires a designated communications provider to implement or build a systemic weakness, or a systemic vulnerability, into a form of electronic protection. Electronic protection includes forms of encryption or passcode authentication, such as rate limits on a device.

        This limitation ensures that providers cannot be asked to implement or build so-called ‘backdoors’ into their products or services

        1. 2

          Right - but you could require apple to sign a custom firmware (provided by police) that unlocked the secure enclave.

          Said firmware would then be accepted by any device, not just the target of the investigation; all it takes is copy of the signed bundle to leak and you have a massive decrease in the security of everything using a T2 chip.

          1. 1

            I highly recommend reading the Limitations division if you’re interested in this. I think you could definitely argue that custom firmware:

            … would render systemic methods of authentication or encryption less effective.

            So therefore will not be complied with.

            1. 1

              If you only plan on installing it once (or claim as much), there’s nothing systemic about it.

              1. 1

                Given:

                Said firmware would then be accepted by any device…

                I think you are claiming that:

                … would render systemic methods of authentication or encryption less effective.

                Are you not claiming that?

                1. 3

                  So, if the firmware blob is only intended to be installed on a single device, you would have a very hard time convincing a judge it represents a ‘systemic’ problem.

                  However, the existence of such a blob is, in fact, a systemic problem - because it only has to leak once to endanger every single ios user.

      2. 2

        I believe the fear is that asking them to send a copy of the key to the provider does not constitute a systemic weakness.

        For example, does the law prohibit requiring Apple to keep a “backup” of your passcode? It says they can’t disable or bypass or remove rate limits, but is a backup a weakness?

        1. 2

          I agree that the wording of the bill isn’t good (and it’s been widely criticised, including by the Law Council of Australia), and that the way it’s been rushed through is terrible. But I do not believe it was intended to allow such actions, and outside of the bill the government documentation (such as the explanatory memorandum I linked above) has made it reasonably clear (in my opinion) that it was not.

          But, yes, it’s bad if the wording of such a bill is ambiguous enough that such an argument could even be made. It’s just that I’m hearing people say “government is mandating all encryption must have a back door implemented” which is a long way from the truth. I saw people on twitter (https://twitter.com/benhutchingsuk/status/1070640169747918849) who seemed to seriously believe that for example Jira, the bug tracking system, could be forced to have a back door. The actual text of the Bill as duplicated in the memorandum seems to say:

          Although agencies may specify removing electronic protection in a technical assistance request and technical assistance notice, agencies may not require providers to build a capability to remove electronic protection under a technical capability notice

          That’s critical - they can ask for decrypted data if the provider can already decrypt it, but can not ask for the functionality to allow decryption (i.e. removal of electronic protection) to be implemented otherwise. In my eyes that text would also cover asking for eg a client program to be modified so that it would send end-to-end encryption keys to the provider.

          I’m not saying I like this bill - I think it sucks - just that I think its effect is being overstated.

        2. 2

          For example, does the law prohibit requiring Apple to keep a “backup” of your passcode? It says they can’t disable or bypass or remove rate limits, but is a backup a weakness?

          As I understand it, if Apple have your passcode then the law would definitely require them to provide it to law enforcement if they issue an appropriate request which they can do if they have a suitable warrant. Part of me wants to argue that it is enough of a problem if Apple have your passcode even if they aren’t legally obliged to provide it to law enforcement, but I’m certainly not a fan of this legislation.

      3. 1

        A lot hinges on the definition of “systemic weaknesses or vulnerabilities”

        The original draft bill (August) didn’t define these. Which meant they were up to the interpretation of the intelligence agencies themselves, and also the appeal process for any “designated communications provider” who challenged a request. The problem with that is that all of this process is secret (unlike normal legal process where more precise definitions can be found via Common Law in public courts.)

        The parliamentary joint committee who reviewed the draft legislation recommended that “systemic weaknesses or vulnerabilities” be defined:

        Recommendation 9

        2.10

        The Committee notes the evidence of the Director-General of the Australian Signals Directorate that a “systemic weakness” is a weakness that “might actually jeopardise the information of other people as a result of that action being taken”. The Committee also notes the evidence of the Director-General of Security, that the powers in Schedule 1 will not be used to require a designated communications provider to do anything that jeopardises the security of the personal information of innocent Australians. Having regard to those assurances, the Committee recommends that the Bill be amended to clarify the meaning of the term ‘systemic weakness’, and to further clarify that Technical Capability Notices (TCNs) cannot be used to create a systemic weakness.

        (PDF)

        This proposed definition from the Signals Directorate seems fairly reasonable to me. Under that definition, I think an organisation could appeal against anything like “just give us the key” or “just put this backdoor into everyone’s product and we’ll only use it for this really bad person”.

        When the law was passed last night, a last minute amendment was made to define “systemic weakness or vulnerability”. The legislative definition is:

        systemic vulnerability means a vulnerability that affects a whole class of technology, but does not include a vulnerability that is selectively introduced to one or more target technologies that are connected with a particular person. For this purpose, it is immaterial whether the person can be identified.

        systemic weakness means a weakness that affects a whole class of technology, but does not include a weakness that is selectively introduced to one or more target technologies that are connected with a particular person. For this purpose, it is immaterial whether the person can be identified.

        (PDF)

        They also added a definition for “target technology” to the legislation (it includes “a particular carriage service”, “a particular electronic service”, “particular software” or “a particular update of the software”).

        However there is no legislative definition of “class of technology”, this term seems to be entirely up for interpretation.

        It seems to me that it makes the law really broad in scope:

        • Is WhatsApp a whole class of technology? Are “all messaging apps” a whole class of technology? Are all phone apps a whole class of technology?
        • Is a company’s cloud service a whole class of technology? Is one server operated by a cloud service a whole class of technology? Are all servers operated by a cloud service a whole class of technology? Are all servers a whole class of technology?

        A definition of “class of technology” will probably be hammered out by the parties involved (ASIO, DSD, the attorney general, anyone who appeals a request). And, hopefully, that definition will be sensible. But the definition won’t be made public, because of national security… that seems pretty bad.

        (Maybe the promised amendments coming next year will add this definition. But the text above has passed as law in Australia…)

    2. 2

      I’m an Australian and am in something close to a state of despair between this legislation passing with the help of our main and utterly useless opposition, and alongside several unrelated non-technology political issues. Against seemingly all expert advice, both legal and technical (excluding of course the LEOs lobbying for it), the legislation was passed without the time for the vast majority of the politicians voting on it to even read it, spurred by a scare campaign by the government that these laws are necessary to prevent terrorism over the Christmas period.

      And this all comes shortly after Huawei was blocked from participating in the build of the national 5G network on the grounds that their technology might be backdoored by the Chinese government …

    3. 2

      For a lighter-hearted look at this, see the following parody (“honest” government ad). Be warned: contains some language that some may consider offensive / not safe for work.

      https://www.youtube.com/watch?v=eW-OMR-iWOE

    4. 2

      This is worrying. Do I have to start looking at fastmail alternatives? Any recommendations?

      1. 2

        How do you think this will affect fastmail, specifically? They do not provide encrypted email, and already will comply with Australian government search orders.

        1. 2

          I’m not an expert, but I don’t see how it would change anything for Fastmail, they already give the govt. anything they ask for(that’s legal).

      2. 1

        There is kolabnow.com (Kolab hosted) also Migadu.com, all hosted in the nice European countries that tend to care about privacy. There are Hosting providers for Zimbra, Horde, etc out there as well.

      3. 1

        I stumbled upon proton mail the other day. Not tried it though: https://en.wikipedia.org/wiki/ProtonMail

        If you like pain, you can host your own mail server.

        1. 1

          I stumbled upon proton mail the other day. Not tried it though: https://en.wikipedia.org/wiki/ProtonMail

          caveat emptor regarding protonmail.

          1. 1

            Who or what is Duke-Cohan?

            I’m a happy customer of ProtonMail, but I don’t like the pain of self-hosting unless it’s solely my own shit, and Google is a no-go zone.

          2. 1

            Oopsy!

    5. 0

      The encryption law in question was passed.

      If you’ve got any kind of political sway in your organisation, you might want to encourage them to not buy software from Australian companies (hi, JIRA!) or at least not security-relevant software.

      1. 8

        I think it’s irresponsible to make this claim because I don’t believe the legislation in any way forces or even encourages Australian companies to produce software that is in any way less secure. I think there are some parts of the legislation where are questionable but this particular claim (that it forces the creation of back doors) is FUD.

        1. 1

          Page 8 of the Explanatory Memorandum, emphasis as per source:

          Allow the Attorney-General to issue a technical capability notice, requiring a designated communications provider to build a new capability that will enable them to give assistance as specified in the legislation to ASIO and interception agencies.

          With the following “safeguard”, whatever that means:

          A technical capability notice cannot require a provider to build or implement a capability to remove electronic protection, such as encryption.

          1. 5

            A technical capability notice cannot require a provider to build or implement a capability to remove electronic protection, such as encryption.

            Right; i.e. it can’t force the creation of a back door. That’s what I’m saying.

            1. 2

              You can (eg) add a hidden participant to a whatsapp chat whilst keeping it E2E encrypted (not hypothetical; has happened).

              That leaks the content to whoever holds the extra participant key. I’d call it a backdoor even if the encryption hasn’t been removed.

              You can’t make them remove crypto, but you can make them build tech to give you the keys.

              1. 1

                Wording is that it can only be specific, not “systematic” - if an agency asks for you to systematically weaken security, you can’t be compelled to comply. This doesn’t apply for the “voluntary” Technical Assistance Requests.

                1. 1

                  Per my other reply - getting yourself inserted into one conversation unlikely to be called a ‘systematic’ weakening in the eyes of the law (even if the practical outcome is that security is weaker across the board).