If you’re not familiar with the conversation around Huawei, it goes like this:
Huawei makes infrastructure components for 5G networks.
They’re cheap and good enough that most providers would want to use them in building their 5G networks.
They’re a Chinese company, which means there’s a risk they could be compelled into helping their government spy (China even has a law to this effect).
The US doesn’t want Huawei equipment used in building 5G networks here, and doesn’t want its allies to, either.
The British government recently did a thorough security audit, and while it didn’t find any intentional back-doors, it didn’t exactly give them a glowing review on code quality or security best practices.
I’ll point you to an article from The Economist for that last one; as you might suspect, the debate continues on whether and how to use their technology. https://www.economist.com/leaders/2019/04/27/britain-strikes-an-artful-compromise-on-huawei-and-5g
I’m choosing that particular article because I submitted a version of the below as a letter to the editor, an excerpt from which was published in this week’s edition. https://www.economist.com/letters/2019/05/11/letters-to-the-editor
There’s a few observations here which are broadly applicable:
It’s unlikely that Huawei is going to be willing or able to fix all of these issues in a timely fashion. Most companies are in a similar position. Ideal security posture is to fix not just issues, but also the business practices that allowed them to appear in the first place… But in practice, most of the time, (and especially when an issue isn’t clearly exploitable) it could languish for a long time - you’d be hard-pressed to find a code base without some sloppy code or to-do’s.
A huge chunk of security is just being well-organized. If you don’t know where all of your doors are (or all of your copies of OpenSSL), you are definitely going to miss one when you change the locks.
In the hands of an attacker, a vulnerability can be just as useful as an intentional back-door. An inventory of where vulnerabilities may lie is similarly useful. In the cyber-realm, there’s no such thing as “your” weapons and “my” weapons. A weapon belongs to anyone and everyone who knows it exists.
People who know more than myself about security use the term “security by obscurity” with disdain. The most secure system is one that has a lot of people with different perspectives and motivations looking at it. That reduces the chance that any given hole will be found only by a party with malicious intent and keeps it to themselves. This is why banks don’t write proprietary encryption algorithms; they use ones that you can read about on Wikipedia.
In this particular case, there’s a geopolitical dimension, that we generally don’t need to consider at our day jobs. Britain’s spies now have capabilities to compromise Huawei’s gear that could rival those of China’s own spies. That’s a powerful deterrent. (There is, however, still a fairly big asymmetry in the willingness to use them - which I’d argue needs to be factored in as a large discount whenever democracies think about offensive capabilities - and I’d also argue should prevent us from inserting our own back-doors into companies’ software)
Little of this is unique to Chinese companies, or even foreign ones. Security audits like this should be common for more technical vendors. The costs should be borne by the vendors, much as banks bear the cost of deposit insurance.
Putting all of that together, if you’re going to restrict imports of a product, but you wouldn’t also restrict exports of a competing product, adversaries will have weapons to use against you. If you fail to audit your domestic vendors, then you’re effectively giving adversaries an exclusive backdoor.