Moreover, given the government already requires companies to submit reports of vulnerability and other security aspects as assessed by third parties, it makes little sense to repeat this exercise.
The government’s new rules for device manufacturers to ensure digital security would seem well-intended against the backdrop of increasing frequency and intensity of digital attacks—CERT-In recently warned of a phishing wave. But, the rules create their own monster. Business Standard reports that mobile-makers are seeing red over a government proposal that makes it mandatory for them to share source codes for security testing, though the draft Indian Telecom Security Assurance Requirements do not have any such provision. Source code is the software fundamental of a phone, powering the apps and user interface of the mobile phone. As per the proposal, the source codes are to be tested by a third party, which may take up to 12-16 weeks. If indeed such a provision is approved, an Apple would not be able to launch its phones in India at the time it does this across the rest of the world, and would be handing over trade secrets and proprietary software to the government and third-party private labs without any assurance that these would not be leaked to other manufacturers. The absurdity, however, does not end there—all future updates will also need to be tested in a similar fashion before release. So, a phone update in India may come months after it is released elsewhere. It isn’t hard to imagine what this will do for piracy and the expansion of illegal trade.
Data breaches can have serious costs, given most users freely upload financial and health information on to their phones. But, even the strictest regimes in the world do not require companies to share source codes. In China, copyright registration requires that only first 10/30 and last 10/30 pages are shared with the government. Indeed, sharing of the source code to detect vulnerabilities could spawn its own industry of information attacks, if data on vulnerability gets leaked to those who benefit from this.
Moreover, given the government already requires companies to submit reports of vulnerability and other security aspects as assessed by third parties, it makes little sense to repeat this exercise. A better approach would be to have a government team of testers red-flag security concerns to mobile phone manufacturers. In such a scenario, the government will be able to highlight security lapses at the time of beta-testing—most companies do release beta versions—and the final release. More important, this way, it will be able to create standards and guidelines and a network of cybersecurity professionals within the country. Security concerns can be better dealt with multiple stakeholders—professionals, industry representatives and academia—studying the security architecture than having a company sign away its trade secrets to the government and a third party. This would also translate into India cultivating a top-class cadre of cybersecurity professionals.