I recently learned that my company prefers closed-source tools for privacy and security.
I don’t know whether the person who said that was just confused, but I am trying to come up with reasons to opt to closed-source for privacy.
Security through obscurity isn’t security.
The classic example:
I have a website with no authentication which displays data that really should be locked down. But it’s OK because I never told anyone else the URL so no one will find it.
I never told anyone else the URL so no one will find it.
Who wants to tell them about DNS records and web crawlers?
My past employers have said the same, until I showed them they were already using apache, nginx, postgresql, MariaDB, and OpenWRT among other things.
A lot of shops think that using proprietary tools means they can demand fixes for critical vulnerabilities, but in my experience, even proprietary dev teams just reply that the code maintainers are aware and working on a fix.
Apache vuln? Here’s the link to their acknowledgment of that CVE and exactly what modules are affected.
That may show that the flaw is in an unused module, like node.is, but even when it is applicable, they just wait for the code maintainers to address it. They take no responsibility themselves.
Anti-libre software bans us from fixing it, bans us from control.
In my experience the “privacy and security” argument is a smokescreen.
The real reason is that it makes someone else responsible for zero-days occuring, for the security of the tool, and for fixing security problems in the tool’s code. With open source tools the responsibility shifts to your cybersecurity team to at least audit the code.
I don’t know about your workplace, but there’s no one qualified for that at my workplace.
A good analogy: If you build your house yourself, you’re responsible for it meeting local building codes. If you pay someone else to build it, you can still have the same problems, but it’s the builder’s responsibility.
That smokescreen argument makes a lot of sense. Both the company and our clients, tend to opt for ready out-of-the-box proprietary solutions, instead of taking responsibility of the maintenance.
It doesn’t matter how bad or limiting that proprietary option is. As long as it somewhat fits our scenario and requires less code, it’s fine.
It doesn’t matter if the code is open here. Depending on what your company does, it might be cheaper to buy ready to use products by some vendor than paying software/sysadmin guys to review, deploy and maintain. It can be even required by law. Needless to say there are many software vendors selling contract for open software, either hosted or fully deployed and supported. Still in many fields like medical due to vendor lock ins there aren’t many feature complete open software and you need the programs to be reliable, usable by non technical people and virtually unchanged over long time. To provide these guarantees without depending on proprietary vendors means to make your own software company (and perhaps open up your work not to become just another closed software) and nobody does that.
Security works kinda the same. But in these contexts if someone uses privacy and security together like this it’s probably just bs.
instead of taking responsibility
This is why, they prefer to shift the blame in case it hits the fan. That’s all, that’s it.
They don’t care about code quality, maintainability or whatever.When you get right down to it, it’s all risk management.
I recently learned that my company prefers closed-source tools for privacy and security.
I will suggest that same logic to my banker too: a vault whose key they won’t own, but I will. Don’t worry, all your money will be safe with me, it’s a promise 😇
Pinky promise
There is some logic here, having a business relationship with a party that now has a contractual duty to you, is a stronger guarantee than an open source project.
For instance Windows is source available, to many businesses, so in one sense it’s open source, and the other sense is closed source. From a business perspective that’s a reasonable trade-off sometimes
Tin-foil hat on. So, with CCP/GSP, secret agencies are free to find backdoors on the system.
I didn’t know about those programs. I thought the Windows source code is kept secret from everyone.
We are banned from fixing backdoors. Conspiracy? Derailment strategy.
Cloased source does for privacy and security what sweeping problems under the rug does: it mitigates them, a bit, but then when they inevitably do hit, they hit hard.
Best reason: nobody see how bad your code is 🤷♂️
A very common strategy to divert blame away from yourself is, using fake security as a cover story, infecting yourself with anti-libre software, so you are banned from fixing its source code. Also, saying ‘open source’ is a strategy to derail libre software.
You can make an argument for confidentiality making it harder to find exploits in your code. If nobody cares enough to report them to you, or if you don’t have the resources to fix them, open-sourcing your code just exposes them.
This is pretty much only an argument if you use stuff that would be irresponsible to use in the first place tho
If nobody cares enough to report them to you, or if you don’t have the resources to fix them
To be fair, this scenario does feel worryingly like it might be common.