Proprietary Insecurity
Other examples of proprietary malwareThis page lists clearly established cases of insecurity in proprietary software that has grave consequences or is otherwise noteworthy.
It would be incorrect to compare proprietary software with a fictitious idea of free software as perfect. Every nontrivial program has bugs, and any system, free or proprietary, may have security holes. That in itself is not culpable. But proprietary software developers frequently disregard gaping holes, or even introduce them deliberately, and the users are helpless to fix them.
-
FitBit fitness trackers have a Bluetooth vulnerability that allows attackers to send malware to the devices, which can subsequently spread to computers and other FitBit trackers that interact with them.
-
“Self-encrypting” disk drives do the encryption with proprietary firmware so you can't trust it. Western Digital's “My Passport” drives have a back door.
-
Mac OS X had an intentional local back door for 4 years, which could be exploited by attackers to gain root privileges.
-
Security researchers discovered a vulnerability in diagnostic dongles used for vehicle tracking and insurance that let them take remote control of a car or lorry using an SMS.
-
Crackers were able to take remote control of the Jeep “connected car”.
They could track the car, start or stop the engine, and activate or deactivate the brakes, and more.I expect that Chrysler and the NSA can do this too.
If I ever own a car, and it contains a portable phone, I will deactivate that.
-
Hospira infusion pumps, which are used to administer drugs to a patient, were rated “least secure IP device I've ever seen” by a security researcher.
Depending on what drug is being infused, the insecurity could open the door to murder.
-
Due to bad security in a drug pump, crackers could use it to kill patients.
-
The NSA can tap data in smart phones, including iPhones, Android, and BlackBerry. While there is not much detail here, it seems that this does not operate via the universal back door that we know nearly all portable phones have. It may involve exploiting various bugs. There are lots of bugs in the phones' radio software.
-
“Smart homes” turn out to be stupidly vulnerable to intrusion.
-
The insecurity of WhatsApp makes eavesdropping a snap.
-
It is possible to take control of some car computers through malware in music files. Also by radio. Here is more information.
-
It is possible to kill people by taking control of medical implants by radio. Here is more information. And here.
-
Lots of hospital equipment has lousy security, and it can be fatal.
-
An app to prevent “identity theft” (access to personal data) by storing users' data on a special server was deactivated by its developer which had discovered a security flaw.
That developer seems to be conscientious about protecting personal data from third parties in general, but it can't protect that data from the state. Quite the contrary: confiding your data to someone else's server, if not first encrypted by you with free software, undermines your rights.
-
Some flash memories have modifiable software, which makes them vulnerable to viruses.
We don't call this a “back door” because it is normal that you can install a new system in a computer given physical access to it. However, memory sticks and cards should not be modifiable in this way.
-
Replaceable nonfree software in disk drives can be written by a nonfree program. This makes any system vulnerable to persistent attacks that normal forensics won't detect.
-
Many smartphone apps use insecure authentication methods when storing your personal data on remote servers. This leaves personal information like email addresses, passwords, and health information vulnerable. Because many of these apps are proprietary it makes it hard to impossible to know which apps are at risk.