this post was submitted on 14 Aug 2025
166 points (93.7% liked)

Ask Lemmy

34120 readers
1514 users here now

A Fediverse community for open-ended, thought provoking questions


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.


6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 2 years ago
MODERATORS
 

I support free and open source software (FOSS) like VLC, Qbittorrent, LibreOffice, Gimp...

But why do people say that it's as secure or more secure than closed source software?

From what I understand, closed source software don't disclose their code.

If you want to see the source code of Photoshop, you actually need to work for Adobe. Otherwise, you need to be some kind of freaking retro-engineering expert.

But open source has their code available to the entire world on websites like Github or Gitlab.

Isn't that actually also helping hackers?

(page 2) 18 comments
sorted by: hot top controversial new old
[–] theunknownmuncher@lemmy.world 5 points 6 days ago* (last edited 6 days ago)

If you want to see the source code of Photoshop, you actually need to work for Adobe. Otherwise, you need to be some kind of freaking retro-engineering expert.

What you're describing is known as "security through obscurity", the practice of attempting to increase security of a system by hiding the way the system works. This practice is highly discouraged, as it is known to not actually be effective at increasing the security of a system.

Security by obscurity alone is discouraged and not recommended by standards bodies. The National Institute of Standards and Technology (NIST) in the United States recommends against this practice: "System security should not depend on the secrecy of the implementation or its components."

https://en.wikipedia.org/wiki/Security_through_obscurity#Criticism

Isn't that actually also helping hackers?

No, by sharing the implementation details of the system, it helps those trying to keep it secure by allowing anyone to inspect, discover, and contribute fixes to security flaws.

Open-source software is not perfect and is suceptible to security flaws and vulnerabilities, but it is better and more secure than closed-source software in every way. Every risk that applies to open-source software also applies to closed-source software, but worse.

[–] Max_P@lemmy.max-p.me 4 points 6 days ago

It helps hackers sure, but it also help the community in general also vet the overall quality of the software and tell the others to not use it. When it's closed source you have no choice but to trust the company behind it.

There's several FOSS apps I've encountered, looked at the code and passed on it because it's horrible. Someone will inevitably write a blog post about how bad the code is warning people to not use the project.

That said, the code being public for everyone to see also inherently puts a bit of pressure to write good code because the community will roast you if it's bad. And FOSS projects are usually either backed by a company or individuals with a passion: the former there's the incentive of having a good image because no company wants to expose themselves cutting corners publicly, and the passion project is well, passion driven so usually also written reasonably well too.

But the key point really is, as a user you have the option to look at it and make your own judgement, and take measures to protect yourself if you must run it.

Most closed source projects are vulnerable because of pressure to deliver fast, and nobody will know until it gets exploited. This leads to really bad code that piles up over time. Try to sneak some bullshit into the Linux kernel and there will be dozens of news article and YouTube videos about Linus' latest rant about the guilty. That doesn't happen in private projects, you get a lgtm because the sprint is ending and sales already sold the feature to a customer next week.

Its relatively easy. First of all if someone would implement a backdoor its much easier to find out, since you can look at the code directly. Second is, that a lot of people actually do this. Looking at the code of projects and searching for ways to find security holes in it.

So even if it isn't that much more secure than closed source, its much easier to trust simply because people can search for vulnerabilities much easier.

One great example of why open source code is easier to realise backdoors would be the xz Security breach.

[–] octopus_ink@slrpnk.net 4 points 6 days ago* (last edited 6 days ago)

In addition to the other good points -

OpenSource software allows you to create reproducible builds in theory. Such that you can take the source code provided by the vendor (which you theoretically audit to ensure you are satisfied it takes no unexpected action or whatever your concerns are) and compile it, and get something that can be validated as 100% identical to what you get if you buy the compiled version from the vendor directly.

Without this assurance, the vendor can tell you that this thing they sold you does XYZ, but unless you are looking for it in your network traffic for example you might not know it's uploading webcam pictures of you in the background to ihackedyourwebcam.com or collecting and transmitting telemetry you didn't agree to or etc. Or you don't realize that software you installed on your server has a hardcoded hidden administrator user with password 123456, etc etc etc.

Also, while Linux in particular is by no means perfect, as a Linux user I know I'm using software that is much more likely to be also used by people who WILL take the time to inspect the code themselves, or might take the time to audit what it does on their network, or any of a bunch of other things that bring hard to quantify additional layers of security to the ecosystem around FLOSS.

[–] brucethemoose@lemmy.world 3 points 6 days ago* (last edited 6 days ago)

Exploits in a lot of closed source software are from really stupid/simple things they’d get ridiculed for if the code were open.

In other words, I think being open creates “pressure” for code to be presentable and auditable. That, and there’s tons of opportunity and incentive for dysfunction with closed source stuff, like sitting on known exploits.

…That being said, it isn’t universal. Is a lone hero dev maintaining some open library going to be more effective at security coverage than a huge commercial team? Probably not.

Does the software for nuclear bomb security need to be public? Probably not.

[–] thatcrow@ttrpg.network 2 points 6 days ago

Assumed by who?

[–] Deestan@lemmy.world 2 points 6 days ago* (last edited 6 days ago)

If Adobe-or-Whatever has an undisclosed vulnerability, a few hundred people could easily already know about it due to working there. It can be due to bugs, or intentional backdoors required by corporate HQ or government.

They will leak this information. Either by accident or for financial gain. Those people will re-sell it to other shady people.

Now you sit on software where an unknown number of third parties can hack your shit. And you don't know about the vulnerability, what is at risk, how to protect yourself, or who from.

You can mostly trust corpos to protect against general hackers to some extent, but backdoors by government or from their own needs they will just keep secret.

Sony's Rootkit fuckery is probably the biggest example I can give, but there are tons more. Anti-piracy software are historically frequent offenders.

[–] snek_boi@lemmy.ml 2 points 6 days ago

Your post is similar to one I saw some time ago. That old post has a reply of mine, and I’ll paste it here:

The problem you’re describing (open sourcing critical software) could both increase the capabilities of adversaries and also make it easier for adversaries to search for exploits. Open sourcing defeats security by obscurity.

Leaving security by obscurity aside could be seen as a loss, but it’s important to note what is gained in the process. Most security researchers today advocate against relying on security by obscurity, and instead focus on security by design and open security. Why?

Security by obscurity in the digital world is very easily defeated. It’s easy to copy and paste supposedly secure codes. It’s easy to smuggle supposedly secret code. “Today’s NSA secrets become tomorrow’s PhD theses and the next day’s hacker tools.”

What's the alternative for the military? If you rely on security by design and open security for military equipment, it’s possible that adversaries will get a hold of the software, but they will get a hold of software that is more secure. A way to look at it is that all the doors are locked. On the other hand, insecure software leaves supposedly secret doors open. Those doors can be easily bashed by adversaries. So much for trying to get the upper hand.

The choice between (1) security by obscurity and (2) security by design and open security is ultimately the choice between (1) insecurity for all and (2) security for all. Security for all would be my choice, every time. I want my transit infrastructure to be safe. I want my phone to be safe. I want my election-related software to be safe. I want safe and reliable software. If someone is waging a war, they’re going to have to use methods that can actually create a technical asymmetry of power, and insecure software is not the way to gain the upper hand.

[–] neons@lemmy.dbzer0.com -1 points 4 days ago

Because people assume someone is auditing it. Which is wrong, most of the time nobody is auditing.

[–] chunes@lemmy.world 1 points 6 days ago

Helping hackers is the whole point. They can read the source code and report problems with the software.

load more comments
view more: ‹ prev next ›