xz-utils backdoored

the possibility exists that we'd never find out about it.
Thats trivially true (same goes for open source affair), same for the possibility that the companies involved (or their clients) would have find out at some point, the possibility exist, it is just strange for someone to say otherwise (and a couple of line later say that maybe not), or people will know far too late.

That's a bit of an assumption. The reality is: We really have no idea how proprietary software development progresses behind the scenes, or how efficient it is for others to view commit history.
I imagine a lot of people here like me did work in proprietary software development, and it is pretty much exactly what you imagine, basically all of them use git (or something similar)

it's harder for malicious actors to infiltrate the proprietary software development chain is a bit of a stretch
It is easy to know a list of reason that it will be harder, a lot of time physically need to be somewhere, some aspect to it do not accept foreigner, lie detector and background check used during the interview process, the legal consequence if you get caught are much more straightforward and it would be hard to imagine a single aspect for which it would not be the case.
 
"But open source is more secure because people spend all their waking hours reading over lines of code to make sure nothing malicious gets in"

Reality is, no they do not. Open source also allows the bad guys to find ways in, just as much as people can find holes and get it patched. But at least being open source, more eyes can dig into it when needed vs waiting for MS or someone else to decide to patch something or not. I would of expected far more scrutiny against a repo being tied into distro's by said teams...

This is why things are so insecure because people just tie into a github repo and assuming it is safe and clean cause it is on github, copy pasta and off they go...

But, in the end, SSH should not be accessible on the internet, and any critical or prod server should have outbound internet blocked all together with specific ACL's for anything that does need to hit the internet (webserver)
It's funny because a Microsoft employee was the one who found out about the backdoor, because Microsoft uses Linux. This was coordinated so that they could pressure code to be implemented as quickly as possible to stop something bad from happening. The persons name that was involved was called Jia Tan. Take a guess what country they're from.

View: https://youtu.be/H_XNSDneR5g?si=XN3oBnwhHKDxzDxk
 
With so many eyes pouring over the code, there's no chance a backdoor or malicious actor could be kept secret in the open source community.
I am not sure there that many people looking and understanding codes for many of them, long time code maintainer can get tired, the backdoor can be made over many small and looking like nothing change.

https://arstechnica.com/information...y-used-open-source-software-to-steal-bitcoin/
According to the Github discussion that exposed the backdoor, the longtime event-stream developer no longer had time to provide updates. So several months ago, he accepted the help of an unknown developer. The new developer took care to keep the backdoor from being discovered

Change in the linux kernel will have many eyes on them, but there millions of projects outthere, even on important part of Linux distro, issue can stay there a long time (2-3 years) unnoticed and that's was not someone that was paid for trying to hide it:
https://freedom-to-tinker.com/2013/09/20/software-transparency-debian-openssl-bug/
 
With so many eyes pouring over the code, there's no chance a backdoor or malicious actor could be kept secret in the open source community.
Mostly, people aren't looking over open source code. It's great that you can, but it's clear that very few do.

And even if you do see something that looks weird, most of the time you just quietly use something else. Or hold you nose and use it, because OpenSSL may be bad, but GnuTLS is worse, and NSS is weird, and libressl didn't exist at the time, and BoringSSL wasn't public.
 
Mostly, people aren't looking over open source code. It's great that you can, but it's clear that very few do.

And even if you do see something that looks weird, most of the time you just quietly use something else. Or hold you nose and use it, because OpenSSL may be bad, but GnuTLS is worse, and NSS is weird, and libressl didn't exist at the time, and BoringSSL wasn't public.

This wasn't some seat of your pants attack, this was well orchestrated by someone quite knowledgeable regarding SSH & systemd dependencies - certain technical defenses were even disabled by the insertion of a single dot in an autoconf file. As stated before, you could have audited the source code, chances are you wouldn't have found anything; as the payload was injected during the build process, and only if certain conditions were met - One of those conditions was that the software was installed via the tarball and not directly from git with the makefile injecting the malicious code. Furthermore, there's every chance this attack had been in planning for a good two years.

Even the repo's security.md was edited only two days ago, telling users to report any oddities, including suspicious behavior, privately to the author. Furthermore, the author had been contacting distro maintainers and software developers pushing them to incorperate the latest build in their projects citing amazing new features.

Chances are, based on the editing of the security.md file, the malicious actor was testing the code to evaluate it's effectiveness - He danced the jig in a minefield and upon entering immediately stepped on a mine.

As stated, one positive is the fact that it's virtially impossible to hide in git, all commits by this individual will be audited and commits can be rolled back.
 
Last edited:
^^ This, this was a long play planned out attack. Now you wonder how many other repo's could be comprimised by the same team that did this (not likely a single person at all, but a nation state group)
 
Mostly, people aren't looking over open source code. It's great that you can, but it's clear that very few do.

And even if you do see something that looks weird, most of the time you just quietly use something else. Or hold you nose and use it, because OpenSSL may be bad, but GnuTLS is worse, and NSS is weird, and libressl didn't exist at the time, and BoringSSL wasn't public.
Exactly... people seem to think people sit there in their free time reviewing random open source code cause they have nothing better to do. With how many libraries and open source projects that are out there? Ya, right... took what, 10 years to discover the OpenSSL exploit?

https://www.linkedin.com/feed/update/urn:li:activity:7180279006552899584/
1711996087936.png
 
Just out of subject note apparently the bystander effect did not survive surveillance camera now being widely available if you get attacked it is much better to have a group of people or a crowd than a single person passing by and was a lot based on the false reportry story of the murder of Kitty Genove about no one calling the cop or helping: https://en.wikipedia.org/wiki/Murder_of_Kitty_Genovese .
 
Exactly... people seem to think people sit there in their free time reviewing random open source code cause they have nothing better to do. With how many libraries and open source projects that are out there? Ya, right... took what, 10 years to discover the OpenSSL exploit?

https://www.linkedin.com/feed/update/urn:li:activity:7180279006552899584/
View attachment 645121
The open source nature does help in this case. It didn't take long to figure out when and who inserted the malicious code into xz. In this case, nobody needed to look through a lot of code to figure out what happened. They figured it out by back tracing when this problem didn't occur and then removing it. This happens all the time with close software, and they're called trogans. You can bet that xz-Utils is now taking the time to look through all their code to make sure nothing shouldn't be there.
 
Exactly... people seem to think people sit there in their free time reviewing random open source code cause they have nothing better to do. With how many libraries and open source projects that are out there? Ya, right... took what, 10 years to discover the OpenSSL exploit?
I think most people who use and evangelize OSS, but don't actually develop it, don't realize how it actually goes on the back end. They think there are hundreds, or thousands, of eager beavers working on the project. Likewise they think people spend lots of time pouring over the code, analyzing every line looking for issues.

In reality? Other than a few really popular projects, it is very few people, sometimes just one. Sometimes they are paid by a company to work on it, but sometimes that is only part of their job or a hobby so it gets not a lot of time.

Likewise code reviews often... aren't. My girlfriend is OSS developer, one of those paid by a company to work on an OSS project and man the things she has to say about code reviews. Nobody wants to review other people's code, they want to write their own code, it's more fun. So to try and get people to do it projects will often require a "reviewed by" signoff for code to go upstream. But then reviews turn into a popularity contest/bargaining chip where people will withhold a signoff for trivial shit for someone they don't like, or to try and get a signoff in return. They don't actually do a through code review to give it, just kinda give it a once-over and say "LGTM!"

OSS is not a magic panacea that prevents bugs and malicious actors. Rather, just like closed source, it is all about the people who work on a project, how good they are, how much time they have, how motivated they are, how skilled they are, etc. For some projects, it is one burned-out maintainer doing it all, and that is ripe for abuse.
 
I think most people who use and evangelize OSS proprietary code, but don't actually develop it, don't realize how it actually goes on the back end. They think there are hundreds, or thousands, of eager beavers working on the project. Likewise they think people spend lots of time pouring over the code, analyzing every line looking for issues.
 
Likewise code reviews often... aren't. My girlfriend is OSS developer, one of those paid by a company to work on an OSS project and man the things she has to say about code reviews.

Sounds like the same things when I've worked in proprietary code with mandatory code review. Although less likely to get away with Reviewed-by: Self.
 
Sounds like the same things when I've worked in proprietary code with mandatory code review. Although less likely to get away with Reviewed-by: Self.
Depends on the project. Some it is sort of a RVB: Self particularly if the maintainer doesn't give a shit. Others development slows to a crawl because everyone hates on everyone else and nobody signs off on shit. Basically it is all normal human drama and as you say: The same kind of shit that happens in closed source.

The projects that are the best tend to be ones that have a number of paid devs working on them and a company or foundation that backs them so there's good governance, like Mozilla. The ones that are the worst are the ones that nobody gives a shit about not in the sense of not being used, but nobody is interested in doing work on. xz-utils is an excellent example of that.

A lot of OSS projects are also suffering in that while companies love the idea of open source, the part they love is getting shit for free. So they'll use it, but contribute very little. There's been more of that going on where companies maybe used to have devs for a project, but slowly they went away and they just didn't replace them. They take the code, but they don't give back because, well, why bother? Another issue is that because of the OSS politics, and because a company can't just get rid of assholes that stop up development if said asshole doesn't work for them, sometimes they'll just fork it and toss code over the wall. They'll decide to do things their own way and hire devs to do that and use that version in their product. While they will release the code, as required, they won't do anything to try to get it in upstream, and usually the community is very hostile about that, so little, if any, of their work gets incorporated. As time goes on their version sort of becomes "open proprietary" where the code is technically out there, but nobody except them uses it or looks at it and the upstream code is stagnant.

In other words: Human bullshit. Same politics, drama, resource allocation, and all that shit you see in closed source. It isn't magic and it sure as shit isn't immune to bugs and compromises. I mean, Ken Thompson demonstrated back in 1984 how you could do a binary-only attack on the C compiler, getting an evil version to recognize its own source code and compile in the evil bits, even though they are never see in source.

I'm certainly not a guy who says don't trust or use OSS. I do and I do. But I think people need to understand that all the issues that you can see in closed source that lead to bugs, exploits and backdoors exist in open source too and that it is about the people involved, not if the code is open to the world.
 
In other words: Human bullshit. Same politics, drama, resource allocation, and all that shit you see in closed source. It isn't magic and it sure as shit isn't immune to bugs and compromises. I mean, Ken Thompson demonstrated back in 1984 how you could do a binary-only attack on the C compiler, getting an evil version to recognize its own source code and compile in the evil bits, even though they are never see in source.

I'm certainly not a guy who says don't trust or use OSS. I do and I do. But I think people need to understand that all the issues that you can see in closed source that lead to bugs, exploits and backdoors exist in open source too and that it is about the people involved, not if the code is open to the world.
The point of open source is that if there's a problem then it can be quickly found and fixed. In this case, the problem was quickly found and fixed. Nobody is combing through thousands of lines of code because they're bored. If there's a problem, and in this case there was a bit more CPU usage than someone at Microsoft would like, then it didn't take long to track down the problem and fix it. One example is Apple's Triangulation attack. People knew there was a problem and didn't know what it was. It took years before a bunch of Russians figured out that Apple had undocumented hardware functions that was being exploited. Another example is this blog from the Dolphin emulator author who had some things to say about drivers from different GPU manufacturers. Mesa for example, "all of the bugs we found in Mesa were promptly fixed after we reported them through the correct channel". Where as AMD's proprietary drivers, "A thread was started on the AMD developer forums, only to be ignored and deleted when AMD moved to a new developer forums system a year later". ARM/Mali had a communication issue, as well as Qualcomm/Adreno. Nvidia was the only one who never gave Dolphin issues, but this also means there was never a need to contact Nvidia over bugs. Another example is when Yuzu was sued by Nintendo and shut down, but their emulator was open source. So of course the Yuzu emulator was forked and now called Suyu. You see the same thing with LibreOffice when OpenOffice shut down. Jellfyin is a fork of Emby when Emby decided to go closed source. Open source doesn't mean bug free exploit free software, it just makes the problem less likely to occur compared to closed software. Who here wouldn't like to see Apple's iOS and Windows telemetry data collection and potential government back doors? What's behind the black box?
 
The point of open source is that if there's a problem then it can be quickly found and fixed. In this case, the problem was quickly found and fixed.
But the problem is that doesn't mean it WILL BE. This problem wasn't found because it was open, it was found because a guy at MS said "Why is my SSH server being slightly slow?" and started doing analysis on the binaries running on the system. Likewise, in the things linked here, you can see how other bugs were going unaddressed and people were hating on the poor maintainer for it because he didn't have enough time and wasn't prioritizing their particular bug.

I'm not hating on OSS, but people need to be realistic. Just because something is open doesn't mean it is bug free or secure. If you ever want to experience that, try out Ardour (DAW) or even worse Cinelerra (NLE). Commercial NLEs aren't known for being bug-free but Cinelerra makes them look like shining jewels by comparison. Now sure, teh code to those is open, unlike Cubase or Vegas. You COULD just go fix those bugs... but people haven't and don't.

It all comes down to how good the people working on the project are, how much time they have, how motivated they are, the culture, etc.
 
This is a good pictorial analysis of the attack. Once again, inspecting the source code wouldn't have highlighted any striking problems regarding this attack, as the malicious code was injected at compile time via the makefile and only if you downloaded the tarball - The fact that multiple devs were able to go through the commit history so quickly, locate just where things took a turn, and take necessary action - highlights a strength of OSS.

As stated earlier, had this been proprietary code, the possibility exists that anyone outside of very tight circles would have never known such a long term and well planned out attack ever took place. No one's stating OSS is bug free or perfectly secure, in the same way no one can claim that proprietary software development is in any way bug free or perfectly secure.

You can inspect the source code, and it could be 100% legit - but what about the many dependencies the software relies on?

https://lemmy.zip/pictrs/image/63fd7c2a-e76f-4ad7-9d85-45bec6871d26.webp
 
Last edited:
In this case, the problem was quickly found and fixed. Nobody is combing through thousands of lines of code because they're bored. If there's a problem, and in this case there was a bit more CPU usage than someone at Microsoft would like, then it didn't take long to track down the problem and fix it.
Not sure what difference it would be with a closed source here ? Nothing about the finding of problem or solution seem to have anything to do with being open source or not no ?

Would the bug been inside closed Microsoft code, hundred of dev could have gone through the commit history.... and take necessary action and maybe we do not know about it, they don't press charge and just fire the rogue employee/3 letters federal agency take care of it, which is a different subject.
 
But the problem is that doesn't mean it WILL BE. This problem wasn't found because it was open, it was found because a guy at MS said "Why is my SSH server being slightly slow?" and started doing analysis on the binaries running on the system. Likewise, in the things linked here, you can see how other bugs were going unaddressed and people were hating on the poor maintainer for it because he didn't have enough time and wasn't prioritizing their particular bug.
If this were a closed source software, you would write to the company who you bought it from and probably send you a copypasta that this is to be expected with ongoing maintenance. How many of us got updates that slowed things down? You think Apple is going to explain why iOS 18 has slower response time when browsing the web? You don't have closed software on a github where you can actually pin point when and what occurred. Even if the project is abandoned, which does happen, it doesn't take long for someone to fork it.
I'm not hating on OSS, but people need to be realistic. Just because something is open doesn't mean it is bug free or secure.
Nobody ever said it was. It is less likely to have bugs and be insecure. If there wasn't some benefit to this then why are most servers running Linux? You think people at Google are going to trust Microsoft Windows and MacOS? Those OS's are black boxes with no way of figuring out if these companies are working against your own interest. How would you know if Windows has a backdoor for China because some employee working for Microsoft was paid to do so? You going to send an email to Microsoft, letting them know that their compression tool is somehow using 3% more CPU usage before the update?
It all comes down to how good the people working on the project are, how much time they have, how motivated they are, the culture, etc.
This is true for anything. In this case it was a lot of bad actors who have spent years trying to gain trust and got away with it. Keep in mind that people did notice a problem and did start looking into it. The Github is down but here is a screenshot of it.
xzutils.png
 
There's a lot of information about this backdoor that is missing in this thread and it adds a lot to the discussion. Before i get into that though:

I don’t understand how anybody sees this backdoor and thinks that open source is insecure. This level of social engineering could easily have worked on a company as well, but this was found because it was open source. All other protections failed, and some guy poking around found it. That kind of scrutiny doesn’t happen on closed source systems.

Now back to the exploit.

This backdoor was specifically targeting distros that use .deb and .rpm. This also needs a link from sshd to liblzma to be applied which is something not all distros do. Arch for example would not be vulnerable as they don't link sshd against liblzma. This also requires systemd and as it happens systemd was already in the process of being changed so this exploit would no longer work at all. So there is a theory that the bad actors had to accelerate their plans before systemd would render their attack vector worthless and made a mistake which is what led to the overall discovery.

One of the major things this backdoor has shown is how badly the culture needs to change with the open source community. I love Linux. I love open source, but the culture of FOSS is honestly pretty shitty at times.

These bad actors abused the already burnt out maintainer of xz which allowed Jia Tan to get into good graces with the maintainer. These maintainers deserve far more respect so that burn out isn't as bad. Then there's non-systemd distros that start posting bullshit about how systemd is so insecure and what not after xz was discovered.

The abuse of maintainers, abuse of new users who ask questions looking for help, and in-fighting are simply big detriments to the FOSS community and people need to realize it.

EDIT: Great link for the xz timeline - https://research.swtch.com/xz-timeline
 
Last edited:
There's a lot of information about this backdoor that is missing in this thread and it adds a lot to the discussion. Before i get into that though:

I don’t understand how anybody sees this backdoor and thinks that open source is insecure. This level of social engineering could easily have worked on a company as well, but this was found because it was open source. All other protections failed, and some guy poking around found it. That kind of scrutiny doesn’t happen on closed source systems.

Now back to the exploit.

This backdoor was specifically targeting distros that use .deb and .rpm. This also needs a link from sshd to liblzma to be applied which is something not all distros do. Arch for example would not be vulnerable as they don't link sshd against liblzma. This also requires systemd and as it happens systemd was already in the process of being changed so this exploit would no longer work at all. So there is a theory that the bad actors had to accelerate their plans before systemd would render their attack vector worthless and made a mistake which is what led to the overall discovery.

One of the major things this backdoor has shown is how badly the culture needs to change with the open source community. I love Linux. I love open source, but the culture of FOSS is honestly pretty shitty at times.

These bad actors abused the already burnt out maintainer of xz which allowed Jia Tan to get into good graces with the maintainer. These maintainers deserve far more respect so that burn out isn't as bad. Then there's non-systemd distros that start posting bullshit about how systemd is so insecure and what not after xz was discovered.

The abuse of maintainers, abuse of new users who ask questions looking for help, and in-fighting are simply big detriments to the FOSS community and people need to realize it.

EDIT: Great link for the xz timeline - https://research.swtch.com/xz-timeline

Obligatory https://xkcd.com/2347/

I feel for this dude, shit situation.

I imagine unlocking the "nation state tried to infiltrate my project and wreak untold havoc" achievement is quite a lot to take in.
 
"But open source is more secure because people spend all their waking hours reading over lines of code to make sure nothing malicious gets in"

Reality is, no they do not. Open source also allows the bad guys to find ways in, just as much as people can find holes and get it patched. But at least being open source, more eyes can dig into it when needed vs waiting for MS or someone else to decide to patch something or not. I would of expected far more scrutiny against a repo being tied into distro's by said teams...

This is why things are so insecure because people just tie into a github repo and assuming it is safe and clean cause it is on github, copy pasta and off they go...

But, in the end, SSH should not be accessible on the internet, and any critical or prod server should have outbound internet blocked all together with specific ACL's for anything that does need to hit the internet (webserver)
Nothing should be accessible on the internet except dedicated web servers and services which also should not have access back internally to the network.
If you are inside the building and need to SSH to something, then that SSH device would need to be in an approved SSH permissions list and you need to be in an approved SSH users list.
If you are outside the building and need to SSH to something, then you connect to the VPN first then proceed as normal.

Personally speaking, any unsolicited incoming connection that is not directed at a dedicated external-facing device is denied and dropped.
There are a few exceptions for some specific maintained hardware but that has so many rules and access controls around them that it makes everybody involved cry. I hate how bad the security is on and how janky the implementation setup is on most HVAC hardware if you don't have full open communication on TCP and UDP for an ungodly port range often with needed ports for check-ins from monitoring facilities then things just don't work half the time. and it is a tooth-pulling experience trying to identify, update, and remove all that hardware... mini rant over, still hate BACNet...
 
All other protections failed, and some guy poking around found it. That kind of scrutiny doesn’t happen on closed source systems.
The founding had nothing to do with the source being open here too right ? It was running the executable not looking at it.

If this were a closed source software, you would write to the company who you bought it from and probably send you a copypasta that this is to be expected with ongoing maintenance.
Not at all my experience when I worked doing closed source software that involve security, the collaboration with expert programming clients is much more serious than that, you are basing that on what ?

The you here is an other company/a client (a bank, microsoft, etc...) with a direct channel to enter ticket or call you, we never sent auto answer

This level of social engineering could easily have worked on a company as well, but this was found because it was open source.
Could have but the steps and engagement for getting a jobs are significantly more than this, workings years until your commit on something involving security does not code reviewed quite attentively and deployed is a bigger deal than working on a open source repo and message boards, specially in you are oversea and the consequence are obviously different when people know who you are. And if you are an NSA personnel (or their world equivalent, or a crypto-ransom group) it is much easier to build a good reputation on 10-30 projects until you see an opportunity than working for a lot of company.
 
I think the issue here was xz and lzma were seen more as just standard compression libraries and not really associated with their cryptographic relationship.

Then, as well, they were dynamically linked with security software (sshd systemd -> liblzma -> sshd) -- which is probably not a good practice, because any version could be loaded in at runtime, even unaudited or completely fabricated libraries.

Finally, the icing on the cake was the rogue maintainer who slipped a malicious binary and script into a release archive for mass adoption.

If any of the above were untrue this would have been a much smaller ordeal.
 
Last edited:
The founding had nothing to do with the source being open here too right ? It was running the executable not looking at it.




Could have but the steps and engagement for getting a jobs are significantly more than this, workings years until your commit on something involving security does not code reviewed quite attentively and deployed is a bigger deal than working on a open source repo and message boards, specially in you are oversea and the consequence are obviously different when people know who you are. And if you are an NSA personnel (or their world equivalent, or a crypto-ransom group) it is much easier to build a good reputation on 10-30 projects until you see an opportunity than working for a lot of company.

He ran the binary, saw the lag and then investigated by looking at source code and tarballs. With closed source code he most likely would have shrugged his shoulders about the lag and moved on.

You have a very high opinion of code review. If you think code review for closed source is any better, you have never done code review.
 
Then, as well, they were dynamically linked with security software (sshd) -- which is probably not a good practice, because any version could be loaded in at runtime, even unaudited or completely fabricated libraries.
Which is exactly why systemd was changing it....

lol
 
He ran the binary, saw the lag and then investigated by looking at source code and tarballs. With closed source code he most likely would have shrugged his shoulders about the lag and moved on.
Or not, that only speculative. And the company that made it could have investigated in their night test why it was slower before shipping it. It is pure speculation here. A sudden performance or memory change in a build will tend to be flagged and investigated.

You have a very high opinion of code review. If you think code review for closed source is any better, you have never done code review.
You must know about everyone must do it, I had really strong reviewer personally specially when you are new, when you are paid and forced to do it and face possible professional consequence when you sign off on them, it will tend to be better than on the average opensource but it will be case by case.
 
Then, as well, they were dynamically linked with security software (sshd systemd -> liblzma -> sshd) -- which is probably not a good practice, because any version could be loaded in at runtime, even unaudited or completely fabricated libraries.

If you trust your distribution's sshd binary, in general, you should also trust the rest of the software the distribution ships... It's not exactly just any version... It's the version that's installed, otoh, linking fewer libraries is better, especially when libsystemd pulls in a lot of stuff and all that was wanted was to open a socket and write a simple message. It's probably less work to do that directly than with a library.
 
If you trust your distribution's sshd binary, in general, you should also trust the rest of the software the distribution ships... It's not exactly just any version... It's the version that's installed, otoh, linking fewer libraries is better, especially when libsystemd pulls in a lot of stuff and all that was wanted was to open a socket and write a simple message. It's probably less work to do that directly than with a library.
It could link any version, just need to change LDPRELOAD, or replace the expected library with one which pulls in your own binary code (like was done here). Because of that, the trusted sshd was backdoored by a seemingly trusted compression library that was linked by the trusted system daemon.

Edit: I don't mean any "version" (as in versioning), but any version (as in state, time, etc).
 
I mean normally I wouldn't care about what libraries a piece of software is linked against, but imo a core system program should not be linked against software that is maintained by one person when it can be statically built or else implemented in the program itself.

It's an extra attack vector that you just don't need, and an obvious one (at least in hind sight)
 
It could link any version, just need to change LDPRELOAD, or replace the expected library with one which pulls in your own binary code (like was done here). Because of that, the trusted sshd was backdoored by a seemingly trusted compression library that was linked by the trusted system daemon.

Edit: I don't mean any "version" (as in versioning), but any version (as in state, time, etc).

Sure, but if you can set LD_PRELOAD for the main sshd or otherwise mess with libraries that sshd loads, you almost certainly have root and can replace the sshd binary, or force inject whatever you want.

But I agree, it's a silly dependency to add. But distributions do silly things all the time. Upstream OpenSSH didn't add the dependency, and from the mailing list thread I saw, never really understood the need, but OpenBSD doesn't have systemd and that probably makes it hard to understand the weirdness it suggests people do. I abandoned Linux on my home systems because systemd was making everything worse, and here it had again.
 
Sure, but if you can set LD_PRELOAD for the main sshd or otherwise mess with libraries that sshd loads, you almost certainly have root and can replace the sshd binary, or force inject whatever you want.
Which is exactly what happened here
 
Back
Top