Author Topic: University of Minnesota Linux code security issues; banned and to be removed  (Read 11162 times)

0 Members and 1 Guest are viewing this topic.


Online ataradov

  • Super Contributor
  • ***
  • Posts: 11238
  • Country: us
    • Personal site
It means that their research ethics committee is pretty loose on the "ethics" part.

I also don't get this "research". There are bugs in the kernel that were non-maliciously contributed. So obviously it is possible to do that maliciously. What did they want to prove with that research?

The ban is fully justified, IMO. No need to take the code from people that acted maliciously in the past.
Alex
 
The following users thanked this post: hans, daqq, Omega Glory, newbrain, BU508A, Jacon

Offline JohnnyMalaria

  • Super Contributor
  • ***
  • Posts: 1154
  • Country: us
    • Enlighten Scientific LLC
This is a classic example of the scourge rampant in modern academia.

A thorough search of the scientific literature reveals that the paper isn't a peer-reviewed article. It's just a PDF that looks like a research paper stored on Github.

Of the two authors, Wu is a PhD student and Lu is an assistant professor who has 37 peer-reviewed (edit: see below) articles published since 2013. He got his PhD in 2017.

Looking at the titles of some of the articles, it appears Prof Lu is just a dressed-up version of a teen hacker getting kicks from academic masturbation.

https://scholar.google.com/citations?user=1F9N6icAAAAJ&hl=en&oi=sra

EDIT: after closer inspection, many of Lu's papers aren't peer-reviewed at all. There are just conference proceedings. i.e., he presented a paper at a conference. Generally, there's no peer-review - you could make up anything and, as long as it seems plausible, get accepted to speak.
« Last Edit: April 23, 2021, 05:39:00 pm by JohnnyMalaria »
 
The following users thanked this post: ebastler

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11238
  • Country: us
    • Personal site
Yep. That's pretty much it.

When I was working with wireless networks, we've got so many academic security researches telling us that the network does not work in the presence of strong interference. Well, no shit. What do you want us to do about it? Change how physics works?
Alex
 

Offline DrG

  • Super Contributor
  • ***
  • !
  • Posts: 1199
  • Country: us
/-//
A thorough search of the scientific literature reveals that the paper isn't a peer-reviewed article. It's just a PDF that looks like a research paper stored on Github.
/--/

I agree, but am not sure that it is accidental that it looks like a research paper...that is, it may have been intended to be a research paper for submission.

According to this (published a few days ago https://www.theregister.com/2021/04/21/minnesota_linux_kernel_flaws_update/) "....Vulnerabilities in Open-Source Software via Hypocrite Commits" [PDF], which is slated to be presented at the Proceedings of the 42nd IEEE Symposium on Security and Privacy next month..."

https://www.ieee-security.org/TC/SP2021/

I also agree with your point about the difference between a presentation and a peer-reviewed publication (having considerable experience with both, but never IEEE 'stuff').

This issue seems to me to be more provocative than anything else. While I have not and likely will not spend a lot of time on it, I think that it will garner a lot of attention and a lot of scrutiny as well.

Also, from the cite above "It further states that the experiment was vetted by the university's Institutional Review Board (IRB), which determined that the project did not constitute human research and thus granted an ethical review waiver." - my guess is that the IRB board is going to modify some SOP as a result of this - not sure, but that is where I would put my money because, while not within a strict IRB mandate, the University is going to be fielding a lot of questions about why they got a waiver.
- Invest in science - it pays big dividends. -
 

Offline SilverSolder

  • Super Contributor
  • ***
  • Posts: 6126
  • Country: 00
Yep. That's pretty much it.

When I was working with wireless networks, we've got so many academic security researches telling us that the network does not work in the presence of strong interference. Well, no shit. What do you want us to do about it? Change how physics works?

Engineers are asked to do that every day!  :D
 

Offline bd139

  • Super Contributor
  • ***
  • Posts: 23018
  • Country: gb
As someone who has to gargle the shit of a couple of hundred software developers every day I think this is a good thing. It sows distrust which in turn causes greater testing and validation to take place.

It's also hilarious.
 
The following users thanked this post: duckduck

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11238
  • Country: us
    • Personal site
It would not cause better validation. There no humanly possible way to make sure that individual patch does not open some sort of a hole in a bigger system. You can do your best to review the patch to make sure that is is not obviously wrong, but not much more. Or you will have to invest a lot more resources, which neither free or commercial software typically has.

In fact, I bet there is a much higher probability of intentionally bad code accepted to the closed source code base, as the code was written in a trusted environment. How hard would it be to blackmail, or just buy some programmer at Microsoft to introduce an innocent-looking hole? Or even a rouge employee introducing some code to take advantage later in case employment does not go so well. This is not something that can be done as part of this hack job of a "research", of course.

And unlike here, where all contributions by that university are removed from the code base, there is no chance Microsoft would remove and rewrite code after firing an employee.

So the whole premise of this effort is wrong.

Not so long ago easter eggs were a common place, and I'm not sure how much knowledge management had about them. This practice was stopped at Microsoft, and for a good reason, but the same principle applies.

This research is of the same nature as "Hey, look, I can defecate in the middle of the street and not get caught and fined. The system is exploitable."
« Last Edit: April 23, 2021, 08:18:32 pm by ataradov »
Alex
 
The following users thanked this post: duckduck

Offline bd139

  • Super Contributor
  • ***
  • Posts: 23018
  • Country: gb
Yeah I work in fintech. We have background checks, multiple layers of review, static analysis, vulnerability analysis (DAST/SAST), least privilege implementations everywhere.

Linux doesn't yet everything we do is based on it.

YMMV but the issue is that the whole industry is a complete amateur shit show and will be like the engineering industry was until the bodies were stacked high enough to mean something...
 

Offline DrG

  • Super Contributor
  • ***
  • !
  • Posts: 1199
  • Country: us

It's also hilarious.

Bad administrator, no doughnut for you. :)
- Invest in science - it pays big dividends. -
 
The following users thanked this post: bd139

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11238
  • Country: us
    • Personal site
Yeah I work in fintech. We have background checks, multiple layers of review, static analysis, vulnerability analysis (DAST/SAST), least privilege implementations everywhere.
So can you 100% say that you never found a bug in the released code base?

I'm not saying that Linux people do as much as possible now. They do as much as they realistically can.

Some fields require more scrutiny than others.  People dealing life sustaining application or even automotive stuff are doing more testing, since there resources are justified.

I think people that suggest that one project or the other needs to do more review - go and do that review. It is easy to tell what others should and should not do.
Alex
 

Offline bd139

  • Super Contributor
  • ***
  • Posts: 23018
  • Country: gb
Oh no my point and the hilarity is that we do all that to increase our quality (100% is impossible) and immediately without thought stick it on top of a Linux kernel which is full of bugs introduced by untrusted university researchers.
 

Offline DrG

  • Super Contributor
  • ***
  • !
  • Posts: 1199
  • Country: us
Yeah I work in fintech. We have background checks, multiple layers of review, static analysis, vulnerability analysis (DAST/SAST), least privilege implementations everywhere.
/---/
I think people that suggest that one project or the other needs to do more review - go and do that review. It is easy to tell what others should and should not do.

Yes, the difference between a 'checker' and a 'doer', but let me ask you something specific...in the "paper" and in the "walk back" https://www-users.cs.umn.edu/~kjlu/papers/clarifications-hc.pdf ...and I have seen enough walk-backs in my time to recognize them and distinguish them from clarifications, so It's only my opinion, but I am sticking with that

The claim is that suggestions were made for improvement. That is, they went beyond simply showing/saying this is screwed up, do a better job. Do you think that they did that reasonably? (I don't know, but I thinking their suggestions for improvement might be considered under-whelming or unrealistic).


In the paper, we provide our suggestions to improve the patching process.
- OSS projects would be suggested to update the code of conduct, something like “By submitting
the patch, I agree to not intend to introduce bugs”.
- We need automated tools for testing and verifying patches. The relevant techniques include
directed fuzzing, (under-constrained) symbolic execution, formal methods, etc. More details are
in the paper.
- OSS maintaining is understaffed. We should very much appreciate and honor maintainer efforts,
and increase potential incentives if possible to encourage more people to join the maintaining.
- We hope both reporters and maintainers are aware of the potential bug-introducing patches. Also,
tools can be developed to check “immature vulnerabilities”.


(from the 'clarification' cited above)
- Invest in science - it pays big dividends. -
 
The following users thanked this post: ebastler, duckduck

Offline bd139

  • Super Contributor
  • ***
  • Posts: 23018
  • Country: gb
To be fair that clause is in our employment contracts under "malicious intent".
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 8637
  • Country: gb
To be fair that clause is in our employment contracts under "malicious intent".
Isn't most of an employment contract there with malicious intent?
 

Offline bd139

  • Super Contributor
  • ***
  • Posts: 23018
  • Country: gb
Yep  :(
 

Offline magic

  • Super Contributor
  • ***
  • Posts: 6761
  • Country: pl
Anyone has seen links to the bad patches and how the review process went for them? :popcorn:

A quick glance at responses to Greg's mass-revert shows that most stuff from that university was legit fixes.
 

Offline DrG

  • Super Contributor
  • ***
  • !
  • Posts: 1199
  • Country: us
I'm probably thinking about this more than I should, but I was reading this thread: https://github.com/QiushiWu/qiushiwu.github.io/issues/1
(That is the PHD candidate - right?)

First off, let this be a lesson to the kids out there- don't piss off the people above you unless you are really, really good (ok, I am saying that with a :)

One big issue that comes to mind, and I don't know enough to even speculate, does this constitute penetration testing without authorization? If so, that would have some consequences, I would think - anybody have a more educated opinion than mine?

The other issue concerns the IRB, when did they get approval and the waiver, and whether that is going to be 'changed', the process or otherwise, in the near future. I already weighed in on that and I am sticking with that opinion.

From that thread, this fellow is gearing up to make an apology (not a clarification) and, as could be anticipated, is getting a lot of heat. Some justifiable, some maybe not so not so much.

I wonder how the presentation goes (if it is not retracted) - a lengthy apology to begin? Concentration on this idea of “immature vulnerability”, which does not seem all that original, but I don't know.

Right now, I think there is a lot of scrambling going on - what do we have to change and how do we do that?....I once had to take a mandatory Jeep Safety Training and Motorcycle Safety Training...as far as I know, I have never been in a Jeep and except for a brief period of time in my youth (EZ-Rider period :) ) I don't own or use a motorcycle. It took me a while to understand how those mandates came to be and it was quite a lesson in bureaucratic processes.
- Invest in science - it pays big dividends. -
 

Offline bd139

  • Super Contributor
  • ***
  • Posts: 23018
  • Country: gb
There’s no contributor contract as such so there’s no legal agreement in place. Usually your only agreement would be to surrender copyright to the project in some way. Thus you could roll up and steaming turd and drop the patch and if they accept it then it’s the maintainer’s funeral, which in this case it should be.

There is a lot of distraction here by the community from the fact that there is a massive vulnerability in the process. Imagine how many times that has been exploited potentially. There are a hell of a lot of contributors.

Both sides of this process are to blame.

The whole OpenBSD IPSEC and DARPA thing is another example of the plausible scenarios.
 

Offline DrG

  • Super Contributor
  • ***
  • !
  • Posts: 1199
  • Country: us
There’s no contributor contract as such so there’s no legal agreement in place. Usually your only agreement would be to surrender copyright to the project in some way. Thus you could roll up and steaming turd and drop the patch and if they accept it then it’s the maintainer’s funeral, which in this case it should be.

There is a lot of distraction here by the community from the fact that there is a massive vulnerability in the process. Imagine how many times that has been exploited potentially. There are a hell of a lot of contributors.

Both sides of this process are to blame.

The whole OpenBSD IPSEC and DARPA thing is another example of the plausible scenarios.

Do you think that there will be some substantial process changes as a result of this or a lot of saber rattling until the smoke clears and then business as usual...or something in between?
- Invest in science - it pays big dividends. -
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11238
  • Country: us
    • Personal site
A quick glance at responses to Greg's mass-revert shows that most stuff from that university was legit fixes.
The bad patches were also masqueraded as legit fixes. They just had unintended side effects.

And the removal is more of a deterrent. I'm sure most of that code is fully legit, but you don't want to encourage this type of childish and stupid behavior. It might turn into a stupid trend of trying to push bad code into the kernel in a more and more escalating ways.

In fact, I personally would cancel the conference talk too, just to signal to others that there is absolutely nothing to be gained from that.
Alex
 
The following users thanked this post: Omega Glory, newbrain

Offline bd139

  • Super Contributor
  • ***
  • Posts: 23018
  • Country: gb
There’s no contributor contract as such so there’s no legal agreement in place. Usually your only agreement would be to surrender copyright to the project in some way. Thus you could roll up and steaming turd and drop the patch and if they accept it then it’s the maintainer’s funeral, which in this case it should be.

There is a lot of distraction here by the community from the fact that there is a massive vulnerability in the process. Imagine how many times that has been exploited potentially. There are a hell of a lot of contributors.

Both sides of this process are to blame.

The whole OpenBSD IPSEC and DARPA thing is another example of the plausible scenarios.

Do you think that there will be some substantial process changes as a result of this or a lot of saber rattling until the smoke clears and then business as usual...or something in between?

Loudest ego wins in 2021 and that appears to be the kernel team.

But adversaries know it’s a process vulnerability now so not acting would be socially irresponsible. Also this is now on everyone’s risk assessment and if it happened again the paid up vendors will be on the hook.

Could go either way but this is a massive “oh shit” moment that will affect the industry for years. Not immediately though!
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11238
  • Country: us
    • Personal site
Do you think that there will be some substantial process changes as a result of this or a lot of saber rattling until the smoke clears and then business as usual...or something in between?
I don't think anything substantial will change. The process as it is works fine for what it is. There is no way to improve it unless you are willing to invest more resources. And those resources should come from the parties benefiting from the improvements. If the companies using Linux in their products think that Linux community is not doing enough - they should invest more money, or even hire third-party reviewers.

I use Linux on my PCs and I'm personally fine with the Linux as it is. I don't think their model of development is flawed in any way here.

I would not ascribe Linux developers some grand social responsibility. They are just writing software. The responsibility lies on the users. If someone truly believes that malicious intent clause in the contract would prevent proprietary software from having bugs, then they are free to switch to that proprietary software.

And if as a result Linux sees significant drop in use, then they will have problem that they would have to address.
« Last Edit: April 23, 2021, 11:27:49 pm by ataradov »
Alex
 

Offline magic

  • Super Contributor
  • ***
  • Posts: 6761
  • Country: pl
Things clearly don't work well if anyone signed off on their shit patches. And those who did have just been caught with their pants down. It's the whole point of review to catch errors, intentional or not.

Only if no one did, then the students have indeed wasted everybody's time for no gain and should have never published their paper out of embarrassment.

By the way, a few years ago many RAID0 SSD arrays lost data because of a buggy fix to a hypothetical bug that affected nobody, which ended up destroying every array every time trim was used on it. It's painfully obvious no one had even tested the alleged fix, and it got backported to all stable releases, thank you very much.

So it's not even the first time shit happens. I'm inclined to agree that they are understaffed, probably overambitious, and as a result - overworked. Oh well.

edit
From limited personal experience (n=1) I can say that pushing fixes for obvious bugs is indeed not very hard and some maintainers seem to treat it as a nuisance and just want to be done with it. In particular, I was a bit :wtf: when somebody from the stable team just emailed me a question which releases the patch is compatible with, with little indication that my answer will receive any scrutiny.

So if you want to introduce a very stealthy vulnerability into Linux, come up with a commit which fixes some current problem while subtly breaking whatever -stable release is currently used by Ubuntu :P
« Last Edit: April 24, 2021, 07:45:50 am by magic »
 
The following users thanked this post: wraper, SilverSolder, bd139

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11238
  • Country: us
    • Personal site
Things clearly don't work well if anyone signed off on their shit patches. And those who did have just been caught with their pants down. It's the whole point of review to catch errors, intentional or not.
Code reviews are never 100% effective. Otherwise we would not have bugs in the software, since majority of big companies have code reviews.

Code reviews make sure that coding style is fine and the code is generally not complete garbage. Reviews would take forever if they had to review how specific patch affects the rest of the system.

I don't see why this is an issue that someone may intentionally introduce a hole. This sounds like a lot of work, it is much easier to find existing holes if you want them. There are plenty, for sure. And not only in the kernel, but in the rest of the user land too.
Alex
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf