@corbet
-
König is correct, in that these problems do not (hopefully) happen with existing, in-tree graphics drivers.
No, he isn't. I think you missed the email where I pointed out how the codepaths in the first in-tree C driver I looked at (panfrost) suffered from the same exact bugs. The only reason this isn't actively blowing up on the daily for users is that in-tree drivers mostly use a global scheduler and only tear it down on device removal, so any potential problems can only happen when you unbind/unplug the GPU, which is only relevant in practice for eGPU users outside of testing scenarios.
In other words, the existing C code is just as broken as Rust would be without the fixups I sent, it just happens to hit the breakage less often due to other driver design differences. The problem is not just that the interface cannot be reasonably used safely from Rust, it's that it reasonably can't be used safely from C either, since users don't understand and don't uphold the undocumented requirements for it to be. C developers just get away with being blissfully unaware of the mistakes until the code actually crashes, because the language doesn't force them to consider these things upfront as part of its syntax.
(Apparently I can't comment without a LWN subscription...)