Wow, I didn't realize Pascal came out first.
Of course, both find roots in Algol. C is just a more dangerous child as it allows you to take advantage (shoot yourself) with it's weak typing.
It's so easy to take down an entire C application. "seg fault" "bus error" or something like that, been years since I touched C.
There are applications where an error in presenting a tooltip over a UI element, if you get the tooltip data wrong (say as a result of calling a function which messes up with pointers or reads beyond end of char array etc) then the entire application just crashes instantly. A lousy tooltip bug should not be able to take down your app.
Even now there are applications on my system which are broken, "kalendar" from KDE, some little glitch somewhere and the entire app just crashes when it tries to start.
I know there are work arounds for a lot of this and very senior C devs won't make these kinds of mistakes, but there is still too much room in the language to shoot yourself.
With Java you just get an exception and it trickles down until you catch it. So much safer and sane.
Writing a kernel or device driver? C is a good candidate (Rust even better).
Writing a desktop/server application, Java is 100 times better.