I see it a lot in visual novels, older PC games and PC ports of older non-PC games. It sounds so trivial on paper, like… just play the video? But I know it’s not. Why though? Can we ever expect the problem to be fully solved? Right now it kinda seems like an uphill struggle, like by fixing cutscene playback in one game doesn’t really seem to automatically fix it for other games, so it’s not a situation where a convenient one size fits all solution works.
And I don’t really get it, because if it’s related to video codecs, there are only so many codecs out there, right? And then you also expect that there’s probably just a few popular ones out there that’ll be used for 99% of all cases, with a few odd outliers here and there perhaps.
As I wrote the codecs were usually included in the game. No sane developer assumed that they are already installed in the system.
Now how do you get a game to not use the codec it’s shipped with? By cracking.
If the codec was included in the game you wouldn’t need to pay to use it. It was already paid by the developers.
Edit: not to mention modding games is legal in most cases