I see it a lot in visual novels, older PC games and PC ports of older non-PC games. It sounds so trivial on paper, like… just play the video? But I know it’s not. Why though? Can we ever expect the problem to be fully solved? Right now it kinda seems like an uphill struggle, like by fixing cutscene playback in one game doesn’t really seem to automatically fix it for other games, so it’s not a situation where a convenient one size fits all solution works.

And I don’t really get it, because if it’s related to video codecs, there are only so many codecs out there, right? And then you also expect that there’s probably just a few popular ones out there that’ll be used for 99% of all cases, with a few odd outliers here and there perhaps.

  • @Cirk2
    link
    41 year ago

    Having the runtime of the codec installed in a wine prefix is not the same as having it work. Just like wine has to work on the codec to get the data to it and output back to the game, in a way that the game, the codec and wines d3d implementation can deal with it. This is made mode difficult by some codecs doing output themselves and some handing buffers back to the game for display.

    This is hard and gruesome work. With painstaking observing and duplicating behavior, since a lot is not documented and for a clean room implementation the person implementing it can not look at disassembled binaries or (hypothetical) code-leaks.

    Now how do you get a game to not use the codec it’s shipped with? By cracking.

    That is just wrong. “Cracking” is the circumvention of Copy-Protection. Before Denuvo afaik no Copy-Protection had data-integrty checks, so modification of game Behavior (aka Modding) did not require tampering with the Copy Protection. Best Example SKSE for Skyrim, a tool adding a lot of additional functions to the internal scripting language while still keeping copy protection in tact.