I wish I could access this study. From the summary alone I couldn’t tell how one would compare Denuvo vs non Denuvo, when only large publisher use Denuvo, which are more likely to be mainstream and more known and therefore more in total pirated compared to an indie title. How would one measure that a DRM is preventing this revenue loss? We barely have two exactly similar games release at the same time. Gaming is fluctuating, game scores are and therefore the sold numbers, some are brand loyal while for others it counts less etc. How has anyone found a way to calculate that? Some publisher are much more likely to use Denuvo but are also more likely to be pirated because their game releases are much more likely to be buggy, boring or mtx infested.
Edit: After reading the study.
Some, in my eye, major issues popped up immediately in my head when reading it.
The calculations of sale numbers going by reviews.
The low number of cracked games within two weeks is only 7 18 in total.
The usage of those 7 18 games (8% of the data) cracked in two weeks, to calculate and predict a logarithmic hypothetical number for less than two weeks.
it acknowledged the possibility of zero revenue loss but not which week those zero revenue loss games were at. Even though the frequency of 0 revenue loss is higher. Compared to the 20 percent ones on page 10 figure 6.
the numbers combine third world countries with very high amount of piracy with first world countries, in all numbers and calculations, while a third world county very likely does provide a much smaller part of the revenue in total per sold copy but gets more significant in using the amount of reviews weighted the same. Also approximating that by cinema visited numbers is kind of far fetched as no good cinema platforms exist in my eye that aren’t as manipulate as IMDb. Which again leads to the necessity to read the other study and conclusions on how this number emerged to use it as base to calculate the number of games sold.
page 11 also states that using new reviews as proxy does overstate the fall of revenue.
games that are more expensive are more likely to be bought via key store or on a sale, which are not possible to calculate into this as reviews again have a delay depending on type of game and average playtime.
going by 18 games this makes 1 and 2 significant as it could very well be online single players with long playtime etc. Making it even more relevant to differentiate what type of game etc.
We had, in my eyes, big shifts in game quality and games being patched later on. We have more often than not big disparity between reviewer score and player score effecting both sales of <2 weeks and >2 weeks. Over the years and the study wasn’t short.
fair enough games that were released early on Epic and then on Steam were removed from the study, but what about simultaneously releases or releases with or without gamepass especially later games are affected by this a lot, gamepass is cheaper than a key or a sale, some of those games release there on day one by now.
1 and 2 combined in my eyes, picking different games in a different time might lead to a complete different conclusion as we can’t tell if a game of those was of significant hype or disdain. While a review score does effect the probability of a game being cracked, does this work for shitstorms or streamer hypes? I think especially the later will significantly alter the calculations especially going by numbers on a low number of games.
20% of revenue loss is the highest, estimated into day one of less than two weeks cracked, which no game was.
Realistic numbers start with larger than two weeks and even then it’s difficult to go by just 7 18 games, which would then go close to 7%, rapidly going down to 5% or less, the more significant the number of games get to make a conclusion.
If you then keep in mind that Denuvo isn’t free for a publisher, we get close into insignificant territory.
My most interesting takeaway is that we didn’t have had any cracked games since a while and the author makes a side jab “are there any groups left?”
I can’t find the quote anymore, it was somewhere early on.
I want to add that I in no way see myself as superior or unfailable, if anything this showed me once again, how difficult it is to read a study and my respect for people who can do this better than me. My thesis was easier than this study.
One can’t comment on your thread. My app says “post not found” was it deleted? Even though I can click the URL and read the comments etc. Some Fediverse thing going on.
I wish I could access this study. From the summary alone I couldn’t tell how one would compare Denuvo vs non Denuvo, when only large publisher use Denuvo, which are more likely to be mainstream and more known and therefore more in total pirated compared to an indie title. How would one measure that a DRM is preventing this revenue loss? We barely have two exactly similar games release at the same time. Gaming is fluctuating, game scores are and therefore the sold numbers, some are brand loyal while for others it counts less etc. How has anyone found a way to calculate that? Some publisher are much more likely to use Denuvo but are also more likely to be pirated because their game releases are much more likely to be buggy, boring or mtx infested.
Edit: After reading the study. Some, in my eye, major issues popped up immediately in my head when reading it.
718 in total.718 games (8% of the data) cracked in two weeks, to calculate and predict a logarithmic hypothetical number for less than two weeks.1 and 2 combined in my eyes, picking different games in a different time might lead to a complete different conclusion as we can’t tell if a game of those was of significant hype or disdain. While a review score does effect the probability of a game being cracked, does this work for shitstorms or streamer hypes? I think especially the later will significantly alter the calculations especially going by numbers on a low number of games.
20% of revenue loss is the highest, estimated into day one of less than two weeks cracked, which no game was. Realistic numbers start with larger than two weeks and even then it’s difficult to go by just
718 games, which would then go close to 7%, rapidly going down to 5% or less, the more significant the number of games get to make a conclusion.If you then keep in mind that Denuvo isn’t free for a publisher, we get close into insignificant territory.
My most interesting takeaway is that we didn’t have had any cracked games since a while and the author makes a side jab “are there any groups left?” I can’t find the quote anymore, it was somewhere early on.
I want to add that I in no way see myself as superior or unfailable, if anything this showed me once again, how difficult it is to read a study and my respect for people who can do this better than me. My thesis was easier than this study.
Credit to @aldalire@lemmy.dbzer0.com for sharing.
Thanks, it’s a complicated read because of the math approximation, but it’s not a bad study per se. I noted my critiques in the original comment.
Huh. They deleted my post? Or is there a weird lemmy reason why I cant see it in this thread? weird.
Edit: Nevermind. The thread I posted in was a crosspost. See here: https://lemmy.dbzer0.com/post/29567097
One can’t comment on your thread. My app says “post not found” was it deleted? Even though I can click the URL and read the comments etc. Some Fediverse thing going on.