Well, the problem is that if its rating changes, the game will be rated based on online content that is not actually part of the content of the game. AFAIK, it would be the first instance ever of this happening. Like if Animal Crossing became rated M because of user generated content like shirt patterns showing something inappropriate.
Its not really the game’s “fault” that user generated content is causing a problem, so changing the rating of the game wouldn’t really change anything.
Plus, who even follows ratings anymore? We used to in the 90s, but children have been playing M rated games for a long time. I don’t see how this is going to do literally anything. Unless you are going to demand age verification to get the game, which I think is a horrendous trade off. Change the rating of a game which is known to have a problem with grooming in DMs in exchange for being forced to present identification to buy or play video games?
It’s quite different considering Roblox is a platform for hosting user-generated content, not a game itself. When 99.99% of the content on your website is user-generated, you can’t use that as a shield anymore.
If you allow user generated content in your game, you should be responsible for it.
As for changing the rating not achieving anything, you’re wrong again.
I imagine there are a lot of adults with kids who are unaware of the chat elements of Roblox and how it exposed their children to sex offenders. Changing the rating would likely cause some people to pause for thought when seeing it.
It would probably limit it’s visibility on some store front. Possibly be removed from others. And if nothing else, I imagine the news cycle around having its rating changed would reach a lot of people who wouldn’t usually be aware of this.
100% this. The fuckers like to profit from what is essentially free labour (user gen content) while having no responsibility for it. Fuck them and Bethesda while we’re on it!
The only way to protect children on the internet is to not allow children on the internet. There is no other way to solve this problem. Parents these days treat the internet like a daycare and when a child is allowed unmonitored access to the internet, bad things are likely to happen. I don’t want government regulation and business to take over the duties that a parent has in raising their children. It is the parents responsibility, not the business’ and not the government’s.
You might convince a few parents to not allow their kids to play Roblox (or any omline game, actually) with an M rating, but most parents just don’t care. Look how many parents buy their children video games like Grand Theft Auto or Dead By Daylight, games that are rated M, without the parents ever even considering the rating or the content of the game not being suitable for children? This has been happening ever since video games began, either due to ignorance or negligence. Changing the rating wouldn’t be nearly as big as the media presence the game has already had due to literal accusation/lawsuits about child abuse. If that media coverage isn’t enough to make any meaningful change to the number of children on the platform then I have no idea what you think will.
An M rating isn’t going to change any visibility on any platform either, unless that platform has data that confirms the age of the user that created the account. Which is horrendously bad. Unless every online game with user generated content or online messaging is instantly rated AO, which is a ridiculously unrealistic ask, store visibility isn’t going to change.
No, software platforms should not be held accountable for the content their users generate. If this was the case, internet service providers could be prosecuted just because nefarious actors used it to plan or commit crime. And then of course entire platforms like Discord, Whatsapp, Reddit, Lemmy, Skype, Facebook, Email providers, etc. would also be included in that. A ridiculous conga line of scapegoats where all of the fault should be on the user that generated the nefarious content. Platforms should certainly do what they can to mitigate criminal activity, of course, but they are not to blame when someone misuses an aspect of their software that isn’t there specifically for nefarious purposes. This is like saying you are party/accessory to a crime just because criminals committing a crime stepped onto your property while they were running away from the scene/police.
It sounds like you are saying making this change will help good parents be good parents be good parents while bad parents will still be bad parents. I agree.
Your argument about not being responsible for the content your platform hosts is old. If a platform can’t moderate the content it hosts, don’t host it. Most people would even be happy with a reasonable attempt at moderation. I’m not one of those people, I don’t believe companies should profit from an unsafe product, which the prevents safe products from gaining traction as they are more expensive to run.
Well, the problem is that if its rating changes, the game will be rated based on online content that is not actually part of the content of the game. AFAIK, it would be the first instance ever of this happening. Like if Animal Crossing became rated M because of user generated content like shirt patterns showing something inappropriate.
Its not really the game’s “fault” that user generated content is causing a problem, so changing the rating of the game wouldn’t really change anything.
Plus, who even follows ratings anymore? We used to in the 90s, but children have been playing M rated games for a long time. I don’t see how this is going to do literally anything. Unless you are going to demand age verification to get the game, which I think is a horrendous trade off. Change the rating of a game which is known to have a problem with grooming in DMs in exchange for being forced to present identification to buy or play video games?
It’s quite different considering Roblox is a platform for hosting user-generated content, not a game itself. When 99.99% of the content on your website is user-generated, you can’t use that as a shield anymore.
If you allow user generated content in your game, you should be responsible for it.
As for changing the rating not achieving anything, you’re wrong again. I imagine there are a lot of adults with kids who are unaware of the chat elements of Roblox and how it exposed their children to sex offenders. Changing the rating would likely cause some people to pause for thought when seeing it. It would probably limit it’s visibility on some store front. Possibly be removed from others. And if nothing else, I imagine the news cycle around having its rating changed would reach a lot of people who wouldn’t usually be aware of this.
100% this. The fuckers like to profit from what is essentially free labour (user gen content) while having no responsibility for it. Fuck them and Bethesda while we’re on it!
The only way to protect children on the internet is to not allow children on the internet. There is no other way to solve this problem. Parents these days treat the internet like a daycare and when a child is allowed unmonitored access to the internet, bad things are likely to happen. I don’t want government regulation and business to take over the duties that a parent has in raising their children. It is the parents responsibility, not the business’ and not the government’s.
You might convince a few parents to not allow their kids to play Roblox (or any omline game, actually) with an M rating, but most parents just don’t care. Look how many parents buy their children video games like Grand Theft Auto or Dead By Daylight, games that are rated M, without the parents ever even considering the rating or the content of the game not being suitable for children? This has been happening ever since video games began, either due to ignorance or negligence. Changing the rating wouldn’t be nearly as big as the media presence the game has already had due to literal accusation/lawsuits about child abuse. If that media coverage isn’t enough to make any meaningful change to the number of children on the platform then I have no idea what you think will.
An M rating isn’t going to change any visibility on any platform either, unless that platform has data that confirms the age of the user that created the account. Which is horrendously bad. Unless every online game with user generated content or online messaging is instantly rated AO, which is a ridiculously unrealistic ask, store visibility isn’t going to change.
No, software platforms should not be held accountable for the content their users generate. If this was the case, internet service providers could be prosecuted just because nefarious actors used it to plan or commit crime. And then of course entire platforms like Discord, Whatsapp, Reddit, Lemmy, Skype, Facebook, Email providers, etc. would also be included in that. A ridiculous conga line of scapegoats where all of the fault should be on the user that generated the nefarious content. Platforms should certainly do what they can to mitigate criminal activity, of course, but they are not to blame when someone misuses an aspect of their software that isn’t there specifically for nefarious purposes. This is like saying you are party/accessory to a crime just because criminals committing a crime stepped onto your property while they were running away from the scene/police.
It sounds like you are saying making this change will help good parents be good parents be good parents while bad parents will still be bad parents. I agree.
Your argument about not being responsible for the content your platform hosts is old. If a platform can’t moderate the content it hosts, don’t host it. Most people would even be happy with a reasonable attempt at moderation. I’m not one of those people, I don’t believe companies should profit from an unsafe product, which the prevents safe products from gaining traction as they are more expensive to run.