The BBC's dangerous disinformation about Roblox
With the broadcast of a dishonest radio program, the BBC attempts to sway the public on a bill which is impractical and unhelpful at increasing the safety of children online
Preamble
The BBC has a history of being inaccurate with regards to technological reporting. Recently, the BBC published an article regarding online harm to Members of Parliament (MP), utilising an AI text filter intending to judge the “toxicity” of messages sent to them. Yet, the filter was skewed to filter anti-right-wing content as “toxic”, as well as a few colloquialisms.
All this sets the stage for the BBC’s punching bag, Roblox. Launched in 2006 as an online playground, it initially attracted an audience of children but has now amassed a significant proportion of teenagers and adults, with 54% of users who use the site daily being over the age of 131. In the past year, the platform has drawn ire from YouTubers such as People Make Games’ Quintin Smith. Whilst raising a few points, the factual inaccuracies in most of his two videos, in addition to the relentless harassment that critics have faced have, resulting in supporters of the video series being regarded as persona non grata in the Roblox development community.
Seeing the immense impact from such disinformation, the BBC realised that this fear, uncertainty and doubt could be used to drive public support for the Online Safety Bill, a bill criticised for its restrictions on speech, with a clause requiring platforms to identify and mitigate content which is “legal but harmful to adults”, notwithstanding clobbering end to end encryption via law, essentially prohibiting it’s use. Hence, the BBC set out to create a radio program, extolling most of the facts from the People Make Games video series, along with a rota of freshly anonymised anecdotes, with an interview with a Member of Parliament (MP) and the director of global communications at Roblox.
File on 4
The program in question is from the series File on 4, titled Roblox: A Dangerous Game? which creates an unbalanced picture of Roblox through the careful weaving of fabrication and interviews with figures such as Quintin Smith.
After the introduction, the program starts out surprisingly positively, featuring Wonuf (Josh Monk-Dalton), a Roblox developer playing his popular game, Berry Avenue. As promising as this facade may look, facts start to become distorted the more one listens. Hayley Hassall, the reporter calls Wonuf a “23 year-old gamer”. Whilst Roblox developers can, and in most cases do play games, whether for pleasure or to study the mechanics and designs of other games, it is incorrect to classify the occupation of developing Roblox games as a “gamer”. The only actions in common between playing Roblox games and developing them are most likely the use of the keyboard and mouse to manipulate the camera. There is quite literally nothing else shared between them. The program often refers to Wonuf in this manner, later writing him down as a “millionaire gamer”, referring to the fact that Berry Avenue earns over a million dollars annually.
The program further mischaracterises life on Roblox by attempting to normalise dating on Roblox, with Josh including that he “would have our dates playing Roblox games” or “just chat over the internet and stuff”, intending to wreck panic over children chatting with strangers on Roblox, which are not the same thing. This isn’t helped by Wonuf indicating that “it’s a lot more common now than previously”.
Even then, Wonuf may not have been aware that his experience would be used in such a critical radio program. Jim Booth, the producer for this program, tweeted out an invitation for Abracadabra, an English Roblox development studio most known for their releases Sharkbite and its sequel, Sharkbite 2.
Yet, this request is highly misleading. For one, it mentions that they want to “feature Abracadabra’s success story” whilst this testimonial would be featured in a radio program which casts a negative outlook on their profession (A similar instance to this occurred with People Make Games as well). The use of a public Tweet as a communication medium is also highly concerning as well, as Abracadabra lists a business email address on their Facebook page.
The first of People Makes Games’ many lies starts with a refrain from the second video, [A Roblox Public Response representative thought our video was misleading and politely asked for it in 1 singular email for it to be taken down to be corrected]. So we dug deeper, that is, Smith’s claim that
HASSALL: Soon after it was launched in 2006, the Roblox website trumpeted the slogan, ‘Make Anything. Reach Millions. Earn Serious Cash.
Here is a web capture of Roblox from October 25th, 2006.
As you can see, Roblox was advertised as a place to “play online with others” or to “work on your own personal creation”. The title of David Baszucki’s post on a Lego fan forum was “Feedback on Brick physics simulation - ROBLOX”, which also did not mention monetisation. That’s because it didn’t exist because Developer Exchange (the system which Robux, the in-game currency is exchanged for real world currencies) did not exist until 2013 and had a maximum payout amount of US$500 per month.2
Further disingenuity is sprinkled in as Hassall mentions that “Robux are a bit like casino chips”, implying that Roblox developers are gambling their money away, when this is simply not the case.
Hassall further demonstrates a lack of knowledge of the subject, suggesting that “bad actors are making money stealing from children in online games hosted by Roblox.” Smith, however, was referring to developers paying other developers, highlighting that “Roblox is offering a lot of young people […] to have a job”. Smith also heavily implied that developers under the age of 18 are skipping school to do Roblox, using the phrase “full-time” to describe “kids tak[ing] on full time work when they’re 12 or 13”. The only issue here is that not only those develop on Roblox in a more professional manner are older (with a Roblox estimate of the top 1000 developers on Roblox being approximately 25 years of age), but also those who are under the age of 18, are not expected to work 40 hours or more as Smith implies, instead often working much less than even part-time workers generally do, often adopting a similar working schedule to flexible workers who are able to choose when they work, as long as they got the job done.
If this wasn’t enough, the speakers in the program confuse themselves in how to refer to the platform’s experiences, with Hassall initially asking children to give the “name of the mini game in Roblox”. Keep in mind that Roblox is a platform that hosts experiences which can include games which can contain minigames, but it is not a game itself. Later, John Staines, a former police inspector now speaking about the harms one may find online, refers to experiences as “go[ing] into a room” whilst talking about potentially discovering sexual activity on the platform. Roblox doesn’t have rooms unless developers create rooms themselves, whether a physical room or server instances which the developers have decided to name rooms. Or maybe this is just “More Roblox speak again”, an actual remark from the presenter of the program who mind you, is supposed to be impartial.
To add to the confusion, the kids chortle out confusing timelines to unfortunate events on their Roblox account. Whist it is important to keep in mind that children do not have the skills to articulate or remember long-drawn out events, the question of accuracy starts to come into play as important details are often omitted in these recouts. Take for example, one child talking about “add[ing] a friend and he basically nicked all my Robux and everything I had”. You cannot get compromised through a friend request on Roblox. There are more events in this story which culminated in this ending, not just a simple “I pressed a button and suddenly all my items were gone”.
Smith attempts to back this “misplaced keystroke or click of a mouse can come at a cost” up, by pretending to buy an egg in the game Adopt Me and somehow losing all his Robux. To demonstrate how difficult this actually is, I decided to play it.
With multiple popups before actually purchasing any products in addition to a standard three second delay before the button in the purchase window actually activates, it is questionable how Smith somehow “literally just paid 45 of my actual Robux”.
One other aspect of the “We dug deeper” video that is also present in this article is the use of anonymised anecdotes, in this case, “Hunter - not his real name“. Yet, these anecdotes can be easily be manipulated, as discussed in a previous article regarding Smith’s second video, where inaccuracies regarding how he reported the situation regarding one of his testimonies was discovered. Supposedly in this testimonial,
HASSALL: Another developer claimed the pair had stolen his intellectual property to build the game, and so reported them to Roblox, and as a result Hunter had his game shut down
This refers to Roblox’s systems of Digital Millenium Copyright Act (DMCA) claims which you need to have legal standing to be able to file and successfully execute, suggesting that perhaps there is a side to this story that hasn’t been answered. With the copyright holders seemingly following the correct legal procedures, it appears that Hunter was rather upset with this, calling it “child exploitation, because they’ve earned the money”, which mind you, never actually changed hands as stipulated in the Roblox Terms of Use.
To add on to this hysteria of monetary madness, Quintin Smith talks about limiteds, a category of Roblox item which can be resold, mentioning that
SMITH: We have got a fedora here that costs £20,000. You saw how easy it was earlier for me to accidentally spend 70 Robux, I didn’t even rig that.
First of all, you have to have 7.5 million Robux in your Roblox account in the first place which is not exactly an amount that any kid will most likely have in their Roblox account. Second of all, to purchase this item, you first have to go to the page for this item which can involve either a search on Google or Roblox itself, then click the Buy button, followed by another dialogue box asking if you really want to make a poor financial decision, then you will be in possession of one. Or you can trade up for one, as literally all other Roblox traders do. Yet in this litany of dialog boxes, Smith argues that Roblox “is a platform that’s deeply unsafe.”. Hayley Hassall agrees with this notion, asking James Kay, Roblox’s Senior Director of International Communications
HASSALL: if you’re buying an item in the hope that it will increase in value, that’s still gambling?
which is an unusual interpretation of reselling non-perishable goods, as Kay points out, calling this practice
KAY: The trading of limited items is something very specific and actually any Robux earned through that process isn’t eligible to be converted into real world currency
Something which Hassall objects to, as she indicated with
HASSALL: I think most psychologists would say that gambling is a behaviour.
Strangely, there appears to be no scientific research to back up this claim of Limited items being considered gambling. A study done by the University of Bergen in 2019 regarding the link between video games and gambling utilised the Canadian Problem Gambling Index, which uses phrases such as “did you go back another day to try to win back the money you lost” and “have you needed to gamble with larger amounts of money”, tasks which are not synonymous with Roblox’s trading systems which utilises the valuation of the items themselves (of which can be easily manipulated) and the consent of both parties who are able to view the valuations of the items being traded. Notably, the study also found a significant increase in the number of “video game problem players” and “video game addicts”, attributing this to the increase in video game players, as well as these players playing for longer periods of time. The statistically significant increase in gambling problems was attributed by the researchers to the increased gambling advertising as well as increased gambling accessibility with more aggressive gambling games.
Somehow, Hassall somehow manages to find games “recommended to me by Roblox algorithms” whilst somehow nobody else has actually been able to easily find them. The first game, titled “No Kids Allowed”, is a game which teleports to another game which teleports to a game which is either private or (more likely) deleted from Roblox. For reference, this is what the game page looks like:
And this is the icon:
The account which uploaded this game was last on Roblox on February 25th, 2021. Combined with the game being last updated in 2020, well before the release of the Age Restriction feature and the fact that this “age restriction” that Hassall speaks of is actually just part of the game title, it’s interesting that this game was picked on in this radio show. The other two games involve “Taliban Operation”, a game that is nowhere to be found and “a strip club called X, Y, Z”, which in fact is several games sharing this phrase including a public bathroom simulator which technically do not include any inappropriate activities. Somehow these games were “listed there were recommended to me by Roblox algorithms”, given the fact that these games barely break tens of players, if they even exist at all, they absolutely do not meet the threshold required to be exposed in Roblox game recommendations, which is a fact that many indie developers on the platform have to face every single day. Yet, it appears that Hassall plays a game of absolutism, proclaiming that
HASSALL: We have found evidence that children are being harmed and upset on this platform, so I would say it’s not safe until you prevent that harmful content being on the site in the first place.
Should society be thrown out if some people turn out to be bad?
Of course, children are a natural part of any sort of society and with their knowledge (or lack thereof) it can be tempting to use them as a battering ram to further your own ideologies. So comes the classic, paladinian cliché of
HASSALL: I would say it’s not safe until you prevent that harmful content being on the site in the first place. And if you can’t do that, then you shouldn’t be allowing children on Roblox.
The MP being interviewed in this program is Jullian Knight, a supporter of the Online Safety Bill. In this interview, he assures the listener that to ensure safety on social media, you “need to have very, very firm age assurance and age verification”.
The only trouble is that Roblox already has age verification. However, as many found out, age verification doesn’t scale, with long downtimes during the first weeks of the release of Spatial Voice (a feature which required age verification to use)
The other issue is that not everyone has an identification document. Sure, everyone has a birth certificate, but Roblox’s provider, Veriff requires ID cards to have a photo and to be one which they recognise. Which is an issue that many faced when trying to get verified. It is a situation that the UK government is all too familiar with, discriminating against those with photo ID Oyster (transport payment) cards unless they were over the age of 65 in voter ID legislation.3
Another familiar issue well known to Knight is the use of computer algorithms to automate parts of moderation. During the second reading of the Online Safety Bill, Knight utters that
Content moderation is decided by algorithms, based on terms and conditions drawn up by the social media companies without any real public input. That is an inadequate state of affairs. Furthermore, where platforms have decided to act, there has been little accountability, and there can be unnecessary takedowns, as well as harmful content being carried. Is that democratic? Is it transparent? Is it right?
However, in this interview, Knight seems to advocate for more algorithmic moderation, not less, pointing to
KNIGHT: There are ways in which, through algorithms that companies can fairly quickly tell whether or not someone is the age that they say they are. Language modification moderators, for example, can do that through algorithms, so you can see if someone is really a child or is an adult. And also, you can ask people to put their face in a frame, and then that frame then will detect whether or not that person has an adult or child’s bone structure. At the moment, it seems to be that it’s still the Wild West when it comes to age verification and age assurance.
The natural question is of course, who is going to develop these algorithms? After all, Knight does not believe that companies are doing an adequate job with regards to their own algorithmic moderation. Should the Government step up to do it, and if so, what entity? Should it be in the hands of a third party and if so, who should that third party be? What are the mechanisms to regulate these algorithms? The next question is what “language modification moderators” means, as a quick Google search pulls up results ranging from a meta-study4 on different teaching practices with regards to English Second Language (ESL) speakers.
What Knight could potentially be referring to is using natural language processing to determine age. Yet, this poses some problems, with one example being ESL speakers, who may possess the English language abilities of a person who is much younger and thus being incorrectly flagged as being younger than they actually are. Another issue is the source material, with Roblox text chat messages usually being rather short, often a line long and utilising telegraphic phrasing (a key factor in determination of a writer’s age), which can potentially skew language models to think that users are much younger than they actually are. The final issue with this to be discussed here is that whilst child language acquisition focuses on how caregivers teach children language, the reverse can also be true when the child is older, as linguistic change often originates in circles involving a younger proportion of the population which are more involved in cultural changes, before spreading out to older segments of the population, a phenomenon known as the wave model. Statistical models are trained on datasets in the past which can be useful if the events are generally predictable. The only issue is that this information is contested in the linguistic world, as some linguistic changes can be easily explained but other changes can simply occur randomly.
Anecdotes are further sourced from children, ranging from general taunting such as “I know [a specific personal detail]” to more extreme examples such as “he asked me to send pictures of my feet, so I sent him some pics.”. The police inspector from earlier returns to discuss the various social networking features such as “voice chat, large volumes of text chat and then [sic] links with other platforms”. As discussed before, voice chat was initially launched with a requirement for identity verification but is now rolling out for users who claim to be over 135 and have a phone number associated with their Roblox account. Links to other social media sites are also dependent on claiming to have a birth date over the age of 13. Text chat can easily be disabled in Roblox's account settings page.
One of the following anecdotes involves Henry, a 13 year old boy who was unfortunately a victim to catfishing and self-abuse. Even then, this compelling story has holes in it. For instance, the parent, Nicola mentioned that some of the measures that she took involved such as “deleting his account”. Later however, Nicola stated that “[Roblox] chose to ignore that. They chose to pretend that it wasn’t happening.” whilst account deletion can only be done by Roblox support. Another issue raised is that “somehow the Roblox app cannot be blocked on a computer by parental controls because it’s in an app.”, which is very vague, given that like any application, Roblox can be blocked on any platform that it is on. There appears to be no mention of Nicola using measures such as blocking Roblox on the router which would stop Henry from accessing Roblox.
The program concludes there. No interviews with anybody against the Online Safety Bill, no commentary about the other areas which the Bill affects, no fact-checking of any claims. The only semblance of “unbiased” reporting is the small snippet at the beginning highlighting Berry Avenue (which Wonuf might not even be aware of it’s inclusion in this program) and James Kay’s inclusion.
The Online Safety Bill
What is more concerning is the contents of the bill that the BBC is trying to push in the File on 4 program. At the end of the program, Hayley Hassall mentions that
HASSALL: The Online Safety Bill is due to be published in May next year. Nicola hopes it will make a difference.
Yet what actual difference will the Bill make? On the child safety front, not much. On other fronts though, the impact is dangerously authoritarian.
Regarding Roblox though, the required provisions for them are rather interesting. It is likely that Roblox will be classified as a Part 3, Category 1 service. This means that it is a user-to-user service (Part 3) with Category 1 services designated by Ofcom as
the highest reach user-to-user services with the highest risk functionalities, with transparency requirements, a duty to assess risks to adults of legal but harmful content, requirements relating to fraudulent advertising and a variety of other duties
Roblox is also a search service given that it is defined as “means an internet service that is, or includes, a search engine” and does not fall in the exceptions later stipulated.
One of these unworkable provisions that is forced on Roblox and companies in similar classifications is Chapter 2 Part 3, that if Roblox identifies “non designated content that is harmful for children”, there is a duty for Roblox to notify Ofcom about these incidents. Given the scale that platforms such as Roblox operate at, this provision seems infeasible given the number of potential requests that could possibly be made in a platform where experiences are frequently added onto the platform.
Another provision assigned to Roblox, given its proportion of children on the platform is to create a child risk assessment. So, what is a child risk assessment? According to the Bill, it is “an assessment of the following matters, taking into account the risk profile that relates to services of that kind”, with some of these these following matters being the user demographic, the level of risk of harm in content, how different people may react to said content as well as how this harmful content can be created and most importantly,
the level of risk of functionalities of the service facilitating the presence or dissemination of content that is harmful to children, identifying and assessing those functionalities that present higher levels of risk
The examples given in this are those of “enabling adults to search for other users of the service (including children)” and “(enabling adults to contact other users (including children) by means of the service”. However, Ofcom has the powers to punish those who it deems to be unsatisfactory in Part 7, Chapter 6, Section 115, encouraging companies to over-compensate which can lead to results such as children being unable to access experiences created by adults, which comprise every single professionally-made Roblox experience (i.e. all popular Roblox experiences), as Roblox states the creators of the experience in the experience page and developers potentially being able to contact children in servers in-game.
Search services are also obligated to provide child risk assessments under Part 3 Chapter 3, so services such as Google and DuckDuckGo will have to restrict search results for users in the UK, or give users the option to present ID, demonstrated by Part 3, Chapter 2, Section 11 item 3a:
prevent children of any age from encountering, by means of the service, primary priority content that is harmful to children (for example, by using age verification, or another means of age assurance)
Age verification is pervasive in the Online Safety Bill, as the bill also forces Roblox to (Part 3, Chapter 2, Section 14, item 7)
(a) prevent non-verified users from interacting with content which that user generates, uploads or shares on the service, and
(b) reduce the likelihood of that user encountering content which non-verified users generate, upload or share on the service.
This would mean that developers would have to verify themselves through Roblox or else see their maximum reach be drastically reduced.
As to what age verification entails, this is set out in Part 4 Chapter 1:
57 User identity verification
(1) A provider of a Category 1 service must offer all adult users of the service the option to verify their identity (if identity verification is not required for access to the service).
(2) The verification process may be of any kind (and in particular, it need not require documentation to be provided).
Whilst this does stipulate that documentation does not necessarily have to be provided, the majority of verification providers do rely on it. Notably however, this is the only section of the bill which includes the phrase “identity verification”, therefore it is unknown whether age verification means identity verification, or some other means such as issuing a small charge to a credit card.
One other issue is regarding “content of democratic importance”. Historically, Roblox has not allowed political content of any kind on the platform, barring the Stop Online Piracy Act (SOPA). However, provisions in Part 3 Chapter 2 item 6 state that Roblox must allow
content is or appears to be specifically intended to contribute to democratic political debate in the United Kingdom or a part or area of the United Kingdom
if it is from a news organisation or regulated user-generated content in relation to that service. This means that with the passage of this bill, content that is currently part of political debate such as if those who identify as transgender are valid or not or if abortion rights should be restricted that has never been allowed on Roblox, must be allowed on Roblox. This is also extended from experiences, with user generated content in this bill covering everything from text chat in those experiences to the direct messages that they receive as well.
More recently, there have been a litany of amendments made to the bill, ranging from clarifying parts of the dense 230-page bill to more concerning changes such as replacing “functionalities” to “characteristics”. Amendment 1 defines characteristics as
In this schedule, “characteristics” of a service include its functionalities, user base, business model, governance and other systems and processes.”
This is meant to be appended to the end of Schedule 11, page 198 (with an additional amendment at beginning applied to this quotation, switching the power from Secretary of State to Ofcom)
OFCOM must make regulations specifying conditions (“Category 1 threshold conditions”) for the user-to-user part of regulated user-to-user services relating to each of the following—
(a) number of users of the user-to-user part of the service, and
(b) functionalities of that part of the service
Expanding Ofcom’s purview from merely the objective functionalities of the platforms to encompassing factors which the platform may not have absolute control of such as the userbase in addition to the generic catchall of “other systems and processes” is deeply concerning. Ofcom’s regulatory power essentially ends up being “did you hurt our feelings” rather than a careful cross-examination of the platform and how it works.
Yet one of the biggest losers in this bill is not Roblox but secure, end to end encryption. Part 7, Chapter 4 is about Ofcom’s power to issue “information notices” to those operate a user to user service, a search service, services which provide to these services, a service which can provide access to these services in addition to people who are required to provide information in the notice and people who happen to have the information in the notice. Section 93 item 4 of this chapter states that
A person commits an offence if, in response to an information notice, the person— (a) provides information which is encrypted such that it is not possible for OFCOM to understand it, or produces a document which is encrypted such that it is not possible for OFCOM to understand the information it contains, and
(b) the person’s intention was to prevent OFCOM from understanding such information
This means that if the provider of an end-to-end encryption messaging service would be serviced a notice which they are unable to serve, they will face punishment as a result.
Part 7 Chapter 5 Section 104 item 4 cements this, requiring providers to use an Ofcom-accredited scanning provider, presumably on the client.
(4) A notice under subsection (1) that relates to a combined service is a notice requiring the provider of the service to do any of the following—
(a) use accredited technology as described [to take down CSEA and terrorist materials], or both, in relation to the user-to-user part of the service;
(b) use accredited technology as described [to ensure that CSEA and terrorist materials are unsearchable], or both, in relation to the search engine of the service;
(c) use accredited technology as described in subsection (2)(a) or (b), or both, in relation to the user-to-user part of the service, and use accredited technology as described in subsection (3)(a) or (b), or both, in relation to the search engine
Obviously, this has not gone down well with the privacy community, leading to several blog posts from concerned individuals, an open letter and the Electronic Frontiers Foundation commenting about the issue.
Making a stand
If you’re in the United Kingdom
Contact your local MP. Send them an email, give them a phone call, perhaps turn up in person to one of their events and chat to them in person about the impact that the Bill will have on you.
The House of Commons will discuss the bill on December 5th, 2022. It is vital that in these few days to speak up to ensure your voice will be heard. Sure, you can tweet about it, but it won’t do anything to change or to stop it.
For everyone (else)
There are two avenues to complain about BBC programming which are accessible to everyone across the world, either complaining to the BBC directly or to Ofcom.
To complain to the BBC, go to the BBC complaints page. Submit a claim for a radio program first broadcasted on Radio 4 called File on 4, broadcasted on Tuesday, 25th October 2022 at 20:00. State that you listened to the program on-demand. You can use points in this article in several categories (i.e. Smith’s inclusion and lack of Bill opposition in bias, factual errors and inaccuracies such as the “Earn Serious Cash” claim). Require a response to your complaint. Fill out your details, review and submit your complaint. Keep the complaint number that you are sent, it will be useful later.
Whilst the Ofcom route is usually blocked off until you have received a response from the BBC’s Executive Complaints Unit (ECU), as the File on 4 program interviews those under the age of 18, you can also directly send a complaint to Ofcom through their complaints page under exceptional circumstances. As before, the program title is File on 4, broadcasted on the 25th of October at 20:00. Fill out the form as it requests you to.
Ironically, in this fight for the truth, the task on convincing the mass swathes of people who fell into the lull of the wave of disinformation has been made easier. Finally, rather than educating what seems to be every single internet user under the sun, all that needs to be convinced are a small group of people who are the arbiters of these news agencies. It has been a while that we have been defending and blocking, but now we can finally fight back.
A study of various studies on a subject and their results to form a more definitive conclusion
If the birth date set on the account is set so that the resulting age is under 13, the user would be unable to modify their own birth date