Third Xbox Transparency Report Shows Our Evolving Approach to Creating Safer Gaming Experiences

We are continuously evolving, as is the gaming community. Safety systems are becoming more complex, faster, and agile as the gaming industry grows. This is to ensure that players remain safe from potential toxic substances. We are pleased to release our Xbox Transparency Report today, which contains data and insights about our efforts to make gaming safer, more inclusive, and enjoyable for all players.  

Xbox is constantly adapting its technology and approach to meet the changing needs of our industry. This includes advancing artificial intelligence’s exploration and usage. AI plays an increasingly significant role in the global acceleration of content moderating. Our team actively innovates with responsible AI application to achieve safer player experiences. This can be used across the entire gaming industry. To build on our existing safety practices, we are trying to balance the need for human oversight and the ever-evolving capabilities of AI.  

We use several AI models that are already available to identify toxic content. For example, Community Sift is an AI platform powered by human insights that filters and classifies billions of interactions each year. It also powers many Xbox safety systems. Turingbletchley, a multilingual model which scans the user generated imagery, ensures that only appropriate content will be shown. We’re actively developing ways in which our systems can be further enhanced by AI and Community Sift to better understand the context of interactions, achieve greater scale for our players, elevate and augment the capabilities of our human moderators, and reduce exposure to sensitive content. 

Our systems have been designed to make players feel included, respected, and safe. Our proactive approach and technology allows us to stop content before it is shared with players on our platform, while enforcements on proactive basis curb any unwanted conduct or content on the site. In the Transparency Report it is noted that 87% of all enforcements during this time period (17.09M USD) were due to our proactive moderating efforts. 

One of the main takeaways from the report is: 

  • New insights into blocked content volumes – Preventing toxicity before it reaches our players is a crucial component of our proactive moderation efforts towards providing a welcoming and inclusive experience for all. To review harmful content and stop it from appearing on our platform, we combine responsible AI with human oversight. To better measure our success, we’re now including a new dataset covering our work in this space called ‘Toxicity Prevented’. Over this period, more than 4.7MContent was blocked by the developers before it reached players.+39% from the last period) in imagery thanks to investments in utilizing the new Turing Bletchley v3 foundation model. 
  • Increased emphasis on addressing harassment – Our commitment is to create a welcoming and safe environment for everyone. Our goal is to identify and address any abusive behaviors, such as hate speech, harassment, or bullying. With that goal in mind, we’ve made improvements to our internal processes to increase our proactive enforcement efforts in this past period by issuing 84k harassment/bullying proactive enforcements (+95% The last time you played. In addition, we launched our voice reporting tool to help us capture and report voice abuse in game. Safety team members continue to work to educate players about the unacceptable nature of abusive behaviour.  
  • Understanding player behavior after enforcement – We are always taking the opportunity to learn more about how we can drive a better understanding of the Community Standards with our players. To that end, we’ve been analyzing how players behave after receiving an enforcement. Early insights indicate that the majority of players do not violate the Community Standards after receiving an enforcement and engage positively with the community. To further support players in understanding what is and is not acceptable behavior, we recently launched our Enforcement Strike System, which is designed to better help players understand enforcement severity, the cumulative effect of multiple enforcements, and total impact on their record.  

In addition to the Transparency Report, we continue to collaborate with our partners around the world to improve safety within the gaming sector: 

  • Minecraft and GamerSafer join forces to encourage servers that are committed to safety. Mojang Studios and GamerSafer have teamed up with the Minecraft community to curate The Official Minecraft Server List. Players can now easily identify third-party server providers who adhere to safety standards. Mojang Studios works with GamerSafer to regularly update the policies and standards for listing servers. All servers that are featured adhere to the Minecraft Usage Guidelines. They also meet certain requirements such as stating the reason for the server and its intended audience. Additionally, different badges are available for servers to demonstrate their dedication to security and safety best practices. The server community manager can register to list their server, while players may contact the server with questions or to report an issue. Site allows server community managers create fun and safe games, while also allowing players and parents to easily find positive experiences. 
  • Xbox Gaming Safety Toolkits Launched JapanThe following are some examples of how to get started: Singapore. These local resources empower parents and caregivers to better understand online safety in gaming and manage their children’s experience on Xbox. These toolkits provide information on common safety concerns, guidance tailored to children of different ages and Xbox features that can make gaming safer for the whole family. Xbox Gaming Safety Toolkits were previously released for Australia, New Zealand and other countries.  

Together we will build a fun community, where all can enjoy themselves, without being intimidated or fearful, within their own boundaries. To achieve this, we need to provide more tools and system to enable players to interact in a respectful manner. Our voice reporting system allows users to record and report any indecent voice behavior on any multi-player game that has voice chat. We also have the Enforcement Strike System, which provides more information regarding how players’ actions affect their overall platform experience. 

We look forward to bringing everyone on board our journey to safety.  

You may also want to consider:     

#Xbox #Transparency #Report #Shows #Evolving #Approach #Creating #Safer #Gaming #Experiences