Katie Zigelman (Head of Sales, Spectrum Labs)
Stephen Peacock (Head of ML/AI for Games, Amazon Web Services)
Joel Silk (Senior Director of Trust & Safety Operations, Roblox)
Josh Newman (CTO & Co-Founder, Spectrum Labs)
Aoife McGuinness (Trust & Safety Manager, Wildlife Studios)
Location: Room 3009, West Hall
Date: Thursday, March 23
Time: 11:30 am - 12:30 pm
Pass Type:
All Access Pass, Core Pass, Summits Pass, Expo Pass, Audio Pass, Independent Games Summit Pass
Topic:
Design, Programming
Format:
Sponsored Session
Vault Recording: Not Recorded
Audience Level: All
Whether you're in Product and Engineering, User Experience or Trust & Safety, content moderation has become critically important to protecting game companies and players from hate speech, grooming, radicalization and more. Whether your players are children, diverse ethnicities or any other type of human being, content moderation that can scale as your player base grows is critical for player experience and user retention as well as regulatory compliance across the world.
In this panel, AWS, Spectrum Labs, Wildlife Studios and Roblox talk about best practices for game developers to use the right mix of humans and technology for your stage of growth and regulatory environment. From key criteria in setting up a trust and safety team to the engineering infrastructure that lets you scale across languages, countries with different regulations, games with different ages or player profiles and more without having to hire an army of human moderators, you'll learn about road-tested practices that work as well as the newest technologies available to scale.
From transparency reporting for DSA compliance to moderation team key roles you'll hear how content moderation is evolving and what you can do to protect your players and your own business.
Takeaway
AI detects toxic and positive behaviors across multiple languages.
AI reduces manual modertaion of content with real-time actioning.
The architecture and implementation of Spectrum Labs AI as an AWS customer.
Using a GDPR and SOC2 compliant AI vendor, mitigates your own compliance risk.
Intended Audience
Team members working on the trust and safety, user experience, product or engineering team of an online platform with user generated content.