Our Lab has been focused on how to get more major tech platforms — especially search engines, video platforms, and social media — to improve their policies, algorithms, and interfaces for justice information online. What do people on Google, YouTube, Facebook, Reddit, Instagram, TikTok, Siri, and more find, if they are seeking out information about legal problems like evictions, domestic violence, debt collections, divorce, child custody, traffic tickets, or expungement?
One of our colleagues at Stanford Internet Observatory shared one program that a tech platform has — that lets domain experts help the platform improve its information policies.
YouTube has a “Trusted Flagger” program for people who work in government agencies or nongovernmental organizations (NGOs). These people can apply to be Trusted Flagger, demonstrating their credentials and expertise in a policy area. Google and YouTube can approve them for this role, and then the Trusted Flagger can report instances of misinformation, harmful content, or other policy concerns happening on YouTube.
YouTube has a shortlist of policy areas that it’s looking for Trusted Flaggers — it includes things like violent content, illegal content, hate speech, vaccines and Covid-19 misinformation, and elections misinformation.
From our perspective in the justice community, this program is exciting as a model to get domain experts’ knowledge about problems, harms, and opportunities to tech platform teams. But it’s also concerning that policy areas around justice issues and other public services are not included in their policy list.
We could imagine similar programs for domain experts in civil and criminal justice, taxes, immigration, government services, and other public policy topics who could raise flags about harmful content or algorithmic dynamics. Or we could imagine that, in a reverse, the domain experts could have a clear channel to alert the tech platform teams about what websites, services, rules, and programs are authoritative, free, and reliable in different jurisdictions.
This could help the tech platform teams understand more about the harms and opportunities with content in this policy domain — and improve the likelihood that people get local, actionable information about public services to help them with life problems.