This page contains resources for academics, practitioners, and other researchers. Please find a growing bibliography of articles and reports about the status quo of the legal Internet — and who want to create and evaluate new interventions to improve it.
Our Research on Better Legal Internet
Our team at Stanford Legal Design Lab has been researching how people use the Internet to deal with their legal problems, what search engines and social media show them, and what interventions can improve the provision of legal help online.
Here are some of our recent publications on our work. Also find a bibliography of others’ research on access to justice online here.
Does Googling Justice Work? Auditing Search Engines’ Effectiveness as Intermediaries of Legal Information
Margaret D. Hagan and Nora al-Haider, “Does Googling Justice Work? Auditing Search Engines’ Effectiveness as Intermediaries of Legal Information,” UCLA Journal of Law and Technology, Forthcoming.
Online search engines are key providers of legal information. Their responses to people’s search queries can influence whether and how people make use of the legal system to deal with problems like evictions, domestic violence, debt collection, and natural disasters. This article presents a new research protocol to understand and evaluate what search engines are showing to people who are seeking out legal help. Using this novel search audit protocol, the article identifies concerning trends in search engines’ responses to people’s legal queries, including low-quality misinformation, incorrect jurisdiction, and an absence of governmental or legal aid links. The article then proposes technical and policy strategies that may improve search engines’ role in people’s attempts to access to the justice system online.
The Supply and Demand of Legal Help on the Internet
Margaret D. Hagan, “The Supply and Demand of Legal Help on the Internet,” Legal Tech and the Future of Civil Justice, edited by David Freeman Engstrom. Cambridge University Press, Forthcoming.
Faith in technology as a way to narrow the civil justice gap has steadily grown alongside an expanding menu of websites offering legal guides, document assembly tools, and case management systems. Yet little is known about the supply and demand of legal help on the internet. This chapter mounts a first-of-its-kind effort to fill that gap by measuring website traffic across the mix of commercial, court-linked, and public interest websites that vie for eyeballs online. Commercial sites, it turns out, dominate over the more limited ecosystem of court-linked and public interest online resources, and yet commercial sites often engage in questionable practices, including the baiting of users with incomplete information and then charging for more. Search engine algorithms likely bolster that dominance. Policy implications abound for a new generation of A2J technologies focused on making people’s legal journeys less burdensome and more effective. What role should search engines play to promote access to quality legal information? Could they, or should they, privilege trustworthy sources? Might there be scope for public-private partnerships, or even a regulatory role, to ensure that online searches return trustworthy and actionable legal information?
Digital Inequalities and Access to Justice: Dialing into Zoom Court Unrepresented
Victor D. Quintanilla, Kurt Hugenberg, Margaret Hagan, Amy Gonzales, Ryan Hutchings, and Nedim Yel. “Digital Inequalities and Access to Justice: Dialing into Zoom Court Unrepresented,” Legal Tech and the Future of Civil Justice, edited by David Freeman Engstrom. Cambridge University Press, Forthcoming.
This chapter explores how virtual proceedings actually unfold for low-income persons in the everyday and serve to construct their status as pro se litigants. To date, much of the conversation has lauded Zoom court proceedings as the future of access to justice, often centering this praise on idealized and optimistic forms of online proceedings, despite persistent and pressing digital divides. In a marked departure, this chapter will examine how these new technologies actually affect the experiences of low-income unrepresented persons. We do so by presenting worrying findings from an ongoing empirical project examining the experience of pro se litigants in Indiana’s courts. We then then link those findings to new theories of psychology about how pro se litigants perform their pro se status. The chapter closes by crafting a path forward for virtual court proceedings that can capture some of the efficiency and other benefits of the online migration without harming the very demographic remote proceedings purport to serve.
The User Experience of the Internet as a Legal Help Service
Hagan, Margaret, The User Experience of the Internet as a Legal Help Service: Defining Standards for the Next Generation of User-Friendly Online Legal Services.” Virginia Journal of Law and Technology, Vol. 20, No. 394, 2016, Available at SSRN: https://ssrn.com/abstract=2942478
This Article presents empirical research about how the Internet is currently failing laypeople who are searching online for legal help to their life problems and what a future agenda of user-centered standards and practices for better legal help on the Internet could be.
It first examines the existing literature about how the Internet can best be used as legal resource and the status quo of legal help sites. Then it surveys and examines negative consumer reports and reviews of legal help websites. Finally, it presents the first study of how laypeople search for resources to resolve a legal issue, how they scout and assess legal help services online, and their feedback on which existing legal help sites they consider to be the most usable, the most trustworthy, and the most valuable.
This data is useful to propose new best practices about how these tech-based services can best serve laypeople, in terms of usability, quality of service, and protection of the users’ interests. It also confirms the importance of the Internet as a legal help service and highlights the need for more research and development on better online legal help sites that fit laypeople’s needs and preferences.
Redesigning Justice Innovation: A Standardized Methodology
Bernal, Daniel W, and Margaret D Hagan. “Redesigning Justice Innovation: A Standardized Methodology.” Stanford Journal of Civil Rights and Civil Liberties XVI, no. 2 (June 2020): 335–84. https://law.stanford.edu/publications/redesigning-justice-innovation-a-standardized-methodology/
Post Turner v. Rogers, courts, advocates, and academics are increasingly investing in access to justice research and development. However, despite many descriptions of how past justice interventions developed, and established methodologies for rigorous evaluation of outcomes, no consensus has yet emerged on which design methodologies produce the best justice innovations. Without an intentional, replicable approach to developing usable and useful justice interventions, interventionists are more likely to create products that few people use, or to waste time and money on expensive randomized trials. To address this need, this Article integrates existing expert-oriented and user- centered approaches and presents a first attempt at establishing a standard methodology for creating and vetting new justice interventions. In addition, to demonstrate the dangers of designing without a comprehensive framework and the difficulties of applying an ideal framework in the real world, we offer a detailed case study of the initial version of Arizona Eviction Help. Ultimately, we argue that just as randomized field experiments have become the status quo in evaluation of justice interventions, a human-centered, participatory approach should become the standard in their design.
A Human-Centered Design Approach to Access to Justice
Hagan, Margaret D. (2018) “A Human-Centered Design Approach to Access to Justice: Generating New Prototypes and Hypotheses for Intervention to Make Courts User-Friendly,” Indiana Journal of Law and Social Equality: Vol. 6 : Iss. 2 , Article 2, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3186101
How can the court system be made more navigable and comprehensible to unrepresented laypeople trying to use it to solve their family, housing, debt, employment, or other life problems? This Article chronicles human-centered design work to generate solutions to this fundamental challenge of access to justice. It presents a new methodology: human-centered design research that can identify key opportunity areas for interventions, user requirements for interventions, and a shortlist of vetted ideas for interventions. This research presents both the methodology and these “design deliverables” based on work with California state courts’ Self Help Centers. It identifies seven key areas for courts to improve their usability, and, in each area, proposes a range of new interventions that emerged from the class’s design work. This research lays the groundwork for pilots and randomized control trials, with its proposed hypotheses and prototypes for new interventions, that can be piloted, evaluated, and — ideally — have a practical effect on how comprehensible, navigable, and efficient the civil court system is.
Research on how people use the Internet to find legal help
Justice Connect. “Seeking Legal Help Online: Understanding the ‘missing majority’.” Victoria, Australia. 2020. https://justiceconnect.org.au/about/digital-innovation/missing-majority-report/
Michigan Advocacy Program and Graphic Advocacy Project, ” Build a better _____: Strategies for user-informed legal design.” 2021. https://www.lsntap.org/sites/lsntap.org/files/MAP%20x%20GAP%20Report%20-%20Strategies%20for%20User-Informed%20Legal%20Design.pdf
Denvir, Catrina. “What Is the Net Worth? Young People, Civil Justice and the Internet.” University College London, 2014. https://pdfs.semanticscholar.org/b584/c82bbc1baebd435a36ac1aa25001930344fa.pdf.
Denvir, Catrina, Nigel J. Balmer, and Pascoe Pleasence. “Surfing the Web – Recreation or Resource? Exploring How Young People in the UK Use the Internet as an Advice Portal for Problems with a Legal Dimension.” Interacting with Computers 23, no. 1 (January 2011): 96–104. https://doi.org/10.1016/j.intcom.2010.10.004.
Denvir, Catrina. “Online and in the Know? Public Legal Education, Young People and the Internet.” Computers and Education 92–93 (2016): 204–20. https://doi.org/10.1016/j.compedu.2015.10.003.
Denvir, Catrina, Nigel J. Balmer, and Pascoe Pleasence. “Portal or Pot Hole? Exploring How Older People Use the ‘Information Superhighway’ for Advice Relating to Problems with a Legal Dimension.” Ageing and Society 34, no. 04 (December 7, 2012): 670–99. https://doi.org/10.1017/S0144686X12001213.
Pleasence, Pascoe, Nigel J. Balmer, and Catrina Denvir. “Wrong about Rights: Public Knowledge of Key Areas of Consumer, Housing and Employment Law in England and Wales.” Modern Law Review 80, no. 5 (September 1, 2017): 836–59. https://doi.org/10.1111/1468-2230.12290.
Balmer, Nigel J, Alexy Buck, Ash Patel, Catrina Denvir, and Pascoe Pleasence. “Knowledge, Capability and the Experience of Rights Problems: Report for PLEnet.” London, UK, 2010.
Balmer, Nigel J., Marisol Smith, Catrina Denvir, and Ash Patel. “Just a Phone Call Away: Is Telephone Advice Enough?” Journal of Social Welfare …, 2012, 1–32. http://www.tandfonline.com/doi/abs/10.1080/09649069.2012.675465.
Pleasence, Pascoe T, Nigel J Balmer, and Catrina Denvir. “Navigating the Legal Advice Maze-Knowledge, Expectations and the Reality of Advice Seeking.” Accessed January 25, 2020. https://ssrn.com/abstract=2730405.
Hagan, Margaret. “The Justice Is in the Details: Evaluating Different Self-Help Designs for Legal Capability in Traffic Court.” Journal of Open Access to the Law 7, no. 1 (2019). https://doi.org/10.2139/ssrn.3475124.
Hagan, Margaret. “The User Experience of the Internet as a Legal Help Service : Defining Standards for the next Generation of User-Friendly Online Legal Services.” Va. JL & Tech. 20, no. 2 (2016): 395–465. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2942478.
Ronkainen, Anna. “Software Usability and Legal Informatics.” Available at SSRN 2162380, 2012, 1–10. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2162380.
Clarke, John A, and Bryan D. Borys. “Usability Is Free: Improving Efficiency by Making the Court More User Friendly.” Future Trends in State Courts, 2011. https://ncsc.contentdm.oclc.org/digital/collection/ctadmin/id/1844/.
Mastarone, Ginnifer L., and Susan Feinberg. “Access to Legal Services : Organizing Better Self-Help Systems.” 2007 IEEE International Professional Communication Conference, October 2007, 1–5. https://doi.org/10.1109/IPCC.2007.4464041.
Research on legal taxonomies, ontologies, and semantic web
This collection of resources is about how we can standardize legal concepts and resources, to improve how people find and use them — and how to make more automation and tools that can build on top of them.
Manzoli, Serena. “Taxonomies Make the Law. Will Folksonomies Change It?” VoxPopuII, Legal Information Institute at Cornell Law School. Ithaca, NY, April 2013. https://blog.law.cornell.edu/voxpop/2013/04/29/taxonomies-make-the-law-will-folksonomies-change-it/.
Costantini, Federico. “#Unfair #Law: Folksonomies & Law between Openness and Knowledge.” In CEUR Workshop Proceedings, Vol. 1105, 2014. http://www.dati.gov.it.
Lauritsen, Marc. “Intelligent Tools for Managing Legal Choices.” Proceedings of the 13th International Conference on Artificial Intelligence and Law – ICAIL ’11, 2011, 106–10. https://doi.org/10.1145/2018358.2018373.
Moens, Marie-Francine, Erik Boiy, Raquel Mochales Palau, Chris Reed, and Raquel Mochales Palau. “Automatic Detection of Arguments in Legal Texts.” Proceedings of the International Conference on AI & Law, 2007, 225–30. https://doi.org/10.1145/1276318.1276362.
Despres, Sylvie, and Sylvie Szulman. “Construction of a Legal Ontology from a European Community Legislative Text.” Jurix, 2004, 79–88.
Breuker, Joost, Abdullatif Elhag, Emil Petkov, and Radboud Winkels. “Ontologies for Legal Information Serving and Knowledge Management,” n.d.
Casellas, Nuria. “Semantic Enhancement of Legal Information… Are We Up for the Challenge?” VoxPopuLII, 2011. https://blog.law.cornell.edu/voxpop/2011/01/18/semantic-enhancement-of-legal-information…-are-we-up-for-the-challenge-revised-repost/
Mommers, Laurens. “On the Ontological Status and Representation of Legal Concepts.” Proceedings of the Eleventh Conference of Legal Knowledge-Based Systems (JURIX’98), 1998, 45–58.
Research on how people use the Internet to solve problems
Much of this research comes from engineers, designers, information scientists, and computer scientists. You can search for more in the ACM Library. Sometimes it is framed around Information Retrieval (discovering appropriate resources for a problem). Sometimes it is around User Engagement, Experience, and Learning Gains (measuring how people use and benefit from online resources).
Ortiz-Cordova, Adan, and Bernard J Jansen. “Site-Searching Strategies of Searchers Referred from Search Engines.” Accessed January 3, 2021. https://doi.org/10.5555/2655780.2655845.
Jiang, Jiepu, Daqing He, and James Allan. “Searching, Browsing, and Clicking in a Search Session: Changes in User Behavior by Task and over Time.” In SIGIR 2014 – Proceedings of the 37th International ACM SIGIR Conference on Research and Development in Information Retrieval, 607–16. New York, NY, USA: Association for Computing Machinery, 2014. https://doi.org/10.1145/2600428.2609633.
Wu, Zhijing, Mark Sanderson, B. Barla Cambazoglu, W. Bruce Croft, and Falk Scholer. “Providing Direct Answers in Search Results: A Study of User Behavior.” In International Conference on Information and Knowledge Management, Proceedings. 2020, pp 1635-1644. https://dl.acm.org/doi/10.1145/3340531.3412017
Bilal, Dania, and Joe Kirby. “Differences and Similarities in Information Seeking: Children and Adults as Web Users.” Accessed November 7, 2019. www.elsevier.com/locate/infoproman.
Rose, Daniel E, and Danny Levinson. “Understanding User Goals in Web Search.” In Thirteenth International World Wide Web Conference Proceedings, WWW2004, 13–19, 2004. https://doi.org/10.1145/988672.988675.
Kohli, Shruti, Sandeep Kaur, and Gurrajan Singh. “A Website Content Analysis Approach Based on Keyword Similarity Analysis,” 2012. https://doi.org/10.1109/WI-IAT.2012.212.
Reinecke, Katharina, Tom Yeh, Luke Miratrix, Rahmatri Mardiko, Yuechen Zhao, Jenny Liu, and Krzysztof Z Gajos. “Predicting Users’ First Impressions of Website Aesthetics With a Quantification of Perceived Visual Complexity and Colorfulness,” n.d.
Hannak, Aniko, Piotr Sapiezynski, Arash Molavi Kakhki, Balachander Krishnamurthy, David Lazer, Alan Mislove, and Christo Wilson. “Measuring Personalization of Web Search.” In Proceedings of the 22nd International Conference on World Wide Web – WWW ’13, 527–38. New York, New York, USA: ACM Press, 2013. https://doi.org/10.1145/2488388.2488435.
Robertson, Ronald E., Shan Jiang, Kenneth Joseph, Lisa Friedland, David Lazer, and Christo Wilson. “Auditing Partisan Audience Bias within Google Search.” Proceedings of the ACM on Human-Computer Interaction 2, no. CSCW (November 1, 2018): 1–22. https://doi.org/10.1145/3274417.
Epstein, Robert, and Ronald E Robertson. “The Search Engine Manipulation Effect (SEME) and Its Possible Impact on the Outcomes of Elections,” 2015. https://doi.org/10.1073/pnas.1419828112.
Manning, Christopher D., Prabhakar. Raghavan, and Hinrich. Schütze. Introduction to Information Retrieval. Cambridge University Press, 2008. https://nlp.stanford.edu/IR-book/.
Research on Internet interventions to solve social problems
Cheng, Qijin, and Elad Yom-Tov. “Do Search Engine Helpline Notices Aid in Preventing Suicide? Analysis of Archival Data.” Journal of Medical Internet Research 21, no. 3 (March 1, 2019): e12235. https://doi.org/10.2196/12235.
Sillence, Elizabeth, and Pam Briggs. “Please Advise: Using the Internet for Health and Financial Advice,” 2004. https://doi.org/10.1016/j.chb.2004.11.006.
Heussner, Ki Mae. “Google ‘Suicide’ Search Feature Offers Lifeline.” ABC News, 2019. https://abcnews.go.com/Technology/google-suicide-search-feature-offers-lifeline/story?id=10313064.
Aramaki, Eiji, Sachiko Maskawa, and Mizuki Morita. “Twitter Catches The Flu : Detecting Influenza Epidemics Using Twitter.” Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing 2011 (2011): 1568–76. https://doi.org/10.1136/emermed-2011-200617.7
Young, Sean D. “A ‘Big Data’ Approach to HIV Epidemiology and Prevention.” Preventive Medicine, 2015. https://doi.org/10.1016/j.ypmed.2014.11.002
Eckhoff, Philip A., and Andrew J. Tatem. “Digital Methods in Epidemiology Can Transform Disease Control.” International Health, March 1, 2015. https://doi.org/10.1093/inthealth/ihv013
Salathé, Marcel, Linus Bengtsson, Todd J. Bodnar, Devon D. Brewer, John S. Brownstein, Caroline Buckee, Ellsworth M. Campbell, et al. “Digital Epidemiology.” Edited by Philip E. Bourne. PLoS Computational Biology 8, no. 7 (July 26, 2012): e1002616. https://doi.org/10.1371/journal.pcbi.1002616
Burris, Scott, Alexander C Wagenaar, Jeffrey Swanson, Jennifer K Ibrahim, Jennifer Wood, and Michelle M Mello. “Making the Case for Laws That Improve Health: A Framework for Public Health Law Research.” The Milbank Quarterly 88, no. 2 (2010): 169. https://doi.org/10.1111/j.1468-0009.2010.00595.x
Thangarajan, Narendran, Nella Green, Amarnath Gupta, Susan Little, and Nadir Weibel. “Analyzing Social Media to Characterize Local HIV At-Risk Populations.” In Proceedings of the Conference on Wireless Health – WH ’15, 15:1–8, 2015. https://doi.org/10.1145/2811780.2811923
Susumpow, Patipat, Patcharaporn Pansuwan, Nathalie Sajda, and Adam W Crawley. “Participatory Disease Detection through Digital Volunteerism.” In Proceedings of the 23rd International Conference on World Wide Web – WWW ’14 Companion, 663–66, 2014. https://doi.org/10.1145/2567948.2579273
Ho, Daniel E., and Kristen M Altenburger. “When Algorithms Import Private Bias into Public Enforcement: The Promise and Limitations of Statistical Debiasing Solutions.” Journal of Institutional and Theoretical Economics 175, no. 1 (2018): 98. https://doi.org/10.1628/jite-2019-0001
Research on measuring quality and value of online resources
Suh, Hyewon, Nina Shahriaree, Eric B Hekler, and Julie A Kientz. “Developing and Validating the User Burden Scale.” In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems – CHI ’16, 3988–99, 2016. https://doi.org/10.1145/2858036.2858448.
Jain, Mohit, Pratyush Kumar, Ramachandra Kota, and Shwetak N Patel. “Evaluating and Informing the Design of Chatbots,” 895–906, 2018. https://doi.org/10.1145/3196709.3196735.
Lagun, Dmitry, and Mounia Lalmas. “Understanding and Measuring User Engagement and Attention in Online News Reading.” In WSDM 2016 – Proceedings of the 9th ACM International Conference on Web Search and Data Mining, 113–22. New York, NY, USA: Association for Computing Machinery, Inc, 2016. https://doi.org/10.1145/2835776.2835833 .
Lalmas, Mounia, and Liangjie Hong. “Tutorial on Metrics of User Engagement: Applications to News, Search and E-Commerce.” In WSDM 2018 – Proceedings of the 11th ACM International Conference on Web Search and Data Mining, 2018-Febuary:781–82. New York, NY, USA: Association for Computing Machinery, Inc, 2018. https://doi.org/10.1145/3159652.3162010.
Roy, Nirmal, Felipe Moraes, and Claudia Hauff. “Exploring Users’ Learning Gains within Search Sessions.” In CHIIR 2020 – Proceedings of the 2020 Conference on Human Information Interaction and Retrieval, 432–36. New York, NY, USA: Association for Computing Machinery, Inc, 2020. https://doi.org/10.1145/3343413.3378012.
Eliason, Emma, and Jonas Lundberg. “The Appropriateness of Swedish Municipality Web Site Designs.” Accessed May 9, 2018. http://delivery.acm.org.stanford.idm.oclc.org/10.1145/1190000/1182481/p48-eliason.pdf.
Lai, Cora Sio Kuan, and Guilherme Pires. “Testing of a Model Evaluating E-Government Portal Acceptance and Satisfaction.” The Electronic Journal Information Systems Evaluation 13, no. 1 (2010): 35–46. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.212.3650&rep=rep1&type=pdf.
Tsohou, Aggeliki, Information Systems Evaluation, Habin Lee, Zahir Irani, Vishanth Weerakkody, Ibrahim Osman, Business Information, Decision Systems, Abdel Anuz Latif, and Tunc Medeni. “Evaluating E-Government Services From a Citizens ’ Perspective : A Reference Process” 2012, no. May 2014 (2012): 146–53.
Barendrecht, Maurits. A Handbook for Measuring the Costs and Quality of Access to Justice, 2010. http://books.google.com/books?id=eZ4cCsfaXyUC&pgis=1.
Work on Schema.org markup, semantic web, and search companies
This thread of research has fewer academic studies, and comes more from practitioners at technology companies or SEO consultants.
Schema.org. “How We Work,” n.d. http://schema.org/docs/howwework.html.
Guha, R. V., Dan Brickley, and Steve MacBeth. “Schema.Org: Evolution of Structured Data on the Web.” Queue 13, no. 9 (November 1, 2015): 10–37. https://doi.org/10.1145/2857274.2857276.
Price, Chuck. “What Is Schema Markup & Why It’s Important for SEO.” Search Engine Journal, 2019. https://www.searchenginejournal.com/technical-seo/schema/.
Montti, Roger. “Google Confirms That Structured Data Improves Targeting.” Search Engine Journal, April 3, 2018. https://www.searchenginejournal.com/structured-data-and-ranking/246993/#close.
PCMag. “Google Is Making It Easy to Find Where and When You Can Vote.” Accessed November 23, 2020. https://www.pcmag.com/news/google-is-making-it-easy-to-find-where-and-when-you-can-vote.
Hansell, Saul. “Google Keeps Tweaking Its Search Engine – New York Times.” The New York Times, 2007. https://www.nytimes.com/2007/06/03/business/yourmoney/03google.html?ex=1338523200&en=f003faaf084c0a72&ei=5124&partner=digg&exprod=digg.
Grind, By Kirsten, Sam Schechner, Robert Mcmillan, and John West. “How Google Interferes With Its Search Algorithms and Changes Your Results.” The Wall Street Journal, 2019. https://www.wsj.com/articles/how-google-interferes-with-its-search-algorithms-and-changes-your-results-11573823753.
Meusel, Robert, Dominique Ritze, and Heiko Paulheim. “Towards More Accurate Statistical Profiling of Deployed Schema.Org Microdata.” Journal of Data and Information Quality 8, no. 1 (October 1, 2016): 1–31. https://doi.org/10.1145/2992788