Mon - Sun 24/7

Are We Simply Keeping Up? Discussing Predictive Searching With New Legal Researchers

“Only by understanding the biases of the media through which we engage with the world can we differentiate between what we intend, and what the machines we’re using intend for us–whether they or their programmers even know it.”[1]

Slaw previously published an excellent post written by Amelia Landenberger, Legal Information Librarian at the Boston University Fineman and Pappas Law Libraries. It outlines a research activity where students are asked to find a pair of black dress shoes online. The exercise reveals the complexity behind a simple research question. Students learn about personal bias, what questions to ask prior to beginning a search, where to search, and database specific tools like filters and sorting options.[2] It is a fantastic, approachable activity to introduce students to thinking critically about research. If you teach research in any capacity, I recommend reading the original post and trying the activity. I would also like to suggest an additional prompt that creates space for a broader discussion about bias and information seeking behaviours.

I have tried this activity a few times with (mostly) great success. However, during one attempt with an online class, I had students dropping links to shoes in the chat before I had finished explaining the activity. I asked these incredibly fast students to provide me with their search process so I could replicate it for the class. You can guess what came next–they had just googled “black shoes” (without the quotations or mention of the word dress). Though I was expecting this, I realized I had failed to consider that searching for a product would show students targeted ads with images. For example, these are the top results when I search for black shoes:

[Click photo to see the larger image]

I chalked the experience up to my own poor instruction and a few external variables. However, later that day as I reflected, I realized that this activity had mimicked current legal research trends and information seeking behaviours quite accurately. Researchers want fast, simplified results and legal research platforms are coming up with innovative ways to deliver. I should have used it as a moment to teach about predictive research suggestions, search engine bias, and innovation bias.

Google does provide autocomplete predictions, but they are based on geolocation, popularity, etc.[3] The targeted ads with images are closer to predictive research suggestions on legal research platforms as they attempt to provide the answer to your question. They are also similar in the way that they draw attention to what appears to be the quickest, simplest option and draw attention away from the remainder of potential search results. Jakob’s law tells us that users prefer all websites–and search engines–operate the same.[4] It makes sense that legal research platforms have adopted this feature and that they will continue to provide improved predictions over time with increased use and availability of user data.

Based on available data from the fall of 2021, the majority of students entering a Canadian law school in 2022 are between 23–25.[5] This means they were born between 1997 and 2000. Avoiding the use of a dreaded generational label, this incoming cohort of students are part of a technology proficient demographic. They have grown up with a tsunami of information available to them at any minute through natural language searches. Fast and easy.

As a legal research example, Westlaw Edge Canada uses predictive research suggestions. The feature is described as, “… an intuitive starting point, especially when you need quick answers [emphasis added].”[6] One characteristic of being an efficient legal researcher is speed, though this is a skill that is developed over time through practice. When speaking about technology, intuitive usually means easy to use. Fast and easy.

However, fast and easy is not always accurate. Many studies have shown that, “… the presentation of search engine results affects users’ credibility judgment, selection making, and belief and attitude shaping of information.”[7] Predictive research suggestions are not inherently bad, but they can distract new users from learning how to judge the quality, relevance, and significance of research results specific to the context of their question. The ideal result is not always at the top of the list (search engine bias), and it is not always the newest feature that finds it (innovation bias). Pointing out the similarities between targeted ads and predicted suggestions can create an opportunity for students to consider search engine bias and innovation bias. This is a good entry point to a discussion about both personal and technology bias in legal research.

Legal research platforms are following Jakob’s law and trying to meet users existing skills and preferences. I am constantly reminding myself to try and do the same while teaching. I do love a bound volume of statutes and prefer holding the pages I am reading, but I am excited about the future of legal research technology and have tried to incorporate more of it in my teaching. Students entering law school now will have access to technology far beyond those available today and they need to be prepared with research skills that are transferable across platforms and tools. As I was writing this post, Thomson Reuters announced Westlaw Precision, the newest iteration of Westlaw (US) with six new features that range from augmented content to improved AI capabilities.

While I am intrigued to see where this takes legal research, for now I will continue using the shoe shopping activity to remind students to consider their personal bias and bias that is rooted in all research platforms. They know what they want and, although suggestions can be helpful, they should control their research. I will close this post with another Rushkoff quote,

“… we tend to think less about how to integrate new tools into our lives than about how simply to keep up.”[8]

 

 ________________________

[1] Douglas Rushkoff, Program or Be Programmed (Berkeley: Soft Skull Press Berkeley, 2011) at 27.

[2] Amelia Landenberger, “Shoe Shopping as an Entry Point to Teach Legal Research” (4 March 2021) online: Slaw <slaw.ca/2021/03/04/shoe-shopping-as-an-entry-point-to-teach-legal-research/>.

[3] Danny Sullivan, “How Google autocomplete predictions are generated” (8 October 2020) online: <blog.google/products/search/how-google-autocomplete-predictions-work/>

[4] “Jakob’s Law” (last visited 15 September 2022) site: Laws of UX < lawsofux.com/en/jakobs-law/>

[5] Hannah Steeves, “2021 1L Statistics for Canadian Law Schools” (15 September 2022) site: <docs.google.com/spreadsheets/d/1IOQz6U3Fb6RS6j1fUdwq9IbC6jRELtcHuiJTOr1CKb8/edit?usp=sharing>

[6] Westlaw Edge Canada, “Common Queries” (last visited 15 September 2022) site: Thomson Reuters <thomsonreuters.ca/en/westlaw-edge/features/common-queries.html>

[7] Ruoyuan Gao & Chirag Shah, “Toward Creating a Fairer Ranking in Search Engine Results” (2020) 57:1 Information Processing and Management 102138 at 2.

[8] Supra note 1 at 21.

The post Are We Simply Keeping Up? Discussing Predictive Searching With New Legal Researchers appeared first on Slaw.

Related Posts