MyReviewers LLC is pleased to receive a second NSF award: I-Corps. The goal of I-Corps is to advance and guide entrepreneurs as they commercialize technology funded by NSF. Please see our abstract below and keep an eye on our blog as we update you on progress.
Abstract for Investigating the Commercialization of Peer Review Tools and Writing Analytics, PI: Joe Moxley, firstname.lastname@example.org
Few educators or students dispute the importance of quality feedback on students’ writing. Critical feedback and the opportunity to revise form the foundation of secondary and postsecondary education. However, over the past 100 years, researchers have repeatedly found that teachers’ grades lack validity and reliability; that teachers’ comments on papers often lack helpful, critical commentary; and that students do not know how to critique other students’ writing even though they are frequently assigned peer-reviews. While teachers’ shared use of rubrics has led to high inter-rater reliability scores, comparisons of teachers’ scores across sections of the same course, even in contexts where the assignments and rubrics are shared, has found great variance in teachers’ scores and comments. Due to both the randomness of teachers’ grades/comments and grade inflation, teachers’ grades are often not considered a valid measure of student learning. Current measures of coaching and assessing student writing, while time consuming and well intentioned, may fail to provide students with the feedback they need to improve as writers and peer reviewers. Improving feedback on writing may help the U.S. compete globally. After all, U.S. rankings for literacy have been declining over the past decade: in 2012, the Programme for International Student Assessment concluded that the U.S. literacy rate fell from 10th to 20th in the latest study on global rankings. In 2013, the College Board reported 57% of SAT takers did not qualify as college ready. The ACT found 31% of high school graduates failed to meet ACT College Readiness Benchmarks, and the most recent NAEP Writing Report determined 73% of 12th graders received scores of Below Basic or Basic as opposed to Proficient or Advanced in 2011.
This research seeks to help expedite document markup, peer-review, team-project, eportfolio, and writing-assessment processes; empower instructors and students to provide more helpful, timely feedback; better prepare students for peer reviews and team projects; and enable high schools, colleges, and universities to more accurately assess learning, improve retention, and prepare accreditation reports. During the I-Corps program, the team will interview stakeholders—writing program administrators, faculty, and university administrators who are concerned with retention—to evaluate mockups for writing analytics, student-retention alerts, data visualizations, and intelligent tutoring systems. For example, we will ask writing program administrators if they would be likely to adopt and use tools that provide inter-rater agreement reports on instructors’ scores in relation to students’ peer-review scores or lexical analysis of the instructors’ and students’ written commentaries. Additionally, we will hold focus groups with instructors and students to ascertain the helpfulness of NLP-based reports that vet peer reviews. Ultimately, this research will explore the efficacy of second-generation digital tools—i.e., tools that provide writing analytics that give information back to users based on their use of a tool, such as lexical analysis of comments, reports of inter-rater agreement among reviewers, reports that analyze the length and quality of feedback, or badges that incentivize quality reviews. By the end of the I-Corps program, the team will prioritize a development plan and a business plan that will allow for commercialization of the technology. Ultimately, commercialization of a second-generation assessment tool that improves feedback and assessment will enable the U.S. to complete on global measures of cognitive, interpersonal and intrapersonal competencies.