We are pleased to announce that the University of South Florida in partnership with the Massachusetts Institute of Technology, North Carolina State University, Dartmouth, and The University of Pennsylvania has received $677,811 in funding from the National Science Foundation. Between 1/16 and 1/19, approximately 10,000 students taking STEM courses at USF, NCSU, Dartmouth, MIT, and Penn will use MyReviewers to upload papers and to receive feedback on their papers from STEM instructors and peers.
Val Ross, Director of Critical Writing at UPENN; Christiane Donahue, Director of Writing at Dartmouth; Chris Anson, Director of the Campus Writing and Speaking Program at North Carolina State University; Suzanne Lane, Director of New Media at MIT; Alex Rudniy, a compsci professor at Fairleigh Dickinson; Dave Eubanks, Vice President at Furman, Norbert Eliot, professor emeritus at NJIT—these researchers are deeply engaged in researching the MyReviewers corpus as part of the NSF Prime support. The researchers are investigating the efficacy of document critique and peer-review in STEM courses and mapping the development of three competency domains (cognitive, interpersonal and intrapersonal) from the perspective of self-efficacy, demographics, socioeconomic status, and gender.
This research is shaped by the NRC’s (National Research Council’s) depiction of 21st Century competencies as a mixture of cognitive, intrapersonal, and interpersonal skills and knowledge. We are especially keen to explore critical feedback and collaborative work from a non-cognitive perspective given “cognitive test scores explained about 6 percent of the variance in performance, leaving 94 percent to be explained by other factors” (Pellegrino & Hilton, 2012, p. 50) and that “Empirical studies show that cognitive competencies are able to account for only a small fraction of the association between education and earning” (p. 49).
As a byproduct of this research effort, STEM faculty at these institutions will develop disciplinary Community Comments, which are hyperlinks users can put on student papers that link out to an article, video, and try it exercises about a comment. By researching instructor and student reviews by demographics, we hope to break the failure narrative our research has identified where students in the lowest quartile drop out or fail to improve over time as writers or collaborators.
The Promoting Research and Innovation in Methodologies for Evaluation (PRIME) program seeks to support research on evaluation with special emphasis on: (1) exploring innovative approaches for determining the impacts and usefulness of STEM education projects and programs; (2) building on and expanding the theoretical foundations for evaluating STEM education and workforce development initiatives, including translating and adapting approaches from other fields; and (3) growing the capacity and infrastructure of the evaluation field.
This project will have critical significance for Science, Technology, Engineering, and Mathematics (STEM) educators by increasing writing and collaboration skills in students, areas of importance to economics, science, and national security. This study focuses on teacher and peer interactions and writing quality and improvement in the context of undergraduate STEM courses. Specifically, the project will map the development of three competency domains (cognitive, interpersonal and intrapersonal) by researching the effects of teacher and peer response on writing improvement and knowledge adaptation in STEM courses. The project utilizes a web-based assessment tool called MyReviewers (MyR). The tool will be piloted by STEM faculty in college-level Introductory Biology or Chemistry on the campuses of University of South Florida (USF), North Carolina State University (NCSU), Dartmouth, Massachusetts Institute of Technology (MIT), and University of Pennsylvania (UPenn). Research domains include both academic performance and inter/intra-personal competencies. Project deliverables will provide new tools and procedures to assist in the assessment of students’ knowledge, skills, and attitudes for project and program evaluation.
Approximately 10,000 students enrolled in STEM courses at USF, NCSU, Dartmouth, MIT, and UPenn will upload their course-based writing to MyReviewers, an assessment tool, and use the tool to conduct peer reviews and team projects. This information is supplemented by surveys of demographics and dispositions along with click patterns within the toolset. Researchers will subsequently analyze this wealth of data using predictive modeling of student writing ability and improvement, including text-based methods to identify useful features of comments, papers, peer reviews, student evaluations of other peers? reviews, and instructor and student meta-reflections. Outcome goals are to (1) demonstrate ways the assessment community can use real-time assessment tools to create valid measures of writing development; (2) provide quantitative evidence regarding the likely effects of particular commenting and scoring patterns on cohorts of students; (3) offer a domain map to help STEM educators better understand student success in the STEM curriculum; and (4) inform STEM faculty regarding the efficacy of peer review.