Unlike a typical Learning Management System like Canvas or Blackboard that operates as a digital repository for student work, MyReviewers is a digital ecology and a learning platform that provides actionable analytics for a variety of stakeholders. MyReviewers structures learning and supports student success via online survey tools that measure self-regulation and self-efficacy, ongoing opportunities for reflection during the writing processs, and workflows such as Peer Review and Revision Plans. Since the platform is criterion-based, student drafts and final submissions are scored according to rubrics tailored to the assignment itself.
Beginning in the Fall 2017 semester, instructors and students will be able to access MyReviewers from your institution’s Canvas LMS. Users will be able to access MyReviewers from a Canvas navigation button, so Canvas will handle single sign-on. Alternatively, if desired, MyReviewers can be a stand alone application integrated with your institution’s single-sign on.
The current price of MyReviewers for English students at the University of South Florida is $47 per student per semester. This price accounts for four custom digital e-books that students use in MyReviewers. Instructors and administrators have free access in this adoption model, and students pay for access at a paywall. If desired, we can offer a similar pricing structure for your institution’s writing program at a reduced rate: $35 per semester, or $50 a year, which includes access to all four e-books.
4 custom E-books
Analytics Tools and Corpus Access
Peer Review Workflow
All users can file a help ticket. The Student Manual, Administrator Manual, and Instructor Manual provide video tutorials and simple instructions to our intuitive interface.
Students can upload PDFs, Word Docs, and PowerPoints, as well as embed Youtube or Vimeo URLs. Instructors and administrators can upload documents and e-books.
Students and instructors can leave annotations, sticky notes, and media-rich comments on student documents or in rubric text boxes.
Instructors and admins can customize the following: Courses, Project Templates, Rubrics, and Grade Scales. Admins can standardize assessment and accreditation reporting.
Users can create a comment library of their own or take advantage of MyReviewers’ media-rich comments.
Students can review all the feedback they received from instructors and peers in one place, allowing them to quickly determine areas that need improvement.
Instructors can assign peer review teams with no limits on the number of teams or students per team. The tool streamlines grading and holds peers accountable by aggregating feedback.
After reviewing their aggregated feedback and scores, students reflect on the comments they received and form a plan to improve the next draft or project.
Admins can assign multiple graders to score a student. If needed, they can assign additional rounds of grading, in which another grader will help address major discrepancies. E-portfolio reports compute inter-rater agreement, identify students at risk of failing, compare instructors’ lexical comments, visualize grade distributions, and facilitate grade norming.
MyReviewers’ surveys include a demographic survey, writing background survey, and end-of-semester survey. A number of self-reflection and rate-the-rater scores are integrated.
Reports include data on grade distribution, aggregated comments and feedback, completed assignments, average grades, and more.
Administrators and approved researchers can access all data (e.g., research surveys, comments on papers, scores in a single Excel file that is sortable by variable. This information is provided for institutional data and IRB opt-in data.)
This feature is intended for a manual analysis. An Excel file is made for each rubric an institution uses. Every row contains a single review including a clickable link to a text file, school code, writer code, grader code, grader type (instructor or student), class code, class name, draft name (initial, intermediate, final, etc.), project name (CV, response letter, etc.), comments left by a reviewer by rubric criterion, rubric criterion scores, weights assigned to rubric criteria, final draft score and corresponding letter grade. If surveys are used, survey responses are included from the pre-, post- and upload surveys (these contain demographics questions, etc.)
This is intended for automated analysis using a programming language (e.g. R, Perl, Python, etc.) Data are the same as above, with a difference that instead of Excel review and related data are exported to a large delimited text file with linked draft files. Connections are preserved by document IDs.
Intended for automated analysis. This is also intended for automated analysis using a programming language and/or a convenient import into a database-driven storage (e.g. Microsoft Access, Microsoft SQL Server, MySQL, MongoDB, etc.)
MyReviewers allows each student to prohibit usage of her/his data in research studies, e.g. opt out. The three mentioned above datasets are exported for (1) all and (2) opt-ins only. All data can be used by the Institutional Research department while the opt-out data may be shared with academic researchers after de-identification
Each dataset is also de-identified by removing student and instructor names from draft texts. An institution may choose to receive identified data, de-identified data or both.
Several data summaries with descriptive statistics are produced in Excel format, showing counts of drafts per class, per instructor, and per draft.
An Excel pivot table is produced showing counts of drafts, which can be arranged by an end user using multiple variables from the original dataset.
A report containing the most frequent n-grams, n-gram frequency histograms and word clouds.
The MyReviewers team is excited to plan a program of instructional and assessment for your institution.
For further information, please contact Joe Moxley at firstname.lastname@example.org.
USF Connect Tampa Bay Technology Incubator
3802 Spectrum Boulevard
Box 8, Slot 5
Tampa, FL 33612