Increasing our Pace: Measuring Student Learning using Rubrics and ePortfolios

Print Friendly

Summary

Our Interdisciplinary ePortfolio Assessment Pilot began in Spring 2011. Each spring, faculty who are using ePortfolio in their teaching volunteer to allow their students’ ePortfolios to be reviewed in search of evidence that students are achieving the goals of our University Core Curriculum learning outcomes. Students upload work to their ePortfolios to demonstrate their competence in at least one of the following outcomes:  Written Communication, Information Literacy and Research, and Analysis. A group of reviewers consisting of faculty and academic support staff review the student work using rubrics developed by the ePortfolio Assessment Committee, many adapted from the VALUE Rubrics. A similar project is underway in our suburban campus English Department, in which faculty and instruction librarians use rubrics to measure student learning in Written Communication and Information Literacy and Research. Our hope is that the Interdisciplinary ePortfolio Assessment Pilots can serve as a model for measuring student learning for all of the Pace Core Curriculum Learning Outcomes.

Author 

Sarah Burns Feyl, Linda Anstendig

Overview of ePortfolio-related Outcomes Assessment on our Campus 

Part I:  Setting the Stage (Outcomes Assessment at Pace University)

Outcomes assessment at Pace takes place in a variety of formats and processes, depending on the school, program or course. Our main accrediting body is the Middle States Commission on Higher Education. A number of programs are accredited by other professional bodies, including:

  • Chemistry – American Chemical Society
  • Physician Assistant Program – Accreditation Review Commission on Education for the Physician Assistant (ARC-PA)
  • Bachelor of Science in computer science and in information systems – Computing Accreditation Commission (CAC) of ABET, Inc.
  • College of Health Professions– Commission on Collegiate Nursing Education (CCNE)
  • Lubin School of Business – Association to Advance Collegiate Schools of Business (AACSB International)
  • School of Education (SOE) – National Council for Accreditation of Teacher Education (NCATE)
  • School of Law – American Bar Association
  • Doctor of Psychology (Psy.D.) – American Psychological Association

In general, assessment work is led by the faculty and administrators within departments and programs. Outcomes assessment is conducted using methods such as Major Field Tests, common questions embedded in course exams, portfolio review (print and electronic), grading using rubrics, licensing exam pass rates, review of student work by outside evaluators, assessment of capstone course projects, and more.

Assessment of the Core Curriculum is led by the Dyson College in partnership with other departments on campus. Our University Assessment Committee includes representatives from all schools and the college, as well as representatives from Faculty Council and from a number of academic support units including the Library, First Year Programs, Career Services, and ITS. The director and staff of the Office of Planning, Assessment, and Institutional Research (OPAIR) lead many University-wide assessment initiatives, and provide assistance and guidance to departments and programs in their unit-based assessment projects.

In 2013, the University is opening up discussion of assessment and student learning as part of our Dyson College Outcomes Assessment project, these discussions may not be directly related to ePortfolios, but the goal is to improve and increase faculty participation and ownership of assessment of student learning. However, in the past it has been difficult to get buy in from Dyson faculty and to get departments to participate in outcomes assessment initiatives, except for going through the motions during the Middle States Accreditation Review process.  Also, we have had little success with closing the assessment loop, especially in respect to making curricular changes.

Part II:  Outcomes Assessment Developmental Story

Various ePortfolio (eP) formats have been used by a number of Pace University programs over the last decade, including home-grown systems as well as proprietary platforms either piloted or purchased. The use of e-portfolios at Pace began in Fall 2001 and was a partnership between the English department and the Center for Teaching, Learning and Technology. This project piloted the first portfolio/e-portfolio project at Pace using several English 101 Composition classes. After this first project, various incarnations of the eP team were convened, and many received internal as well as external grants to fund eP projects. Internal grants included the Presidential Learning Assessment Grant as well as the Dyson College Assessment Grant. External grant funders included Thinkfinity and of course, C2L. The projects funded by these grants all had common goals of trying to determine if ePortfolios could allow us to teach and capture reflective, integrative student learning; and these projects all had an element of student learning outcomes assessment in them. When the University’s Core Curriculum was revised in 2003, our eP team received an internal assessment grant and drafted grading criteria for use in reviewing student work via ePortfolios in the areas of communication and analysis. Within the various eP projects over the years, other goals were identified such as using ePortfolios to assess service learning and career services activities, as well as to facilitate learning and assessment of specific populations such as the Honors College.

From day one the ePortfolio initiatives served as a connector, bringing together representatives from the Dyson College, the Center for Teaching, Learning & Technology, the Library, Career Services, the Honors College, the office for Institutional Research (now OPAIR, mentioned above), and Information Technology. These connections continue, and grow. In the past decade many of our professional programs began using portfolio assessment and then ePortfolio as a method of outcomes assessment, and we have been able to work with many of these departments and programs, including Educational Technology, Media and Communication Arts, and Management for Public Safety & Homeland Security Professionals.

The English Department and the Instructional Services Librarians at the Mortola Library have had a special partnership for many years. It began as a review of student research papers in paper portfolios, and has now progressed to the ePortfolio system. We have presented on this partnership at the Assessment Institute at IUPUI, and in 2009 we were fortunate to publish a chapter on this partnership, “Multi-faceted Portfolio Assessment: Writing Program Collaboration with Instructional Librarians and Electronic Portfolio Initiative” in Trudy Banta’s book Designing Effective Assessment: Principles and Profiles of Good Practice.

Our university-wide ePortfolio platform (Mahara) was made accessible to all students and faculty in Fall 2010. The primary goal of the current ePortfolio initiative is similar to where we started back in 2001,  to acclimate the University community—faculty, students and staff—to the tool and to promote its use as a means of housing, sharing, reflecting upon and refining students’ best work. A secondary goal of the current eP initiative is to promote the use of the ePortfolio as a means of authentic assessment of student learning. In order to achieve this goal we have facilitated a number of ePortfolio Assessment Pilots, already described above. Within these pilot programs, the faculty and reviewers also have the opportunity to reflect on the process. Faculty members who taught with ePs, and whose student work is being reviewed, are surveyed. The rubric spreadsheet that reviewers use asks them specific questions about the review process, in an effort to encourage their reflection and so that we can use their feedback to improve the process.

We have presented our results to the Deans, to the Provost’s Office, at the Dyson Day Annual Conference for Dyson faculty members, and we continue to update the Pace community via the ePortfolio Advisory Board Newsletter. In addition, we do share the results of our Assessment Pilots, as well as the rubrics themselves, with our eP Teaching Circle participants each semester.

More work needs to be done about reporting data and findings to faculty and other stakeholders, and to having in-depth discussions about the strengths and weaknesses of students’ competencies revealed by our assessment pilots.

As more faculty and staff members participate in this project, our community’s understanding and enthusiasm for ePortfolios as a means of authentic assessment of student learning can only grow.

As an enhancement to our ePortfolio Assessment work in the Pilots described above, Linda Anstendig and two students performed research into student learning by closely reading student ePs, especially their reflective comments, and interviewing students to see how or if our ePs are capturing student learning. You can read more about this project here: https://eportfolio.pace.edu/view/view.php?t=phFKwQLR9Mylm6sTfctG

 Part III:  Conceptual Framework

Having now completed three years’ worth of Assessment “Pilots” using rubrics to review student work in ePortfolios, we can say with confidence that this is no easy task.  In terms of affecting change within the curriculum, we are not yet to that point. The team acknowledges that the main weakness of our assessment pilot programs is the uncompleted assessment loop. We are making some progress in this area. The Writing Faculty who conducted ePortfolio reviews the past two Decembers did discuss the Rubric and how to interpret the ratings, so there was progress made there. There wasn’t explicit discussion of “how can we change things next semester…” regarding student writing, however. What was discussed was how to better train and support students in the use of ePortfolio such as having a greater eTern presence in those courses – especially to help with the issue of Permissions. The Writing Faculty Assessment Pilots did lead to the current “Writing Portfolio” page template being implemented in the English department.

The template page implemented by the Pleasantville English department in Fall 2013. Students upload the appropriate documents after completing each required English course.

Template page created in Fall 2013 and is currently being piloted; will be required in Fall 2014

 

We look forward to working with the Writing Faculty and Instructional Librarians as students and faculty use this template in the future. The Instructional Services Librarians have discussed the results of the past two iterations of the Writing Course pilot as well. They haven’t made any major changes to their instruction programs, because what they saw was not a surprise; the student weaknesses are in the areas of citation, which was already known. See attached for the results of our numerous Assessment “Pilots:”

EPortfolio Assessment Pilot Summary 2011

Summary of December 2011 Writing eP Grading Sheets

Summary of December 2011 Writing eP for Info Lit

Summary of May 2012 Writing eP Grading Sheets

EPortfolio Assessment Pilot 2011 and 2012 Summary

EPortfolio Assessment Pilot 2012 Summary

EPortfolio Core Assessment 2013 Summary

Part IV: Correlations between ePortfolio and NSSE results 2011 through 2013

With the help of the Office for Planning, Assessment and Institutional Research, we have begun to gather encouraging data to show positive correlations between ePortfolio use and NSSE results. We have posted this data under the “Evidence” section of our What We’ve Learned page.

Connections to Other Sectors of the Catalyst 

Since spring 2011 we have conducted three multidisciplinary assessment pilots using ePortfolios to evaluate evidence of three Core Curriculum learning outcomes:  Written Communication, Analysis, and Information Literacy and Research Skills. Over the course of the three pilots, reviewers had the opportunity to reflect upon their experiences using the rubrics to grade student work. In response to the prompt “What did/didn’t work overall?” responses included these comments:

–          Reviewers voiced their frustration with student permissions not being set correctly
–          Some noted that all assignment types were not adequately addressed by the rubrics. For example, all criteria of the Written Communication rubric do not apply to blog entries, and the Information Literacy rubric does not address academic dis/honesty or field research.
–          In 2013, reviewers frequently noted their lack of access to student work (due to students not setting Permissions correctly), but also noted their comfort with the rubrics.
–          Reviewers reported general satisfaction with students’ reflective statements, but also noted that many reflections were general; two reviewers suggested the need for separate rubrics for reflections.
–          Some reviewers noted that student improvement was evident in ePs containing multiple semesters of work.
–          Written Communication reviewers reported satisfaction with the rubric, while Information Literacy reviewers noted that student work did not satisfactorily meet many of the standards of the rubric. Analysis reviewers also noted that student work did not evenly meet all dimensions of the rubric, and one reviewer suggested that providing more guidance to students in the analysis outcome would be beneficial to their writing.
–          In 2013, the eight who completed the reflection tab reported spending about 20 minutes reviewing each ePortfolio, with Written Communication reviewers needing the least amount of time, 10 to 15 minutes, while Analysis reviewers needed the most time, about 25 minutes per ePortfolio.
–          Many reviewers noted that the variety of student work was interesting but not always easy to evaluate with their assigned rubrics.
–          Some reviewers offered suggestions for future ePortfolio assessment projects—one reviewer suggested including a course name and description for each student portfolio, and another reviewer recommended further instruction for faculty on permission setting, student reflections, and the importance of uploading analytical papers.

In fall 2011 we began an assessment pilot with the Pleasantville campus English Department writing faculty in partnership with instructional librarians to assess students’ Written Communication and Information Literacy competencies via ePortfolios, using similar rubrics. Writing Faculty and librarian reviewers also had an opportunity to reflect on the pilot and contribute to our understanding of how we can improve this process. In response to the prompt “What did/didn’t work overall?” in 2012, reviewers cited:

–      the ease and convenience of viewing student work electronically,

–      difficulty entering comments into the Excel rubric grading sheet, and

–      dissatisfaction with the whole-integer (per dimension) grading options.

–      the majority of the reviewers reported that students’ reflective statements were uneven in quality: some were thoughtful and some were superficial.

–      reviewers’ general comments about their reviewing experiences included satisfaction viewing the variety of student work available in the electronic portfolio, comments about the availability of student work, and suggestions for improving the setting of permissions.

In spring 2013, we hoped to scale up the pilot with the English Department as the chair of the department expected all writing faculty to begin using ePortfolio and participating in ePortfolio assessment review. Unfortunately, not all faculty members participated. The English department has now instituted a required “Writing Portfolio” page for student ePortfolios which provides a template for uploading and reflecting upon writing in the ENG 110, ENG 120 and ENG 201 courses. We hope to continue the review of student work using this new template as it develops and is implemented by the English department.  Dyson College is providing funding to incentivize faculty to participate and to hire a graduate Writing Center assistant to help with data collecting and analysis.

Although we recognize the need to scale up our Outcomes Assessment work, we face some challenges because of a lack of support from some top administrators.  Our Strategic Plan does not include any mention of the use of  ePortfolios for Outcomes Assessment.  We do have an opportunity to include ePortfolios in a new University-wide initiative in the early stages called the Pace Path, for which each School  and College is charged with creating a distinctive experience for students that will include the opportunity for them to showcase their learning.

Conclusion

The central way we will advance the use of ePortfolios in outcomes assessment is to grow the two pilots mentioned previously. We will continue to learn from and adapt our Interdisciplinary ePortfolio Assessment Pilot so that we can expand on this attempt to use this method of assessing Core Curriculum Learning Outcomes. We will work with the English department on our Pleasantville campus to have all Writing Faculty using the “Writing Portfolio” ePortfolio page in their courses, and to use the ePortfolios as a way to facilitate assessment of Written Communication and Information Literacy skills.

In order to be successful, we must continue to recruit and retain enthusiastic faculty members in our Teaching Circles, and we must provide training and support for Writing Faculty so all are comfortable using ePortfolios.

For a number of years, we have sponsored ePortfolio Contests. Students self-nominate, and ePs are reviewed using criteria to judge their competencies in selection, reflection and writing mechanics. We have not compiled or used these grading sheets in any comprehensive “outcomes assessment” way, but there is potential here for future development. ePortfolios of contest winners can perhaps be used as evidence of student learning when accrediting bodies visit Pace. Through a combination of all these efforts, we plan to improve and enhance our use of ePortfolios as a vehicle for outcomes assessment.