There are many ways to assess learning outcomes. Our plan begins with addressing low-hanging fruit by using data that are already collected on a regular basis. This past year, we categorized all of our programs and services across the division into eight major themes:
- Academic Engagement
- Campus Connections
- Career Exploration
- Community Engagement
- Diversity and Inclusion
- Health and Wellness
- Involvement and Leadership
- Pride and Traditions
We plan to compile a comprehensive list of participants for each of the eight themes and tie that data to other data that are already collected regularly at the university. Once this is done, we will be able to compare the results between participants and non-participants and determine whether or not there are any significant differences. By approaching the learning outcomes assessment in this way, we will be able examine the impact of our programs and services across the division as a whole.
In terms of assessment, we reviewed the different sources of data that could potentially be used to measure our learning outcomes. The first set of assessments include six surveys conducted by the Office of Institutional Research (IR). These surveys are administered in a rotating cycle over a three year period. We have intentionally aligned our assessment plan with IR's survey schedule. By collaborating with IR, we will be able to reduce the assessment burden on students, gather valuable information about our goals for learning, and maximize the use of university data.
The second set of assessments include evaluations and surveys that are administered within our division. In order to maintain consistency, we will focus on assessments that are administered regularly and will include ad hoc surveys/evaluations as appropriate. Lastly, we began exploring the possibility of creating new assessments. We have identified specific sections of the AAC&U Value Rubrics that we could possibly implement in the future; however, more discussion is needed before we move forward.
Our initial assessment plan relies heavily on staff to collect and submit program and service participation data. Many departments already make a point to do this on a continual basis. In order to facilitate additional data collection, we will need to continue investing in assessment technology, support, and training. Our plan also assumes that we will continue to collaborate with the Office of Institutional Research and share data.
A summary of our initial assessment plan is available here (PDF). We understand that many of our assessments are indirect measures of learning. We plan to explore the possibility of implementing more direct measures during the next academic year. Going forward, we will review our assessment plan on an annual basis and make adjustments as needed.