Tools and Training

This page provides helpful information about the different types of assessment projects, the assessment process, tools and resources available to support assessment at the university, as well as tips for making assessment easier. Feel free to read it from start to finish or skip to the sections that are most relevant to your needs and interests. The topics on this page include the following:


An assessment glossary is available on the university's Outcomes Assessment website.

Back to top

Assessment Planning

Creating a plan makes it more likely that the project will be a success. The plan serves as a guide throughout the planning process. A discussion with the study team early on will help answer team member's questions and provide an opportunity to address potential issues. A good assessment plan includes the following:

  • Description of the study (e.g., purpose, who will be studied, methodology)
  • A scope of work (list of tasks, timeline, and persons responsible for each task)
  • Information about resources needed in order to conduct the study (e.g., participant list, marketing materials, participant incentives, access to systems and training, fees for using the instrument, fees for external evaluators, supplies, printing, personnel, food, space)
  • The written materials needed in order to carry out the study (e.g., consent form, instrument, invitations and reminder messages, scripts, marketing materials, resource list)
  • Plan for analyzing the information
  • Communication plan for sharing the results

The Director of Student Affairs Assessment can work with you on developing a solid assessment plan.

Tip: Some offices find it easier to plan all of their assessments for the year at the same time. If you're new to assessment or want to conduct an assessment that hasn't been done before, start planning around 2-3 months before data collection. Use this time to review literature, sources of available data, and assessment instruments related to your topic. For established assessments and those who have done assessment before, start planning around 6-8 weeks prior to data collection. Remember, the sooner you start planning, the better.

Tip: The Director of Student Affairs may be able to recommend third parties to assist with external evaluations (e.g., grant evaluations) and other options for outsourcing assessment projects. They may also be able to provide recommendations on the scope of work and participate in interviews for potential evaluators.

Back to top

Assessment scheduling

In order to enhance participants' overall assessment experience and improve assessment response rates across the university, all members of the CWRU community are encouraged to collaborate on assessments and reduce overlap in assessment schedules as much as possible.

Other factors that should be considered when planning an assessment schedule include the amount of time required to conduct the study, time needed to complete training, the time of day the messages are sent, and other activities that might compete with the study (e.g., orientation, moving days, exams, breaks and holidays, major travel dates, athletic championships, campus crises, homecoming, senior week, concerts, conferences, local events).

Back to top

Audience Polls

Audience polls are excellent ways to learn about group interests, assess levels of knowledge or comfort in order to guide discussions and program content, determine the quality of programs, and vote on various topics. Up to 100 iClickers can be reserved through the Office of Student Affairs by calling 216.368.2020.

Back to top

Benchmarking/Comparison Studies

Comparison studies, as they are known in Academic Affairs, are sometimes called benchmarking studies in Student Affairs. These assessments include reviewing practices at other institutions. This type of assessment can be very useful for informing strategic plans and helping new leaders understand the strengths, weaknesses, opportunities and challenges of their office. The easiest way to begin benchmarking is to review the websites of peer institutions and then follow up with phone calls to the institutions later on. You may want to think carefully about who makes the phone calls. People at peer institutions are more likely to answer questions if they are contacted from their counterparts at CWRU (e.g., a director contacting a director).

CWRU has been a member of the Association of American Universities (AAU) since the 1970s. Our peer institution lists often start with the members of the AAU and then are whittled down based on the context of the study. For example, each division, department, etc. at CWRU may have their own peer list based on their programs, services, facilities, etc. Additionally, administrators may have specific schools that they want to see included int he list. Some institutions may be considered to be aspirants rather than peer institutions. These colleges and universities may also be included in the study.

The Division of Student Affairs uses the following institutions for benchmarking/comparison studies:

Brandeis University New York University
Carnegie Mellon University Northwestern University
Dartmouth University Tulane University
Duke University University of Chicago
Emory University University of Rochester
Johns Hopkins University Vanderbilt University
Massachusetts Institute of Technology Washington University in St. Louis

If you are looking to start a benchmarking/comparison study project, reach out to the Student Affairs Assessment Director to see if they have any templates, advice or resources for making the process easier.

Back to top

Choosing a Methodology

There are many forms of assessment. Some of the more common methodologies include benchmarking, one-on-one interviews, focus group interviews, observations, surveys, evaluations, rubrics, and third-party assessments. Depending on the context of the study, some methodologies will work better than others.

When selecting a methodology, consider the following factors:

  • Level of detail that needs to be gathered
  • Characteristics of the population (e.g., technical skills, access to technology, preferred communication methods, ability to read, access to transportation, willingness to participate, comfort level with assessment)
  • Sensitivity of the topic
  • Timeframe available
  • Access to a list of participants and their contact information
  • Where data collection will take place
  • Characteristics of the people involved in data collection (e.g., demographic background, languages, technical skills, disabilities)
  • Access to technology and likelihood that it will work correctly
  • Cost of the project and resources available to support the project
  • Level of security and privacy needed for data collection, entry and storage

Back to top

Consent Forms

Consent forms protect both the participant and the person conducting the assessment. The form provides the terms of the study and ensures the participant understands the terms of the study. The consent form is the first page a participant sees after the invitation to participate in the study. Topics covered in the consent form include the following:

  • The purpose of the study
  • Benefits and risks of participation
  • Incentives
  • Nature of the study (confidential vs. anonymous)
  • Participation is voluntary and there are no consequences for not participating in the study
  • Steps taken to ensure data security
  • Contact information for the study coordinator
  • Contact information for the Director of Student Affairs Assessment
  • Place for them to indicate their agreement to participate in the study

Additional information is sometimes included in the consent form. The Research Compliance Office at CWRU has provided a comprehensive Informed Consent Document template.

Participants must give their consent before the study can begin. In an electronic survey, this means that a skip pattern/skip logic is used to move them to the end of the survey if they mark "no" to the consent form. In other forms of assessment, they might not receive a copy of the assessment instrument unless they mark "yes" on the consent form. If they mark "no" on the consent form but then participate in the study, their results must be discarded.

Note: Minors must be removed from the study unless parents are notified about the study and given the opportunity to remove their children from the study prior to the start of data collection.

Note: Anonymous means you are not collecting any information that could be used to identify the participant (e.g., ID number, name, IP address, email address, home address, latitude, longitude). Confidential means you are collecting information that could be used to identify the participant, however, you are promising not to share their name with anyone who is not on the study team.

Tip: In Qualtrics, the person who designs the survey/evaluation, can click the anonymize button in the survey options tab to prevent the program from collecting identifiable information.

Back to top


This certification provides a baseline for understanding assessment and research ethics/risks. Everyone involved in assessment, and particularly those who are responsible for coordinating assessment efforts for their department, should complete the Continuing Research Education Certification.

Back to top

Data Collection, Storage, Security, and Systems

For the Division of Student Affairs, the following products are in compliance with the university's security standards:

  • CWRU U Drive: storing and sharing files
  • sharing data files
  • Excel: manual data entry, data cleaning and analysis
  • iClickers: polls
  • Qualtrics: survey and program evaluation design, data collection, analysis and reporting
  • NVivo: qualitative data entry
  • OrgSync: surveys and program evaluation design, data collection, and attendance tracking
  • SPSS: quantitative data entry and analysis (must purchase an administrative license for each person using this product)
  • SPSS Modeler: merging data and predictive modeling
  • Tableau: data visualization (must purchase a license for each person using this product)

Note: For security purposes, Google Drive and Surveymonkey are not recommended for assessment activities. Check with the Director of Student Affairs and [U]Tech before using these products.

Access to documents and data files should be limited to the study team. Tangible materials (e.g., consent forms, surveys, evaluations, transcripts, reports) should be stored in a locked cabinet in a locked office and/or software programs approved for these activities. Data should never be opened or stored on personal property (e.g., home, car, personal computer, personal server or iCloud account). Data should be disposed of in a secure fashion once it is no longer useful (e.g., shredder, shredding company, University Archives).

In some cases, data may need to be shared with third parties. Confidentiality agreements, security measures, and data disposal plans need to be included in contracts before the information is shared.

Back to top


Tracking program and service participation is the first step in assessment. It helps us understand who we are and are not serving.

Back to top

Existing Data

Universities collect a wealth of information about students, faculty and staff. Participants don't like to be asked to answer the same questions over and over again. After a while, they stop participating in the project and worse, stop participating in all assessments. We can show participants that we respect their time and feedback by making sure that data that is collected is used to make change before we ask for feedback again. The Director of Student Affairs Assessment may be able to save you some time if you check in with them. Click here to view some of the data that is currently available.

Back to top

External Evaluators

External evaluators may be required or requested by external funders. The Director of Student Affairs Assessment should be involved in reviewing and selecting an external evaluator and providing oversight and support for the project. The American Evaluation Association provides a list of evaluators within each state. Some criteria to consider when selecting an evaluator include the following:

  • Ability to complete the work within the given timeframe
  • Charges for indirect costs
  • Clientele
  • Company location(s)
  • Contract process
  • Data transfer, security and disposal
  • Experience working with programs related to the study topic
  • Institutional Review Board requirements for both parties
  • Number of staff
  • Number of staff with doctoral degrees
  • Number of years in business
  • Other services available (e.g., grantwriting)
  • Previous experience working with CWRU programs and services
  • Price
  • Proximity to campus or to the program
  • Quality of reports
  • Quality of the website
  • Type of services provided
  • University affiliation
  • Who will be assigned to do the work (e.g., graduate student, professional staff, consultant)

Back to top


Incentives are often used to increase response rates. They provide a token of appreciation for to those who take time to participate in a study. Incentives should hold some value to the participants but not so much value that choosing not to participate in the study would threaten some aspect of their livelihood. Examples of coercive incentives include cars, tuition reimbursement, a large amount of cash or gift cards, paying for all of their course books or lab fees, paying for their meal plan, giving them a parking space, etc.

The more likely a person is to receive an incentive, the more likely they are to participate in a study. Typically, response rates are highest when participants know they are going to receive an immediate reward for their participation, however, coordinating this can be difficult. Knowing your budget may help you decide between giving all participants an incentive, entering participants into a raffle, some form of both of the former options, or no incentive at all. If you don't have a budget for incentives, get creative. There may be ways to provide an incentive without a cost or there may be other groups on- and off- campus who would be willing to donate incentives. For example, some businesses will make donations in exchange for marketing opportunities.

Tip: Choosing good incentives is all about knowing what people want.

Incentive Ideas
Accessories Few hours off work Limo ride
Books Free consultation Meal with an important person
Candy Gaming systems Movie
Case Cash (see Dining Services) Gift baskets Latest technology
Cleveland/Ohio attractions Gift cards Organizational membership
Clothing Health services (e.g., massage) Professional development
Drink/food coupons Hotel Tickets to an event

Back to top


In all interviews, the participant is given two copies of the consent form, one to sign and one to keep for their own records. Interviews are typically recorded and transcribed. If an interviewer wants an additional record of the participant's consent, they will tape record themselves reading the consent form and then ask the person if they agree to participate in the study. In order to maintain confidentiality, the interviewer may ask the participant to choose a fake name.

The interviewer has a script that they use to explain the purpose of the interview, the structure of the interview, the rules related to the interview, and the way in which they intend to use the results. The same set of questions are asked for every interview, although follow up questions may differ across interviews. The standard questions tend to be related to the larger picture.

It is important that the interviewers are not in a position of power or authority over the participants, are trusted by the participants, and present themselves as neutral parties who are simply gathering information. They must maintain confidentiality, be good listeners, ask appropriate follow up questions (e.g., tell me more, can you describe what you said in more detail, help me understand...), and paraphrase what they are being told accurately.

Once the interview is transcribed, the interviewer shares the transcript with the interviewee(s) and asks them to confirm and/or edit parts that don't align with what they intended to say. The people responsible for analyzing the data read the complete transcript to familiarize themselves with the data before looking for themes across all of the interviews. Results can be analyzed by up to three members. Member checks are done to ensure that all of the people analyzing data are on the same page in terms of the thematic coding.

Individual interviews may include 1-2 interviewers and one interviewee while focus group interviews include one facilitator, one moderator and up to 10 interviewees. The facilitator and moderator may need to go through specialized training before conducting interviews.

The moderator's role is to host the conversation, ensure that no one monopolizes the conversation and encourage all participants to share their thoughts. The facilitator's role is to fill in the seating chart, collect the consent forms, take notes throughout the interview, summarize the discussion at the end of the interview, write down anything that was missed, and help distribute incentives (if applicable).

Tip: Food is often provided by the interviewers as part of the incentives for focus group interviews.

Back to top

Making Sure Assessments Work

Piloting an assessment means asking others, who aren't familiar with the topic, to take the assessment from the perspective of a potential participant. The person(s) involved in designing the assessment should also take the assessment from the perspective of a potential participant. This helps them understand the participant's experience and refine the assessment. This is also the best way to prevent errors in data collection.

It's important to build trust with the person piloting the assessment. Explain that they are not required to answer honestly and that you are more concerned about their feedback than with their actual responses. Ask the participant to take the assessment from a specific perspective (e.g., male, female, underrepresented minority, international student, student with a specific major, etc.). As they are taking the assessment, they should answer the questions below.

  • Start time
  • Stop time
  • On what question did you get bored and feel like you wanted to stop participating?
  • On what question would you stop participating if you were taking the evaluation for real?
  • Was there anything in the directions or questions which was confusing, awkward, or uncomfortable?
  • Did you find any of the questions to be redundant or unimportant?
  • Were there any questions that did not have a response option that met your needs?
  • What concerns, if any, did you have about answering the questions truthfully?
  • Other comments about the evaluation?

Back to top


A marketing plan helps increase the number of people who participate in a study (also known as the "response rate"). Marketing often occurs before and during data collection period. Creating the marketing materials in advance helps streamline the marketing process later. The Director of Student Affairs Assessment may be able to provide examples of marketing materials that have been used in the past and provide guidance on developing and carrying out a marketing plan for your project.

Examples of marketing techniques include, but are not limited to, the following:

  • Articles (e.g., newspapers, newsletters)
  • Class/listserv/meeting announcement scripts
  • Email messages
  • Flyers
  • Invitations/reminder messages
  • Posters
  • Social media posts
  • Table tents
  • Television ads
  • Websites
  • Word-of-mouth

Marketing materials should cover the following information:

  • Purpose of the study
  • How they were chosen for the study
  • How long it will take to participate in the study
  • There are no consequences if they decide not to participate in the study
  • Incentives they may receive if they participate in the study
  • When the study will end
  • How the results from the study have been used to make change in the past
  • Plans for sharing/using the results in the future

Some systems have options for branding and logos. For example, Qualtrics has a template branded in CWRU's colors and with CWRU's logo that is automatically used when a survey or evaluation form is created. This helps participants know that the survey/evaluation isn't spam and that it's safe to participate in the study.

Tip: Individuals are more likely to respond to requests from people they know. It may be more effective to ask someone who has established a relationship with the participants to sponsor your study. Sponsorship simply means that they agree to let you send a message to the participants on their behalf, asking them to participate in the study. If they agree to sponsorship, you should give them the opportunity to review and edit the message. It's important that the message matches the sponsor's natural tone and language.

Tip: The title of a study and email subject messages need to be relevant and interesting to the participant in order to gain their participation.

Tip: Response rates for student surveys and evaluations should be at least 30%. Studies with lower response rates are unlikely to accurately represent the views and experiences of the larger population (e.g. student body). It usually takes 3-5 messages to achieve a 30% response rate from students, faculty and staff. Participation rates tend to decline substantially three days after each message sent.

Back to top

Office Visitor Tracking

Equipment is available to track student service office visits and the reasons for visits. The equipment needs to be displayed on a front desk or mounted to a wall or piece of furniture near the front door. Efforts need to be made by the staff to ensure that everyone checks in upon arrival.

The office typically provides up to seven reasons for visiting their office. The eighth reason is always listed as "Other." These are provided on the equipment for the visitor to select upon arrival. If needed, the reader can also include a question about the visitor's affiliation to the university (e.g., undergraduate student, graduate student, faculty, staff, alumni, campus partner). Aggregated reports regarding visitor demographics can be quickly generated by the director and staff.

Back to top

Providing Support for Participants

Resource lists describe where a participant can go for help related to the assessment topic (e.g., academic tutoring, community service opportunities, counseling) and/or to cope with a negative experience related to an assessment (e.g., triggering a negative memory, a panic attack, depression). Some CWRU departments provide resources lists on their webpage (e.g., Sexual Misconduct and Title IX website).

For all assessments where the possibility of harm, discomfort or inconvenience may occur, resource lists should be provided in the invitation and reminder messages. This ensures that the participants will have the resource list readily available if needed and be able to get support as quickly as possible.

Assessments with little to no risk may or may not include a resource list at the end of the assessment.

Example: For help with... Resource Contact Information
Individual study and time management strategies Educational Services for Students, 216.368.5230
(insert more here)
(insert more here)

Back to top


Qualtrics is a survey product purchased by U[Tech] that is available to university staff, faculty, and students free of charge. The product is user-friendly, offers many features, and is a good tool for building complex surveys. The training resources are well designed and the Qualtrics team is very good about responding to questions. This product is used by many educational organizations and, at last count, there were over 1,500 users at CWRU.

The Division of Student Affairs has its own panel within the program that allows for administrative oversight and support. If you would like to set up an account, please contact Dennis Rupert, Associate Vice President of Operations and Planning, at or 216.368.6061.

Back to top

Survey & Evaluation Design Tips

Learn how to create an effective survey on the Survey & Evaluation Design Tips page.

Back to top

Third Party Assessments

Some companies have bought and designed assessments as part of their efforts to offer marketing, data collection, data analysis, and reporting services to colleges and universities. These assessments are often copyrighted and have been tested for reliability and validity. Additionally, higher educational organizations and professional organizations sometimes hire companies to design and manage assessments on behalf of their members.

These assessments can be particularly valuable. The reports often include comparison information for similar colleges and universities, information regarding all of the colleges and universities that participated in the assessment in a given year, and historical information for individual institutions if the assessment is conducted on a regular basis.

In order to meet the needs of all colleges and universities participating in the study, the language of third-party assessments tends to be general. To offset this, some companies will provide institutions an opportunity to add a specific number of questions to the end of the assessment. These questions can be used for shorter assessments, thereby decreasing the number of assessment requests sent to participants.

At CWRU, the Office of Institutional Research conducts the following third-party surveys on a rotating cycle:

  • National Survey of Student Engagement (NSSE)
  • Beginning College Study of Student Engagement (BCSSE)
  • CIRP Freshman Survey
  • Your First College Year
  • College Senior Survey

In addition to the third-party assessments described above, staff and students may receive requests from individuals or groups who are interested in conducting assessments on members of the CWRU community as part of their coursework or faculty research. These requests should be sent to the Director of Student Affairs Assessment for follow up.

Back to top