Template: /var/www/farcry/projects/fandango/www/action/sherlockFunctions.cfm
Execution Time: 3.84 ms
Record Count: 1
Cached: Yes
Cache Type: timespan
Lazy: No
SELECT top 1 objectid,'cmCTAPromos' as objecttype
FROM cmCTAPromos
WHERE status = 'approved'
AND ctaType = 'moreinfo'

Asking the Right Questions: A Guide to Effective Assessment in College Health Education Programs

November 3, 2023 Rachel Perse University of Miami

Asking the Right Questions: A Guide to Effective Assessment in College Health Education Programs

By: Rachel Perse, MPH

Assistant Dean of Students and Director of the Sandler Center for Alcohol and Other Drug Education, University of Miami


Many of us do health education and outreach through programming, which is vital for promoting healthier and safer campuses. However, to ensure the effectiveness of these programs, assessment is key. Assessment helps gather valuable insights into the program's impact and potential areas of improvement. By carefully planning, executing, and analyzing assessment, you can optimize your peer education program, ensuring it meets the needs of your campus community.

If you are looking to start doing assessment, or looking to improve assessment you’re already doing, let this serve as a starting point. This guide outlines six questions you should be asking yourself about assessment and offers tips to help you answer them.

1. What do I hope to gain/understand through the assessment?

Before you dive into assessment, it’s necessary to clearly outline your objectives and what specific goals you hope to achieve through the assessment. Some common goals of assessment are:

  • Evaluating program effectiveness
    • Did students learn what you intended them to from the program? Were your learning objectives met? By assessing program effectiveness, you can identify areas for improvements and make adjustments for future programs, whether that’s reallocating budget, adjusting staffing, or revising content.
  • Understanding student experiences with your programs
    • Gauging student satisfaction levels, engagement, and overall perception of your programs helps you tailor them to better meet the needs and expectations of students moving forward.
  • Identifying gaps in programming
    • Are there any behaviors that aren’t being addressed through your current programming? Are there any student populations you aren’t reaching?
    • For example: At the Sandler Center for Alcohol & Other Drug Education, our assessment shows that most of our program engagement is from first- and second-year students. That tells us we need to adjust our programming strategies to better reach students in their third-year and beyond.

2. How will I implement the assessment?

When determining survey implementation, here are some things to consider:

  • Which tool to use:
    • Most universities offer free survey tools, such as Qualtrics, Google Forms, Survey Monkey, etc. Discover which tool(s) your university offers and become familiar with the software. If more than one is available, determine which one has features that best align with your assessment needs.
  • Timing of implementation:
    • Assessment can be implemented pre-program, immediately post-program, and as follow-up after the program has ended, or a combination of multiple time points.
  • Survey Access:
    • The survey should be as accessible as possible. You can display a QR code, send a link via email, or create a shortened link for easy access.

While this guide primarily focuses on survey-based assessment methods, it's important to note that other assessment approaches, such as interviews, focus groups, and observations, can also provide valuable insights. Depending on the specific context and goals of your assessment, you may choose to incorporate these methods in addition to or instead of surveys.

3. What questions should I ask to gather the information I need?

To gather meaningful information, you must carefully and intentionally formulate your questions. Good assessment questions have the following qualities:

  • Alignment with your learning objectives and assessment goals
    • For example:
      • Learning objective: Students will be able to recognize standard drink sizes.
      • Assessment question: In a red solo cup, the first line from the bottom is a standard drink size of:
        • Beer (5% alcohol)
        • Wine (12% alcohol)
        • Hard liquor (40% alcohol)
  • Concise & clear
    • Questions should be concise, clear, and easily comprehensible to the students. Avoid vague or overly broad questions, double negatives, or complicated sentence structure. A student should immediately grasp what is being asked of them.
  • Appropriate question type
    • Choose the right question type based on the information you're trying to gather. Examples include:
      • True/false
      • Multiple choice
      • Multiple select
      • Open-ended
    • Combining multiple question types in your survey can help you gather more comprehensive data.

Try this: Before finalizing questions, consider pre-testing them with a small sample group to identify any potential issues or ambiguities.

4. How will I analyze the data?

Once you've collected the data, it's time to analyze it to derive meaningful insights. The approach you take in analyzing the data should be directly aligned with the types of questions you asked:

  • Quantitative data:
    • Mathematical techniques, such as counts, percentages, mean, median, standard deviation, and correlation analysis, are commonly used to derive insights from numerical data.
    • Data analysis can show trends over a period of time (semester, academic year, or between pre- and post-event surveys).
  • Qualitative data:
    • Open-ended questions can be coded and analyzed for recurring themes, patterns, and perspectives present in the responses, offering a deeper understanding of participants' thoughts and attitudes.

Consider your team's or your own analytical capacity and skills in the chosen methods before finalizing the assessment questions. If needed, seek assistance from campus partners with expertise in data analysis to ensure accurate interpretation of the collected data.

5. How will I utilize the analyzed data? How will I report it and to whom?

Now that you have collected and analyzed your data, you don’t want it to go to waste! Consider how your data will best serve you. Here are some ways assessment data can be utilized effectively:

  • Adjust program content and/or delivery methods to address identified gaps in programming and participant feedback.
  • Allocate resources more effectively based on identified priorities and areas of improvement.
  • Demonstrate program effectiveness to donors, senior leadership, and grant or award programs.


I will provide a real-life example of assessment used at the Sandler Center for Alcohol and Other Drug Education at the University of Miami. In this example, we hosted Narcan Trainings for our campus community.

What do I hope to gain/understand through the assessment?

Our goals for this assessment were to determine if learning objectives were met and to evaluate participant satisfaction with the training. Our learning objectives were as follows:

  1. Participants will be able to identify how long it takes for Narcan to take effect.
  2. Participants will be able to identify how long Narcan's effects last.
  3. Participants will feel more confident using Narcan in an overdose situation.

How will I implement the assessment? What tools/software are available to me?

We created the survey using Qualtrics and displayed a QR code on the screen at the end of the presentation. To encourage participation, we entered all survey participants into a t-shirt raffle.  

What questions should I ask to gather the information I need?

Here are 5 of the survey questions that align with our learning objectives and goals above:

 1. How long does it take for Narcan to take effect?

  1. 1-2 minutes
  2. 2-3 minutes
  3. 4-5 minutes

 2. How long do the effects of Narcan last?

  1. 30 min- 1 hour
  2. 2 hours
  3. 4 hours

3. After this training, I feel more confident using Narcan in an overdose situation.

  1. Strongly Agree
  2. Agree
  3. Disagree
  4. Strongly Disagree

4. Please indicate your level of satisfaction with this training.

  1. Extremely satisfied
  2. Somewhat satisfied
  3. Somewhat dissatisfied
  4. Extremely dissatisfied

5. You indicated that you were [insert answer to previous question] with the training. Can you please elaborate on why?

How will I analyze the data?

The multiple-choice questions were analyzed by calculating the percentage of students who answered correctly versus incorrectly or agreed versus disagreed. These answers were compared to previous trainings. The open-ended questions were reviewed and common themes were noted and summarized.

How will I utilize the analyzed data? How will I report it and to whom?

The data demonstrated that recall for questions one and two and confidence in using Narcan were high. Satisfaction with the training was also high. Open-ended responses demonstrated two common themes for areas of improvement: the training was too long and students wanted to see a demonstration. Adjustments were made for future trainings. The assessment data was used as part of an award nomination, leading to Narcan Training winning the Division of Student Affairs Laurel Award for Outstanding Collaborative Initiative! The analyzed data was included in monthly and annual reports and was reported at the Presidential Commission on Mental Health, Alcohol and Other Drug Issues.


In summary, effectively utilizing assessment data is crucial for enhancing the impact of health education programs on college campuses. Use this guide as a foundational resource in evaluating and reimagining your current assessment practices. By asking yourself these six questions, you can more fully embrace and harness the power of assessment!

Connect with me by emailing [email protected] to brainstorm assessment ideas or to learn more about the Sandler Center’s assessment efforts.