A person standing at a podium giving a speech to graduates.

What is Program Assessment?

For every academic program, faculty have learning goals for their students. Program-level assessment is conducted by faculty who wish to improve the quality of student learning in their programs and provide justification for the resources needed to maintain them. These assessments involve the systematic collection of information about student learning across courses in an academic program to (a) determine how well students are meeting program goals and (b) to inform decisions about program content, delivery and pedagogy.

Note: Academic departments conduct two types of assessment projects: (1) curricular and learning assessment, and (2) a comprehensive self-study, conducted once every few to years, known as Academic Program Review (APR). The latter aims to identify a program’s challenges and opportunities in areas of quality, service, efficiency and resources. Assessment of student learning is just one component of APR. This page focuses on the process for the former. 

How to Assess an Academic Program

When designing an assessment activity, it is important to consider the following: (1) Assessment is an opportunity to address faculty concerns related to student learning and should start with genuine questions, (2) It is important to articulate the learning goals of the program and the ways in which progress on these goals can be meaningfully measured, and (3) Good assessment is manageable and sustainable. It is neither prudent nor productive to measure everything. Remember that assessment is an ongoing process and create a long-term strategy for the assessment of your program’s learning goals.

Step 1. Define Curricular Goals

It can help to start with a mission statement — a concise description of the program’s purpose, values, and principles that guide its curriculum.

Encourage program faculty to discuss the following questions:

  • What is the program’s purpose? 
  • What are the program’s most important activities, services, and offerings?
  • What values and principles guide the program’s activities?
  • How does the program contribute to the education and career goals of its students?

Example mission statement:

The mission of the Honors program is gather engaged faculty and academically motivated, curious students from across campus within a holistic learning community that emphasizes active learning, intellectual exploration, and civic engagement.

A program’s curricular goals are broad statements about how we want students to be different as a result of completing the academic program (Suskie, 2009). More specifically, program goals specify the knowledge, values, or skills we want students to have acquired. To define the goals for students in a program, engage faculty in the following questions:

  • What achievements do you expect of all graduates from your program?
  • Describe the ‘ideal student’ graduating from your program.
    • What does this student know?
    • What can this student do?
    • What does this student care about?
    • What program experiences contributed to the development of this ideal student?

You may want to collect syllabi or course descriptions from capstone courses, or review the curricular goals of similar programs at other institutions.

Example Curricular Goals:

  1. “To teach students how to be conscientious consumers of information”
  2. “To prepare students for careers in the Accounting industry.”
Step 2. Set Program Learning Objectives

Program Learning Objectives (PLOs) describe what students will be able to demonstrate when a learning goal has been met. Once the program’s curricular goals have been articulated, it’s time to translate these into measurable objectives. This will help point to the ways in which progress on these goals can be meaningfully measured.

Example PLOs (aligned to the curricular goals in Step 2):

  1. “Students will develop the disposition and skills to gather, organize, and evaluate the credibility of information and ideas.”
  2. “Student will describe and apply accounting principles and rules to a variety of business situations.”

Use of action verbs from Bloom’s Taxonomy can help to ensure that your SLOs are measurable. Some examples include:

    • identify…
    • explain…
    • differentiate…
    • integrate…
    • challenge…
    • compose…

You can find more examples of Program Learning Objectives here.

Importantly, program learning objectives should be clearly stated on all program materials (e.g., the program website, student handbook, etc.). This helps students to understand (1) what they can expect from the program, and (2) what they are expected to achieve in the program.

Step 3. Map for Alignment

A “Curriculum Map” identifies how the courses in a program contribute to the program’s learning goals and facilitates understanding of how courses are sequenced and “fit” together.

This can help you to structure program content in ways that best support students, and it also shifts instructors’ focus from “my course” to “our program.” Curriculum mapping supports faculty collaboration and curriculum revision, as the activity often reveals curricular strengths, gaps, and unnecessary overlaps.

How to develop a basic curriculum map:

  1. Place program learning goals in a row across the top of an Excel Sheet
  2. List all required courses in a column down the right side
  3. Convene the faculty teaching within your program for 1 to 2 hours
  4. Have faculty identify the program learning goals they are covering in their courses
  5. Follow up the creation of the map with an examination of the Map

Ask questions such as:

  • Is the curriculum well-sequenced? How well are new majors prepared for mid-level courses? How well are majors prepared for capstone courses? 
  • Do students have enough practice to achieve each curricular goal?
  • What does the curriculum say about our priorities? 
  • How do the required learning experiences of our curriculum shape a graduate of our program? How well are graduates prepared for life after college?

Carnegie Mellon’s Eberly Center offers a great Blank Mapping Tool (xls).

NOTE: When program curriculum maps are made available to students, this facilitates understanding of how (1) each course is intended to contribute to specific knowledge or skills of the program, and (2) elective courses could be strategically selected to further strengthen knowledge or skills or to explore new areas of interest.

Step 4. Choose Assessment Methods

Identify the possible measures and/or data source(s) your department will use to gauge progress. Note that high quality assessment incorporates at least some direct measures along with some indirect measures*. For example:

    • Pre- and post assessments of particular desired outcome
    • Attainment in required courses (DFW rates; student surveys; etc.)
    • Blind ratings of student artifacts using to a faculty-developed rubric
    • Retention rates, graduation rates, and within-group gaps
    • Comparisons to other: departments; national norms; etc.
    • Placement success: jobs/salaries; graduate schools
    • Alignment with industry/professional needs

 *More examples are provided in the sections below on direct and indirect measures.

Important considerations: When designing an assessment activity, consider both the logistics of the activity as well as the validity of the assessment. Often, assessing progress on program learning goals involves a series of decisions dependent on your sample size, resources, needs, and other project variables. To conduct meaningful and worthwhile assessment activities, consider the following questions:

    • Is the activity clearly related to the mission and goals?
    • Will it provide meaningful, actionable results?
    • Is the program prepared to carry out the activity?
    • Is the scope of the activity suitable for the timeframe planned?
    • For assessments that span multiple years, how well does the activity integrate with the overall project?

Be sure to explain and document the procedures used to determine whether the curriculum is effective in achieving the goals sought by the department. Copies of surveys or other instruments used should be included in the appendices of your assessment report.

NOTE: The OIE supports assessment activities with institutional data and survey projects. Learn more here: OIE Assessment Support

■ Examples of Direct Measures

When properly aligned to curricular goals and learning objectives, assessment methods help us determine whether our students are learning what we think they’re learning, as well as how well program content, delivery and pedagogy support students’ opportunities to achieve program learning goals.

Aim to measure both directly and indirectly. Direct measures measure how well students demonstrate that they have achieved a learning goal. Some examples include:

  • Capstone Courses: Senior seminars or designated assessment courses where program learning goals are integrated into assignments.
  • Collective Portfolios: Faculty assemble samples of de-identified student work from various courses and use this “collective portfolio” to assess specific program learning goals. Portfolios should be assessed using normed scoring rubrics and expectations should be clarified before portfolios are examined.
  • Content Analysis: is a procedure that analyses the content of written documents. The analysis begins with identifying the unit of observation, such as a word, phrase, or concept, and then creating meaningful categories to which each item can be assigned. For example, a student’s statement that “I learned that I could be comfortable with someone from another culture” could be assigned to the category of “Positive Statements about Diversity.” The number of incidents that this type of response occurred can then be quantified and compared with neutral or negative responses addressing the same category.
  • Course Assessment: Data collected from de-identified course assessments can be analyzed to assess program learning goals if the assessments are aligned to these outcomes.
  • Embedded Questions to Assignments: Questions related to program learning goals are embedded within course exams. For example, all sections of “research methods” could include a question or set of questions relating to your program learning outcomes. Faculty score and grade the exams as usual and then copy exam questions that are linked to the program learning outcomes for analysis. The findings are reported in the aggregate.
  • Locally developed essay questions: Faculty develop essay questions that align with program learning goals. Performance expectations should be made explicit prior to obtaining results.
  • Locally developed exams: Faculty create an exam that is aligned with program learning goals. Performance expectations should be made explicit prior to obtaining results.
  • Normed Scoring Rubrics: When developed and normed by faculty, these rubrics can be used to holistically score any product or performance such as essays, portfolios, recitals, oral exams, research reports, etc. A detailed scoring rubric that delineates criteria used to discriminate among levels is developed and used for scoring. Generally two raters are used to review each product and a third rater is employed to resolve discrepancies.
  • Observations: can be of any social phenomenon, such as student presentations, students working in the library, or interactions at student help desks. Observations can be recorded as a narrative or in a highly structured format, such as a checklist, and they should be focused on specific program objectives.
  • Primary Trait Analysis: is a process of scoring student assignments by defining the primary traits that will be assessed, and then applying a scoring rubric for each trait.
  • Standardized Achievement and Self-Report Tests: Select standardized tests that are aligned to your specific program learning goals. Score, compile, and analyze data. Develop local norms to track achievement across time and use national norms to see how your students compare to those on other campuses.

Source: Allen, Mary; Noel, Richard, C.; Rienzi, Beth, M.; and McMillin, Daniel, J. (2002). Outcomes Assessment Handbook. California State University, Institute for Teaching and Learning, Long Beach, CA.

■ Examples of Indirect Measures

Again, aim to measure both directly and indirectly. Indirect measures measure students’ (or others’) perceptions of how well they have achieved a learning goal. Some examples include:

  • Curriculum Maps: Curriculum Maps (or Degree Maps) are used to summarize the relationship between program goals and courses, course assignments, or course syllabus objectives to examine congruence and to ensure that all objectives have been sufficiently structured into the curriculum.
  • Focus Groups: A series of planned discussions among homogeneous groups of 6-10 students who are asked a carefully constructed series of open-ended questions about their beliefs, attitudes, and experiences. The session is typically recorded and later transcribed for analysis. The data is studied for major issues and reoccurring themes along with representative comments.
  • Interviews: Conversations or direct questioning with a student  or group of students. The interviews can be conducted in person or on the telephone. Interviewers should be trained to follow agreed-upon procedures (protocols).
  • Surveys: can be used to (a) assess learning, (b) assess student needs, (c) obtain student feedback, and (d) find out what happens to students after graduation. A program assessment survey should cover a number of topics, including: program goals, courses, instructors, educational supports, DEI, and career prep/professional development. Please request assistance with survey design and/or administration from OIE. If designing your own instrument, please consult Best Practices in Survey Research, as good data only comes from good methodology. See our Program Assessment Survey Template or our Alumni Outcomes Survey Template for examples of appropriate question items and answer options.
  • Transcript Analysis: are examined to see if students followed expected enrollment patterns or to examine specific research questions, such as to explore differences between transfer and freshmen enrolled students.

Source: Allen, Mary; Noel, Richard, C.; Rienzi, Beth, M.; and McMillin, Daniel, J. (2002). Outcomes Assessment Handbook. California State University, Institute for Teaching and Learning, Long Beach, CA.

Step 5. Document your Assessment Plan

An Assessment Plan documents an assessment activity according to the stages of the assessment cycle: state the department’s mission and goals, map to courses, and select assessment methods.

Guiding questions for designing an assessment project:

  • Which program goals should we assess? 
  • How might we use assessments already embedded in our courses to measure learning at the program level?
  • How can we assure that we use the same criteria when assessing a particular outcome? Should we test interrater reliability?
  • What is our implementation plan and timeline?
  • What do you expect to find?

Keep in mind that the aim of assessment is improvement over time. Assessment should be useful, actionable, manageable, and sustainable. We ask that you assess progress towards at least two of your department’s goals each year with an eye towards how you will assess other goals in the future. Over time, progress towards all goals should assessed every few years (ideally every 5 years or less).

Before you submit your Assessment Plan, we recommend that you complete this Academic Assessment Progress Report to guide discussion on sustainable assessment practices.

Download a copy of the Academic Assessment Planning Template here.

Step 6. Submit your Report and “Close the Loop"

Given what you’ve learned, where can changes be made? Assessment empowers faculty to make evidence-based decisions about program design and resource allocation.

“Closing the Loop” means identifying ways to use your results. Assessment findings may point to a program’s strengths or areas of concern. If the activity reveals the latter, what concrete actions will the department take to address this?

Use assessment results to spark meaningful conversation and collaboration among faculty as to what steps can be taken (content, structure, alignment, delivery, pedagogy, etc.), and how implemented changes might be assessed in the future. Other questions to address include:

  • In hindsight, was this a useful activity? What, if anything, should we do differently?
  • Are students meeting some program goals better than others?
  • Does the report indicate the actions to be taken based on the results of this activity?
  • Are there suggestions for future activities based on the results of this one?

Then complete and submit an assessment report. The annual report is an opportunity to share progress in key areas, request support, and scaffold the department’s periodic Academic Program Review Self-Study.

Download the Academic Assessment Annual Reporting Template.

*Remember, no individual faculty member has the sole responsibility for ensuring that students will acquire one or more of the program’s learning goals. Student attainment of learning goals should result from the collective learning experiences they engage in during their time at the college. Therefore, learning assessment must not be used to evaluate any individual faculty to measure the performance of any instructional program.

Support for Program Assessment

Assessment Coordinators guide departmental assessment committees and department chairs throughout the assessment process.

The Office of Institutional Effectiveness (OIE) provides data and applied research to support assessment projects.

The Assessment Council provides feedback to academic departments about their assessment plans.

Rebekah Chow, Associate Provost of Institutional Effectiveness, and Chris Hanusa, Faculty Liaison for Evaluation and Assessment, provide guidance on assessment plans and reports.

The assessment cycle at QC: Inquiry, Evidence, Insight, Action, Improvement

To learn more about program-level learning assessment methods, see our FAQs, browse our Assessment Resources, or check out our Events page for upcoming professional development opportunities. Also, feel free to peruse guides to program-level assessment at other colleges, such as Clark College or Carnegie Mellon.

Assessing Our General Education Program

A General Education at Queens College is an education in the liberal arts and sciences – courses that introduce students to the perspectives and knowledge of many disciplines. Our goals for providing students an education in the liberal arts and sciences have endured since the college was founded in 1937, even as the courses and requirements have changed over the years. Since 2013, entering freshmen and transfer students follow a liberal arts curriculum that fits the framework of the CUNY Pathways Initiative. Pathways course proposals are reviewed by several committees to ensure that they achieve the required set of Pathways Student Learning Outcomes, which represent the broad skills and knowledge that all undergraduate students are expected to attain as a result of their educational experiences at the College.

As the core of the undergraduate student experience, assessment of the general education program is of utmost importance. A recent Pathways Syllabi Project and Assessment of English 110 (First Year Writing at QC) indicate we need to rethink the general education program at QC. To this end, the Curriculum Strategic Planning Committee, a faculty- and staff-led working group, is tasked with determining how our General Education curriculum might be improved.