Core Values

  1. Liberal Arts Foundation
  2. In-Depth Programs of Study
  3. Global Perspective and Diversity
  4. World of Work
  5. Community
  6. Leadership
  7. Service

Learn More about our Core Values

Family Friendly Marietta

Learn about living in Marietta, Ohio, and the Mid-Ohio Valley.




Online Giving

Assessment Definitions

Terms associated with:


Terms associated with educational objectives and rubrics

A variety of terms express the educational objectives of institutions, programs and courses. These objectives are expressed in different ways for different audiences. The more rigorous and precise phrasing used for standards, outcomes and performance levels allows for application of assessment tools.

Mission - The statement of the broadest educational intentions of an institution or program that describe the attributes of students upon graduation. For example, at Marietta College our Mission Statement is:

"We achieve this mission by offering undergraduates a contemporary liberal arts education and graduate students an education grounded in advanced knowledge and professional practice. Intellectual and creative excellence defines the Marietta experience."

Educational Goals - Goals are more narrowly defined than a mission statement to more specifically identify objectives of a learning experience. Our Institutional goals are expressed in the Seven Core Values and General Education Curriculum.

Outcomes - Outcomes are skills and attributes students demonstrate as evidence of achieving an assessment goal. Outcomes are often classified into three categories based upon the type of learning being assessed:

  • Cognitive outcomes - What should students know?
  • Affective outcomes - What should students think or care about?
  • Behavioral outcomes - What should students be able to do?

Outcomes are also classified based upon the educational level to which they apply:

  • Institutional Student learning Outcomes (ISLOs)
  • Program Student learning Outcomes (PSLOs)
  • Course Student learning Outcomes (CSLOs)

Institutional Student Learning Outcomes

We have seven sets of ISLOs, each focused upon a particular type of learning objective:

  • Inquiry skills
  • Communication skills
  • Quantitative Reasoning skills
  • Competence in Cultural and Social Diversity
  • Practice of Ethical and Responsible Citizenship
  • Artistic Literacy
  • Critical Thinking

Critical Thinking is treated as a "meta-group", and draws upon selected assessment data from the other areas. As examples, these are the Assessment Outcomes for two ISLO sets:

Communication Skills Set Inquiry Skills Set
1. Responsiveness to Purpose1. Selection of question
2. Conveyance of Central Message2. Use of Information Sources
3. Demonstrated Understanding of Content3. Identification of Thesis
4. Application of Disciplinary Conventions4. Application of Inquiry Methodology
5. Use of Source Information5. Analysis of findings*
6. Use of Syntax and Diction6. Drawing of Conclusions*
7. Quality of the Delivery
*Asterisk indicates an outcome that contributes to assessment of critical thinking


A rubric is a tool used to assess student learning. It consists of a specific outcome and descriptions of performance levels that reflect different levels of learning achievement. For example, the rubric for the "Responsiveness to Purpose" outcome of the Communications Skills set looks like this:

Performance levels
Responsiveness to PurposeDemonstrates a thorough understanding of context, audience, and purpose that is responsive to the assigned task and focuses all elements of the work. Demonstrates an appropriate consideration of context, audience, and purpose that is focused on the assigned task. Demonstrates a partial consideration of context, audience, and/or purpose that is focused in a limited way on the assigned task. Demonstrates a minimal consideration of context, audience, and/or purpose that lacks focus on the assigned task.
Interpretation of the performance levels
4 - Demonstrates a quality of work that might be expected of an experienced professional or an unusually capable individual.
3 (Target) - Demonstrates a quality of work that we expect for our graduating seniors.
2 - Demonstrates some of the desired attributes, but with clearly recognizable deficiencies.
1 - Deficiencies predominate, although there is some demonstration of the desired attributes.
0 - Work does not demonstrate application of appropriate process or skills to the task; in other words, no demonstrated attributes.

Terms associated with types of assessment

Authentic assessment - a form of direct measure, defined by John Mueller as "A form of assessment in which students are asked to perform real-world tasks that demonstrate meaningful application of essential knowledge and skills."

Formative assessment - doing 'real-time' assessment to evaluate how well skills and concepts that are being learned, with results ("feedback") given back to students during the course to improve the learning process.

Summative assessment - doing assessment upon completion of an assignment or conclusion of a course, where the results are used to improve the teaching process in future semesters. Summative assessment but can also be used as part of the grading process.

Terms associated with assessment tools

Portfolio - an archived collection of work submitted by a student for assessment purposes.

Direct measure - measures of students' actual knowledge, skills, and/or behaviors.

Indirect measure - measures of perceptions, beliefs, or opinions of students' knowledge, skills, and/or behaviors.


Terms associated with assessment quality

Assessment Reliability - refers to how reproducible are the results. Forms of reliability include:

  • Intra-rater (How consistently can an assessment tool yield similar results when used by the same instructor?)
  • Inter-rater (How consistently can an assessment tool yield similar results when used by the different instructors?)

Assessment Validity - The degree to which evidence and theory support the interpretations of assessment result; or, more simply, how well does an assessment tool measure what is intended to be measured. Forms of validity include:

  • Content validity - Do the skills/abilities being measured actually reflect the learning outcome being assessed?
  • Construct validity - Are the assessment tools properly designed and sufficient to measure the learning outcome being assessed?
  • External validity - Does an assessment tool yield results similar to those of other assessment tools measuring the same indicator of a learning outcome?
  • Predictive validity - Do assessment results predict students' success in applying the learning outcome in a job or other 'real-world' setting?

Assessment Fairness -



Other Assessment Terms

Assessment Cycle - The schedule by which data for different assessment goals or outcomes will be summarized and reported on. Collection of assessment results is an ongoing process.