How will you know that a student’s learning experience is successful as they progress towards mastery? You need to identify specific criteria that indicate learning has taken place and when completed, enable a quantifiable measure of student progress.
EXAMPLE OF CRITERIA:
(Criteria relates to Performance Indicators)
- (PI) Student is able to construct and implement well formed objects.
- (criteria) Object contains required structures like name, constructor.
- (criteria) Object created demonstrates composition.
- (criteria) Object created demonstrates polymorphism.
- (PI) Student is able to create valid classes that provide functionality defined in a specification.
- (criteria) Classes created compile.
- (criteria) Classes created execute without unhandled errors.
- (criteria) Classes created provide functionality defined by a specification that is not contained in the object definition.
- (PI) Student is able to develop code that adheres to the concept of separation of concerns.
- (criteria) Code created demonstrates effective use of encapsulation.
- (criteria) Code created demonstrates effective use of Polymorphism.
You will see this ordered approach built into the LRM Rubric builder design. The guardrails in place guide users to begin with Competencies, then move to Performance Indicators, then move to Criteria. Without this structure, data collected through rubrics would not map to competency.
When a rubric is selected, a list of criteria used in that rubric is displayed. Criteria may be re-used across many rubrics that are tied to the same Performance Indicator. From there, the user can define descriptors for each criteria and performance level for the selected rubric.
The Scales tab gives users access to Performance Scales used by the institution. We recommend that the institution creates scales that can be used by all Instructors or Curriculum Designers.