There are multiple levels of assessment. Assessment of how students are learning can occur at the level of an individual classroom session, a semester long course, the culmination of learning throughout a major, or the learning that occurs in all courses a student takes from starting at CSULB until graduation. We've assembled a few websites here to help guide you in assessment at all levels.
The Program Review and Assessment division of Academic Affairs is a great resource when considering assessment at the programmatic (i.e., each major, each department) level. Each department submits an annual Assessment Report to convey the effectiveness of how a program meets its stated learning outcomes. This website offers links to learning outcomes for each department and our GE courses. It also offers guidance when creating learning or measuring the success of meeting learning outcomes.
According to the website, the "Program Review and Assessment Division helps departments and academic programs ensure students graduate with highly valued degrees by:
CSULB has larger learning outcomes that should link to the learning outcomes from each College, Department, and to some degree, each course. When you want to see the big picture as you create your learning outcomes, check out this link.
This AAC&U PKAL report provides links to several rubrics to assess student learning that are applicable to STEM (see the VALUE rubrics).
Written for ecologists, this website nicely defines formative vs summative assessment (good for those RTP files!).
If you are not already part of the Tomorrow's Professor List serve—get on it! This posting discusses a few ways to assess student learning (with the main example coming from changes in a calculus class at Berkeley). From the article: "A great way to keep students on track with their reading assignments is to assign what I call "Preparation Quizzes" -- I provide online, short quizzes, and they are open book. Students are required to complete the quiz prior to class. I can review the item analysis and determine where students were having difficulty with the reading -- and focus more class time on those areas."
These are examples of involving critical thinking, hypothesis testing, problem solving, etc. in MC exams. Just a few examples-- but helpful ones!
An excellent general guide from Vanderbilt University on writing good MC questions, including tips for distractors.
These two pages help demystify item analysis for MC tests (information Parscore Provides). The first page lists step-by-step analysis procedures for why item analysis is done, how to do it, how to interpret the results, how to crunch your own numbers and evaluate your test. The second gives definitions and tips on improving MC questions based on analysis of answers.
This is not well written and it is for English/Drama not STEM (bear with us a bit on this)-- but it does contain some interesting observations on how to approach writing an essay test. Useful tips on how to construct questions, convincing rationale for not allowing students to select among 2 or more choices for essays on a timed test, suggestions for using compare/contrast/defend [the hypothesis that], etc as opposed to more vague "discuss" or "explain" unless you really guide them with what you want, and the always important tip of writing your key before you send your test to be photocopied!
This short, bulleted ( & sometimes difficult to follow) list is an overall useful guide to scoring essay questions. In science and math, essay scoring is often "easier" (students get the correct answer or they don't); however, when asked to describe a process or pathway, it helps to no only have a key with the correct key words/ideas listed but also to attempt a more objective grading system.
This comprehensive assessment resource page from the Center for Teaching Excellence at Duquesne University has very useful information on grading, using rubrics, and writing essay questions.
Field Tested Learning Assessment Guide (FLAG) is a great resource for in class assessments and discipline specific learning tools for STEM educators.
More resources from:
This can be one you create from your own lecture material, or could be a general discipline specific one (for several diagnostic tests for physics, see Hestenes et al., 1992 - then scroll down to find out how to get the tests). For example, give a quick 10-question MC test the first or second day of class and then give it again at the end of class. A few caveats to think about: 1) if you give them no credit, they might not take it seriously (less of a problem if you keep the test short, but still can be an issue where students might simply answer C, C, C all the way down). 2) because many students will panic at being graded on a pre-test, you might consider a one or two point extra credit points assigned for completing both pre and post- or a point given only for improvement on the post. 3) however, if you tell them there is a post they might ‘bomb' the pre if you award points for improvement.
The one minute paper jot is a technique used across many disciplines. Students begin or end a class period with writing down what they don't understand. Teachers than can skim those papers and tailor the lesson accordingly. One idea for large classrooms might be to adapt this to a Discussion on Beachboard. Students could post the topic that they don't understand on the discussion board.
It's old (it is photocopied and actually looks like it was created on a typewriter?) and it's long (it's a handbook) but it does give nice background data on why assessment is needed and also some ideas to use in class.
Content knowledge, problem solving skills, critical thinking, technique ability can all be assessed- in fact, these types of assessments are included on most of our exams, laboratory reports, quizzes, papers, and other assignments. When you are thinking about your course, take some time to think what how you specifically want to improve about the course, and how that can be measured and analyzed. You can measure more than grades-- though overall final grade distribution (number of As-Fs), number of Ws, exam/quiz scores, performance on a particular problem/type of problem on an exam or assessment, class participation, performance on a large synthesis assignment like a paper or project, certainly are good items to consider.
You can study attitude as well as performance-- sometimes an improved attitude about math and science can go a long way! Don't forget to reflect on your own affect—how do you feel about teaching now that you've made this change. Since we aren't as familiar about creating surveys about how students feel about learning, turn to these or other expert references as you tailor surveys to fit your learning goal needs.
The background, scale items, and scoring instructions are all found here.
A mini paper with chemistry specific examples of in class assessment.
A great resource with surveys, ideas, and geoscience specific examples of assessment.
Writing Better Physics Exams by Stith, et al. in The Physics Teacher, v26 n3 p138-44 Mar 1988
(We have access via CSULB computer or library)
In class math assessment techniques (a list in alphabetical order!) for math courses at multiple levels.