Sunday, November 30, 2014

Assessing creativity and critical thinking

Our middle school has two fifth grade teachers who are using project-based and mastery-based learning in a subject-integrated classroom that they are piloting this year.  For a long time, I have been obsessed with these approaches, and when the chance came last year to bring the idea to life at the middle school, I wanted to be a part of it in whatever way I could.  From my own experience developing new curriculum on the fly, I knew the classroom teachers would be completely exhausted just trying to get through the next days, and collecting and analyzing data that tracks the long-term goals would likely take a back seat.  Thus, I became the data guy for the pilot.

Mastery learning and PBL are simply approaches, not ends in themselves.  The biggest objectives of the pilot were to increase critical thinking and creativity amongst students (without sacrificing core content skills).  Unfortunately, these are hard things to measure and there seems to be no "gold-standard" assessment out there.  Of the assessments that do exist, many are locked up and cost a large, ongoing fee.  We wanted something that we could use for years to measure the pilot against the control group of other 5th graders in the district with little to no cost.

As much as possible, we also wanted the assessments to be reasonably objective to score, though we would trade off some precision for a task that required authentic thinking and creativity.  Our goal was to measure class performance (averages and the distribution), not individual performance, which meant that any individual score did not need to be perfectly precise if we had a large enough sample size to average together (thank you Central Limit Theorem).

Before we could measure these 21st century skills, we needed to get beyond the Twitter definitions (no offense Tweeps) and get something concrete.  We started with the 6-12 Critical Thinking Rubric for PBL and the 6-12 Creativity and Innovation Rubric for PBL, both from the Buck Institute.  To further clarify critical thinking and reasoning, I constantly reference this page from criticalthinking.org and this very brief summary from Robert Ennis.  Creativity is not as studied and written on as critical thinking, but one expert I kept finding my way back to is E. Paul Torrance.  The Torrance Tests of Creativity focus on a student's ability to problem-solve and think divergently, but they get very specific as well.  Though I thought the tests themselves were too canned and non-authentic, the criteria for creativity laid out on the Wikipedia page provide an awesome start to a task rubric.

Armed with some definitions that the team agreed upon, I started my desperate search for authentic tasks that could be assessed.  I was fortunate enough to stumble on the Ennis-Weir critical thinking assessment at 2am one night.  It is a letter from a citizen to the editor of the newspaper making an argument for a change in city policy.  Students are asked to read the letter and respond, paragraph by paragraph, to whether or not the author is exhibiting "good thinking" or not and why.  It is a very clearly defined task, the task does a good job of assessing a student's ability to read and think critically, and there is a reasonably clear rubric that helps you assess each response.   Because it is targeted at an older audience, I modified it from an 8-paragraph to a 5-paragraph letter, added worksheet-style structure, greatly simplified the language, and made the "reason" section partially multiple choice.  Before starting the task, I will read an instruction sheet with the definitions of all of the choices so students have at least some familiarity with the terms.  See the example task I created.  See the rubric used to score the task.

For creativity, I never did find a good starting point online.  Fortunately, I knew quite a bit about one of the tasks designed by E. Paul Torrance thanks to spending grades 7-12 competing in Future Problem Solving (FPS).  The "team global issues" component asks groups of four students to spend a couple months researching a given topic of future significance such as water, nutrition, or cyber security.  On the day of competition, the team is put in a room together with two hours to read and respond to a futuristic scenario related to the topic of study (such as this one about Space).  As a team, you need to generate up to 16 possible challenges you see in the scenario in as many categories as possible.  Next, you choose one challenge that seems most important and write it as your "underlying problem": Since    (statement from scenario)   , how might we    (verb phrase)    in order to    (purpose)    for    (topic, place, time)   .  Continuing from here, you write up to 16 solutions to your underlying problem in as many categories as possible.  To finish, you write 5 relevant criteria to judge your solutions, rank the solutions against each criterion, and elaborate a two-page explanation of your winning solution.  For a painfully detailed guide to assessing an FPS booklet, read this.

I modified the FPS process to make it something any student could do, without more than a few minutes of training, in 30 minutes.  Students are given a 2-sentence description of a new invention.  Then, they need to generate up to 7 possible groups of people who might want to use the invention, choose the most important one and give a reason, generate up to 7 possible improvements on the original invention that will specifically benefit this user group and explain how it does so, write 4 criteria for judging improvement ideas, defend why his or her favorite idea is good based on the criteria, and elaborate on the favorite idea.  This simplification cuts out a lot of the beauty in the FPS process, but I think it serves as a simple and powerful diagnostic of divergent and reason-based convergent thinking.  See the example task I created.  See the rubric used to score the task.

For both assessments, I made a complete example version.  Note that these examples are not used with students -- the format is identical, but the exact content of the letter or invention is changed to prevent advanced preparation in the topic.  The intention of this is to allow for public critique of the process and make the test and rubric as open-source as possible.  Just as it once was in software, open source assessment is against the grain of common practice.  People worry that if they give away too much detail about the test, that people will just teach to the test, but this is what I want!  If the test is very well designed, and teachers teach to the test, then students will learn critical thinking and creativity skills.  They will buy into the assessment as a valid measure of what they are trying to improve and will use the results to change their practice when it is not working.  I want to design something that impacts instruction, not something that ends up in a buried Excel file.  By opening up the design, I also get to tap into the thoughts of the rest of the world and get critiques from people who know a lot more about this than I do.  It strengthens the assessment and validates its accuracy.  If other people want to start using it or derivatives, it could also lead to a larger pool of comparative data, improved scoring rubrics, and a community around free K-12 assessments of creativity and critical thinking.  Keeping the design of the test cloaked in mystery offers so little upside for everything it misses out on.

Finally, my charge to you: please comment, critique, and question both the tasks I made and the process I used to develop them.  If you are also interested in this kind of assessment, please reach out and let's work together to further refine these tasks and rubrics to make them truly awesome.  I will admit that I only have a couple days to make changes as I will start administering the pre-assessments later this week to the 5th graders.  However, long-term I am more interested in an improved version that is a better measure of student thinking and creativity and can better guide teachers who care about helping their students improve these skills.