Science literacy has been a hot topic of conversation in education for several years now. As part of the national push for more STEM focus, science literacy encompasses a number of skills (as opposed to content) that are essential for STEM and other professions in the 21st century workforce.
Our science team centered its goal this year around science literacy:
Based on the fact that students currently score below the state and national averages on MCAS, AP, & SATII exams, our goal is to increase scientific literacy across grade levels. We will develop monthly assessments that measure proficiency in scientific literacy skills. We will review student performance on these monthly assessments and if 70% of the class does not receive a 75% or higher, we will reteach and reassess.
We made this year’s goal in response to the fact that our data shows our students are consistent unprepared for the level of rigor of high-level assessment, most of the time not due to lack of content knowledge but lack of skills in breaking down and interpreting complex texts, graphs, data, etc. The skill deficiency was also noticed by 12th grade teachers who get wave after wave of students who lack the skills for researching, writing, and defending their senior thesis.
The need for these skills is more urgent now for us as well because of the Common Core standards, and the accompanying PARCC exam. Last year, students struggled with both the ELA and Math PARCC pilot tests, again not due to content knowledge, but due to being unable to parse the question and figure out what was even being asked.
So our administration basically said, top-down from the skills we know they are missing in 12 grade AP, PARCC, and senior defense, everybody align all the way down in every grade, every content, every student.
Over the summer, we used two references as guidelines to construct a draft vertical alignment. Both are attached. The first is a pdf of the pages relevant to Scientific and Technical Literacy from the Common Core ELA standards, which are obviously PARCC aligned. These will serve as classroom-level guides on constructing tasks, assessments, projects, etc. All major projects and assessments should include components from this rubric.
The second is the NMSI Process Skills Progression chart, which is based on the NGSS Science and Engineering Practices. The nine skills are broken down into three levels of increasing abstraction: Factual Knowledge, Conceptual Understanding, and Reasoning & Analysis. We have loosely decided to base the assessments we will use to measure our Science Team goal on these skills. We will assess one of the nine skills per month, and try to establish a baseline set of data for what level our students are at on the progression in each skill by grade level. Then next year, we will use the baseline data as the starting point to construct a full vertical alignment of what needs to be taught by grade level and in what depth.
Both of these overlap very well with what we’ve been using to design projects until now, the Hess Cognitive Rigor Matrix (also attached). We will continue to measure our major projects against the Hess rubric.
That’s about all I really know at the moment, since we are just starting this initiative. I’ll try and update with any significant developments throughout the year as we continue to take a look at it.