SCIENCE CLASSROOM
J.Fox
Inquiry Cycle 1:
CTSP - Effective Lesson Design
Universal Design for Learning (UDL)
Standard - 4 CTSP
Planning Instruction and Designing Learning Experiences for All Students
Element 4.4
Planning instruction that incorporates appropriate strategies to meet the learning needs of all students
​
Dates: 22 September 2022 - 24 November 2022
DRIVING Question
​
How can I create a supportive environment for multilingual learners and students with learning differences to access the information and material within the module, while also being able to show mastery on assignments?
​
How did I plan to close the gap?
​
Incorporating the full domains of language development: listening, speaking, reading, & writing through various strategies implemented within a module.
Students will have the opportunity to listen to teacher modeling pronunciation of vocabulary and proper use.
Students will have the opportunity to speak to peers using vocabulary accurately in a sentence.
Students will have the opportunity to read segments of text using the vocabulary to build context for understanding.
Student will have the opportunity to write a comprehensive paragraph utilizing the vocabulary at the end of the module.
Reflection​
​
To answer the driving question and incorporate supportive practices of vocabulary development that enable access of the information, I sampled 9 student's midterm exam scores that were low-performing, middle-performing, and high-performing, including multilingual learners and students with learning differences. Each student sample was analyzed by looking at their score of a 4-question matching vocabulary section and their written responses on the exam. After implementing language-supporting activities, I reassessed their vocabulary use in a mastery assignment at the end of a recent unit.
​
The 4-question matching vocabulary section was analyzed by counting number of correctly answered questions out of the 4 possible points. The students who were middle-performing and high-performing, all answered the 4 matching questions correctly. The students who were low-performing answered 2 questions correctly.
​
The free-response were analyzed with a set of standards including attempts to use vocabulary and correct use of vocabulary. The median attempt for high-performing was 5 and the median for the correct use was 4. The middle-performing group had a median of 3 for attempts and a median of 3 for correct use. The low-performing group had a median of 1 for attempts and 0 correct uses of the vocabulary.
​
What this data demonstrated was that there were attempts by students to use vocabulary in most cases, but students who had a better grasp of the meaning of the vocabulary were able to use the vocabulary correctly in context. Three support structures for increasing understanding of vocabulary in context that I have implemented for the past two units were "vocabulary call-outs" and "vocabulary sentences", and writing prompts. Vocabulary call-outs are a call-and-response activity where students hear me say the vocabulary word out loud and then practice saying it out loud. Vocabulary sentences involves small-groups of students creating sentences using the vocabulary word in context, without redefining the term. As the groups are discussing their sentences, I monitored and asked two different groups to share out their sentences for each vocabulary word. This aligns with my closing-the-gap goals of intentionally providing space for students to be able to listen, speak, and understand the context of a vocabulary word relating to science. The writing prompt takes a sample free response question that connects the previous unit to the current unit so that students can actively bridge their understanding as well as create connections between information learned.
​
At the end of the unit, there are mastery assignments that will be providing opportunities for students to construct paragraphs and responses utilizing their vocabulary. This assignment was also analyzed with the same set of students to compare their growth. The high-performing students had a median of 5 attempted use and 4 correct usage of vocabulary from the module. The middle-performing students had a median of 7 attempted use and 6 correct usage of vocabulary. The low-performing students do not have data because at the time, they had not submitted their mastery assignment. Noting these data, the high-performing students' medians stayed constant while the middle-performing students saw a growth in their attempts and correct use. This indicates that the language supports I implemented in class were able to aid in their understanding and language skills, specifically benefitting the middle-performing group of students.
​
As a result of this inquiry cycle, I want to focus on increasing submission rates for assignments with a focus on more effectively supporting my multilingual students (MLLs) and students with learning differences (SWLDs) so that I can determine the effectiveness of the current language supports and adjust accordingly.
​
Inquiry Cycle 2:
CTSP - Designing Student Centered Learning
Universal Design for Learning (UDL)
Standard - 4 CTSP
Planning Instruction and Designing Learning Experiences for All Students
Element 4.4
Planning instruction that incorporates appropriate strategies to meet the learning needs of all students
​
Dates: 15 December 2022 - 2 February 2023
DRIVING Question
​
How can I increase submission rates for lower-performing students, students with learning differences, and multilingual learners, so that I can better identify gaps in their understanding and needed supports in the classroom?
​
How did I plan to close the gap?
​
I plan to incorporate intentional monitoring of student progress through answering the questions on assignments as well as specifically monitoring within chunks of time during the block. The outcomes of this monitoring will be recorded and analyzed to understand the impact on student success.
​
Reflection
​
While focusing on increasing submission rates, I also wanted to implement more monitoring of progress through assignments to see if there was a disconnect between completing work in class and then submissions after a work day. In the image above, it illustrates the data of progress through an assignment during a monitored work day. For example, the Module 8 Land & Water mastery assignment is represented on the image below and shows both the question number they reached on Thursday, being from 0 (no questions completed) to 5 (all 5 questions completed), N/A represents the students being absent, or having a meeting during class so they did not have the full time to work on their assignment. The breakdown of the number of questions and on-time submissions is as follows:
​
5 Questions: 10 students; 7 on time
4 Questions: 6 students; 4 on time
3 Questions: 8 students; 7 on time
2 Questions: 14 students; 5 on time
1 Question: 8 students; 5 on time
0 Questions: 12 students; 1 on time
N/A (absent): 2 students; 2 on time
With these numbers in mind, I also tracked a
Lab assignment that demonstrated mastery
of the concepts in class discussing wastes
created by humans. The breakdown of
questions and on-time submissions is as follows:
​
4 Questions: 29 students; 25 on time
3 Questions: 17 students; 14 on time
2 Questions: 3 students; 3 on time
1 Question: 2 students; 0 on time
0 Question: 2 students; 0 on time
N/A (absent): 7 students; 4 on time
​
​
Initially looking at this information, I found that students were able to make progress during class on the assignment, but not everyone who completed questions, submitted the assignment on time. The consistent monitoring was beneficial because it helped students be more on track with making progress on the assignment in class and allowed a view of a different gap. In monitoring which questions students had completed, it created space to ask more pointed questions of the student to determine if they needed more specific assistance or needed to increase their comprehension. One that demonstrated it might be a different set of barriers to submissions that aren't directly related to classroom management.
Inquiry Cycle 3:
CTSP - Engagement Strategies for Learning
Engaging and Supporting All Students in Learning
Standard - 4 CTSP
Planning Instruction and Designing Learning Experiences for All Students
Element 1.5
Promoting critical thinking through inquiry, problem solving, and reflection
​
Dates: 9 February 2023 - 15 March 2023
DRIVING Question
​
How will intentional monitoring and prompting impact student submission rates and demonstrations of mastery assignments?
How do I plan to close the gap?
​
I plan to incorporate intentional monitoring of student progress through answering the questions on assignments as well as specifically monitoring and checking in with students about their thought progress for the assignment. The outcomes of this monitoring will be recorded and analyzed to understand the impact on student success.
​
Reflection​
​
This was the collected data from student's most recent submissions for their mastery assignment. In total, there are 60 students in AP Environmental Science. For this deadline, 48 students turned their assignment in on time, which is an increase from the previous module submission rate. The assignment had 3 questions with some reading and analyzing articles, as well as a request to draw conclusions to the information that was discussed in class.
​
During class, I was monitoring each student and circulating the classroom. This enabled me to be able to prompt students if they were stuck on answering questions and needed a direction for their writing or clarification of a topic. At the end of both blocks of AP Environmental Science, the totals were as follows:
3 Questions: 8 Students
2 Questions: 25 Students
1 Question: 20 Students
N/A: 7; means the student was not in
class or had a meeting during that time.
​
To assess mastery demonstration by students,
there was a rubric aligned to concepts covered
in class detailed by the College Board for the
unit. Most of the standards were scored as one
point, some of the more complex standards
were scored as two points. Within the scoring,
there was a range of a 5-mark scale.
​
​
0 = no marks; not mentioned
1 = few marks; mentioned briefly, and
not defined
2 = some marks; mentioned briefly, and
partial definition (incomplete or wrong)
3 = half marks; mentioned, and defined
4 = most marks; mentioned, defined, and
partially explained
5 = full marks; mentioned, defined, and
fully explained
​
​
Overall, a shortened mastery assignment with clear monitoring allowed students to submit their assignment on time, while also asking for clarification during class. In my reflection, I will be discussing their demonstrations of mastery. The average score in the grade book was 11.5 points out of 15 points, meaning that the percentage was 76.7%. This indicates that while there were more submissions of the assignments, there is still a bit of a gap with demonstrating mastery of information. Overall, students were able to demonstrate most of their understanding.
​
Inquiry Cycle 4:
CTSP - Assessing for Learning
Assessing for Student Learning
Standard - 5 CTSP
Assessing Students for Learning
Element 5.4
Using assessment data to establish learning goals and to plan, differentiate, and modify instruction
​
Dates: 13 April 2023 - 11 May 2023
DRIVING Question
​
How can the data I collect from assessments inform my adjustment of instruction to prepare students for taking the APES exam?
How do I plan to close the gap?
​
I plan to provide students with real-time feedback based on the results of the College Board AP Prep progress check quizzes as well as time in class to review concepts from previous units paired with the progress check quizzes. With this information, I plan to illustrate to students where their gaps of understanding are so they can target those before taking the AP exam.
​
Reflection​
​
The image below illustrates all progress checks taken by students who took the AP Environmental Science exam. There were in total, 9 units to review and complete a progress check. Some of the units had two parts because there was more information. The key for the color-coding is based on the percent of correctly answered questions.
​
Dark Green is 75%-100% and equivalent to a 4 on the exam
Light Green is 50%-74.99% and equivalent to a 3 on the exam
Light Yellow is 25%-49.99% and equivalent to a 2 on the exam
Yellow is 0%-24.99% and equivalent to a 1 on the exam
​
The scores are assorted by progress check number, number of questions answered correctly, and by the percentage received. There are different numbers of scores for each progress check because not all students who were taking the AP exam were able
to finish taking the progress checks
by the exam date. Most students
were able to take the progress checks
which allowed them to focus their
studying and target their gaps or
incomplete understandings.
​
When monitoring the results of the
progress checks preparing for the
AP Environmental Science exam,
I was able to analyze students'
trajectories for the topics within
these units. As I was looking through
my students' results, it became clearer
which topics students struggled with
not only collectively, but also
individually. This platform allows
for more insight into students'
understandings and abilities to
apply their knowledge to practice
exam questions.
It also demonstrated that this type of practice would be beneficial to assign a pre-quiz before the unit and a post-quiz at the end after learning the information. I think the pre-quiz would give time for instruction to be adjusted to incorporate information that bridged their background knowledge and the new content. The post-quiz would give insight into how they built their understanding and if there is a need to work in reteaching with the next unit or a need to rework a lesson in the future. Through this inquiry cycle, I learned more about how you can use data to inform choices about instruction and re-teaching in the classroom. It made it more accessible and I will be using this practice in the future to ensure that I am supporting my students as effectively as possible.
​