2.8 Data Analysis
Candidates model and facilitate the effective use of digital tools and resources to systematically collect and analyze student achievement data, interpret results, communicate findings, and implement appropriate interventions to improve instructional practice and maximize student learning. (ISTE 2h)
Artifact: Data Overview
I created the Data Overview assignment in the ITEC 7305: Data Analysis & School Improvement course. This was an extensive artifact to create mainly because it required so much data to disaggregate through. For this assignment I had to locate and review both demographic and student achievement data for my school as measured by the statewide accountability test. This assignment had to include three to five years of longitudinal data for comparison. Since this course was taken during the summer access to this data was limited, so I had to rely on data from our previous CRCT scores.
I began this project by communicating with my System Testing Coordinator, Michael Huneke. During our phone conference I learned what data was available to me without having to petition our Associate Superintendent for access. With his help I began collecting, analyzing, and interpreting the available data from the GOSA State Report Card. I initially wanted to use our Measure of Academic Progress (MAP) scores, but we have only been using this testing instrument for three years. Unfortunately, the third year’s data was not available and I would need to petition our Associate Superintendent for access, which could take weeks. I collected CRCT data for grades 3-5, but while I was analyzing this data I decided to focus on just fourth grade in mathematics. In the Data Overview I tried to create a concise summary of our student body and how the demographics and individual subgroups play a role compared to our school system. Within this Data Overview I also included our achievement results while highlighting our strengths and weaknesses, areas of improvement and celebration, and I encouraged my staff to discuss some of the reasons for the data, instructional implications, and what steps we should take next. To model and implement our schools progress I had to become proficient in excel and disaggregating data.
While I was creating this Data Overview I felt overwhelmed and excited all at once. I enjoyed drilling down into the data to understand our student’s performance. I felt that I presented this data in a logical manner to help my audience understand what they were going to see and what I hoped to accomplish with this project. While reflecting on this assignment I believe that I could have modified my graphs more. All of my graphs contained most of the information required except the number of students tested. I included this information in the notes section, but not on the actual graph. This number would help the audience understand whether 30% of ED students represent many or a few students.
I shared this Data Overview with my administration and our School Improvement Specialists this year. This data was shared, along with our MAP scores, with the grade level teams and they were able to make some necessary changes to our instructional planning and available interventions for our students. This year our data PLCs are focused to increase the percentage of students reaching typical and high growth in both reading and math. The impact can be assessed by reviewing our current School Improvement Plan and our student’s academic progress this year.
I began this project by communicating with my System Testing Coordinator, Michael Huneke. During our phone conference I learned what data was available to me without having to petition our Associate Superintendent for access. With his help I began collecting, analyzing, and interpreting the available data from the GOSA State Report Card. I initially wanted to use our Measure of Academic Progress (MAP) scores, but we have only been using this testing instrument for three years. Unfortunately, the third year’s data was not available and I would need to petition our Associate Superintendent for access, which could take weeks. I collected CRCT data for grades 3-5, but while I was analyzing this data I decided to focus on just fourth grade in mathematics. In the Data Overview I tried to create a concise summary of our student body and how the demographics and individual subgroups play a role compared to our school system. Within this Data Overview I also included our achievement results while highlighting our strengths and weaknesses, areas of improvement and celebration, and I encouraged my staff to discuss some of the reasons for the data, instructional implications, and what steps we should take next. To model and implement our schools progress I had to become proficient in excel and disaggregating data.
While I was creating this Data Overview I felt overwhelmed and excited all at once. I enjoyed drilling down into the data to understand our student’s performance. I felt that I presented this data in a logical manner to help my audience understand what they were going to see and what I hoped to accomplish with this project. While reflecting on this assignment I believe that I could have modified my graphs more. All of my graphs contained most of the information required except the number of students tested. I included this information in the notes section, but not on the actual graph. This number would help the audience understand whether 30% of ED students represent many or a few students.
I shared this Data Overview with my administration and our School Improvement Specialists this year. This data was shared, along with our MAP scores, with the grade level teams and they were able to make some necessary changes to our instructional planning and available interventions for our students. This year our data PLCs are focused to increase the percentage of students reaching typical and high growth in both reading and math. The impact can be assessed by reviewing our current School Improvement Plan and our student’s academic progress this year.