Showing posts with label instrumentation. Show all posts
Showing posts with label instrumentation. Show all posts

Monday, August 22, 2016

Decisions about Qualitative Coding



Groups of researchers worked with C-PEER in order to code archival documents for specified constructs, including: (a) STEM equity; (b) Teacher’s use of available time (TAUT); (c) English Language Learning (ELL) supports; and (d) Professional learning structures with coaching for Culturally Relevant Pedagogical (CRP). Taking guidance from Speer and Basurto’s (2012) calibration of qualitative data as sets for qualitative comparative analysis (QCA), researchers compared elementary schools systematically while trying to “give justice to within-case complexity” (Rihoux and Ragin, 2009; Speer, 2012, p. 156). Our research team decided not to perform a Qualitative Code Analysis, though the approach (thinking about how to define conditions related to outcomes) was an important part of our analysis framework. I contributed to the types of documents that I wanted accessed, and designed some codes based on expectations from current literature. I used those initial codes with my co-researchers and then refined them, continually performing multiple check-ins and corrections to be sure that I had inter-coder agreement. This was informally arrived at through code trials, discussions, and revisions of disagreements. This reflexive method of developing codes across coders allowed me to plan, code, monitor, and adjust throughout our coding sessions.

Data Collection Tools


We recruited seven urban elementary schools in an urban Colorado school district to participate in this study (N=7). Within these schools, we surveyed 186 teachers and educational leaders (n=105) for our sample subset. Schools were chosen using the Colorado Department of Education’s School Performance Framework. The seven participating schools ranked as follows: three “Performance Level” schools, two “Improvement Level” schools, one “Priority Improvement Level” school, and one “Turnaround Level” school. Using a range of elementary schools (multi-site) allowed researchers to analyze STEM foundational thinking and instructional activities as a comparative case study. Results from student perception surveys, leader and teacher surveys (quantitative data) were analyzed against students achievement and school academic growth data from school’s Unified Improvement Plans (UIP) and relevant trend (qualitative data), and archival structure data (e.g.: school schedules and team and committee workflows). These include observational data from classroom visits and curriculum and course schedule archives. In collaboration with C-PEER, researchers followed a short-cycle, iterative approach to research, working in collaborative learning teams. C-PEER and doctoral candidates formed research teams in order to co-design and create survey items.

Using Qualtrics Survey software, we administered the effective learning communities’ teacher survey by sending links out to building principals, to disseminate to their staff. This allowed for the greatest number of teachers to participate while remaining anonymous. Teachers were randomly assigned to take one of two versions of Qualtrics survey. This had the simultaneous effect of increasing the total number of items surveyed and keeping the survey short. We wanted teachers to stay engaged throughout the survey. The questions on each survey were similar in context (e.g.: STEM), but random in question order. For example, we grouped survey questions into nine areas: (a) lesson preparation; (b) staff collaboration; (c) student use of feedback and reflection on learning; (d) academic work relevant to students; (e) teacher access to instructional resources; (f) teachers engaging students in problem solving; (g) students participating in problem-solving activities; (h) teacher engagement with students’ families; (i) students showcasing mastery of academic content. This design supported our collection of reliable and valid data.

Quantitative Data Analysis


The teacher response rates for the Effective Learning Teacher Survey are between 37% and 100%. With regards to STEM-foundational thinking, teachers at each elementary school answered as follows: Annie Easley: 84%, Benjamin Banneker: 100%, Richard Spikes: 39%, Aprille Ericsson: 81%, Mae Jemison: 50%, Shirley Jackson: 37%, and Elijah McCoy: 50%.

Monday, August 8, 2016

Planned Study Design

Our proposed research design was to use a mixed-methods, multi-site, comparative case study, using both quantitative and qualitative processes. Examining both quantitative and qualitative methods allowed for a more complete analysis of the research questions and findings (Tashakkori & Teddlie, 1998), as well as provide a broader basis for generalization of results (Simons, 1996). For example, this study intended to answer: (a) What elementary school structures support students in STEM curricular areas? (b) Do these supports differ for subgroups of students, i.e. students of color, students in poverty, and English language learners? (c) What are the components of elementary STEM opportunities to learn that foster interest, participation, and academic success in STEM content areas, especially for marginalized students of color? Our mixed-methods approach was guided by these research questions, and “ultimately reflect[ed] a value of both subjective and objective [STEM] knowledge” (Johnson & Onwuegbuzi, 2004; Tashakkori & Teddlie, 2003). We planned to use data sources centered on (a) recruiting schools to develop a focused priority research agenda; (b) conducting trend analyses of participating schools; and (c) a collaborative analysis of the research questions and recommendations for further action. It is important to note that mixed methods studies, such as this one, are strengthened when research teams are comprised of individuals from a variety of disciplines (Simons, 1996). This allows researchers to engage in a “mixed methods way of thinking” while discussing “different ways of seeing, interpreting, and knowing” about the data (Sammons, 2010, p. xi). Our C-PEER research team includes doctoral candidates from various K12 backgrounds including expeditionary learning, professional development, distributive leadership, and coaching teams. 



          Planned instrumentation. The C-PEER team planned to use instruments organized with specific constructs. For example, teacher surveys will measure the shared values and vision of teaches, as well as support conditions and relationships within the building. We also intended to collect information from both teachers and administration using the (Teaching, Empowering, Leading, and Learning) TELL Colorado surveys. These anonymous instruments would allow researchers to assess teaching conditions within school buildings, as well as throughout participating districts. Since these surveys would be designed to support school and district improvement planning, as well as inform policy decisions, our hope is that they would be extremely reliable and valid measures. It is our intention to design this study that will yield high-quality evidence for educators and school districts.