Monday, August 29, 2016

Final Instrumentation (Effective Learning Teacher Survey for STEM-related Outputs)

The Effective Learning Teacher Survey (ELTS) was co-designed with C-PEER and doctoral candidates/researchers as a way to study comparative perspectives on teaching and learning in public schools. For STEM-related outputs, survey constructs were adapted from the Teaching and Learning International Survey, looking at STEM-foundational thinking and instructional activities through the lens of teachers. Researchers wanted to examine how STEM-foundational thinking is perceived and implemented in elementary schools and classrooms. Teacher survey questions were split into two survey blocks. Questions were randomized in order to improve the survey’s validity and reliability. The use of two survey blocks decreased the amount of time required by participants so that the survey could be completed 20 minutes or less. Each ELTS block (see Fig. 15) is approximately 40 questions. Survey questions related to STEM-foundational thinking utilized a five-point Likert Scale where 1 is “Almost Neve,” and 5 is “Almost Daily” (see Fig. 15). Teachers who took the survey were de-identified through the use of a participant constructed, confidential identification code, which allowed researchers to connect responses across schools without providing any identifiable personal information. Instructions for completion of the survey were accessed by the subjects at a specified URL. The results of the survey were compiled and analyzed by C-PEER.

These features of the ELTS need to be taken into account when interpreting the results. For example, while teacher responses offer insight to the culture of a building, they are still subjective data points akin to interviewing individual teachers. C-PEER and researchers took great care in designing this instrument in order to ensure that the data are reliable and valid across elementary schools.

Effective Learning Leader Survey

The Effective Learning Leader Survey (ELLS) was also co-designed with C-PEER and doctoral candidates/researchers as a way to better understand the role that participation in teacher leadership networks plays in supporting and retaining effective teachers in urban schools. Researchers wanted to understand how opportunities for collaboration and leadership (within and beyond the classroom) can increase teacher efficacy and effectiveness for STEM-foundational thinking, while improving the retention of highly effective teachers. The ELLS was designed to be completed in less than 20 minutes, and is approximately 30 questions (see Fig. 16). ELLS questions related to the STEM-foundational thinking and instructional activities utilized a four point Likert Scale, (1 is “Not at all true,” and 4 is “Very much like my school”), as well as a text response and multiple choice. The ELLS was also administered online using the Qualtrics Survey Software. Participants who took the survey were de-identified through the use of a participant constructed, confidential identification code, which allowed researchers to connect responses across schools without providing any identifiable personal information. Instructions for completion of the survey were accessed by the subjects at the provided URL. The results of the survey were compiled and analyzed by C-PEER.

Student Perception Survey

The Student Perception Survey (SPS) is designed to provide important feedback regarding teacher behaviors and the classroom environment. SPS results can point to strengths and opportunities for greater growth for teachers’ pedagogical practice. I focused primarily on how students perceived STEM-foundational instructional activities in the classroom. The survey (see Table 1) is comprised of 30 questions and can be administered in 45 minutes. According to the school district’s LEAP Handbook, “the SPS is administered once per year in the late fall to students in grades 3-12,” in order for administration and teachers to use results from the survey to make adjustments to instructional practices (LEAP, 2015). Responses are scored on a four point Likert Scale where 1 is “Never,” and 4 is “Always.” For this focus-study, the responses for items under the SPS construct of Facilitates Learning were analyzed. The results of the survey were compiled by the school district. Results for the participant schools provided by the district were analyzed for variance and correlation by C-PEER. The STEM-foundational thinking questions were derived a variety of sources (see Technical Report) (Seidel, et al. 2016).

LEAP Framework for Effective Teaching

Researchers specific to analyzing STEM-foundational thinking, examined results for teachers in participating schools on nine indicators on the framework for effective teaching (LEAP, 2015). From the Framework for Effective Teaching Observation Domain, these include: 
  • LE.1: Demonstrates knowledge of, interest in and respect for diverse students’’ communities and cultures in a manner that increases equity (Positive Classroom Culture and Climate).
  • LE.3: Implements high, clear expectations for students’ behavior and routines (Effective Classroom Management).
  • I.1: Clearly communicates the standards-based content-language objective(s) for the lesson, connecting to larger rationale(s) (Masterful Content Delivery).
  • I.2: Provides rigorous tasks that require critical thinking with appropriate digital and other supports to ensure students’ success (Masterful Content Delivery).
  • I.3: Intentionally uses instructional methods and pacing to teach the content-language objective(s) (Masterful Content Delivery).
  • I.4: Ensures all students’ active and appropriate use of academic language (Masterful Content Delivery).
  • I.5: Checks for understanding of content-language objective(s) (High-Impact Instructional Moves).
  • I.6: Provides differentiation that addresses students’ instructional needs and supports mastery of content-language objective(s) (High-Impact Instructional Moves).
  • I.7: Provides students with academically-focused descriptive feedback aligned to content-language objective(s) (High-Impact Instructional Moves
  • I.8: Promotes students’ communication and collaboration utilizing appropriate digital and other resources (High-Impact Instructional Moves
Teachers are evaluated on a four-point rubric, where 1 is “Not Meeting” and 4 is “Distinguished.” The results of these evaluations for teachers in the district were compiled by the district and analyzed for each participating school. Researchers analyzed school-by-school factor runs ANOVA scale checks and possible correlation.
Figure 15. Sample questions from teacher survey

Figure 16. Sample questions from leader survey
SPS Category
My teacher listens to me.
Supports Students
My teacher helps me understand my mistakes so that I can do better next time.
Facilitates Learning
My teacher makes sure that the class rules are clear.
High Expectations of Students
My teacher makes learning interesting.
Facilitates Learning
In my teacher's class, I have to work hard.
High Expectations of Students
Q06: My teacher explains what we are learning and why.
Facilitates Learning
My teacher ignores me (reverse-coded).
Supports Students
My teacher wants me to think about things I learn and not just memorize them.
Facilitates Learning
My teacher encourages me to share my ideas.
Facilitates Learning
My teacher makes sure that we all treat each other with respect.
High Expectations of Students
My teacher helps me learn new things.
Facilitates Learning
My teacher uses examples in class that I understand.
Facilitates Learning
I like the way my teacher treats me.
Supports Students
In my teacher's class, we learn to correct our mistakes.
Facilitates Learning
My teacher hurts my feelings (not used in scoring).
Filtering Use Only (not used in scoring)
My teacher checks to make sure I understand.
Facilitates Learning
In my teacher's class, I have to think hard about the work I do.
High Expectations of Students
My teacher believes in me.
Supports Students
My teacher makes sure that students do what they're supposed to be doing.
High Expectations of Students
My teacher only accepts my best effort.
High Expectations of Students
My teacher is good at explaining things that are hard to understand.
Facilitates Learning
I get bored in my teacher’s class (not used in scoring).
Filtering Use Only (not used in scoring)
My teacher explains things in different ways.
Facilitates Learning
My teacher makes sure that students in this class behave well.
High Expectations of Students
In my teacher's class, I have to explain my answers.
Facilitates Learning
My teacher is nice to me when I need help.
Supports Students
My teacher makes sure I do my best in school.
High Expectations of Students
The rules in my teacher's class are fair.
Supports Students
My teacher knows when the class does not understand.
Facilitates Learning
My teacher cares about me.
Supports Students

Table 1. Questions from student perception survey with coding categories

Monday, August 22, 2016

Decisions about Qualitative Coding

Groups of researchers worked with C-PEER in order to code archival documents for specified constructs, including: (a) STEM equity; (b) Teacher’s use of available time (TAUT); (c) English Language Learning (ELL) supports; and (d) Professional learning structures with coaching for Culturally Relevant Pedagogical (CRP). Taking guidance from Speer and Basurto’s (2012) calibration of qualitative data as sets for qualitative comparative analysis (QCA), researchers compared elementary schools systematically while trying to “give justice to within-case complexity” (Rihoux and Ragin, 2009; Speer, 2012, p. 156). Our research team decided not to perform a Qualitative Code Analysis, though the approach (thinking about how to define conditions related to outcomes) was an important part of our analysis framework. I contributed to the types of documents that I wanted accessed, and designed some codes based on expectations from current literature. I used those initial codes with my co-researchers and then refined them, continually performing multiple check-ins and corrections to be sure that I had inter-coder agreement. This was informally arrived at through code trials, discussions, and revisions of disagreements. This reflexive method of developing codes across coders allowed me to plan, code, monitor, and adjust throughout our coding sessions.

Data Collection Tools

We recruited seven urban elementary schools in an urban Colorado school district to participate in this study (N=7). Within these schools, we surveyed 186 teachers and educational leaders (n=105) for our sample subset. Schools were chosen using the Colorado Department of Education’s School Performance Framework. The seven participating schools ranked as follows: three “Performance Level” schools, two “Improvement Level” schools, one “Priority Improvement Level” school, and one “Turnaround Level” school. Using a range of elementary schools (multi-site) allowed researchers to analyze STEM foundational thinking and instructional activities as a comparative case study. Results from student perception surveys, leader and teacher surveys (quantitative data) were analyzed against students achievement and school academic growth data from school’s Unified Improvement Plans (UIP) and relevant trend (qualitative data), and archival structure data (e.g.: school schedules and team and committee workflows). These include observational data from classroom visits and curriculum and course schedule archives. In collaboration with C-PEER, researchers followed a short-cycle, iterative approach to research, working in collaborative learning teams. C-PEER and doctoral candidates formed research teams in order to co-design and create survey items.

Using Qualtrics Survey software, we administered the effective learning communities’ teacher survey by sending links out to building principals, to disseminate to their staff. This allowed for the greatest number of teachers to participate while remaining anonymous. Teachers were randomly assigned to take one of two versions of Qualtrics survey. This had the simultaneous effect of increasing the total number of items surveyed and keeping the survey short. We wanted teachers to stay engaged throughout the survey. The questions on each survey were similar in context (e.g.: STEM), but random in question order. For example, we grouped survey questions into nine areas: (a) lesson preparation; (b) staff collaboration; (c) student use of feedback and reflection on learning; (d) academic work relevant to students; (e) teacher access to instructional resources; (f) teachers engaging students in problem solving; (g) students participating in problem-solving activities; (h) teacher engagement with students’ families; (i) students showcasing mastery of academic content. This design supported our collection of reliable and valid data.

Quantitative Data Analysis

The teacher response rates for the Effective Learning Teacher Survey are between 37% and 100%. With regards to STEM-foundational thinking, teachers at each elementary school answered as follows: Annie Easley: 84%, Benjamin Banneker: 100%, Richard Spikes: 39%, Aprille Ericsson: 81%, Mae Jemison: 50%, Shirley Jackson: 37%, and Elijah McCoy: 50%.

Monday, August 15, 2016

Methodology (Mixed Methods Comparative Cases)

In collaboration with the Center for Practice Engaged Education Research (C-PEER), I analyzed available and relevant trend data for participating schools and school districts, archival data (e.g.: school schedules and team and committee workflows), data from common teacher and leader survey developed for the project (including asking teachers about 21st century teaching methods identified from the extant literature), and extant data from the schools’ district about students’ perceptions of school and evaluation data about teachers. We focused on understanding “effective learning community” systems through the lens of STEM-foundational thinking. Effective school learning communities, both inside the classroom, among teachers, and in relationship to school leadership work together to support or hinder a STEM mindset in students, especially students of color. We collected data to help schools understand how school structures and resources (e.g.: time, curriculum/instructional programs, equity of access, procedures), climate (e.g.: trust, leadership systems. STEM culture in buildings), and personal aspects (e.g.: teachers’ efficacy, student persistence and motivation) intersect. While each of these has been the subject of research in focused, disconnected studies, our collaborative approach with C-PEER brought data from each element of the system together for a comprehensive look at the interacting factors for improving access and opportunity to STEM curricula.

For our triangulation research design we use a mixed-methods, multi-site, comparative case study, using both quantitative and qualitative processes, in order to measure STEM foundational thinking
and instructional activities at the elementary school level. In some instances, statistical conclusions were limited by small sample size, but we received a very high response rate from school teachers and leaders. Where survey data did not lend themselves to standard quantitative analysis, they could still be considered as qualitative findings. Specifically, I am answering: (a) What elementary school structures support students in STEM curricular areas? (b) Do these supports differ for sub-groups of students, i.e. students of color, students in poverty, and English language learners? (c) What are the components of elementary STEM opportunities to learn that foster interest, participation, and academic success in STEM content areas, especially for marginalized students of color? The modified cultural historical activity theory (CHAT) framework, which undergirds this study allows us as researchers to triangulate our data points and understand what STEM-oriented activities (object) and goal-directed actions lead to STEM foundational thinking. Activity systems analysis is a method that helps to capture multi-mediational processes in human activity (Engeström, 1987, 1999). For example, in researching school structures at the elementary school level that give STEM access and opportunity to students of color, we need to capture information that will inform us about teachers’ mediational processes. 

A triangulation design is the best choice of methodology in order to ascertain what practices are in place in effective learning organizations for recruiting and engaging students of color in STEM curricula. The scope of prior research focuses on separate lines of inquiry (e.g.: STEM perspectives, STEM frameworks, Critical Race Theory, Culturally Responsive Education). By using a triangulation, mixed-methods comparative case study design, we were able to modify an established framework (CHAT), collect several quantitative and qualitative data points, and analyze elementary schools for STEM foundational thinking. For example, something in the rules/policies corner conflicts with something in the students’ ability to take on the role of a collaborative peer with others in their classroom. Researchers chose participating schools in an urban public school district based on a range of academic performance. Researchers used the Colorado Department of Education (CDE) School Performance Framework (SPF) to identify elementary schools based on student achievement and student growth in the 2015-2016 academic school year (ASY).

Monday, August 8, 2016

Planned Study Design

Our proposed research design was to use a mixed-methods, multi-site, comparative case study, using both quantitative and qualitative processes. Examining both quantitative and qualitative methods allowed for a more complete analysis of the research questions and findings (Tashakkori & Teddlie, 1998), as well as provide a broader basis for generalization of results (Simons, 1996). For example, this study intended to answer: (a) What elementary school structures support students in STEM curricular areas? (b) Do these supports differ for subgroups of students, i.e. students of color, students in poverty, and English language learners? (c) What are the components of elementary STEM opportunities to learn that foster interest, participation, and academic success in STEM content areas, especially for marginalized students of color? Our mixed-methods approach was guided by these research questions, and “ultimately reflect[ed] a value of both subjective and objective [STEM] knowledge” (Johnson & Onwuegbuzi, 2004; Tashakkori & Teddlie, 2003). We planned to use data sources centered on (a) recruiting schools to develop a focused priority research agenda; (b) conducting trend analyses of participating schools; and (c) a collaborative analysis of the research questions and recommendations for further action. It is important to note that mixed methods studies, such as this one, are strengthened when research teams are comprised of individuals from a variety of disciplines (Simons, 1996). This allows researchers to engage in a “mixed methods way of thinking” while discussing “different ways of seeing, interpreting, and knowing” about the data (Sammons, 2010, p. xi). Our C-PEER research team includes doctoral candidates from various K12 backgrounds including expeditionary learning, professional development, distributive leadership, and coaching teams. 

          Planned instrumentation. The C-PEER team planned to use instruments organized with specific constructs. For example, teacher surveys will measure the shared values and vision of teaches, as well as support conditions and relationships within the building. We also intended to collect information from both teachers and administration using the (Teaching, Empowering, Leading, and Learning) TELL Colorado surveys. These anonymous instruments would allow researchers to assess teaching conditions within school buildings, as well as throughout participating districts. Since these surveys would be designed to support school and district improvement planning, as well as inform policy decisions, our hope is that they would be extremely reliable and valid measures. It is our intention to design this study that will yield high-quality evidence for educators and school districts.

Monday, August 1, 2016

Chapter III. Mixed Method Comparative Case Study Design

Presently, there is no established agreement on the proper methodology of integrating STEM in elementary schools. However, I felt that interdisciplinary STEM education will be most successful at the elementary level, due to the fact that students spend most of their academic day with the same teacher in all content areas. Research indicates that traditionally underserved, minoirtized population of students engage in problem-based learning activities (e.g.: STEM-foundational thinking, instructional activities and assessments) will exhibit an increased performance in their overall academic achievement, critical thinking skills, and cooperative learning strategies (Cole, 1995; Tharp, R.G., Doherty, R. W., Echevarria, J., Estrada, p., Goldenberg, C., Hillberg, R.S., & Saunders, W.M., 2004). Therefore, we invited seven urban elementary schools from a large urban district to
participate in research designed to understand the school systems that may support STEM-foundational thinking and activities. We focused on the elementary school level because it is in these early academic years where students find their interests in STEM either helped or hindered. Our research analyzed school performance and process data (both quantitative and qualitative) and each school received a report designed to help them incorporate the findings into school improvement efforts. Seven elementary schools agreed to participate. Participation with this project intended to simultaneously help individual schools learn with and from other sites engaged in similar work. Led by the Center for Practice Engaged Education Research (C-PEER) at the CU Denver School of Education and Human Development, this technical assistance brought research expertise to support school improvement efforts. It helped fill gaps in local staff time, bridge challenges accessing performance and process data, and provide access to additional resources and learning from other school sites.