Skip to main content

Welcome, the Hub connects all projects

MSPnet Academy Discussions

All Topics for this Forum

Topic: "MSPnet Academy: Students as Citizen Scientists"

Topic Posts

MSPnet Academy Discussion
April 24th - May 8th, 2018

Presented and Moderated by: Lauren Birney and Bob Midden

This is a follow up discussion to the April 24th MSPnet Academy Webinar.
(Webinar recording now available below.)

Webinar Description:
This webinar will summarize findings from two MSP projects that involve elementary and middle school students participating in real science research that is integrated into their classroom instruction. One project, titled Curriculum and Community Enterprise for the Restoration of New York Harbor with New York City Public Schools involves over forty schools, eighty teachers, and 8,640 students in densely populated, low-income urban areas where resources and access to natural areas are limited. Students study New York harbor and the extensive watershed that empties into it, and they conduct field research in support of restoring native oyster habitats, building on the existing Billion Oyster Project of the New York Harbor School. A number of partners play key roles in the project including Pace University, the New York City Department of Education, the Columbia University Lamont-Doherty Earth Observatory, the New York Academy of Sciences, the New York Harbor Foundation, and the New York Aquarium. The project includes five interrelated components: A teacher education curriculum, a student learning curriculum, a digital platform for project resources, an aquarium exhibit, and an afterschool STEM mentoring program. The second project, titled iEvolve with STEM, involves more than 3,000 students and 75 teachers in two medium-size, Midwestern school districts, one low-income urban district and one neighboring district with mixed demographics. Each grade level participates in a different research project that is aligned with some of the state learning standards. Lessons have been developed that integrate the students' research into all four core academic subject areas: science, math, social studies, and language arts. Professional scientists affiliated with a number of partners lead the research projects and provide scientific consultation to ensure the scientific validity of students' research and to assist with their contributions to national and international citizen science research projects. This presentation will provide a summary of the key features of these projects and will report what has been found regarding the factors and resources that make it possible for students and teachers to participate in science research in meaningful and successful ways as well as the outcomes and benefits from integrating this into classroom instruction.

Discussion forums have been archived, posting is no longer available.

This topic has 6 posts, showing all.

Happy to answer questions about this presentation

posted by: Bob Midden on 4/24/2018 4:18 pm

I am happy to try to answer any questions posed about the presentation that I gave today along with Dr. Lauren Birney about student participation in citizen science research as part of classroom instruction.

I noticed that there were multiple questions during the webinar about the motivation and engagement instrument that we developed. I will post separately about that later when I have time to compose a more thorough response. I must go now to other commitments for the rest of today but intend to post soon about that instrument.

life after funding.

posted by: Brian Drayton on 4/25/2018 7:41 am

I enjoyed the presentations yesterday. One question that would be interesting to explore is that old bugbear, sustainability.
Your projects both have pulled together partnerships with many members. Each of them has their own priorities and agendas, of course, and so participating in your projects requires allocation of time and resources (especially attention!) that the NSF projects have supported during this time.
How do you see the various partners sustaining their involvment, if at all?
In a project Joni Falk and I were involved in in the 1990s, scientists, teachers, and schools collaborated intensively for a while, but the relationships mostly didn't last, though of course there was probably continued benefit accruing from the teachers' new capacities. (see here for a paper describing it st+Collaborations+for+Teacher+Enhancement)

Sustaining the changes

posted by: Bob Midden on 4/25/2018 7:59 am

We consider sustaining the changes made in the project an important aspect of our efforts and have been devoting significant effort to that since about year 3.

We have redesigned student science research projects so that they can be continued using available materials, expertise, and support. Fortunately, there are some local agencies and organizations that have as part of their mission, support of K-12 education. They are providing some of the scientific research support that the teachers and students need to be able to continue their science research.

Some of the changes in practices that have been promoted by iEvolve have been institutionalized by the districts to some extent and at this point, in the last formal year of the project, it appears that the curriculum will largely continue to be used for the foreseeable future. I estimate that perhaps 60% of the teachers, or more, will continue conducting science research with their students at the maximum (optimum) level and perhaps another 20% will continue some type of science research activity.

We have tried to develop mechanisms for induction of new teachers.

But it is uncertain how well this will all continue after we are no longer formally and extensively involved. I think that will depend, in part, on the perception of the benefits that it provides and the opinions of school leadership.

I suggest that it would be a worthwhile endeavor to observe these districts with minimal interaction over the next few years to determine the level at which the iEvolve practices are sustained and with methods to try to identify the factors that account for that. Perhaps such research grants devoted to investigating sustainability are warranted?

Survey of Student Engagement and Motivation

posted by: Bob Midden on 4/25/2018 8:23 am

As I promised in my post yesterday, here is information about the instrument that we are using to assess student engagement and motivation for learning science in school.

During the spring and summer of 2013, a survey was developed to measure students' motivation to learn and engagement in science.

The survey is comprised of two parts: twenty-six items measure students' motivation to learn science in school, and nineteen items measure students' school science engagement.

The motivation items were developed using the expectancy-value theory of academic motivation as a framework (e.g., Wigfield & Eccles, 2000). Scales were created to measure self-efficacy and the four components of subjective task value: intrinsic value (what we also called interest), attainment value (what we also called importance), utility value (what we also called usefulness), and cost. The research team developed two scales around attainment value: one related to personal importance, and another related to societal importance. Several items were selected and modified from the Motivated Strategies for Learning Questionnaire (MSLQ; Pintrich & De Groot, 1990), the Test of Science Related Attitudes (TOSRA; Fraser, 1981), and the science-oriented expectancy-value scales developed by Berger and Karabenick (2011). The research team wrote several items to ensure that each scale had a sufficient number of items.

The majority of the engagement items were modified from the School Engagement Scale (Fredericks, Blumenfeld, Freidel, & Paris, 2005), which measures three types of engagement: behavioral, emotional, and cognitive. Most of the modifications were made to make the items ask about students' engagement of learning *science* in school instead of school in general. Similar wording changes were made for the motivation items as well. The intention was to better detect changes in students engagement in learning science since that is the focus of the iEvolve project and thus that is the change of interest.

The initial draft of the survey was read and critiqued by an external reviewer who specializes in student motivation. Then, nine students-three from each grade three to five--were interviewed using a cognitive interview process wherein they complete the survey while talking aloud about what they are thinking. Items were adjusted based on student responses in the cognitive interviews to optimize the alignment of student interpretation of the survey items with the research intent.

The survey was administered 2-5 weeks after the beginning of the school year and in the spring within the last 6 weeks of the school year depending, in part, on when teachers felt they could fit it into their classroom instruction.

This instrument was administered by all iEvolve teachers who teach science to all of the students that they teach. It was also administered by teachers in the schools selected for comparison that are not participating in iEvolve.

To select comparison schools, we examined information provided by teachers enrolled in iEvolve and used that to select control teachers with similar characteristics from schools that are similar to those of the experimental (iEvolve) schools. Several key characteristics such as school rating (e.g., Academic Watch, Excellent), environment (e.g., urban, rural), size, and distribution of student demographic variables (e.g., race, socioeconomic status) were considered. We targeted our comparison teacher recruitment efforts at schools that best matched the key characteristics of the experimental schools. Since these key characteristics, especially the distribution of student demographic variables, tend to predict student outcomes such as academic achievement, starting our recruiting efforts at the school level helped to ensure the student variables studied in iEvolve (i.e., engagement, motivation, attitudes, and science content knowledge) are comparable at the outset for students in the experimental and comparison groups.

Once comparable schools were identified, teachers within those schools who teach grades 3-5 (for the first cohort) and 6-8 (for the second cohort) were asked to provide relevant demographic information and the same baseline data (e.g., attitudes and beliefs about science and teaching science; content preparedness; instructional practices) as the experimental teachers. Teachers were offered an incentive to provide this initial information. Using these data, we selected teachers for the comparison group who best matched the characteristics of the teachers in the experimental group. (These comparison teachers' students are considered comparison students.) This established a common baseline for both experimental and comparison teachers from which changes are measured. This also helps to control for extraneous "teacher effects" on student outcomes. In other words, experimental and comparison teachers were matched on their initial beliefs about teaching and instructional practices, so they hypothetically were teaching students in similar ways at the beginning of the project. Differences between teachers and students in the experimental and comparison groups should then be attributable to the knowledge and resources teachers acquire during iEvolve.

This process of comparison group construction was used in the first year to select a comparison group matching the first iEvolve cohort (third through fifth grade teachers), and again in the third year to select a group matching the second iEvolve cohort (sixth through eighth grade teachers). Throughout the project, teachers and students in the comparison group completed the same evaluation and research instruments as the experimental group.

The comparison group will help to determine how teachers and students "naturally" change (or don't change) in regards to our variables of interest over the course of the project.

The results of analysis of the survey data to date have not indicated a substantially higher level of motivation or engagement of iEvolve students relative to comparison students. Yet iEvolve teachers have uniformly reported that they have observed substantial increases in those aspects of student behavior. Thus we are not highly confident that this instrument is the most effective way to assess student motivation and engagement, especially for younger students. We are examining the results in greater detail and depth to try to discern factors that may account for this discrepancy in teacher reports and survey responses and will report our findings later.

In the meantime, we are not confident that this survey is the best instrument to assess student motivation and engagement to learn science in these grade levels. Nevertheless we welcome inquires about it and also welcome interest in collaborating in a further investigation of the validity and utility of this instrument.


posted by: Betsy Stefany on 4/26/2018 7:57 am

I am following your research as the area of systemic measurement is a challenge in projects that extend beyond the classroom. The task of measuring motivation and engagement is an area that our project casts with interest.

We seek to keep that aspect fueled, allowing engagement to be self monitored and engagement a manner of the choice of role. This provides actual performance as well as data from the tools involved that draw along motivation.

Group dynamics has a sustainable energy when that system can be captured. I find the use of visual evidence critical as the words in surveys or even in verbal downloading an experience are formalized , often difficult for students and set by prior expectations when asked of teachers or group leaders.

Teaching teachers and leaders the use of visual capture in our project was important to their professional development as they practice physical positioning to listen and watch rather than direct.

The area of visual documentation is increasing important in projects that practice STEM as the accurate display of data relies on trusting
the full system of progress.

Will watch for your further reports.

Visual Observation of Engagement

posted by: Bob Midden on 4/30/2018 12:55 pm

I can recognize the benefit of teachers learning to observe and assess student engagement and also recognize how that may be superior to a survey as a better means of assessing engagement. Do you have references or documentation that you could provide about how that is best done?