Skip to main content


Welcome, the Hub connects all projects

RETA Workshops


MSP Workshop: Assessment of Student Learning
February 1-3, 2004

THE NATIONAL ACADEMIES
Advisers to the Nation on Science, Engineering, and Medicine

NATIONAL RESEARCH COUNCIL
NATIONAL SCIENCE RESOURCES CENTER

MATH/SCIENCE PARTNERSHIPS WORKSHOP
ASSESSMENT OF STUDENT LEARNING

The Keck Center, 500 Fifth St. NW, Room 100
Washington, DC
February 1-3, 2003

TABLE OF CONTENTS

AGENDA
Sunday, February 1 | Monday, February 2 | Tuesday, February 3


February 1 (Sunday)

1:00   Opening Remarks
Jay Labov, National Research Council (NRC)
Biosketch | Powerpoint Presentation

Sally Goetz Shuler, National Science Resources Center
Biosketch | Transcript

Martin Orland, NRC Center for Education
Biosketch | Transcript

Elizabeth VanderPutten, National Science Foundation Representative
Biosketch | Transcript of Presentation

Overview of Workshop:
Melvin George, President Emeritus, University of Missouri and NRC Chair, Mathematics and Science Partnerships Committee
Biosketch | Transcript of Presentation

1:30   Assessment as a Primary Means for Promoting Student Learning
Lorrie Shepard, Dean of the School of Education and Chair of the Research and Evaluation Methodology Program, University of Colorado, Boulder
Biosketch | Powerpoint Presentation | Transcript of Presentation | Transcript of Final Questions and Discussion*

2:30   What Assessment Issues are MSPs currently confronting?
Panel: Two MSP teams discuss assessment decisions
Panel Members:
Deborah Poland, County Science Coach
Wendy Williams, County Math Coach
Transcript of Presentation
MSP: Allegheny Intermediate Unit:
Nancy R. Bunt, PI
J. Kevin Kelly, MSP Coordinator
Transcript of Presentation
MSP: Stark County, OH:
Michael Kestner, Specialist, U.S. Dept. of Education,
Transcript of Presentation

3:30   An Assessment Exercise
Andy Porter, Patricia and Rodes Hart Professor of Educational Leadership and Policy and, Director of the Learning Sciences Institute, Vanderbilt University

Tests send messages to teachers and students as to the content that is most important to be learned. Tests can be more or less aligned to state and professional content standards and instruction can be more or less aligned to tests and/or standards. But, tests consist of collections of items (tasks). Tools have been developed to provide analytic powerful descriptions of the content messages of tests and standards. Tools also have been developed for describing the alignment among instruction, assessment and standards. These tools will be discussed. Participants will use the tools to content analyze a test.
Biosketch | Powerpoint Presentation | Transcript of Presentation | Transcript of Final Questions and Discussion* | Handout 1 | Handout 2

4:45   Break

5:00   Debriefing of Assessment Exercise and Participant Discussion
Andy Porter, Lorrie Shepard, MSP team panel Transcript of Presentation and Discussion

5:45   Adjourn

6:00   Dinner

7:15   Meeting: Facilitators, Presenters, NRC and NSRC Staff


February 2 (Monday)

8:00   Breakfast

8:30   Connecting Cognition and Assessment
James Pellegrino, Co-Director, Center for the Study of Learning, Instruction, & Teacher Development, University of Illinois, Chicago

This presentation will focus on critical connections between cognition and assessment based on the research findings summarized in several NRC reports. Special focus will be given to the conception of assessment offered in the Knowing What Students Know report and its connection to major ideas about the nature of cognition and learning described in the How People Learn report. Issues regarding the design and use of assessment at the classroom, school, district and state level will be highlighted. A special attempt will be made to provide examples of quality assessment design and practice in math and science and discuss how they exemplify the concepts and principles about assessment laid out in Knowing What Students Know. Topics covered in this presentation should complement and help set the stage for the plenary and breakout sessions that follow.
Biosketch | Powerpoint Presentation | Transcript of Presentation | Transcript of Final Questions and Discussion*

10:00   Break

10:15   Equity in Assessment William Trent, Professor of Education Policy Studies and Sociology, University of Illinois, Urbana-Champaign

This session will focus primarily on the NRC report, "High Stakes: Testing for Tracking, Promotion, and Graduation." Discussion will include fairness in testing, equity issues, and accommodations in the testing process. Dr. Trent will stress the need for access to high quality, advanced courses for all students and how "tracking" fits into the assessment picture.
Biosketch | Powerpoint Presentation | Transcript of Presentation | Transcript of Final Questions and Discussion*

11:15   Classroom Assessment of Learning: What Does It Mean for MSPs?
Lorrie Shepard, Dean of the School of Education and Chair of the Research and Evaluation Methodology Program, University of Colorado, Boulder
Powerpoint Presentation | Transcript of Presentation | Transcript of Final Questions and Discussion*

12:15   Lunch

1:00-2:00   Implications for MSPs of Large-Scale Assessments
Marge Petit, Senior Associate, National Center for the Improvement of Educational Assessment

Under the requirements of NCLB each state must develop a set of mathematics grade level expectations for grades 3 - 8 and one grade at the high school level. These grade level expectations are to form the foundation for state level assessments in grades 3 - 8 and one high school grade aligned with the grade level expectation to be administered beginning in the school year 2005 -2006. In this session participants will explore the implications of these requirement on MSPs and how "bridging the gap" between large-scale and classroom assessment may have different implications for different Mathematics and Science Partnerships.
Biosketch | Powerpoint Presentation | Transcript of Presentation | Transcript of Final Questions and Discussion*

2:15   Break-outs:

  • C-TOOLS: Concept-map Tools for Online Learning in Science
    Diane Ebert-May, Professor, Plant Biology, Michigan State University

    C-TOOLS is a web-based concept mapping tool (Java applet) with an automatic scoring function (Robo-grader). C-TOOLS enables students in large (or small) introductory science classes to visualize their thinking online as well as to receive immediate formative feedback. The value of concept maps is that they provide visual evidence of how students understand the direct relation and organization among many principles, data not easily assessed by multiple choice questions or even extended responses. In this workshop we will use C-TOOLS as an assessment tool and discuss methods of its application in the classroom to motivate students to reflect, revise and share their thinking with peers as an extension of the learning process. Participants are encouraged to explore the website. C-TOOLS web URL: http://ctools.msu.edu
    [Transcript Not Available]


  • Ongoing Classroom Assessment of Mathematics: Assessing for Understanding
    Marge Petit, Senior Associate, National Center for the Improvement of Educational Assessment

    In this session participants will examine a model of ongoing classroom assessment in development by the Vermont Mathematics Partnership. The model draws on findings and recommendations in Adding it Up: How Children Learn Mathematics (2001), Knowing What Students Know: The Science and Design of Educational Assessment (2000), Assessment in Support of Instruction and Learning: Bridging the Gap Between Large-scale and Classroom Assessment (2003), How People Learn: Brian, Mind, Experience, and School, Expanded Addition (2001), Jim Minstrell's work on Facet of Learning, and cognitive research in the development of specific mathematical concepts. The model is an evolving example of "bridging the gap" between large-scale and classroom assessment in support of student learning. Participants will explore the model, examine student work from early one-on-one interviews with students, and have an opportunity to provide feedback on the developing model. Handout 1 | Handout 2 | Handout 3 | Handout 4 | Handout 5 | [Transcript Not Available]


  • Information Technology: Implications for Assessment of Learning
    Ellen Mandinach, Associate Director for Research, EDC Center for Children & Technology

    This session will deal with some of the pressing issues facing researchers and practitioners concerning the assessment and evaluation of the impact of educational technology on teaching activities and student learning. NCLB, the What Works Clearinghouse, and the U.S. Department of Education's Institute for Education Sciences' focus on scientifically-based research all have major impact on how researchers and school personnel can effectively assess the impact of the emerging technologies on student learning. The session will touch upon the intersection of technology, instruction, and assessment. It will also draw upon a recent paper commissioned for the National Education Technology Plan that traces twenty years of educational technology policy documents and recommendations. Particular attention will be paid to how researchers and practitioners can work together to meet the challenges of NCLB, without compromising school infrastructure and ethics. The session will be informal, consisting of interactive exchanges of ideas among the attendees.
    Biosketch | Handout 1 | Handout 2 | Handout 3 | [Transcripts Not Available]


  • Implications of Formative Assessment for Curriculum Development and Professional Development
    Lorrie Shepard, Dean of the School of Education and Chair of the Research and Evaluation Methodology Program, University of Colorado, Boulder [Transcripts Not Available]

  • Formative Assessment: Small Strategies that Work in a Big Way
    Mary Colvard, Consultant, New York State Education Department, and Steering Committee Member

    Formative assessment is a tool too often overlooked by educators. Quick and easy formative assessment strategies appropriate for use in both the classroom and professional development settings will be modeled during this session. A discussion of when and how to use several of the techniques will follow activities during which participants work through a variety of formative assessment activities designed to provide feedback on student learning. Participants will receive materials and strategies to take home and implement in a classroom setting.
    Biosketch | Handout 1 | Handout 2 | Handout 3 | Handout 4 | Handout 5 | [Transcripts Not Available]

3:45   Break

4:00   MSP Teams: Team Networking and Time with Facilitators

5:15   Adjourn and Dinner (on your own)

5:30   Feedback Panel -11th Floor Collaboration Room


February 3 (Tuesday)

8:30   Breakfast

9:00   Concept-mapping: A Research Tool
Diane Ebert-May, Professor of Plant Biology, Michigan State University

Concept mapping can be a formative assessment tool and a research tool. This session will focus on the implications of concept-mapping as a research tool. Dr. Ebert-May will discuss some of the work currently in progress and possible applications. Powerpoint Presentation | Handout | Transcript of Presentation | Transcript of Final Questions and Discussion*

10:30   Planning for Change in Assessment
Mark Kaufman, Co-Director of the Center for Education Partnerships, TERC

In this session MSP teams will use the knowledge gained from the workshop to plan for the challenge of making changes in local assessment policies and practices. Teams will examine a specific set of conditions in their partnerships that can support or hinder the process of making needed changes. Biosketch | Powerpoint Presentation | Transcript of Presentation | Transcript of Final Questions and Discussion* | Handout

11:45   Committee Reflections and Participant Discussion
Transcript of Discussion

12:30   Box Lunch and Adjourn

TABLE OF CONTENTS