180

 

 

 

 

 

 

How Are Our Graduates Teaching? 

Looking at the Learning Environments in Our Graduates' Classrooms

 

 

 

 

 

 

A paper presented at the annual meeting of the

Association for the Education of Teachers in Science

Akron, OH

January 2000

 

 

 

 

 

 

 

 

 

Bruce Johnson

University of Arizona

Teaching and Teacher Education

Room 719, College of Education

P. O. Box 210069

Tucson, AZ 85721-0069

Phone: (520) 621-7889

Fax: (520) 621-7877

Email: brucej@email.arizona.edu

 

and

 

Robert McClure

St. Mary’s University

Winona, MN
Introduction

 

As teacher educators, we have numerous opportunities to observe our science education students teaching and to talk with them about their teaching.  This happens almost exclusively before they become full time teachers.  We rarely have opportunities to see them teach and to talk with them about their teaching once they leave us for classroom teaching positions.  This study is part of a larger effort to help us improve our science teacher education programs by extending our relationships beyond our preservice programs.

 

The classroom learning environment, sometimes referred to as the educational environment or the classroom climate, is the social atmosphere in which learning takes place.  Fraser (1994) regards these learning environments as "the social-psychological contexts or determinants of learning."  Several studies have indicated that classroom learning environment is a strong factor in determining and predicting students' attitudes toward science (Lawrence 1976; Simpson & Oliver 1990).  Talton and Simpson (1987) argued that classroom learning environment was the strongest predictor of attitude toward science in all grades.

 

As educators of future teachers, we have many opportunities to see our students teach during practicum and student teaching semesters, but we rarely get to see what happens after they leave us.  How are they really teaching?  SciMathMN funded the Teacher Research Network (TRN) in an attempt to find some answers to that question.  Five teacher education institutions in Minnesota participated in the first year of studies of recent graduates teaching in K-12 schools. Researchers from the five institutions have been meeting for joint planning and processing and have been pooling data to help construct a more complete picture of our graduates' teaching.  In addition to interviewing teachers and observing them teach, an existing learning environment instrument was used to get a sense of the perceptions of both teachers and their students.  This paper describes the use of this instrument.

 

Instrument

            The Constructivist Learning Environment Survey (CLES) was developed "… to enable teacher-researchers to monitor their development of constructivist approaches to teaching school science…" (Taylor, Dawson, & Fraser, 1995, p.1).  Orignally developed by Peter Taylor and Barry Fraser at Curtin University of Technology in Perth, Australia (Taylor, Fraser, & Fisher, 1993) the CLES consisted of 28 items, seven each in four scales - autonomy, prior knowledge, negotiation, and student-centredness.  The instrument was later revised to incorporate a critical theory perspective because "… our ongoing research program had revealed major socio-cultural constraints (e.g., teachers acting in accordance with repressive cultural myths of cold reason and hard control) that worked in concert to counter the development of constructivist learning environments."  (Taylor, et al., 1995, p. 2). 

 

            The revised CLES that was used in the present study (see Appendix A) consists of 30 items, six each in five scales (see Table 1).  Rather than having items from different scales mixed together throughout the instrument, items in this version are grouped by scale.  In addition, there is only one item that is negatively worded.  The items attempt to reveal teachers' perceptions of the learning environment in their classrooms.  There are versions for both science and for math as well as for teachers and for students.  All four versions were used in the present study. 

 


 

Table 1

 

Constructivist Learning Environment Survey (CLES) Scale Descriptions

 

Personal Relevance -

"… concerned with the connectedness of school science to students' out-of-school experiences.  We are interested in teachers making use of students' everyday experiences as a meaningful context for the development of students' scientific knowledge."

 

Uncertainty -

"… has been designed to assess the extent to which opportunities are provided for students to experience scientific knowledge as arising from theory-dependent inquiry, involving human experience and values, evolving and non-foundational, and culturally and socially determined."

 

Critical Voice -

"… assesses the extent to which a social climate has been established in which students feel that it is legitimate and beneficial to question the teacher's pedagogical plans and methods, and to express concerns about any impediments to their learning."

 

Shared Control -

"… concerned with students being invited to share control with the teacher of the learning environment, including the articulation of their own learning goals, the design and management of their learning activities, and determining and applying assessment criteria."

 

Student Negotiation -

"… assesses the extent to which opportunities exist for students to explain and justify to other students their newly developing ideas, to listen attentively and reflect on the viability of other students' ideas and, subsequently, to reflect self-critically on the viability of their own ideas."

________________________________________________________________________

Note:  All scale descriptions are taken from: Taylor, et al., 1995.

 

 

Methods

 

            In the first year of this study, the CLES was used in several different ways in an attempt to answer these questions.

 

1)      Does the CLES provide useful information about our graduates' classrooms?

2)      Are revisions needed before using the CLES in a larger study in future years?

 

The CLES was administered to a wide range of people, including inservice and preservice elementary and secondary science and math teachers and elementary and secondary science and math students (see Table 2).  Participants recorded their responses on a computer scorable answer sheet.  Participants were also asked to record, directly on the survey, comments on items that they felt were difficult to understand.

 

 

Table 2

 

Number of Respondents on Each Form of the CLES

 

 

CLES Form

Number of

Completed Surveys

Science teacher

290

Math teacher

    2

Science student

145

Math student

  39

__________________________________________

 

 

            Once the data was screened and prepared, several analyses were conducted.  The first was an exploratory factor analysis (EFA).  This analysis was conducted only on the responses for the science teacher form of the CLES.  This form was selected because it had the highest number of respondents, 290.

 

            As mentioned earlier, the CLES was developed with 6 scales.  There has been no published report of a factor analysis, however.  EFA was used in the present study to analyze the relationships between items.  Principal axis factoring (PAF) and oblimin rotation were used.  These methods were selected because there was an underlying theoretical factor structure (six scales) and because it was also assumed that the scales might be related in a larger factor, classroom learning environment.  The analysis was run using SPSS 8.0 for Windows.  Items with missing items had the item mean substituted.  Since five scales were hypothesized, the analysis was constrained to five factors.  The factors were then rotated to maximize their variance.  An examination of items that loaded strongly on each factor was then made to see if the items actually fit together.

 

            Internal consistency of the CLES as a whole and of items within each scale was also investigated by running an alpha reliability analysis.  Coefficient alpha is based on the idea that items within a factor are really measuring the same thing, in this case the scale, and to some extent all of the items in an instrument are measuring the same broad construct, in this case classroom learning environment (Pedhazur & Schmelkin, 1991).  Alpha reliability coefficients range from 0 to 1.0, with higher values indicative of higher internal consistency.  While there is no set value that must be obtained, coefficients of .70 and higher are generally considered to be adequate for this type of instrument.

 

            Written comments from respondents were also read and considered.  Items which participants felt were confusing or overly redundant were noted.  Informal comments from teachers after survey administration and during interviews were also considered.

 

Finally, means of the scales were examined.   Teachers' responses were compared to those of their students to see if the teachers' perceptions differed from their students' perceptions.  Participating teachers received the results for their classrooms.

 

 

 


Results

            An examination of the factor loadings (see Table 3) was made. Loadings of less than .30, a commonly used cut-off, were eliminated.  Most items loaded strongly on their hypothesized scale. There were exceptions, however.  Item six, the only negatively worded item, in the personal relevance scale, had a much lower factor loading (.30) than did other items in that scale.  Item seven, in the uncertainty scale, similarly had a low factor loading (.36).  Item 18, in the critical voice scale, had similar factor loadings on both its own scale (.41) and on the uncertainty scale (.42).  Items 22 and 24, in the shared control scale, had lower loadings (.43 and .38) than did the other items in that scale. 

 


Table 3

 

EFA Factor Loadings

 

Item

PR

UN

CV

SC

SN

  1

.73

 

 

 

 

  2

.56

 

 

 

 

  3

.65

 

 

 

 

  4

.62

 

 

 

 

  5

.66

 

 

 

 

  6

.30

 

 

 

 

  7

 

.36

 

 

 

  8

 

.62

 

 

 

  9

 

.67

 

 

 

10

 

.64

 

 

 

11

 

.61

 

 

 

12

 

.67

 

 

 

13

 

 

.65

 

 

14

 

 

.82

 

 

15

 

 

.75

 

 

16

 

 

.77

 

 

17

 

 

.55

 

 

18

 

.42

.41

 

 

19

 

 

 

.87

 

20

 

 

 

.73

 

21

 

 

 

.79

 

22

 

 

 

.43

 

23

 

 

 

.78

 

24

 

 

 

.38

 

25

 

 

 

 

.65

26

 

 

 

 

.81

27

 

 

 

 

.81

28

 

 

 

 

.77

29

 

 

 

 

.73

30

 

 

 

 

.86

Note.  PR is Personal Relevance; UN is Uncertainty; CV is Critical Voice;

SC is Shared Control; SN is Student Negotiation.

 

 

Alpha reliability coefficients for the five scales were also examined (see Table 4).  While all of the coefficients were high enough to be considered adequate, there were items that did not contribute as heavily as others did.  Analysis revealed that eliminating items two and six from the personal relevance scale would lead to a higher alpha coefficient for that scale.  Similarly, eliminating item seven from the uncertainty scale, item 18 from the critical voice scale, items 22 and 24 from the shared control scale, and item 25 from the student negotiation scale would increase those alpha coefficients.

 

           

Table 4

 

Internal Consistency Results

 

Scale (factor)

Alpha coefficient

Personal Relevance

.80

Uncertainty

.81

Critical Voice

.83

Shared Control

.85

Student Negotiation

.91

Overall Instrument

.88

__________________________________

 

 

Relatively few participants chose to write comments on the survey forms.  A review of those who did, however, as well as conversations with some of the participating teachers, revealed two common comments.  First, many participants felt there was too much redundancy.  Some participants questioned the need for six items asking essentially the same thing.  Second, some of the items were confusing.  The items mentioned above as problems identified by the factor analysis and the internal reliability analyses were also most frequently identified by participants.

 

            The results were presented to the TRN team at a meeting following the end of the academic year.  Discussions revealed a consensus on the answers to our two questions.  The CLES provided valuable information, and it needed to be revised to reduce redundancy and eliminate confusing items.  A decision was made to keep the five scales but reduce the number of items in each to four, in the process eliminating the single negatively worded item.  Small groups were each given one scale to revise using the science teacher form of the CLES.  

 

            For each scale, items were examined to see if there were four different aspects of the scale construct that the items addressed.  Using information from the factor analysis and the internal consistency analysis as well as a description of the scale, items that were redundant or confusing were eliminated or rewritten. 

 

            The result is a revised, more parsimonious form of the CLES (see Appendix B).  It contains 20 items, four each in five scales.  Terms that were found to be confusing were eliminated, as was the instruments' only negatively worded item.  Some items were also rewritten to ensure that different aspects of each scale's construct were addressed.

 

Discussion

            The CLES can provide valuable information about teachers' and students' perceptions of their classroom learning environment, particularly when it is used in conjunction with teacher interviews and classroom observations.  The instrument is relatively easy to administer without requiring large amounts of valuable classroom learning time. 

 

            In the second year of the TRN study, additional institutions have joined.  A small number of beginning teachers and student teachers are participating in an in depth part of the study that includes extensive teacher interviews and classroom observations as well as the revised CLES.    A much larger number are participating by completing the CLES.  It is hoped that such a joint approach, with both depth and breadth, will provide rich information that will influence how we prepare our teachers.

 

 

 

 

Notes:

 

Further information on the Teacher Research Network can be obtained from the project co-directors:

 

Dr. George R. Davis, EdD

Regional Science Center

Moorhead State University

Moorhead, MN 56563

Phone: (218) 236-2904

Email: davisg@mhd1.moorhead.msus.edu

 

Dr. Patricia Simpson

Department of Biological Sciences

St. Cloud State University

St. Cloud, MN 56301-4498

Phone: (320) 255-3012

Email: psimpson@stcloudstate.edu

 

 

Further information on SciMathMN can be obtained from:

 

Cyndy Crist

Manager, Transforming Teacher Education Initiatives

SciMathMN

1500 Highway 36 West

Roseville, MN 55113

Phone: (651) 582-8762

Fax: (651) 582-8877

Email: Cyndy.Crist@state.mn.us

 

 

Participating Institutions and Principal Investigators:

 

Gustavus Adolphus College - Bruce Johnson

Moorhead State University - George Davis

St. Mary's University - Robert McClure

St. Cloud State University - Patricia Simpson

University of Minnesota-Duluth - Thomas Boman

 


Appendix A

 

Constructivist Learning Environment Survey - version used in the present study.

 

What Happens in My Science Classroom -- Teacher Form

 

Response choices for all items are:

Almost Always                Often                Sometimes       Seldom                Almost Never

            A     B                 C                              D                        E

 

Learning about the world (Personal Relevance)

In this class …

1.     Students learn about the world outside of school.                                                                  

2.        New learning starts with problems about the world outside of school.

3.        Students learn how science can be a part of their out-of-school life.

4.                    Students get a better understanding of the world outside of school.

5.                    Students learn interesting things about the world    outside of school.

6.                    What students learn has nothing to do with their out-of-school life.

 

Learning about science (Uncertainty)                                                                                                                                                                          In this class … 

7.                    Students learn that science cannot provide perfect answers to problems.

8.                    Students learn that science has changed over time.                     

9.                    Students learn that science is influenced by people's values and opinions.

10.                 Students learn that different sciences are used by         people in other cultures.

11.                 Students learn that modern science is different from the science of long ago.

12.                 Students learn that science is about inventing theories.                             

 

Learning to speak out (Critical Voice)                                                                                                                                                                               In this class …

13.                 It's OK for students to ask me "Why do we have to learn this?"

14.                 It's OK for students to question the way they are being taught.

15.                 It's OK for students to complain about activities that are confusing.

16.                 It's OK for students to complain about anything that stops them from learning.

17.                 It's OK for students to express their opinion.

18.                 It's OK for students to speak up for their rights.     

 

Learning to learn (Shared Control)

In this class …

19.                 Students help me to plan what they are going to learn.

20.                 Students help me to decide how well they are learning.

21.                 Students help me to decide which activities are best for them.

22.                 Students have a say in deciding how much time they spend on an activity.

23.                 Students help me to decide which activities they do.

24.                 Students help me to assess their learning.

 

Learning to communicate (Student Negotiation)

In this class …

25.                 Students get the chance to talk to other students.    

26.                 Students talk with other students about how to solve problems.

27.                 Students explain their ideas to other students.

28.                 Students ask other students to explain their ideas.              

29.                 Students are asked by others to explain their ideas.

30.                 Students explain their ideas to each other.     


Appendix B

 

Constructivist Learning Environment Survey - revised version as a result of the present study.

 

What Happens in My Science Classroom -- Teacher Form

 

Response choices for all items are:

Almost Always                Often                Sometimes       Seldom                Almost Never

            A     B                 C                              D                        E

 

Learning about the world (Personal Relevance)

In this class…                                                                                      

  1.  Students learn about the world in and outside of school.               

  2.  New learning relates to experiences or questions about the world in and outside of school.

  3.  Students learn how science is a part of their in- and outside-of-school lives.

  4.  Students learn interesting things about the world in and outside of school.

 

Learning about science (Uncertainty)

In this class…

  5. Students learn that science cannot always provide answers to problems.                         

  6. Students learn that scientific explanations have changed over time.                       

  7. Students learn that science is influenced by people's cultural values and opinions.

  8. Students learn that science is a way to raise questions and seek answers.                      

 

Learning to speak out (Critical Voice)

In this class…

  9. Students feel safe questioning what or how they are being taught.                                 

10. I feel students learn better when they are allowed to question what or how they are being taught.

11. It's acceptable for students to ask for clarification about activities that are confusing.

12. It's acceptable for students to express concern about anything that gets in the way of my learning.                                                                                                                                                

Learning to learn (Shared Control)

In this class…

13.  Students help me plan what they are going to learn.                                                                     

14.  Students help me to decide how well they are learning.                                                

15.  Students help me to decide which activities work best for them.                                     

16.  Students let me know if they need more/less time to complete an activity.           

 

Learning to communicate (Student Negotiation)

In this class…

17.  Students talk with other students about how to solve problems.                                              

18.  Students explain their ideas to other students.                                                                               

19.  Students ask other students to explain their ideas.                                                     

20.  Students are asked by others to explain their ideas.                                                     

 

 

 

 

 

 

 

 

 

 

 

 

 


References: 

 

Fraser, B. J. (1994). Research on classroom and school climate. In D. Gabel (Ed.), Handbook of Research on Science Teaching and Learning (pp. 493-541). New York: Macmillan.

 

Lawrence, F. P. (1976). Student perception of the classroom learning environment in biology, chemistry and physics courses. Journal of Research in Science Teaching, 13, 351-353.

 

Pedhazur, E. J., & Schmelkin, L. P. (1991). Measurement, design, and analysis: An integrated approach.  Hillside, NJ: Lawrence Erlbaum Associates.

 

Simpson, R. D. & Oliver, J. S. (1990). A summary of major influences on attitude toward and achievement in science among adolescent students. Science Education 74(1), 1-18.

 

Talton, E. L. & Simpson, R. D. (1987). Relationships of attitude toward classroom environment with attitude toward and achievement in science among tenth grade biology students. Journal of Research in Science Teaching 24(6), 507-526.

 

Taylor P., Dawson, V., & Fraser, B. (1995, April). A constructivist perspective on monitoring classroom learning environments under transformation. Paper presented at the Annual Meeting of the American Educational Research Association, San Francisco, CA.

 

Taylor, P., Fraser, B., & Fisher, D. (1993, April). Monitoring the development of constructivist learning environments. Paper presented at the Annual Convention of the National Science Teachers Association, Kansas City, MO.