The Impact of Systemic Reform Efforts in Promoting Constructivist Approaches inHigh School Science

 

Micheal Dryden, Dallas Public Schools, Texas, USA

Barry J. Fraser, Curtin University of Technology, Australia

 (Paper presented at the annual meeting of the American Educational Research Association, San Diego CA, April 1998)

 

            During the past four years, a large school district has been part of an Urban Systemic Initiative funded by the National Science Foundation (NSF).  Over the same four years, the National Key Centre for School Science and Mathematics at Curtin University in Western Australia has assisted the district in monitoring instructional changes in the learning environment.  This paper is a compilation of those efforts and focuses on the impact that the reform initiative in changing high school science instruction towards a more constructivist approach.

 

Perspectives or Theoretical Framework

 

            The recent national reform movements in the United States are grounded in a constructivist approach to learning.  That is, students should find personal relevance in their studies, share control over their learning, feel free to express concerns about their learning, view science as ever changing, and interact with each other to improve comprehension (Taylor, Dawson, & Fraser, 1995; Taylor, Fisher, & Fraser, 1997). However, reform is often difficult to implement is large systems with inertia set by years of tradition and entrenched beliefs.  At this point in time, the nature of how teachers view their students and how they believe that students learn is unknown within this district.  However, it is known that the majority of teachers do not think that understanding how students learn is important (Dryden, 1996).  Jakubowski and Tobin (1991) show that teachers who embrace a realist epistemology "emphasize technical interests and adopt strategies that controlled what students were to learn and how they were to engage".

            In keeping with recommendations that qualitative and quantitative methods be combined in research on learning environments (Fraser, & Tobin, 1991; Tobin & Fraser, 1998), over 250 mathematics classroom observations were documented to confirm that a realist approach was the overwhelming practice of teachers in this district (Dryden, 1997).  While fewer science teachers were observed, this trend of a realist teaching approach was the same (Dryden &Fraser, 1998).  This paper presents evidence that the realist tradition prevails and that efforts towards reform towards constructivism are limited.

 

Methods and Techniques

 

            Reform in a large system is complex and multiple sources of data and techniques must be used.  These sources should all converge to give the same consistent story.  The evaluation of the USI reform implementation had three implementation phases across the district.  For the purposes of this study, three high schools per implementation phase were selected.  The selection for Phase I was fixed as only three high schools participated and have been participants in the USI since 1993-1994.  Phase II and Phase III schools were selected based on their representation of the district. 

            In this study, two methodologies to assess the implementation of the reform initiative were used.  First, pretest and posttest administrations of the Constructivist Learning Environment Survey  (CLES; Taylor, Dawson & Fraser, 1995; Taylor, Fraser, & Fisher, 1997) were given to high school science students at the beginning of the USI and again three years later.  Second, science classes were observed to determine the nature of instruction, with a learning environment checklist being used by the observers.  Teacher interviews and teacher surveys were also conducted, but the results obtained from the application of these techniques are beyond the scope of this paper.

            The data in this large study involved nine high schools, pretest administration of the CLES to 440 students, posttest administrations of the CLES to 351 students, and 29 classroom observations by five evaluators.

 

Field of Learning Environment and Constructivist Learning Environment Survey

 

Field of Classroom Environment Research

            Over the previous two decades or so, considerable interest has been shown internationally in the conceptualization, assessment and investigation of perceptions of psychosocial characteristics of the learning environment of classrooms at the elementary, secondary and higher education levels (Fraser, 1986, 1994, 1998; Fraser & Walberg, 1991). Use of student perceptions of classroom environment as predictor variables has established consistent relationships between the nature of the classroom environment and student cognitive and affective outcomes (McRobbie & Fraser, 1993). Furthermore, research involving a person-environment fit perspective has shown that students achieve better where there is greater congruence between the actual classroom environment and that preferred by students (Fraser & Fisher, 1983).

            Studies involving the use of classroom environment scales as criterion variables have revealed that classroom psychosocial climate varies between Catholic and government schools (Dorman, Fraser & McRobbie, 1994). Researchers and teachers have found it useful to employ classroom climate dimensions as criteria of effectiveness in curriculum evaluation because they have differentiated revealingly between alternative curricula when student outcome measures have shown little sensitivity (Fraser, Williamson & Tobin, 1987). Research comparing students' and teachers' perceptions showed that, first, both students and teachers prefer a more positive classroom environment than they perceive as being actually present and, second, teachers tend to perceive the classroom environment more positively than do their students in the same classrooms (Fraser, 1994). In small-scale practical applications, teachers have used assessments of their students' perceptions of their actual and preferred classroom environment as a basis for identification and discussion of actual-preferred discrepancies, followed by a systematic attempt to improve classrooms (Fraser & Fisher, 1986).

 

Background to the CLES

            The Constructivist Learning Environment Survey (CLES) enables researchers and teacher-researchers to monitor the development of constructivist approaches to teaching school science and mathematics. The original version of the CLES  (Taylor & Fraser, 1991) was based largely on a psychosocial view of constructivist reform that focused on students as co-constructors of knowledge but which remained blind to the cultural context framing the classroom environment. Although the original CLES was found to contribute insightful understandings of classroom learning environments and to be psychometrically sound with Australian high school students in science and mathematics classes, as well as in a number of studies in other countries (Lucas & Roth, 1996; Roth & Bowen, 1995; Roth & Roychoudury, 1993, 1994; Watters & Ginns, 1994), its theoretical framework supported only a weak program of constructivist reform.

            Our ongoing research program revealed major cultural restraints that can counteract the development of constructivist learning environments, such as powerful cultural myths rooted in the histories of science or mathematics and of schooling (Taylor, in press; Milne & Taylor, 1996). Because of the importance of teachers and students becoming critically aware of how their teaching and learning roles are being unduly restrained by these otherwise invisible forces, we decided to redesign the CLES to incorporate a critical theory perspective on the cultural framing of the classroom learning environment.

            As part of the design process, the viability of the new CLES for monitoring constructivist transformations to the epistemology of school science and mathematics classrooms was examined.  Trialing early versions of the CLES in two classroom-based collaborative research studies enabled critical scrutiny of both the conceptual soundness and psychometric structure of the questionnaire. During our interpretive research inquiry (Erickson, 1986), we visited classrooms as participant-observers, observed teaching and learning activities, analysed curriculum documentation, and interviewed teachers and students. In these two case studies, we investigated both the way in which students made sense of responding to CLES items and the way that CLES data enabled us to make sense of our observations of the classroom environment (Taylor, Dawson & Fraser, 1995; Taylor, Fraser & White, 1994).

            These qualitative studies led to important modifications to both the content and format of the CLES. Some of these changes signal a departure from traditional practices in learning environment research. First, by rejecting items whose wording was conceptually complex and by minimising the use of negatively-worded items, we produced a more economical and less conceptually complex 30-item version comprising five six-item scales. Second, we created a more meaningful context for responding to items by abandoning the traditional cyclic format for items in learning environment instruments and grouping items in their respective scales, each with a ‘user-friendly’ title. Third, in order to focus student thinking on the ‘immediate’ classroom learning environment, a prompt was included, “In this science class . . .”.

 

Description of the CLES

            Each scale of the new version of the Constructivist Learning Environment Survey (CLES) was designed to obtain measures of students' perceptions of the frequency of occurrence of five key dimensions of a critical constructivist learning environment:

 

·      Personal Relevance focuses on the connectedness of school science to students' out-of-school experiences, and on making use of students' everyday experiences as a meaningful context for the development of students' scientific knowledge.

·      Uncertainty involves the extent to which opportunities are provided for students to experience scientific knowledge as arising from theory-dependent inquiry involving human experience and values, and as evolving, non-foundational, and culturally and socially determined.

·      Critical Voice involves the extent to which a social climate has been established in which students feel that it is legitimate and beneficial to question the teacher's pedagogical plans and methods, and to express concerns about any impediments to their learning.

·      Shared Control is concerned with students being invited to share with the teacher control of the learning environment, including the articulation of learning goals, the design and management of learning activities, and the determination and application of assessment criteria.

·      Student Negotiation assesses the extent to which opportunities exist for students to explain and justify to other students their newly developing ideas, to listen attentively and reflect on the viability of other students' ideas and, subsequently, to reflect self-critically on the viability of their own ideas.

 

            The CLES contains 30 items altogether, with six items in each of the five scales. The response alternatives for each item are Almost Always, Often, Sometimes, Seldom, and Almost Never. A complete copy of the CLES is contained in Appendix A.

 

Validity and Reliability of the CLES

 

            As part of our evaluation of the NSF Urban Systemic Initiative, the CLES was used with a large sample of approximately 1,600 students in 120 grade 9-12 science classes to establish district-wide baseline information (Dryden & Fraser, 1996). Table 1 shows the internal consistency reliability obtained for the Dallas sample for each CLES scale (Cronbach alpha coefficient) for the individual student as the unit of analysis.  Reliability values range from 0.61 to 0.89.

 

Table 1.  Item Factor Loadings and Scale Alpha Reliabilities

 

Item

 

                         Factor Loadings

 

 

 

Personal

Relevance

Uncertainty

of Science

Critical

Voice

Shared

Control

Student

Negotiation

Q1

0.58

 

 

 

 

Q2

0.54

 

 

 

 

Q3

0.37

 

 

 

 

Q4

0.66

 

 

 

 

Q5

0.66

 

 

 

 

Q6

 

 

 

 

 

Q7

 

 

 

 

 

Q8

 

0.56

 

 

 

Q9

 

0.54

 

 

 

Q10

 

0.38

 

 

 

Q11

 

0.54

 

 

 

Q12

 

0.40

 

 

 

Q13

 

 

0.52

 

 

Q14

 

 

0.64

 

 

Q15

 

 

0.62

 

 

Q16

 

 

0.65

 

 

Q17

 

 

0.70

 

 

Q18

 

 

0.79

 

 

Q19

 

 

 

0.72

 

Q20

 

 

 

0.66

 

Q21

 

 

 

0.79

 

Q22

 

 

 

0.78

 

Q23

 

 

 

0.78

 

Q24

 

 

 

0.59

 

Q25

 

 

 

 

0.44

Q26

 

 

 

 

0.70

Q27

 

 

 

 

0.79

Q28

 

 

 

 

0.81

Q29

 

 

 

 

0.74

Q30

 

 

 

 

0.78

Alpha
Reliability

 

0.70

 

0.61

 

0.82

 

0.89

 

0.89

Only factor loadings ≥ .40 are included.

Sample size was approximately 1 600 students

 

            Also the structure of the CLES was explored using factor analysis (principal components with varimax rotation). Separate factor analyses were conducted using the individual and the class mean as the units of analysis.  Table 1 shows that, with the student as the unit of analysis, the orthogonal structure of the CLES held up, thus supporting that each CLES scale assesses a unique aspect of constructivism within the classroom environment.  With the exception of two items, all CLES items had a factor loading of 0.40 or higher with its a priori scale.  No item had a factor loading as high as 0.40 with any of the other four scales.

            The CLES also was administered in Western Australia to 494 13-year-old students in 41 science classes from 13 schools.  The Cronbach alpha reliability coefficient for four of the CLES scales (i.e., Personal Relevance, Critical Voice, Shared Control, Student Negotiation) was above 0.80.  When factor analyses with varimax rotation were performed with the Australian data, the orthogonal structure found for the American sample (shown in Table 1) was replicated.

 

Results:  Students’ Pretest and Posttest Scores on Constructivist Learning Environment Survey

            If reform that is standards-based is truly occurring, then improvements in scores on the five CLES scales should be evident after three full years of implementation.  In this study, the original Phase I schools responded to the CLES as a pretest in 1994 and as a posttest again in 1997.  In general, physical science was taught in ninth grade in 1994-1995 and biology was taught in the tenth grade.  At some schools, this was reversed and, in many schools, honors students skip physical science to accelerate their academic program.  As a result of the USI reform movement, integrated science has replaced physical science, although this course is still mainly designed to cover physical science concepts.  Thus, in 1997-1998, ninth graders took integrated science in ninth grade (tenth grade is not an option), and biology was taken in the tenth grade.  Again, honors students skip integrated science and take biology in the ninth grade. 

            The CLES was administered to 440 students in 1994 and 351 students in 1997.  Means scores and t tests for differences between the two years are shown in Table 2. Scores on CLES scales were converted to a scale from 0 to 100 by scoring 0 for Almost Never, 25 for Seldom, 50 for Sometimes, 75 for Often and 100 for Almost Always.

            Table 2 shows that, among biology students, only the Personal Relevance scale had a posttest mean which was significantly different from the pretest mean, but this represented a decline in this dimension.  For the physical science/integrated science courses, Table 2 shows there were no significant changes in scale means among any of the CLES dimensions.  This implies that student-perceived change is not occurring, at least not towards a constructivist approach to instruction.

 

TABLE 2.  Changes in Student Perceptions of Dimensions of Constructivism Between 1994 and 1997

 

 

 

Biology

Integrated Science

CLES Scale

Year

Mean

SD

t

 

Mean

SD

t

 

Personal

1994

60.2

18.6

-2.0*

 

55.5

17.3

0.3

 

Relevance

1997

56.9

17.6

 

 

56.1

16.8

 

 

 

 

 

 

 

 

 

 

 

 

Critical

1994

64.7

22.9

0.3

 

65.3

21.1

0.5

 

Voice

1997

65.3

23.8

 

 

66.5

24.3

 

 

 

 

 

 

 

 

 

 

 

 

Uncertainty

1994

64.0

16.7

-1.5

 

56.6

15.0

0.6

 

of Science

1997

61.5

19.1

 

 

57.8

19.0

 

 

 

 

 

 

 

 

 

 

 

 

Shared

1994

32.0

25.1

-1.1

 

26.3

21.5

1.8

 

Control

1997

29.5

24.5

 

 

31.2

24.6

 

 

 

 

 

 

 

 

 

 

 

 

Student

1994

54.7

24.9

0.6

 

51.6

24.2

1.3

 

Negotiation

1997

56.1

23.8

 

 

55.1

24.0

 

 

 

 

 

 

 

 

 

 

 

 

*p<0.05

0 = Almost Never, 25 = Seldom, 50 = Sometimes, 75 = Often, 100 = Almost Always

 

            The mean scores in Table 2 indicate that the practices encompassed by all CLES scales except Shared Control were perceived by students to occur with a frequency of between Sometimes and Often.  Although this suggests that a moderate level of constructivist practices were perceived by students in 1994 (in terms of Personal Relevance, Critical Voice, Uncertainty of Science and Student Negotiation), it is noteworthy that increases in these dimensions did not occur during the three years of the USI.

            Perhaps the most striking pattern in Table 2 is the low mean scores for Shared Control for both biology and integrated science both in 1994 and 1997.  Mean scores suggest a frequency of only around Seldom for the practices encompassed by this scale.  This important aspect of constructivism clearly is largely absent in this USI.

 

Results:  Classroom Observations

 

            During the fall of 1997 and early spring of 1998, five evaluators observed 14 biology, 15 integrated science and 12 mathematics classes (a total of 41 classes) to document practices in these targeted high schools.  During these observations, an observer rating form was used to give a quick impression of the learning environment.

            The observational instrument, which is shown in Appendix B, involves the observer in rating the nine areas of (1) instructional pacing, (2) high expectations for all students, (3) informal assessment, (4) respect and equity, (5) enthusiasm for teaching, (6) meaningful instruction, (7) student-centeredness, (8) student involvement, and (9) student thinking.  Each area was rated by the observers using a three-point frequency scale consisting of the alternatives of Often, Sometimes and Not Observed.

            A factor analysis (principal components with varimax rotation) of the nine scores for each of the 41 classes yielded the two clear factors shown in Table 3.  A factor named Instructional Delivery encompasses six of the areas in the observational scheme (high expectations for all students, respect and equity, informal assessment, enthusiasm for teaching, instructional pacing, and meaningful instruction) and a factor named Learner-Centeredness covers the other three areas in the observational scheme (student-centeredness, student involvement, and student thinking).  Each of the nine areas observed had a factor loading ranging from 0.62 to 0.86 with its own factor scales, and a loading of less than 0.30 with the other scale.

            Table 3 also shows that the internal consistency (Cronbach alpha reliability) was 0.88 for Instructional Delivery and 0.82 for Learner Centeredness.  Overall, the data in Table 3 support the factorial validity and internal consistency reliability of the observational scheme, even though five separate researchers were involved in conducting the observations in different classrooms.

 


Table 3:  Factor Structure and Internal Consistency (Alpha Reliability) for Classroom Observation Learning Environment Instrument

 

 

Factor Loading

 

Scale

Instructional Delivery

Learner Centeredness

 

 

 

High expectations for all students

0.81

 

Respect and equity

0.77

 

Informal assessment

0.76

 

Enthusiasm for teaching

0.71

 

Instructional pacing

0.68

 

Meaningful instruction

0.62

 

 

 

 

 

Student-centeredness

 

0.86

Student involvement

 

0.73

Student thinking

 

0.69

 

 

 

 

Alpha reliability

0.88

0.82

 

Factor loadings of less than 0.30 have been omitted.

 

            A comparison was made of biology classes (N=14) with integrated science classes (N=15) with respect to scores on the two factors of Instructional Delivery and Learner-Centeredness. Table 4 shows the mean and standard deviation for each factor, together with the results of an ANOVA comparing scores on the two subjects. (Means were calculated in such a way that a mean of 100, 50 and 0 would occur when all areas belonging to a factor scales were scored, respectively, Often, Sometimes and Not Observed by the Observers.)  According to Table 4, there is slightly more emphasis on Instructional Delivery in biology classes that in integrated science classes, and somewhat more emphasis on Learner-Centeredness (about 0.4 standard deviations) in biology classes than in integrated science classes.  However, Table 4 shows that these differences between biology and integrated science classes were not statistically significant.

 

Table 4:  Comparison of Biology and Integrated Science Classes on Observed Instructional Delivery and Learner-Centeredness

 

Factor

Subject

N

Meana

SD

F

 

 

 

 

 

 

Instructional

Biology

14

71.4

30.4

0.6

Delivery

Integrated Science

15

68.9

26.0

 

 

 

 

 

 

 

Learner-

Biology

14

39.3

33.8

1.1

Centeredness

Integrated Science

15

53.3

37.4

 

 

 

 

 

 

 

aOften=100, Sometimes=50, Not Observed=0

 

            Another noteworthy pattern in Table 4 is the relatively lower means for Learner-Centeredness than for Instructional Delivery, especially for biology.  The mean of 39.3 for Learner-Centeredness for biology indicates that, on average, observers thought that the three constituent areas (student-centeredness, student involvement, and student thinking) were observed less often than Sometimes in biology classes and approximately Sometimes in integrated science.  Given that Learner-Centeredness is such a key aspect of constructivism, it is noteworthy that it was observed to occur less frequently than desirable.

            On the other hand, relative to Learner-Centeredness, aspects of Instructional Delivery (e.g. pacing, high expectations, enthusiasm) were observed at reasonably high frequencies (between Sometimes and Often) and with a mean of around 70 in Table 4).

 

Conclusion

 

            This paper has attempted to evaluate the success of an Urban Systemic Initiative (USI) in terms of students’ perceptions on the Constructivist Learning Environment Survey (CLES) in 1994 and 1997, as well as external observers’ classroom observations.  Although moderate levels of the CLES dimensions of Personal Relevance, Critical Voice, Uncertainty of Science, and Student Negotiation were perceived by students in 1994, these levels did not increase during the three years of the USI.  (In fact, Personal Relevance declined significantly in biology.)

            The most striking finding from the student questionnaire survey was the low level Shared Control (i.e., the teacher sharing with students decisions about curriculum, teaching methods and assessment) in both biology and integrated science.  Furthermore, negligible shift in this dimension occurred between 1994 and 1997.  The low levels of, and negligible shift in, Shared Control can be considered to be rooted in the system.

            The teachers are part of a state and district accountability system that threatens to fired teacher and administrators if examination scores do not increase.  The typical teacher response to this threat, whether real or perceived, has been to focus on those skills thought necessary score well on examinations.  The shift in training generally has been from pedagogy in the general sense (i.e., how to design a lesson, how to maintain discipline), to program-specific pedagogy (i.e., how to use NSF, standards-based materials).  During this training, some effort was made to incorporate research on brain theory, or child development principles but, in general, the focus was on how to use the materials.   While this materials-based training is occurring, the Board of Education and the public in general is demanding that content be taught.

            ‘Content’ is viewed as fixed and there is a strong belief that learning is the acquisition of a fixed set of knowledge.  Although the system has focused on using materials, it has never attempted to focus on how students learn and how ‘content’ is dependent on the experiences of the learner.

            Although teachers often attempt to make lessons personally relevant for students, still the approving of a small set of textbooks and the presence of an accountability system that measures gains based on a multiple-choice examination items only make the assessment system part of the problem.

            While not reported in this study, the predominant form of instruction was guided discussion in which the teacher presents the material and guides students by asking questions.  Most of these questions were asked and ultimately answered by the teacher.  At no time were student-initiated questions observed, despite the teacher’s prodding.  Ironically, the best example of students negotiating meaning among themselves occurred when newly immigrant Latino students of limited English proficiency sought the assistance of the more English-proficient Latino students. 

            The Critical Voice scale on the student questionnaire had one of the highest ratings, but it has changed little between 1994 and 1997.  It has always been a tradition for students to express their opinions.  For example, within the past five years and on more than one occasion, hundreds of students have descended upon District headquarters to protest about the actions of central administration.  However, typically, the students’ voice has not been used in a productive way in the classroom to produce a more student-centered and student-involved teaching and learning.

            Based on classroom observations, moderate levels of Instructional Delivery (e.g. pacing, high expectations and enthusiasm) were found, but aspects of Learner-Centeredness were observed to occur with quite low frequencies.

 

References

 

Dorman, J., Fraser, B.J., & McRobbie, C.J. (1997, April). Classroom environments in Australian Catholic and government secondary schools. Curriculum and Teaching, 12 (1), 3-14.

Dryden, M. (1996).  Dallas Urban Systemic Initiative:  Evaluation report, 1995-96 (REIS96-450-2, technical report).  Dallas, Texas:  Dallas Public Schools.

Dryden, M. (1997).  Dallas Urban Systemic Initiative:  Evaluation report, 1996-97 (REIS97-450-2, technical report).  Dallas, Texas:  Dallas Public Schools.

Dryden, M. & Fraser, B.J. (1996, April). Use of classroom environment instruments in monitoring urban systemic reform. Paper presented at the annual meeting of the American Educational Research Association, New York.

Dryden, M. & Fraser, B.J. (1998, April).  Awareness versus acceptance of systemic change among secondary science teachers.  Paper presented at the annual meeting of the National Association for Research in Science Teaching, San Diego, CA.

Erickson, F. (1986). Qualitative methods in research on teaching. In M.C. Wittrock (Ed.), Handbook of research on teaching (3rd ed., pp. 119-159). New York: Macmillan.

Fraser, B.J. (1986).  Classroom environment.  London:  Croom Helm.

Fraser, B.J. (1994).  Research on classroom an school climate.  In D. Gabel (Ed.),  Handbook of research on science teaching and learning (pp. 493-541).  New York:  Macmillan.

Fraser, B.J. (1998).  Science learning environments:  Assessment, effects and determinants.  In B.J. Fraser and K.G. Tobin (Eds.), International handbook of science education (pp. 527-564).  Dordrecht, The Netherlands:  Kluwer.

Fraser, B.J., & Fisher, D.L. (1983). Student achievement as a function of person-environment fit: A regression surface analysis. British Journal of Educational Psychology, 53, 89-99.

Fraser, B.J., & Fisher, D.L. (1986). Using short forms of classroom climate instruments to assess and improve classroom psychosocial environment. Journal of Research in Science Teaching, 23, 387-413.

Fraser, B.J. & O'Brien, P. (1985).  Student and teacher perceptions of the environment  of elementary-school classrooms.  Elementary School Journal, 85, 567-580.

Fraser, B.J. & Tobin, K. (1991).  Combining qualitative and quantitative methods  classroom environment research.  In B.J. Fraser and H.J. Walberg (Eds.), Educational environments:  Evaluation, antecedents and consequences (pp. 271-292).  Oxford:  Pergamon.

Fraser, B.J. & Walberg, H.J. (Eds.) (1991).  Educational environments: Evaluation,  antecedents and consequences.  Oxford:  Pergamon.

Fraser, B.J., Williamson, J.C., & Tobin, K. (1987). Use of classroom and school climate scales in evaluating alternative high schools. Teaching and Teacher Education, 3, 219-231.

Jacubowski,E. & Tobin, K. (1991).  Teachers’ personal epistemologies and classroom learning environments.  In B.J. Fraser and H.J. Walberg (Eds.), Educational environments:  Evaluation, antecedents and consequences (pp. 201-214).  Oxford:  Pergamon.

Lucas, K.B. & Roth, W.M. (1996). The nature of scientific knowledge and student learning: Two longitudinal case studies. Research in Science Education, 26, 103-129.

McRobbie, C.J. & Fraser, B.J. (1993). Association between student outcomes and psychosocial science environments. Journal of Educational Research, 87, 78-85.

Milne, C. & Taylor, P.C. (1996, April). School science: A fertile culture for the evolution of myths. Paper presented at the annual meeting of the National Association for Research in Science Teaching, St Louis, MO.

Roth, W.M. & Bowen, G.M. (1995). Knowing and interacting: A study of culture, practices, and resources in a grade 8 open-inquiry science classroom guided by a cognitive apprenticeship metaphor. Cognition and Instruction, 13, 73-128.

Roth, W. M. & Roychoudhury, A. (1993). The nature of scientific knowledge, knowing and learning: The perspectives of four physics students. International Journal of Science Education, 15, 27-44.

Roth, W.M. & Roychoudhury, A. (1994). Physics students epistemologies and views about knowing and learning. Journal of Research in Science Teaching, 31, 5-30.

Steffe, L.P. & Gale, J., (1995) Constructivism in education.  Mahwah, New Jersey: Lawrence Erlbaum Associates.

Taylor, P.C. (in press). Mythmaking and mythbreaking in the mathematics classroom. Educational Studies in Mathematics.

Taylor, P.C. & Dawson, V. (in press). Critical reflections on a problematic student-supervisor relationship. In J. Malone, W. Atweh, & J. Northfield (Eds.), The practice of postgraduate research supervision. Dordrecht, The Netherlands: Kluwer.

Taylor, P.C., Dawson, V. & Fraser, B.J. (1995, April).  Classroom learning environments under transformation:  A constructivist perspective. Paper  presented at the annual meeting of the American Educational Research  Association, San Fransisco.

Taylor, P.C., Dawson, V., & Fraser, B.J. (1995, April). A constructivist perspective on monitoring classroom learning environments under transformation. Paper presented at the annual meeting of the American Educational Research Association, San Fransisco, CA.

Taylor, P.C. & Fraser, B.J. (1991, April). Development of an instrument for assessing constructivist learning environments. Paper presented at the annual meeting of the American Educational Research Association, New Orleans, LA.

Taylor, P.C., Fraser, B.J., & Fisher, D.L. (1997).  Monitoring constructivist classroom learning environments.  International Journal of Educational Research, 27, 293-302.

Taylor, P.C., Fraser, B.J. & White, L.R. (1994, April). The revised CLES: A questionnaire for educators interested in the constructivist reform of school science and mathematics. Paper presented at the annual meeting of the American Educational Research Association, Atlanta, GA.

Tobin, K. & Fraser, B.J. (1998).  Qualitative and quantitative landscapes of classroom learning environments.  In B.J. Fraser and K.G. Tobin (Eds.), International Journal of Science Education  (pp. 623-640).  Dordrecht, The Netherlands:  Kluwer.

Watters, J.J. & Ginns, I.S. (1994). Self-efficacy and science anxiety among preservice primary teachers: Origins and remedies. Research in Science Education, 24, 348-357.

 

BJF-1194-RW-7/4/98


Appendix A

 

Constructivist Learning

Environment Survey

 

 

 

directions

 

 

1.    Purpose of the Questionnaire

       This questionnaire asks you to describe important aspects of the science classroom which you are in right now. There are no right or wrong answers. This is not a test and your answers will not affect your assessment. Your opinion is what is wanted. Your answers will enable us to improve future science classes.

 

 

2.    How to Answer Each Question

       On the next few pages you will find 30 sentences. For each sentence, circle only one number corresponding to your answer. For example:

 







Almost Always

 

Often



Some-times


Seldom



Almost Never


In this class . . .

 

 

 

 

 

8

The teacher asks me questions.

5

4

3

2

1

 

          If you think this teacher almost always asks you questions, circle the 5.

          If you think this teacher almost never asks you questions, circle the 1.

          Or you can choose the number 2, 3 or 4 if one of these seems like a more accurate answer.

 

 

3.    How to Change Your Answer

       If you want to change your answer, cross it out and circle a new number, For example:

 

8

The teacher asks me questions.

3

2

1

 


Learning about the world




Almost Always

Often


Some-times

Seldom


Almost Never

In this class . . .

 

 

 

 

 

 

1

I learn about the world outside of school.

 

5

4

3

2

1

2

My new learning starts with problems
about the world outside of school.

 

5

4

3

2

1

3

I learn how science can be part of
my out-of-school life.

 

5

4

3

2

1


 

In this class . . .

 

 

 

 

 

 

4

I get a better understanding of
the world outside of school.

 

5

4

3

2

1

5

I learn interesting things about
the world outside of school.

 

5

4

3

2

1

6

What I learn has nothing to do with
my out-of-school life.

 

5

4

3

2

1


Learning about science




Almost Always

Often


Some-times

Seldom


Almost Never

In this class . . .

 

 

 

 

 

 

7

I learn that science cannot provide
perfect answers to problems.

 

5

4

3

2

1

8

I learn that science has changed over time.

 

5

4

3

2

1

9

I learn that science is influenced by
people's values and opinions.

 

5

4

3

2

1

In this class . . .

 

 

 

 

 

 

10

I learn about the different sciences
used by people in other cultures.

 

5

4

3

2

1

11

I learn that modern science is different
from the science of long ago.

 

5

4

3

2

1

12

I learn that science is about inventing theories.

 

5

4

3

2

1


Learning to speak out




Almost Always

Often


Some-times

Seldom


Almost Never

In this class . . .

 

 

 

 

 

 

13

It's OK for me to ask the teacher
"why do I have to learn this?"

 

5

4

3

2

1

14

It's OK for me to question the way I'm being taught.

 

5

4

3

2

1

15

It's OK for me to complain about activities
that are confusing.

 

5

4

3

2

1

In this class . . .

 

 

 

 

 

 

16

It's OK for me to complain about anything
that prevents me from learning.

 

5

4

3

2

1

17

It's OK for me to express my opinion.

 

5

4

3

2

1

18

It's OK for me to speak up for my rights.

 

5

4

3

2

1


 


Learning to learn




Almost Always

Often


Some-times

Seldom


Almost Never

In this class . . .

 

 

 

 

 

 

19

I help the teacher to plan
what I'm going to learn.

 

5

4

3

2

1

20

I help the teacher to decide
how well I am learning.

 

5

4

3

2

1

21

I help the teacher to decide
which activities are best for me.

 

5

4

3

2

1

In this class . . .

 

 

 

 

 

 

22

I help the teacher to decide
how much time I spend on activities.

 

5

4

3

2

1

23

I help the teacher to decide
which activities I do.

 

5

4

3

2

1

24

I help the teacher to assess my learning.

 

5

4

3

2

1


Learning to communicate




Almost Always

Often


Some-times

Seldom


Almost
Never

In this class . . .

 

 

25

I get the chance to talk to other students.

 

5

4

3

2

1

26

I talk with other students about how to solve problems.

 

5

4

3

2

1

27

I explain my ideas to other students.

 

5

4

3

2

1

In this class . . .

 

 

 

 

 

 

28

I ask other students to explain their ideas.

 

5

4

3

2

1

29

Other students ask me  to explain my ideas.

 

5

4

3

2

1

30

Other students explain their ideas to me.

 

5

4

3

2

1








Almost Always

Often


Some-times

Seldom


Almost Never

 

 

 


Appendix B

 

Observer Rating of Classroom Learning Environment

 

 Often       Sometimes        Not Observed

1.   Instructional pacing

 

 

 

 

 

 

 

 

 

 

Teacher keeps students on-task and rarely allows downtime;

is prepared for the lesson and paces instruction appropriately;

keeps the flow of the lesson moving to maintain high involvement;

has classroom rules and enforces them to keep students on task.

1

 

 

2

 

 

 

3

 

 

2.  High expectations for all students

 

 

 

 

 

 

 

 

 

 

Teacher encourages active participation of all students;

challenges students;

allows students time to consider other points of view or multiple solutions;

assigns open-ended problems for students to solve.

1

 

 

2

 

 

 

3

 

 

3.  Informal assessment

 

 

 

 

 

 

 

 

 

 

Teacher asks direct questions to elicit conceptual understanding;

asks "what if" or "suppose that" questions;

modifies lesson based on student questioning or other information;

has students explain their reasoning when answering a question.

1

 

 

2

 

 

 

3

 

 

4.  Respect and equity

 

 

 

 

 

 

 

 

 

 

Teacher fosters a climate of respect for student ideas and contributions;

uses language and behavior that demonstrate sensitivity to all students;

encourages slower learners or reluctant participants;

interacts with off-task students in dignified manner.

1

 

 

2

 

 

 

3

 

 

5.   Enthusiasm for teaching

 

 

 

 

 

 

 

 

 

 

Teacher is enthusiastic in presentation of material;

appears confident in his/her ability to teach math/science;

praises and reinforces student efforts.

1

 

 

2

 

 

 

3

 

 

6.   Meaningful instruction

 

 

 

 

 

 

 

 

 

 

Teacher builds lesson from simpler to more complex ideas or concepts;

provides age-appropriate concrete examples of concepts to be learned;

makes lesson relevant by relating lesson to students' interests.

1

 

 

2

 

 

 

3

 

 

7.  Student-centeredness

 

 

 

 

 

 

 

 

 

 

Students explore ideas collaboratively;

think about and relate examples from their own experiences;

are given meaningful assignments when completing an activity early.

1

 

 

2

 

 

 

3

 

 

8.  Student involvement

 

 

 

 

 

 

 

 

 

 

Students gather, record, represent, and/or analyze data;

design or implement investigations in math or science (hands-on);

attempt to solve open-ended problems or investigations.

1

 

 

2

 

 

 

3

 

 

9.  Student thinking

 

 

 

 

 

 

 

 

 

 

Students attempt to justify their ideas and explain their thoughts;

ask and pursue questions on their own;

formulate conjectures or make reasonable guesses;

look for patterns, cycles or trends as they learn new things.

1

 

 

2

 

 

 

3

 

 

 


The Impact of Systemic Reform Efforts in Promoting Constructivist
Approaches in High School Science

 

Michael Dryden
Dallas Public Schools
3700 Ross Avenue
Dallas  TX  75204,  USA

 

Barry J. Fraser
Curtin University of Technology
GPO Box U1987
Perth 6845, Australia

 

 


The Impact of Systemic Reform Efforts
 on Instruction in High School Science Classes

 

Barry J. Fraser

Curtin University of Technology
GPO Box U1987
Perth 6845, Australia

 

Michael Dryden
Dallas Public Schools
3700 Ross Avenue
Dallas  TX  75204,  USA

 

Peter Taylor
Curtin University of Technology
GPO Box U1987
Perth 6845, Australia

 

 


The Impact of Systemic Reform Efforts in Promoting Constructivist Approaches
in High School Science

 

 

Michael Dryden

Dallas Public Schools

 

 

Barry J. Fraser

Curtin University of Technology


 


 

The Impact of Systemic Reform Efforts

 on Instruction in High School

Science Classes

 

Barry J. Fraser

Curtin University of Technology

 

Michael Dryden
Dallas Public Schools

 

Peter Taylor
Curtin University of Technology

 

 


Constructivist Learning Environment Survey (CLES)

·     Personal Relevance focuses on the connectedness of school science to students' out-of-school experiences, and on making use of students' everyday experiences as a meaningful context for the development of students' scientific knowledge.

·     Uncertainty involves the extent to which opportunities are provided for students to experience scientific knowledge as arising from theory-dependent inquiry involving human experience and values, and as evolving, non-foundational, and culturally and socially determined.

·     Critical Voice involves the extent to which a social climate has been established in which students feel that it is legitimate and beneficial to question the teacher's pedagogical plans and methods, and to express concerns about any impediments to their learning.

·     Shared Control is concerned with students being invited to share with the teacher control of the learning environment, including the articulation of learning goals, the design and management of learning activities, and the determination and application of assessment criteria.

·     Student Negotiation assesses the extent to which opportunities exist for students to explain and justify to other students their newly developing ideas, to listen attentively and reflect on the viability of other students' ideas and, subsequently, to reflect self-critically on the viability of their own ideas.

The CLES contains 30 items altogether, with six items in each of the five scales. The response alternatives for each item are Almost Always, Often, Sometimes, Seldom, and Almost Never. A complete copy of the CLES is contained in Appendix A.


Table 1.  Item Factor Loadings and Scale Alpha Reliabilities

 

Item

 

                   Factor Loadings

 

 

 

Personal

Relevance

Uncertainty

of Science

Critical

Voice

Shared

Control

Student

Negotiation

Q1

0.58

 

 

 

 

Q2

0.54

 

 

 

 

Q3

0.37

 

 

 

 

Q4

0.66

 

 

 

 

Q5

0.66

 

 

 

 

Q6

 

 

 

 

 

Q7

 

 

 

 

 

Q8

 

0.56

 

 

 

Q9

 

0.54

 

 

 

Q10

 

0.38

 

 

 

Q11

 

0.54

 

 

 

Q12

 

0.40

 

 

 

Q13

 

 

0.52

 

 

Q14

 

 

0.64

 

 

Q15

 

 

0.62

 

 

Q16

 

 

0.65

 

 

Q17

 

 

0.70

 

 

Q18

 

 

0.79

 

 

Q19

 

 

 

0.72

 

Q20

 

 

 

0.66

 

Q21

 

 

 

0.79

 

Q22

 

 

 

0.78

 

Q23

 

 

 

0.78

 

Q24

 

 

 

0.59

 

Q25

 

 

 

 

0.44

Q26

 

 

 

 

0.70

Q27

 

 

 

 

0.79

Q28

 

 

 

 

0.81

Q29

 

 

 

 

0.74

Q30

 

 

 

 

0.78

Alpha
Reliability

 

0.70

 

0.61

 

0.82

 

0.89

 

0.89

 

Only factor loadings ≥ .40 are included.

Sample size was approximately 1 600 students


 

TABLE 2.  Changes in Student Perceptions of Dimensions of Constructivism Between 1994 and 1997

 

 

 

Biology

Integrated Science

CLES Scale

Year

Mean

SD

t

 

Mean

SD

t

Personal

1994

60.2

18.6

-2.0*

 

55.5

17.3

0.3

Relevance

1997

56.9

17.6

 

 

56.1

16.8

 

 

 

 

 

 

 

 

 

 

Critical

1994

64.7

22.9

0.3

 

65.3

21.1

0.5

Voice

1997

65.3

23.8

 

 

66.5

24.3

 

 

 

 

 

 

 

 

 

 

Uncertainty

1994

64.0

16.7

-1.5

 

56.6

15.0

0.6

of Science

1997

61.5

19.1

 

 

57.8

19.0

 

 

 

 

 

 

 

 

 

 

Shared

1994

32.0

25.1

-1.1

 

26.3

21.5

1.8

Control

1997

29.5

24.5

 

 

31.2

24.6

 

 

 

 

 

 

 

 

 

 

Student

1994

54.7

24.9

0.6

 

51.6

24.2

1.3

Negotiation

1997

56.1

23.8

 

 

55.1

24.0

 

 

 

 

 

 

 

 

 

 

 

*p<0.05

0 = Almost Never, 25 = Seldom, 50 = Sometimes, 75 = Often, 100 = Almost Always


Table 3:  Factor Structure and Internal Consistency (Alpha Reliability) for Classroom Observation Learning Environment Instrument

 

 

Factor Loading

 

Scale

Instructional Delivery

Learner Centeredness

 

 

 

High expectations for all students

0.81

 

Respect and equity

0.77

 

Informal assessment

0.76

 

Enthusiasm for teaching

0.71

 

Instructional pacing

0.68

 

Meaningful instruction

0.62

 

 

 

 

 

Student-centeredness

 

0.86

Student involvement

 

0.73

Student thinking

 

0.69

 

 

 

 

Alpha reliability

0.88

0.82

 

 

Factor loadings of less than 0.30 have been omitted.


Table 4:  Comparison of Biology and Integrated Science Classes on Observed Instructional Delivery and Learner-Centeredness

 

Factor

Subject

N

Meana

SD

F

 

 

 

 

 

 

Instructional

Biology

14

71.4

30.4

0.6

Delivery

Integrated Science

15

68.9

26.0

 

 

 

 

 

 

 

Learner-

Biology

14

39.3

33.8

1.1

Centeredness

Integrated Science

15

53.3

37.4

 

 

 

 

 

 

 

 

aOften=100, Sometimes=50, Not Observed=0


 


Learning to learn




Almost Always

Often


Some-times

Seldom


Almost Never

In this class . . .

 

 

 

 

 

 

19

I help the teacher to plan
what I'm going to learn.

 

5

4

3

2

1

20

I help the teacher to decide
how well I am learning.

 

5

4

3

2

1

21

I help the teacher to decide
which activities are best for me.

 

5

4

3

2

1

In this class . . .

 

 

 

 

 

 

22

I help the teacher to decide
how much time I spend on activities.

 

5

4

3

2

1

23

I help the teacher to decide
which activities I do.

 

5

4

3

2

1

24

I help the teacher to assess my learning.

 

5

4

3

2

1


Learning to communicate




Almost Always

Often


Some-times

Seldom


Almost
Never

In this class . . .

 

 

25

I get the chance to talk to other students.

 

5

4

3

2

1

26

I talk with other students about how to solve problems.

 

5

4

3

2

1

27

I explain my ideas to other students.

 

5

4

3

2

1

In this class . . .

 

 

 

 

 

 

28

I ask other students to explain their ideas.

 

5

4

3

2

1

29

Other students ask me  to explain my ideas.

 

5

4

3

2

1

30

Other students explain their ideas to me.

 

5

4

3

2

1








Almost Always

Often


Some-times

Seldom


Almost Never

 

 


Appendix B

 

Observer Rating of Classroom Learning Environment

 

 Often       Sometimes        Not Observed

1.   Instructional pacing

 

 

 

 

 

 

 

 

 

 

Teacher keeps students on-task and rarely allows downtime;

is prepared for the lesson and paces instruction appropriately;

keeps the flow of the lesson moving to maintain high involvement;

has classroom rules and enforces them to keep students on task.

1

 

 

2

 

 

 

3

 

 

2.  High expectations for all students

 

 

 

 

 

 

 

 

 

 

Teacher encourages active participation of all students;

challenges students;

allows students time to consider other points of view or multiple solutions;

assigns open-ended problems for students to solve.

1

 

 

2

 

 

 

3

 

 

3.  Informal assessment

 

 

 

 

 

 

 

 

 

 

Teacher asks direct questions to elicit conceptual understanding;

asks "what if" or "suppose that" questions;

modifies lesson based on student questioning or other information;

has students explain their reasoning when answering a question.

1

 

 

2

 

 

 

3

 

 

4.  Respect and equity

 

 

 

 

 

 

 

 

 

 

Teacher fosters a climate of respect for student ideas and contributions;

uses language and behavior that demonstrate sensitivity to all students;

encourages slower learners or reluctant participants;

interacts with off-task students in dignified manner.

1

 

 

2

 

 

 

3

 

 

5.   Enthusiasm for teaching

 

 

 

 

 

 

 

 

 

 

Teacher is enthusiastic in presentation of material;

appears confident in his/her ability to teach math/science;

praises and reinforces student efforts.

1

 

 

2

 

 

 

3

 

 

6.   Meaningful instruction

 

 

 

 

 

 

 

 

 

 

Teacher builds lesson from simpler to more complex ideas or concepts;

provides age-appropriate concrete examples of concepts to be learned;

makes lesson relevant by relating lesson to students' interests.

1

 

 

2

 

 

 

3

 

 

7.  Student-centeredness

 

 

 

 

 

 

 

 

 

 

Students explore ideas collaboratively;

think about and relate examples from their own experiences;

are given meaningful assignments when completing an activity early.

1

 

 

2

 

 

 

3

 

 

8.  Student involvement

 

 

 

 

 

 

 

 

 

 

Students gather, record, represent, and/or analyze data;

design or implement investigations in math or science (hands-on);

attempt to solve open-ended problems or investigations.

1

 

 

2

 

 

 

3

 

 

9.  Student thinking

 

 

 

 

 

 

 

 

 

 

Students attempt to justify their ideas and explain their thoughts;

ask and pursue questions on their own;

formulate conjectures or make reasonable guesses;

look for patterns, cycles or trends as they learn new things.

1

 

 

2

 

 

 

3