Category Archives: assessment

General education core science assessment

In early August 2009 I was tasked with setting up an initial assessment of the following two program learning outcomes:

Outcome 3.4: Define and explain scientific concepts, principles, and theories of a field of science.
Outcome 3.5: Perform experiments that use scientific methods as part of the inquiry process.

A 20 August 2009 worksheet number two document records the evaluation question as “Can students demonstrate understanding and apply scientific concepts and principles of a field of science and apply scientific methods to the inquiry process?”

The chosen method of evaluation would be to collect laboratories from three courses, SC 120 Biology, SC 117 Tropical Pacific Island Environment, and SC 130 Physical Science, as these courses were being taught system-wide and were felt to well represent the general education science laboratory experience.

A rubric was developed during the fall of 2009 based on one in use at McKendree University (page no longer extant 2011).

Performance factor score
4 3 2 1
Metric: Scientific Procedures and reasoning
Accurately and efficiently used all appropriate tools and
technologies to gather and analyze data
Effectively used some appropriate tools and technologies to gather
and analyze data with only minor errors
Attempted to use appropriate tools and technologies but information
inaccurate or incomplete
Inappropriate use of tools or technology to gather data
Metric: Strategies
Used a sophisticated strategy and revised strategy where
appropriate to complete the task; employed refined and complex
reasoning and demonstrated understanding of cause and effect;
applied scientific method accurately
Used a strategy that led
to completion while recording all data; used effective scientific
reasoning; framed or used testable questions, conducted experiment,
and supported results with data
Used a strategy that led to partial completion of the task/
investigation; some evidence of scientific reasoning used;
attempted but could not completely carry out testing, recording all
data and stating conclusions
No evidence of procedure or scientific reasoning used; so many
errors, task could not be completed
Metric: Scientific communication/using data
Provided clear, effective explanation detailing how the task was
carried out; precisely and appropriately used multiple scientific
representations and notations to organize and display information;
interpretation of data supported conclusions and raised new
questions or was applied to new contexts; disagreements with data
resolved when appropriate
Presented a clear explanation; effectively used scientific
representations and notations to organize and display information;
appropriately used data to support conclusions
Incomplete explanation; attempted to use appropriate scientific
representations and notations, but were incomplete; conclusions not
supported or were only partly supported by data
Explanation could not be understood; inappropriate use of
scientific notation; conclusion unstated or data unrecorded
Metric: Scientific concepts and related content
Precisely and appropriately used scientific terminology; provided
evidence of in-depth, sophisticated understanding of relevant
scientific concepts, principles or theories; revised prior
misconceptions when appropriate; observable characteristics and
properties of objects, organisms, and/or materials used; went beyond
the task investigation to make other connections or extend thinking
Appropriately used scientific terminology; provided evidence of
understanding of relevant scientific concepts, principles or
theories; evidence of understanding observable characteristics and
properties of objects, organisms, and/or materials used
Used some relevant scientific terminology; minimal reference to
relevant scientific concepts, principles or theories; evidence of
understanding observable characteristics and properties of objects,
organisms, and/or materials used
Inappropriate use of scientific terminology; inappropriate
references to scientific concepts, principles or theories

Laboratory assignments were collected from a single laboratory in each of the three courses system wide. A team of science faculty members marked the laboratories with no faculty marking their own laboratories.

In a worksheet number three document that this author received on 05 August 2010, the following findings were reported. Note that scores were the sum of two readers, hence the four point scale above is reported as an eight point scale below. Four metrics, four points maximum, two readers yields the total possible of 32 points.

1b. Summary of Assessment Data Collected (3-9):

  • Overall average points on lab reports was 14.95 out of 32 possible points.
  • Scientific Procedures & Reasoning = 3.89 out of 8 possible points.
  • Strategies = 4.05 out of 8 possible points.
  • Scientific communication using data = 3.38 out of 8 possible points
  • Scientific concepts and related content = 3.63 out of 8 possible points

The following plans were made to seek improvement in the 2010-2011 academic year.

1c: Use of Results to Improve Program/Unit Impact/Services[Closing the loop] (3-10):
Increase the students’ ability to demonstrate understanding and apply scientific concepts and principles of a field of science and apply scientific methods to the inquiry process by increasing the overall rating from 14. 95 to at least 16.

  • Have students write more lab reports than the reported 1 – 3.
  • Collaborate with Lang/Lit, and English divisions to help prepare students to write scientific papers.
  • Provide science faculty with ways to help students write better lab reports.

A questionnaire had accompanied the request to collect laboratories.

2a. Means of Unit Assessment & Criteria for Success:
Students were asked [on a cover page], “How confident are you in writing this lab report?”

2b. Summary of Assessment Data Collected:
Only one group of students were not asked to complete the cover page for this part of the assessment. 50 of the 60 students assessed completed the form with the following results:

  • 33 (66%) of the 50 students reported that they felt confident or very confident in completing the lab report.
  • 7 (14%) of the 50 students reported that they were nervous, unsure, or uncomfortable in writing a lab report.
  • Confidence of students doesn’t match with low ratings

In a workshop on or about 05 August 2010, a decision was made to repeat the science assessment activity with the same courses and the same rubric. The above results were restated as three multi-part comments on planning worksheet two completed on 05 August 2010:

Comment 1. Increase the students’ ability to demonstrate understanding and apply scientific concepts and principles of a field of science and apply scientific methods to the inquiry process by increasing the overall rating from 14. 95 to at least 16 by:
C1a. Having students write more lab reports than the reported 1 – 3.
C1b. Collaborating with Lang/Lit, and English divisions to help prepare students to write scientific papers.
C1c. Providing science faculty with ways to help students write better lab reports.

Comment 2. Only one group of students were not asked to complete the cover page for this part of the assessment. Make sure to collect the same data for all lab reports submitted when the project is run again.
C2a. Repeat directions more than a few times.
C2b. Remind assignment administrators again just before distributing the assignment·

Comment 3. Increase student confidence in their writing.
C3a. Provide immediate feed back
C3b. Find ways for expectations from instructors to be clear to students such as providing students with rubrics before the assignment is due.
C3c. Provide more opportunities for scientific writing.

The above comments (recommendations) were to be circulated to all science faculty. A new assessment coordinator came on board in this same time frame. Laboratories were requested in the fall of 2010 and submitted to the assessment coordinator who later assembled a marking team. This author received a worksheet number three on 03 August 2011. The worksheet had a creation date of 18 July 2011.

The first item on worksheet number three appears to have been an addition made by the assessment coordinator at that time as the metric was not called for in the August 2010 worksheet number two document. The first item reported on course completion rate information. The metric noted that a 70% completion rate was expected for students in the courses.

A table in the report, reproduced below, reported on the completion rate.

Course Completion Rate for Fall 2010
Row Labels Enrollment Total Passed Completion Rate
SC117 57 53 93%
SC120 105 77 73%
SC130 51 46 90%

The definition of completion was not fully defined, the worksheet only noted that, “Criteria for success: General Weighted Mean of 70% or higher.”

The common science laboratory assignment was reported in section 2a and 2b. The data was reported in a slightly different manner than the prior year. The data reported was:

2b. Summary of Assessment Data Collected:
Overall average score on science assignment. 2009 – 14.95 2010 – 21.9

2009 Total

2010 Total

Pohnpei

13.4

21.8

Kosrae

19.4

24

National

15

19.8

Yap

16

22

While all campuses saw scores improve, there was no report on whether implementation of the comments (recommendations) had occurred. Thus there was no way to determine whether the earlier worksheet two recommendations were causative of the score improvement.

A second potential complication was that the marking teams were not the same for the two years, thus there was the possibility that the two teams had interpreted the rubric in different manners.

Score improvements were seen in all four metrics when looked at system-wide:

Sc Proc & Reasoning

Strategies

Sc Comm/Using Data

Sc Concepts & Related Content

2010

5.5

5.4

5.2

5.1

2009

3.9

4.1

3.4

3.6

Again, ferreting out meaning, whether score improvement was due to specific actions taken by instructors, random change, or a difference in the way the marking teams marked, was not determinable. Were more laboratories assigned? Did collaboration occur with language and literature instructors? Did instructors provide rubrics prior to assigning the laboratories? Was feedback as immediate as was reasonably possible?

Section 3a and 3b of the 18 July 2011 worksheet number three reported on student confidence.

3a. Means of Unit Assessment & Criteria for Success (3-8):
Students were asked, “How confident are you in completing the Science lab?”

3b. Summary of Assessment Data Collected (3-9):
Confidence level:

      2010    2009
High   43%    66%
Med.   10%
Low    12%    14%

Comments (made by the then assessment coordinator): Confidence levels on the low end are similar both years. Perhaps in 2010, students confidence at the high and medium levels is more in line with lab scores.

On the 05 August 2011 worksheet two was prepared for the 2011-2012 academic year. The recommendations on worksheet number two (listed as comments) were as follows:

Comments:

The following recommendations derive from the 2009-2010 assessment cycle for general education science. During the 2010-2011 school year no data was gathered on whether these comments were implemented. Improvements seen in ratings might be due to rater bias – different raters were used year-on-year.

Comment 1. Increase the students’ ability to demonstrate understanding and apply scientific concepts and principles of a field of science and apply scientific methods to the inquiry process by increasing the overall rating from 14. 95 to at least 16 by:
C1a. Having students write more lab reports than the reported 1 – 3.
C1b. Collaborating with Lang/Lit, and English divisions to help prepare students to write scientific papers.
C1c. Providing science faculty with ways to help students write better lab reports.

Comment 2. Only one group of students were not asked to complete the cover page for this part of the assessment. Make sure to collect the same data for all lab reports submitted when the project is run again.
C2a. Repeat directions more than a few times.
C2b. Remind assignment administrators again just before distributing the assignment·

Comment 3. Increase student confidence in their writing.
C3a. Provide immediate feed back
C3b. Find ways for expectations from instructors to be clear to students such as providing students with rubrics before the assignment is due.
C3c. Provide more opportunities for scientific writing.

Comment 4. Document whether recommendations one to three were actually implemented system-wide and if not, what issues inveighed against implementation. Bear in mind that the comments/recommendations did not have system-wide buy-in.
C4a. Specific solutions yet to be determined. Possible use of a self-report survey.

Comment 5. Resolve the issue of consistency in marking of the laboratories year-on-year.
C5a. Specific solution yet to be determined.

Comment 6. Begin discussion of whether current assessment is providing useful, actionable information on accomplishment of program learning outcomes. Consider alternative assessment options for 2012-2013 school year.

Comments C4a and C5a were highlighted in the original document.

The absence of an assessment coordinator to help coordinate the tracking and collection of the meta-data required to report on whether the recommendations (comments) were implemented, and, if so, to what extent,  once again puts at risk the ability to interpret the raw data and to report on the recommendations in May 2012.

This article is intended to help improve institutional memory by gathering the key results of the general education core assessment for the past three years in one single place, along with the rubric being used. At present this information is scattered across multiple files that were exchanged only via email over a three year period. This end note is only intended to make plain that blogs are a useful way to report and make transparent assessment efforts, data, and results.

Teacher Corps Assessment

At the end of a week long mathematics and science workshop the 21 participants were asked to respond to the following questions. A report on the workshop exists as two blog articles, Teacher Corps and Teacher Corps II. Responses were obtained from 17 participants.

1. What was the most useful activity for you as a teacher?
2. What was the least useful activity for you as a teacher?
3. What was the most interesting activity?
4. What was the least interesting activity?
5. What was the most surprising experience during this past week?
6. What was the most fun?
7. What would you change if such a workshop was run in the future for teacher corps?

The table below is an excerpt for a larger table of responses. Responses were tallied and common responses were combined. The table includes only those responses which appeared three or more times. Note that respondents were permitted to cite more than one activity per question if they chose to do so.

Response Most useful Least useful Most interesting Least interesting Surprising Fun Sum
Plant names 4 3 2 7 3 19
Field trip 2 3 2 7 14
Floral litmus 1 5 1 2 9
None 7 2 9
All 6 6
Constructions 1 1 2 1 5
El Niño 1 1 2 1 5
Local materials 3 1 4
Speed of sound 1 3 4
Fibonacci ratio 1 1 1 3
Marble math 1 1 1 3

The “plant names” response refers to a number of walks on which plants were identified by the instructor in the local language of the participant. The participants did not all know their own plant names and many found this interesting and surprising.

A field trip to the Pwunso botanic garden to view spice plants, timber trees, and learn about the benefits of local foods from the Island Food Community garnered the most votes for being fun.

A laboratory that used boiled flowers to generate floral litmus solutions the most interesting activity for the participants, followed by a laboratory that determined the speed of sound using sticks from the forest, orally counted seconds, and echoes.

Six of the seventeen respondents felt that all of the activities engaged in were useful to them as teachers and as future teachers. Seven felt that none of the activities could be classified as least useful to them.

Constructing circles, triangles, squares, pentagons, and hexagons with a string and straight edge generated the strongest even split of all activities.

Although not shown in the excerpt above because the number of responses was only two, a side unit on a sound wave done in the computer laboratory, a unit on logic (categorical propositions, the square of opposition, and categorical syllogisms), and a batteries and bulbs activity were the only activities to receive more than one negative response.

The participants were also asked what they would change if the workshop were to ever be run again. The following responses are in descending order of popularity.

Move the start time back one hour from 8:00 to 9:00 (4 respondents)
Ensure lunch is arranged (4 respondents)
Run the workshop at dates that do not fall so close to a major holiday (3)
Cover how to prepare a science worksheet for lower grades (1)
Have more field trips (1)
Shorten the workshop day to four hours (1)
Extend the workshop to three weeks (1)
Spend more time outside (1)

Teacher Corps II

Thursday morning, day four of the workshop, opened with a focus on captivating students’ attention. No attention, no learning. Rather than say this up front, however, the concept was made concrete by putting a teacher, supported by two other teachers, on a RipStik caster board.

With a teacher standing on the board, the difference between dynamic and static stability was explained. Having a teacher being held up on the unstable, stationary board, focused the attention of at least the teacher on the board, if not the class.

With the definition illustrated, the concept was extended to climate change. If the global climate is essentially statically stable, then small perturbations in that system should engender nothing more than small, fairly stable changes in the global climate. If the global climate system is only dynamically stable, then small changes may have unexpected effects including potentially large changes as described in runaway climate change scenarios.

Following this presentation, the instructor used the RipStik to introduce waves. The RipStik leaves behind a distinctive wave on the paper. The wave form provides an opportunity to introduce terminology such as crest, trough, wavelength, and amplitude. The RipStik also makes concrete frequency as being the number of “wiggles” per second.

Dana on a RipStik laying down a waveform
Dana on a RipStik laying down a waveform

Rapid wiggling generates a high frequency (big), short wavelength (small). Slow wiggling generates a low freqency (small),  long wavelength (big). Thus the caster board well demonstrates the inverse relationship between wavelength and frequency that is seen in many systems.

Best of all, for the caster board the wave speed (frequency times wavelength) is exactly the linear board speed.

Images of the tracks with labeled features were illustrated in an article written by the workshop lead in October 2011. The activity is used in conjunction with a unit on waves in physical science.

The board ridden on paper on concrete provides a way to bring wave phenomenon down into earlier grades below the high school level. The boards do cost money, and one has to either ride the board or have a rider, yet there are a fair number of young riders even here on Pohnpei and thus it might be an option for a teacher. Simply have the student ride their board across the paper.

Inside the classroom transverse waves on a length of chain and longitudinal waves in a Slinky spring were demonstrated.

Following the Thursday morning break, the 10:00 session started with geometric math standard 2.31 recognize common shapes. But in a twist on standard 2.8.1, all shapes were constructed using only a length of string as a compass and a meter stick from the forest. The meter sticks had been built on Monday. Constructions based on these limitations are well covered by Zef Damen.

Constructions started with a circle and moved on to equilateral triangles, hexagons, squares, and finally a pentagon. The proof of the Pythagorean theorem was also presented, along with proof of the irrationality of the square root of two. This last fact was problematic for the Pythagoreans and their math system that effectively postulated all one needs to do mathematics are “marbles and pompoms“, along with ratios of marbles and pompoms.

Rustem analyzes acids and bases, Trevor on the right
Rustem analyzes acids and bases, Trevor on the right

In the background above the pentagon/pentagram construction can be glimpsed, on the far right is part of the Pythagorean proof.

The square root of two is not, however, expressable in the Al Mat marbular system, much to the consternation of the Pythagoreans. Thousands of years later Cantor would show that the infinity of irrational numbers is a higher order infinity than that of integers.

By the end of the session the class had moved up from 2.3.1 recognize common shapes and 2.4.1 identify and classify shapes up past 2.8.1 construcitons, 2.8.4 Pythagorean theorem,  1.8.3 square roots, and on into a presentation on the proof of the irrationality of the square root of two.

After the lunch break the class spent a half an hour in the computer laboratory were a fourier sound applet was demonstrated. The applet showed the connection between wavelength and frequency for sound waves, along with a graphical representation of a sound wave.

Then the teacher moved downstairs to engage in a laboratory using floral litmus solutions to detect acids and bases. This was based directly on physical science laboratory thirteen.

Yasko with acid, base, and neutral detection
Yasko with acid, base, and neutral detection

The session served science standards Sci 1.hs.1 and the Sc hs benchmark chemistry bullet item number eighteen, the study of acids, bases and salts.

Benskin
Benskin with test tubes

The laboratory also demonstrated the use of minimal glassware and locally available materials including common household chemicals in a chemistry experiment.

Cheryl tests a floral litmus solution against a known base
Cheryl tests a floral litmus solution against a known base

In the final session of the day which began at 3:00 in the afternoon, the teachers returned to the computer laboratory where science outcomes 3.4.2, 3.5.2, El Niño, La Niña, tropical storm formation, and climatic patterns were presented using presentations put together by Chip Guard of the National Weather Service on Guam. The instructor owes a deep debt of thanks to Chip Guard and the NWS for sharing those presentations.

Friday morning began with a discovery learning session using batteries and bulbs. This particular exercise derives from physical science laboratory 12 and served FSM science learning outcome 2.8.7. A Pohnpei Utility Corporation bill was also worked.

Lorleen measures Deffeny
Lorleen measures Deffeny

Following this session, the class determined their FiboBelly ratio, an exercise selected from a series of exercises centered on numeric patterns including the Fibonacci sequence.

Yasko measures Benskin
Yasko measures Benskin

This exercise is often done in conjunction with Fibonacci factors and Pigonacci. Pigonacci includes the connection between the Fibonacci numbers and Pascal’s triangle.

FiboBelly ratios on the white board
FiboBelly ratios on the white board

The final white board with the teachers’ Fibobelly ratios.

After the morning sessions the teachers prepared workshop evaluations and assessment. Some of the teachers chose to work in the computer laboratory.

A204 Computer laboratory
A204 Computer laboratory

The workshop wrapped up on Friday the 23rd of December with a pizza luncheon and certificates of completion for all participants.

As an addendum to the Thursday afternoon presentation, the following is an account of the damage done by typhoon Lola in November 1957 held in the Pacific Digital Library.

Typhoon Lola Pays A Call

A most unladylike intruder by the name of Lola paid a call upon the Trust Territory in mid-November 1957.

Lola was a typhoon of major proportions. Sweeping along like a bulldozing
broom, she smashed down valuable breadfruit and coconut trees, submerged crops, wrecked homes and generally produced havoc as she rolled on from the Marshalls through Ponape, Truk, Guam, and up to
Rota.

The typhoon which caused more over-all damage than any previously recorded within the territory, brought no loss of life and no major bodily injuries as far as is known, although many times tragedy knocked hard and close. In the face of danger, numerous spontaneous acts of valor came to the fore.

Lola entered Ponape District on November 12, leaving havoc, destruction and debris as she whirled on her way. Not for fifty years had Ponape had a typhoon. It was generally considered to be out of the typhoon path. But reports from atolls and islands throughout the area repeated the story of coconut and breadfruit trees destroyed, and of food shortage imminent after the windfall of nuts on the ground would have been made into copra or consumed for food, and the breadfruit eaten.

Kolonia, the Ponape District center, was in the direct path of the storm, as were the islands immediately around it. Knowing that the typhoon was coming, the people of Kolonia took shelter in the hospital building and warehouse, District Administration office, Intermediate School, agriculture station, and in churches and other buildings of the religious missions. For
some 250 or more storm refugees in these shelters, C-rations (individual canned foods), rice and sugar were issued by the Administration, also small quantities of kerosene to provide fuel for the ranges on which people prepared hot food and beverages.

The damage to buildings and utilities at Kolonia was considerable. Destroyed were the temporary warehouses and carpenter shed on the site of the new Pacific Islands Central School, and ruined was all of the
bagged cement therein, a total loss representing some five thousand dollars.
Ponape’s power and telephone systems were heavily hit by falling trees; roads were eroded, and bridges and culverts damaged, with a loss of approximately thirty-three thousand dollars in government property alone.

In addition to buildings damaged and public works systems affected, four vessels went aground in the bay – all privately owned. These were the LUCKY, the CULVER, the MARU, and the ASCOY. All except the first were expected to be refloated. The LUCKY, which was directly hit and forced high onto the reef, was not thought to be salvable.

Cacao pod production in Ponape was reduced by at least fifty per cent by the typhoon, according to estimates, and copra production here also is diminishing as a result of the high winds which blew immature nuts to the ground, or weakened them so that they began falling off before
ripening.