Monthly Archives: January 2012

INS Half Marathon

The INS half marathon for “2011”, delayed from 17 December, went off on 14 January 2012 under cloudy skies. Although I had done some building into the 17th of December, on the 19th I went into a week long 40 hour teacher corps workshop that dropped my running to zero. The following week included at least three Christmas sakau sessions with family and friends and zero running.

Dana outbound in Dolihner
Dana outbound in Dolihner at 4:13 P.M.

Although I was no where in the shape I wanted to be in, I took the point of view that the shame is not in failing to finish, but in failing to start. To remain on the sidelines, or the couch, when I could at least toe the line and put in some mileage, seemed the more shameful.

Dana inbound to the finish line
Dana inbound to the finish line at 6:23 P.M.

The 2012 half marathon was certainly one to remember. Instead of the tropical sun melting the runners, intense tropical downpours turned the late afternoon road into early night. Dark, wind driven rain lashing in from the north Pacific. At points I felt that if I stopped running I would become chilled.

Dana

Amidst the fierce rain, dark road conditions, low visibility, and Saturday evening traffic, I oped not to juggle from U turn-around inbound. My tennis balls were sodden and while the weight difference in small, at about 150 tosses per minutes, the small differences add up over the course of a couple hours.

First place winner Ardos
First place winner Ardos

Although my own goal was the turn-around in U, from where I could ride a truck back, I felt good enough to continue on.

Took first in zoris!
Took first in zoris!

I did pick up two juice packs at the donut shop in Nett outbound, plus I found a small bottle of Gatorade G at a container store in Nett near the U border. My own supply of Gatorade missed the truck to the U turn-around. The truck was in, but left before I realize that the truck was headed to the turn-around.

Inbound in U
Inbound in U, Photo by Zillah

Inbound I picked up two more juice packs at the donut shop in Nett. I did juggle from the Kolonia town post in. Thus the run was a DQ for joggling, but a finish for running.Where 2011 threw challenges at those closest to me, 2012 seems to be throwing challenges directly at me.

At the finish of the half marathon
At the finish of the half marathon, photo by Laurel

I finished in 2:23:30, which surprised me as that was faster than my 2:27:15 in 2010. Note that the INS half-marathon usually runs in September. This year INS proposed running the race in December. Then in December the race was delayed to 14 January 2012. Thus it has been almost a year and half since the September 2010 half. The next half is slated for September 2012.

Advertisements

Counting pillars and posts

The students were paired off on day one and told to count the number of pillars and posts on campus. The only definition was that the pillar or post should be standing free from a structure – a not a wall element.

Counting pillars
Counting pillars

I did not mention how to count the two-story tall pillars for the A and B buildings. This led to a natural introduction of the importance of definitions. The counting took 25 minutes, some finishing faster, singular groups slower.

M8 M9 M10
275 258 266
228 261 281
152 281 283
312 286 285
316 298 285
305 293
309 296
310 326
sample size n 5 8 8
275 unspecified how A, B bldg pillars counted
258 A, B, pillars counted as one top to bottom
305 A, B pillars counted as two top to bottom


A good sample description must include clear definitions. In the 9:00 class the students noted that they were counting the pillars differently for the A and B building. Some counted the pillar top to bottom as one pillar, some counted “per floor” and wound up with double the number of pillars. I suspect this happened at 8:00 as well.

Nancyleen
Nancyleen tallies data

Note that the 152 is an outlier: that group did not go all the way to the cafeteria – they each thought the other one had gone there. This provided a chance to note that z-scores can be used to help identify outliers that could potentially be data errors.

General education core science assessment

In early August 2009 I was tasked with setting up an initial assessment of the following two program learning outcomes:

Outcome 3.4: Define and explain scientific concepts, principles, and theories of a field of science.
Outcome 3.5: Perform experiments that use scientific methods as part of the inquiry process.

A 20 August 2009 worksheet number two document records the evaluation question as “Can students demonstrate understanding and apply scientific concepts and principles of a field of science and apply scientific methods to the inquiry process?”

The chosen method of evaluation would be to collect laboratories from three courses, SC 120 Biology, SC 117 Tropical Pacific Island Environment, and SC 130 Physical Science, as these courses were being taught system-wide and were felt to well represent the general education science laboratory experience.

A rubric was developed during the fall of 2009 based on one in use at McKendree University (page no longer extant 2011).

Performance factor score
4 3 2 1
Metric: Scientific Procedures and reasoning
Accurately and efficiently used all appropriate tools and
technologies to gather and analyze data
Effectively used some appropriate tools and technologies to gather
and analyze data with only minor errors
Attempted to use appropriate tools and technologies but information
inaccurate or incomplete
Inappropriate use of tools or technology to gather data
Metric: Strategies
Used a sophisticated strategy and revised strategy where
appropriate to complete the task; employed refined and complex
reasoning and demonstrated understanding of cause and effect;
applied scientific method accurately
Used a strategy that led
to completion while recording all data; used effective scientific
reasoning; framed or used testable questions, conducted experiment,
and supported results with data
Used a strategy that led to partial completion of the task/
investigation; some evidence of scientific reasoning used;
attempted but could not completely carry out testing, recording all
data and stating conclusions
No evidence of procedure or scientific reasoning used; so many
errors, task could not be completed
Metric: Scientific communication/using data
Provided clear, effective explanation detailing how the task was
carried out; precisely and appropriately used multiple scientific
representations and notations to organize and display information;
interpretation of data supported conclusions and raised new
questions or was applied to new contexts; disagreements with data
resolved when appropriate
Presented a clear explanation; effectively used scientific
representations and notations to organize and display information;
appropriately used data to support conclusions
Incomplete explanation; attempted to use appropriate scientific
representations and notations, but were incomplete; conclusions not
supported or were only partly supported by data
Explanation could not be understood; inappropriate use of
scientific notation; conclusion unstated or data unrecorded
Metric: Scientific concepts and related content
Precisely and appropriately used scientific terminology; provided
evidence of in-depth, sophisticated understanding of relevant
scientific concepts, principles or theories; revised prior
misconceptions when appropriate; observable characteristics and
properties of objects, organisms, and/or materials used; went beyond
the task investigation to make other connections or extend thinking
Appropriately used scientific terminology; provided evidence of
understanding of relevant scientific concepts, principles or
theories; evidence of understanding observable characteristics and
properties of objects, organisms, and/or materials used
Used some relevant scientific terminology; minimal reference to
relevant scientific concepts, principles or theories; evidence of
understanding observable characteristics and properties of objects,
organisms, and/or materials used
Inappropriate use of scientific terminology; inappropriate
references to scientific concepts, principles or theories

Laboratory assignments were collected from a single laboratory in each of the three courses system wide. A team of science faculty members marked the laboratories with no faculty marking their own laboratories.

In a worksheet number three document that this author received on 05 August 2010, the following findings were reported. Note that scores were the sum of two readers, hence the four point scale above is reported as an eight point scale below. Four metrics, four points maximum, two readers yields the total possible of 32 points.

1b. Summary of Assessment Data Collected (3-9):

  • Overall average points on lab reports was 14.95 out of 32 possible points.
  • Scientific Procedures & Reasoning = 3.89 out of 8 possible points.
  • Strategies = 4.05 out of 8 possible points.
  • Scientific communication using data = 3.38 out of 8 possible points
  • Scientific concepts and related content = 3.63 out of 8 possible points

The following plans were made to seek improvement in the 2010-2011 academic year.

1c: Use of Results to Improve Program/Unit Impact/Services[Closing the loop] (3-10):
Increase the students’ ability to demonstrate understanding and apply scientific concepts and principles of a field of science and apply scientific methods to the inquiry process by increasing the overall rating from 14. 95 to at least 16.

  • Have students write more lab reports than the reported 1 – 3.
  • Collaborate with Lang/Lit, and English divisions to help prepare students to write scientific papers.
  • Provide science faculty with ways to help students write better lab reports.

A questionnaire had accompanied the request to collect laboratories.

2a. Means of Unit Assessment & Criteria for Success:
Students were asked [on a cover page], “How confident are you in writing this lab report?”

2b. Summary of Assessment Data Collected:
Only one group of students were not asked to complete the cover page for this part of the assessment. 50 of the 60 students assessed completed the form with the following results:

  • 33 (66%) of the 50 students reported that they felt confident or very confident in completing the lab report.
  • 7 (14%) of the 50 students reported that they were nervous, unsure, or uncomfortable in writing a lab report.
  • Confidence of students doesn’t match with low ratings

In a workshop on or about 05 August 2010, a decision was made to repeat the science assessment activity with the same courses and the same rubric. The above results were restated as three multi-part comments on planning worksheet two completed on 05 August 2010:

Comment 1. Increase the students’ ability to demonstrate understanding and apply scientific concepts and principles of a field of science and apply scientific methods to the inquiry process by increasing the overall rating from 14. 95 to at least 16 by:
C1a. Having students write more lab reports than the reported 1 – 3.
C1b. Collaborating with Lang/Lit, and English divisions to help prepare students to write scientific papers.
C1c. Providing science faculty with ways to help students write better lab reports.

Comment 2. Only one group of students were not asked to complete the cover page for this part of the assessment. Make sure to collect the same data for all lab reports submitted when the project is run again.
C2a. Repeat directions more than a few times.
C2b. Remind assignment administrators again just before distributing the assignment·

Comment 3. Increase student confidence in their writing.
C3a. Provide immediate feed back
C3b. Find ways for expectations from instructors to be clear to students such as providing students with rubrics before the assignment is due.
C3c. Provide more opportunities for scientific writing.

The above comments (recommendations) were to be circulated to all science faculty. A new assessment coordinator came on board in this same time frame. Laboratories were requested in the fall of 2010 and submitted to the assessment coordinator who later assembled a marking team. This author received a worksheet number three on 03 August 2011. The worksheet had a creation date of 18 July 2011.

The first item on worksheet number three appears to have been an addition made by the assessment coordinator at that time as the metric was not called for in the August 2010 worksheet number two document. The first item reported on course completion rate information. The metric noted that a 70% completion rate was expected for students in the courses.

A table in the report, reproduced below, reported on the completion rate.

Course Completion Rate for Fall 2010
Row Labels Enrollment Total Passed Completion Rate
SC117 57 53 93%
SC120 105 77 73%
SC130 51 46 90%

The definition of completion was not fully defined, the worksheet only noted that, “Criteria for success: General Weighted Mean of 70% or higher.”

The common science laboratory assignment was reported in section 2a and 2b. The data was reported in a slightly different manner than the prior year. The data reported was:

2b. Summary of Assessment Data Collected:
Overall average score on science assignment. 2009 – 14.95 2010 – 21.9

2009 Total

2010 Total

Pohnpei

13.4

21.8

Kosrae

19.4

24

National

15

19.8

Yap

16

22

While all campuses saw scores improve, there was no report on whether implementation of the comments (recommendations) had occurred. Thus there was no way to determine whether the earlier worksheet two recommendations were causative of the score improvement.

A second potential complication was that the marking teams were not the same for the two years, thus there was the possibility that the two teams had interpreted the rubric in different manners.

Score improvements were seen in all four metrics when looked at system-wide:

Sc Proc & Reasoning

Strategies

Sc Comm/Using Data

Sc Concepts & Related Content

2010

5.5

5.4

5.2

5.1

2009

3.9

4.1

3.4

3.6

Again, ferreting out meaning, whether score improvement was due to specific actions taken by instructors, random change, or a difference in the way the marking teams marked, was not determinable. Were more laboratories assigned? Did collaboration occur with language and literature instructors? Did instructors provide rubrics prior to assigning the laboratories? Was feedback as immediate as was reasonably possible?

Section 3a and 3b of the 18 July 2011 worksheet number three reported on student confidence.

3a. Means of Unit Assessment & Criteria for Success (3-8):
Students were asked, “How confident are you in completing the Science lab?”

3b. Summary of Assessment Data Collected (3-9):
Confidence level:

      2010    2009
High   43%    66%
Med.   10%
Low    12%    14%

Comments (made by the then assessment coordinator): Confidence levels on the low end are similar both years. Perhaps in 2010, students confidence at the high and medium levels is more in line with lab scores.

On the 05 August 2011 worksheet two was prepared for the 2011-2012 academic year. The recommendations on worksheet number two (listed as comments) were as follows:

Comments:

The following recommendations derive from the 2009-2010 assessment cycle for general education science. During the 2010-2011 school year no data was gathered on whether these comments were implemented. Improvements seen in ratings might be due to rater bias – different raters were used year-on-year.

Comment 1. Increase the students’ ability to demonstrate understanding and apply scientific concepts and principles of a field of science and apply scientific methods to the inquiry process by increasing the overall rating from 14. 95 to at least 16 by:
C1a. Having students write more lab reports than the reported 1 – 3.
C1b. Collaborating with Lang/Lit, and English divisions to help prepare students to write scientific papers.
C1c. Providing science faculty with ways to help students write better lab reports.

Comment 2. Only one group of students were not asked to complete the cover page for this part of the assessment. Make sure to collect the same data for all lab reports submitted when the project is run again.
C2a. Repeat directions more than a few times.
C2b. Remind assignment administrators again just before distributing the assignment·

Comment 3. Increase student confidence in their writing.
C3a. Provide immediate feed back
C3b. Find ways for expectations from instructors to be clear to students such as providing students with rubrics before the assignment is due.
C3c. Provide more opportunities for scientific writing.

Comment 4. Document whether recommendations one to three were actually implemented system-wide and if not, what issues inveighed against implementation. Bear in mind that the comments/recommendations did not have system-wide buy-in.
C4a. Specific solutions yet to be determined. Possible use of a self-report survey.

Comment 5. Resolve the issue of consistency in marking of the laboratories year-on-year.
C5a. Specific solution yet to be determined.

Comment 6. Begin discussion of whether current assessment is providing useful, actionable information on accomplishment of program learning outcomes. Consider alternative assessment options for 2012-2013 school year.

Comments C4a and C5a were highlighted in the original document.

The absence of an assessment coordinator to help coordinate the tracking and collection of the meta-data required to report on whether the recommendations (comments) were implemented, and, if so, to what extent,  once again puts at risk the ability to interpret the raw data and to report on the recommendations in May 2012.

This article is intended to help improve institutional memory by gathering the key results of the general education core assessment for the past three years in one single place, along with the rubric being used. At present this information is scattered across multiple files that were exchanged only via email over a three year period. This end note is only intended to make plain that blogs are a useful way to report and make transparent assessment efforts, data, and results.