Westmont College Biology Department Program Review 2015 Table of Contents

Download 2.52 Mb.
Date conversion19.05.2016
Size2.52 Mb.
  1   2   3   4   5   6   7   8   9   ...   27
Westmont College Biology Department Program Review 2015
Table of Contents

  1. Introductory Summary (p3)

  2. Findings (p5)

    1. Student Learning (p5)

    2. Alumni Reflections (p12)

    3. Curriculum Review (p13)

    4. Program Sustainability (p15)

    5. Additional Analysis (p16)

      1. Research capability and Resources (p16)

      2. Student Focus Groups (p18)

        1. Independent Research (p18)

        2. Faith-Learning throughout the curriculum (p19)

      3. Facilities (p20)

      4. Internships (p21)

      5. Interactions with other departments (p23)

  3. Looking Forward: Changes and Questions (p23)


  1. Program mission, vision, goals and program learning outcomes for the current six-year cycle or the link to the documents posted on the departmental website (p26)

  2. Summary of assessment results for every PLO (p26)

  3. Rubrics and assessment instruments for every PLO (p37)

  4. Reports on closing the loop activities for every PLO (p48)

  5. Relevant syllabi for major changes in the curriculum such as a new capstone course, senior seminar, internship requirement, experiential learning course, etc. (p49)

  6. Curriculum Map and the PLOs Alignment Chart (p49)

  7. Alumni Survey (p54)

  8. Peer institution comparison (p77)

  9. Full-time faculty CVs (p79)

  10. Core faculty instructional and advising loads (p128)

  11. Faculty race/ethnicity and gender breakdown (p132)

  12. Adjunct faculty profiles (p133)

  13. Overview of proposed changes (p134)

  14. Student race/ethnicity and gender breakdown (p134)

  15. Student graduation rates (p137)

  16. Review of library holding (p137)

  17. Internships report (p139)

  18. Budget (p140)

  19. Inventory of Educational Effectiveness Indicators (p140)

  20. Student Focus Groups (p141)

  21. Student survey on curriculum (p157)

  22. Facilities and faculty size comparison (p165)

  23. Total graduates and numbers in each track (p166)

I. Introductory Summary

Our program review was a fruitful opportunity for the Biology Department to take stock, examining the things we are doing well and identifying improvements that may be necessary. In addition to the student assessment we have done annually for the last 6 years, we connected with both current and former students to get a comprehensive view of their satisfaction with our program. This past academic year, we conducted 2 student focus groups and administered a student survey on curriculum. Furthermore, our alumni survey yielded a 45.4% response rate and therefore we heard from 103 of our graduates from the last 6 years. We examined the curricula at comparable institutions to see if there are any major deficits in our program. We surveyed Biology department colleagues at other colleges and universities about their facilities to see how we currently compare. Finally, we had a departmental retreat back in May to reflect together on the new data we collected and how we might respond as well as discussed what the obstacles might be that are preventing us from achieving our individual professional goals. While there are some concerns and areas for improvement, overall, this was an encouraging 6-year cycle for us.

Our annual student learning assessment projects revealed no major deficits in any of our program learning outcomes, except one, the lack of significant plant biology in our program, a deficit we have long identified. Otherwise, we found our students struggled in the areas known to be challenging (e.g. constructing arguments, framing a bioethical decision in a scriptural context, distinguishing what information goes in the Results and Discussion sections of a scientific article). Of course, identifying these weaknesses helped us modify our instruction accordingly. It also highlighted the classes and assignments that worked well and helped our students accomplish our student learning outcomes. For example, we were satisfied and encouraged with how our students grew in their ability to recognize a range of perspectives on the creation/evolution debate, to be self-critical of their own position, and to accept a degree of ambiguity when certainty is unattainable. It confirmed that it was important to retain a key debate assignment in our Introductory Biology course and to offer one of our capstone courses every year instead of every other year.

Most student and alumni feedback was overwhelmingly positive. In our alumni survey, 99% of respondents said that they were “satisfied” or “extremely satisfied” with their Westmont education as a whole. Over 90% said that the teaching in the Biology department specifically was strong or superior. Other departmental strengths described by alumni were the hands-on experience in the lab and field and the biology/faith discussions. Indeed, our current student focus group on biology/faith made it clear that these discussions permeate our entire curriculum, occurring in almost every class. The focus group with our current independent research students also indicated they were satisfied with their research experience with no student rating their experience less than a 8 on a 10 point scale. Finally, a large majority of alumni (81%) and current students (82%) were either “mostly satisfied” or “completely satisfied” with the breadth of course offerings in the Biology department. Nevertheless, alumni suggested that improvements could be made in expanding curricular options, in career direction and job preparation, and in offering more research opportunities. Half of current students also said that they wanted more Biology-specific off-campus programs.

Our comparisons to Biology programs at other institutions were less encouraging. Based on the data we gathered: 1.) We have a smaller faculty than 5 of 6 institutions we used for curricular comparison when you controlled for enrollment (This, of course, was immensely improved after our last program review in 2009 which resulted in the approval of an additional FTE), 2.) Compared to other CCCU schools, we rank last by a considerable margin in square footage/student (most CCCU schools have twice the space, the better ones have 3-4 times).

Based on the data we collected and our reflection as a group, it’s clear that many of our current challenges are not new but have perhaps been compounded with time. The paucity of Plant Biology in our curriculum has long been recognized and we have made staffing decisions knowing it would result in more of a specialization in animal biology in our Eco/Evo/NH track. While the Biology department is grateful for all the small and large renovation projects in our facilities, many comparable institutions have had newly built life sciences facilities or large additions to their current buildings in the last decade (e.g. Wheaton, APU, Gordon, Houghton, Point Loma, Taylor) and our facilities therefore may communicate to prospective students and parents that Biology is not a priority here at Westmont. This might be one contributing factor to the decreased enrollment in our Eco/Evo/NH track. Most faculty members said that research was the area in which the most struggles have existed. Insufficient dedicated research space, limited funds for the typical consumables needed for biological research, unreliable strategy to fund our summer research student program, and needed basic infrastructure such as the establishment of the IACUC have all put strain on many of our research ambitions.

While some of the challenges feel beyond our immediate control, we recognize that departmental effort is needed in certain areas to improve our program. Institutional momentum clearly exists for helping our students “launch” into careers after graduation and the department needs to take its share of the responsibility in improving the discipline-specific resources we offer, the guidance we give in advising, our internship opportunities, and our collaboration with the Career Development and Calling office. We need to seek creative strategies to incorporate more Plant Biology into the current classes we teach. We need to make sure the new faculty member hired to replace Frank Percival will expand our curriculum in one or more of the needed areas. Finally, we need to increase our efforts in acquiring external funds for our research programs. All these improvements seem reasonable and within our reach.
II. Findings

A. Student Learning

PLO #1: Students will effectively identify, and explain, fundamental principles of life processes at different levels of structural organization.

In 2014, we administered the Major Field Test in Biology to assess the SLO: Students will score competitively on a national comprehensive standardized exam in Biology. Thirty-two students in the department’s capstone courses, Bioethics Seminar (BIO 196), both Fall and Spring, and Biology and Faith (BIO 197) took the Biology Major Field Test, developed by the Educational Testing Service in the 2013-2014 academic year. The following three benchmarks were used to interpret the test results. Compared to the National Liberal Arts Colleges using the test only for program assessment (our “Main Reference group”), (a) BS General Track students’ mean score will be at the 50th percentile for the whole exam, (b) BA students’ mean score will be at the 40th percentile for the whole exam, and (c) the mean scores for our specialized tracks (Cellular and Molecular Biology and Environmental/Natural History) on the relevant subsections of the exam will be at the 60th percentile.

The full results can be seen in Appendix 2A. Our “average student’s” total exam score ranked at the 58th percentile of all students taking the exam (45,260 students from 507 institutions), and this placed us at the 69th percentile when comparing our mean total exam score to those of the other institutions in the database. Our student mean score for the total examination was virtually the same as the mean score for all students from our Main Reference Group (157 compared to 156.8, 48th percentile), and a Z test indicated essentially no difference between our students’ scores and those of the reference population as a whole (P = 0.929). For comparison, Westmont’s 75th percentile SAT score (1310) places us at the 54th percentile in relation to other institutions in this group. Looking just at the BS General Track students’ results, their mean score was 159 (54th percentile in the Main Reference Group) while our BA students, with a mean score of 154, were at the 40th percentile. Thus, these groups met the first two benchmarks defined above. The MFT’s “assessment indicators” indicated the relative strengths and weaknesses of our program. Overall, our students did relatively well with questions dealing with animal biology and biochemistry (79th and 66th percentiles in our Main Reference group) and relatively poorly on questions dealing with plant biology and ecology (16th and 29th percentile in our Main Reference group). Nevertheless, students in specialized tracks within the major have area scores that reflect the emphases of their studies. The Cellular/Molecular track students had mean subscores in Cell Biology, Molecular Biology and Genetics, and Organismal Biology that were well above the 50th percentile in the reference group, but their mean subscore for Population Biology, Evolution, and Ecology was at the 41st percentile. Conversely, the Environmental/Natural History students had their highest score (55th percentile) for Population Biology, Evolution, and Ecology, but scored well below the 50th percentile in the other three areas. Therefore, while the BS Cellular/Molecular students met the benchmark of the 60th percentile for the subsections most relevant to their studies the BS Environmental/Natural History students did not.
Closing the loop: We are currently undertaking a review of our introductory core of courses for our majors in the current academic year with an eye toward possibly restructuring the way in which we introduce students to the study of Biology.

PLO#2: Experimental Investigation: Students will grow in their ability to carry out scientific investigation of living systems.
In 2011, we used a series of direct and indirect assessment strategies to examine 4 SLOs related this our research PLO. We analyzed end-of semester research posters from Bio-005 to analyze 2 SLOs: a.) Students have an appropriate understanding of the experimental question and of the previous research published on the topic and b.) Students will apply appropriate principles of experimental design and data interpretation in their research projects. In groups of two or three, the introductory Bio-005 students develop a research question, plan the experiments, carry out the research, analyze the data, and present their findings in a poster-style format. From the Spring 2011 class, we analyzed all 34 research posters, and in doing so organized our findings into two general areas of competence: Experimental design and background, and Graphs and statistics. Two faculty graded the posters (see Appendix 3B for the rubric used), compiled the data, and then presented it to the entire department for discussion. These data, reported as the number of student groups failing to meet the standard in each category (percentages could not be used here because each category does not apply to all 34 research posters), are depicted in Appendix 2B.
Although there were more instances of failing in each category than we desired, two in particular stood out. First, students don’t use the scientific literature appropriately in their research projects; specifically, they don’t do sufficient background research before starting their project (“Use of Literature”). One obvious reason for this is that, as introductory students, they simply aren’t comfortable with engaging the literature, even though one laboratory session is devoted to introducing them to the use of the scientific literature. A second, and very likely, explanation could be that even with some previous engagement with the literature and minimal level of comfort with reading papers, they do not know how to apply it specifically toward designing and carrying out a research project. Second, in interpreting their data, many students do not use the appropriate format to report their findings (“Proper graphs used”). For example, some students used a bar graph when an XY-style plot would have been appropriate and of those who did use an XY-style plot appropriately, some failed to connect the data points. Our students need to know that to be successful in interpreting the meaning of their data, they must be able to convey it in a form that is accessible to the reader.

Closing the loop: We discussed two broad changes to the curriculum that will be implemented in response to these observations:

  1. The process of making a project proposal for Bio5 students was made more formal, in order to help students with experimental design. Specifically,

  1. The Bio5 instructors designed a form that students will complete during or after consult with instructional staff that draws their attention to things like appropriate controls, units of measurement, and variables subject to manipulation.

  2. The grading rubric was updated to be more explicit about how the scientific question should be developed

  3. The Bio5 instructors added more emphasis to the proper methods for doing background literature searches so that students can better frame their research question.

  1. We revised a section of the Bio-005 lab manual that addresses statistical tests.

We used a student survey to assess 2 other SLOs: c.) Students will develop competence with current research methods, tools, and techniques, and d.) Students will conduct their research with enthusiasm and commitment. Eight of nine Bio 198 and/or summer research students who worked during the summer of 2010 or the 2010-11 academic year to completed a survey evaluating their research experience. The survey (supplied as Appendix 3B) was designed to get the students’ perspective on their experience and their self-assessment on their abilities. For each question, students were asked to respond on a scale of 1 (strongly disagree) to 5 (strongly agree). The results are summarized in Appendix 2B. Our benchmark was to have the average value in each of the categories at 4.0 or above.

Overall, we were pleased with students’ self-evaluation of their independent research experiences. We met our benchmark in all but one category and identified two areas for improvement. The first area for improvement is identified by a noticeable trend in figure A above: students rate their ability to actually do the lab work higher than they rate their ability to engage the theoretical or background components of their work (compare the last two categories in Fig 2A, which evaluate lab bench competency [average value = 4.9], with the first four categories, which evaluate their understanding of and ability to articulate their research project [average value = 4.2]). Perhaps this is not surprising, as a majority of our students tend to view research in the lab as a series of techniques to master, rather than a set of scientific questions to answer. Second, responses were relatively high in each category except one: students did not agree that part of their motivation for independent research was that it would result in published papers or research posters (the latter would entail presentations at research conferences). We interpret these data to mean that students are not engaging their work at what we feel is a critical level: connecting what they do at the bench or in the field to the broader scientific community. This further suggests that the summative work in which nearly every one of our research students engage – constructing and presenting a research poster at the Westmont Research Symposia that take place in the fall and spring –is considered by the students to be just another component of the research experience, rather than as a capstone of their time in the lab or field.
Closing the loop: We would like to nurture our research students to think of themselves as contributing scholars in the field, rather than simply as “experiment doers” and to understand the big picture of how their experimental findings contribute to answering larger research questions. Towards this end, we discussed requiring each summer research student to (i.e., those that have ten weeks allotted to engage their research projects full-time)

  1. submit a written report on how the primary literature, that serves as the background information for their own project, helps to frame the research question.

  2. give a summative lab meeting-type presentation using their research poster as a visual aid, with their research peers and faculty mentor in the audience. In this presentation, the student’s faculty mentor will provide guidance on how to “tell a story” rather than link together a series of experiments.

PLO #3: Scientific Communication: Students will be able to present the findings and implications of scientific research through written research reports, oral presentations and scientific posters.
In 2012, we examined upper-division lab reports to assess the SLO: students write research reports, produce posters and give oral presentations in appropriate format and style as instructed. Three of our upper division biology classes (Bio-110, Microbiology; Bio-130, Cell Biology; and Bio-132, Molecular Biology) require an extensive paper that summarizes experimental work performed, stemming either from lab experiences planned by the instructor or independent projects that students have developed on their own. From these selected courses, we analyzed 9 research papers; three from each of the three courses, randomly selecting an “A, a “B,” and a “C” paper from each. All department faculty members read all 9 papers and scored elements of scientific format and style using a rubric (See Appendix 3C) Unknowingly, two of the “C” papers from two different courses were actually written by the same student. We therefore omitted data from one of these “C” papers, thus reducing our student sample size to eight. Reviewer’s scores were then compiled and averaged. Our benchmark was that the group mean for each element would be 4.0 or higher. Results are shown in the graph in Appendix 2C.

We found that upper division students do a good job writing in an appropriate scientific tone and style, as we met the target benchmark for each element in this category. With regards to adherence to a prescribed format, students showed some deficiencies in effective use of the three core sections used in scientific reports – the Introduction, Results and Discussion sections, as well as (for a few) the writing that occurs in a paper’s Abstract. Specifically, we observed that students 1) need more help in understanding the rationale for doing what they are doing and in connecting their research question to previously published literature (=Introduction), 2) need more help relating their own data to previously published literature (=Discussion), 3) (some) need assistance in understanding the requirement of statistical analysis in the student’s paper (=Results), by reviewing and then clearly indicating what statistical tests would be most appropriate for the work being done, and 4) (some) need further guidance in choosing the most relevant literature sources (although significant progress has been made in discerning the difference between peer-reviewed literature and other non-scholarly sources). Finally, a clearer understanding of the introduction, results and discussion sections should then improve the format of their abstract (the only other section with a mean score below 4.0) since it summarizes the contents of all other sections.

Closing the loop: We discussed two categories where work needs to be done:

  1. How can we help more of our students improve in their ability to write scientifically, especially with regards to format?

    1. The Biology faculty met in May 2014 to review the lab report guidelines for Bio-114: Genetics, the writing-intensive course that all our majors are required to take. Our main observations and reflections were:

        1. Overall, descriptions of the section are very clear.

        2. We noted sub-discipline distinctions in format but all of the faculty were comfortable with this format in Genetics.

        3. We decided we needed to add a clear statement of using parentheses for reference to tables/figures and statistics.

        4. We noted that Chemistry requires an “Error Analysis” section which biology students feel compelled to include regardless of whether data suggest there might be any errors.

The Bio-114 instructor made changes based on this feedback.

    1. We also agreed to spend departmental money to pay academically strong, upper division biology students to serve as departmental tutors during the second semester of this academic year. In the Spring 2013 and 2014, we required the Genetics lab TAs to hold 4 hours of open tutoring/help sessions each week to assist the lower-division students in our core-courses (Bio-005 and Bio-114). While they helped with general comprehension of lecture material, the student tutors also assisted Bio-005 students with correct poster formatting, utilizing appropriate literature resources, and using the correct graph format (see assessment results for PLO #3) and provided Bio-114 students with needed guidance on writing lab reports. Anecdotal evidence suggests the tutors have been useful.

  1. How can we improve our strategy in assessing this particular PLO?

    1. We need to modify the way we select student papers to read. We need to consider assessing a larger, and completely random sample, of upper division, senior papers. We will rethink this strategy before our next round of assessing this PLO.

  1   2   3   4   5   6   7   8   9   ...   27

The database is protected by copyright ©essaydocs.org 2016
send message

    Main page