Bush Faculty Development Committee
Sherry Barkley, Karen Dorn, Margot Nelson, Susan Schrader, and Glenda Sehested attended the November 16-17 Collaboration Conference "Making Assessment Meaningful: Practical Approaches to Documenting and Using Evidence for Student Learning" in Bloomington, MN. Here are some reflections by Sue, Margot, Karen, and Sherry. Also see pp. 5-6 for some thoughts from the incoming members of the Bush Committee.
By Susan Schrader
The Friday keynote address was delivered by Barbara Cambridge, VP for Programs at the American Association for Higher Education. She began by asking the audience, "How many of you are interested in making a difference in the classroom?" She described assessment in broad strokes as a way to find out if we actually are making a difference. In her address, she differentiated among teaching, scholarly teaching (where a teacher engages others and the literature on how students might learn better, with a goal of improving student learning), and the scholarship of teaching and learning (where a faculty member studies the pedagogy scientifically, uses methods to evaluate teaching strategies and student learning, and then disseminates those findings to the broader community).
Cambridge then asked how institutions and departments might evaluate the extent towhich they have a culture that supports, affirms, and rewards the scholarship of teaching and learning. She answered her question by highlighting ten areas in a campus setting to map progress on assessment and improved student learning. (1) mission, infrastructure, integration, (2) participation on the campus, (3) campus support, including time and money, (4) faculty selection and development, (5) faculty evaluation, (6) collaboration internally and among other institutions, (7) uses of technology, (8) initiatives that haven't worked yet, (9) promising signs of progress, and (10) opportunities yet untapped. Cambridge gave illustrations of institutions across the nation and what they are doing to foster a culture of assessment. In closing, Cambridge argued that being interested in the scholarship of teaching and learning improves student learning, provides evidence of accountability, more clearly defines one's discipline, and ultimately is very satisfying.
Using Classroom Assessment Techniques to Improve Online Teaching and Learning
By Margot Nelson
This session was grounded in Angelo and Cross's work Classroom Assessment Techniques...A Handbook for College Teachers, 2nd edition, 1993. Dr. Kristi Pearce reviewed the dual purposes of classroom assessment and differentiated classroom assessment (process evaluation) from outcomes assessment. The dual purposes are to ascertain how well students are learning and how effectively faculty are teaching. Classroom assessment techniques (CAT) focuses on HOW, not WHAT, students are learning. A basic assumption is that the quality of student learning is directly, although not exclusively, related to the quality of teaching.
Pearce shared several resources, besides the book by Angelo and Cross, for CAT tools and ideas. One is MERLOT–funny, I always thought that was a type of wine. In this case, it is a Multimedia Educational Resource for Learning and Online Teaching. Their website contains teaching materials and commentary by faculty users, sample assignments, and opportunities for dialogue. Visit them at www.merlot.org. Two other sites were also intriguing: Classroom Assessment Techniques (http://www.siue.edu/~deder/assess/catmain.html) and Field-tested Learning Assessment Guide for teachers of science (http://www.wcer.wisc.edu/cl1/flag/). The website for Classroom Assessment Techniques would be worth visiting for both a review of principles and specific strategies such as having students write about "The Muddiest Point" in a given class or "The Minute Paper."
The Winona Assessment Project
By Margot Nelson
This session consisted of a presentation by Susan Hatfield, Theresa Waterbury, and Phil Jirsa, faculty and information technologists at Winona State University. They have received funding from the US Department of Education for the development of an integrated database system focused on assessment–both cross sectional and longitudinal. There are six databases in the system: student records, self-report (both by faculty and students), financial aid, admissions, placement, and learning outcome data. Periodic collection of data from students (through web-based forms) has allowed for the development of a student "risk profile" (Who are the students who fail to successfully complete their program of study?) as well as a "success profile" (What student behaviors and characteristics predict high grades and program completion?).
Students are asked, at various times in their program, to submit data about themselves, their evaluation of the campus climate, satisfaction, activities, evaluation of the quality of instruction, social behaviors, and skill development. Each year all students participate in an Assessment Day in February when they submit the information requested of them, determined by their status, by computer.
Faculty data is being collected and analyzed through online storage of vitae and inventories of course strategies being used. The latter database is still in development, but its intent is that teaching approaches can be evaluated according to standards and alsoequated with student outcomes. It's a very industrious project and one which offers ideas for systematizing institutional assessment.
More than Merely "Pretty to Think So": Assessment Data and Meaningful Change
By Susan Schrader
This session, presented by English faculty at Winona State, set out to demonstrate that assessment can be used for change (specifically to improve student learning). The topics for table conversation in this session were well crafted. For example, small groups wrangled with issues of (a) resources, (b) challenges/liabilities, and (c) opportunities/rewards related to assessment. Some of the opportunities/rewards identified included (i) enabling students to translate the liberal arts to themselves and future employers so they better understand what skills they have in hand, (ii) measured and intentional creation of a seamless curriculum, (iii) a campus culture that supports cross-campus dialogue about assessment and ways to improve student learning, (iv) less scrambling prior to accreditation visits, and (v) improved student learning.
Presenters Eddy and Johnson walked through the model used in their department at Winona State and described the measures they used (portfolio combined with surveys, interviews, anecdotal evidence, and institutionally generated data). They made it clear that assessing the data took time and careful interpretation; for example, they felt that who interpreted the portfolio and through what lens was especially important. They left the small groups with five scenarios that generated lively conversation. This was a very well done session that fostered some useful understandings of the benefits from assessment as well as some of the actual measures that could be used to assess the extent to which a department improves student learning.
Evaluating Faculty Development Programs
By Karen Dorn
At the faculty development coordinators breakfast, Collaboration representatives presented a model for use in evaluating faculty development programs. The Logic Model, developed by Michael Patton (1997), and described in the book Utilization-Focused Evaluation, third edition is based onBennet's (1979) theory of action. The model is useful for doing short-term evaluation for results that may not be observed for several years as occurs in education and human service areas. In this model, for each phase or event, there is matching evidence that can be used for evaluation. While the program chain of events may occur in sequence, the matching evaluation criteria reflect increasingly complex levels of evidence that are used for evaluation. The challenge to each of us in evaluating our faculty development activities and impact on student learning is to move up the hierarchywith respect to the levels of evidence. Parallels to classroom teaching-learning activities and assessment of student learning can be made. The table below illustrates the model and hierarchy of evaluation criteria.
SOURCE: Adapted from Bennet 1979
REFLECTIONS OF A FIRST-TIME PARTICIPANT
By Sherry Barkley
I have attended many professional conferences over the years, but this was my first Collaboration Conference. The conference theme was appealing and I was hoping to collect lots of information that I could put to use in my classes. However, I wasn't quite sure what to expect...and I was a little bit nervous! I was reassured at the preconference session when I discovered three things (that I've also noticed at other conferences I've attended): (1) Based onwhat the "experts" were telling me, I was already doing some things "right;" (2) Some of the issues that I struggle with are the same things that other professors–even those with more experience–also struggle with; and (3) There were a couple of "pearls" of information that I could take home and put into practice.
At the preconference workshop entitled "Designing and Evaluating Writing Assignments," one point that hit home with me had to do with the teacher's response to student writing. Marion Hogan Larson reminded us to pay attention to the emphasis of our written comments on the student papers. If we tend to emphasize mechanics, and 80% of the comments are about grammar and punctuation, that is what the students will think they need to fix. Ms. Larson also suggested that it's a good idea to skim the whole paper before making any comments, and also to focus comments on work that can still be revised. Since I tend to start reading writing assignments with pencil in hand, ready to circle every misspelled word and out-of-place comma, a "guilty" sign flashed in my mind, and I resolved to make sure that I spend more time commenting on the content and organization of my students' papers, putting an "X" in the margin if there is an error inpunctuation or spelling, and letting the students do their own proofreading.
The concurrent sessions on Thursday afternoon and Friday morning offered a variety of topics to choose from. Again, I could usually find one or two pearls of useful information to bring home and apply in my classes. The two sessions I most enjoyed were entitled "Assessing Learning in Internships: Learning Agreements, Journals, and Portfolios" and "The Development of Rubrics as Assessment Tools to Facilitate Feedback and Enhance Learning."
Overall, the conference was a positive experience for me. My challenge was to take the time to incorporate some of these new ideas into my own teaching and assessment of student learning.
FROM THE INCOMING MEMBERS OF THE
As a former recipient of Bush Grant largesse, I am excited by the opportunity to serve on the Bush Committee. The resources provided by the Bush Foundation play a crucial role in promoting and sustaining faculty development programs at Augustana. And I think this is a particularly invigorating time to be part of this activity. In the midst of a renewal cycle of Augustana's grant, we are in a position to continue programs that have proven fruitful in the past and to look to new and exciting possibilities for the future.
Greetings from the Bush Steering Committee. I am excited about the work of this committee and the opportunities our Bush funding provides for faculty development relative to student learning. It has been a pleasure working with the outgoing committee members, Gary Earl, Susan Schrader and Richard Swanson. The incoming committee greatly appreciates the transition time we had together as we wrote the renewal grant. A big thanks for all your work these past few years! It is also a pleasure working with Karin Lindell who keeps us all faithfully on task. Please let us know how we might better serve you in the coming years.
I am excited about serving on the Bush Committee and helping to support projects that the college might not otherwise be able to fund, especially those promoting collaboration and interdisciplinarity. The opportunity Bush Grants provide at Augustana for "opening windows" (you remember that vector, right?!), academically and otherwise, seems especially important as we prepare students to go out and serve in a diverse world. Working with faculty and learning what they are doing creatively in their teaching and scholarship will be fun as well as enriching. I suspect I'll get a lot more out of the experience of serving on the Bush Committee than I give. One thing is certain: our predecessors on the committee--Gary Earl, Sue Schrader, and Dick Swanson--have certainly been enthusiastic about its work!
* * * * * * * * * * * * *
Coming Soon to the Faculty Resource Collection
2000 Tips for Lecturers, Phil Race, ed. 1999
The Departmental Guide and Record Book for Student Outcomes Assessment and Institutional Effectiveness, James O. Nichols and Karen W. Nichols, Third Edition, 2000
Mark your calendars for the February 14-15 Collaboration Conference "Values, Citizenship, and Community: Preparing Students for Leadership in a Democratic Society." The Bush Grant will cover conference fees and hotel accommodations and provide for transportation. If you need a registration form or have any questions, please contact Karin Lindell at ext. 4808 or e-mail klindell.