Out-of-School Time Resource Center Conference Evaluation Toolkit

Jennifer Buher-Kane
March 1, 2006

Out-of-School Time Resource Center
Conference Evaluation Toolkit
Reference Guide
March 2006
The contents of this document were created by the Out-of-School Time Resource Center at the
University of Pennsylvania and may not be reproduced without permission.
Copyright 2006 The Trustees of the University of Pennsylvania. All rights reserved.
Authors:
Jennifer Buher-Kane, Dr. Susan Kinnevy and Nancy Peter
Editors:
Suzanne Bouffard, Dr. Cheri Fancsali, John Price,
Thomas Schaeffer and Nyeema Watson
The Out-of-School Time Resource Center is supported by a grant from the
William Penn Foundation
OSTRC Conference Evaluation Toolkit Reference Guide
2
Table of Contents
Introduction .......................................................................................................................3
Overview of the Evaluation Toolkit ........................................................................................ 3
Key Definitions......................................................................................................................... 3
Common Questions................................................................................................................... 3
Conference Evaluation Toolkit Surveys..........................................................................5
Theoretical Basis ....................................................................................................................... 5
Survey Descriptions .................................................................................................................. 5
Survey Description Summary Chart....................................................................................... 7
Survey Combination Options................................................................................................... 8
Elements of a Successful Evaluation..............................................................................10
Planning .................................................................................................................................. 10
Implementation ....................................................................................................................... 10
Analysis ................................................................................................................................... 10
Getting Started.................................................................................................................12
Free ¡°Start Up¡± Consultation with the OSTRC .................................................................. 12
Fee-Based Services.................................................................................................................. 12
Training................................................................................................................................ 12
On-site Services .................................................................................................................... 13
Analysis................................................................................................................................ 13
Suggested Levels of Evaluation.............................................................................................. 14
Terms of Use.....................................................................................................................15
Appendix A: Development of the OSTRC Toolkit......................................................16
Appendix B: Additional Resources...............................................................................18
OSTRC Conference Evaluation Toolkit Reference Guide
3
Introduction
Overview of the Evaluation Toolkit
This Evaluation Toolkit is designed for those persons who develop, provide, and/or
evaluate OST conferences. By using this Toolkit and contributing to a growing database
maintained by the OSTRC, organizations will help provide the OST community with up-to-date,
research-based information to effectively evaluate professional development offerings.
This guide will:
. Introduce Toolkit Surveys and how they work
. Discuss how to use the Toolkit for successful evaluation
. Introduce the OSTRC¡¯s ¡°Promising Practices for Out-of-School Time Professional
Development Conferences,¡± and other useful templates for planning your conference
evaluation
. Discuss various ways in which organizations can collaborate with the OSTRC
Key Definitions
The OSTRC uses the following key definitions for the purpose of this Reference Guide:
Conference Evaluation: ¡°A process involving planning, implementing survey instruments, and
analyzing survey data.¡±
Professional Development: ¡°Workshops, conferences, technical assistance, resource centers,
peer mentoring, electronic listservs, and other supports designed to promote improvement,
enrichment, and achievement in OST staff, programs and youth.¡±
Common Questions
What is in this Toolkit?
This Toolkit includes research-based evaluation instruments that organizations can use to
evaluate OST conferences.
Included in this Toolkit are the following Surveys:
Pre Workshop Survey
Post Workshop Survey
Follow-Up Workshop Survey
Presenter Self-Assessment
Overall Conference Survey
Why did the OSTRC develop this Toolkit?
The OSTRC developed this toolkit because comparable research-based evaluation
instruments did not exist.
OSTRC Conference Evaluation Toolkit Reference Guide
4
How was this Toolkit developed by the OSTRC?
The OSTRC performed a two-year study, pilot-testing these surveys at various OST
conferences, which yielded 4,733 survey responses.
Why should organizations use this Toolkit?
The enclosed research.based evaluation tools help measure the impact of professional
development on staff, programs, and students. OST organizations are increasingly responsible
for providing this evidence.
Can this Toolkit be used to evaluate workshops other than in conference settings?
The OSTRC does not recommend using these surveys outside of conference settings.
First, the surveys were only pilot-tested - and are therefore only reliable - in conference settings.
Secondly, there are differences between stand-alone workshops and conference workshops that
are not addressed by these surveys. The OSTRC is currently pilot-testing other surveys that are
being designed to evaluate workshops outside of conferences.
Is the Evaluation Toolkit free?
The following items and services are free to all who use the OSTRC Conference Evaluation
Toolkit:
. A one hour ¡°Start Up¡± consultation with the OSTRC
. Access to the surveys
. Access to an online PowerPoint¢â presentation describing the processes of planning,
implementing, and analyzing conference evaluations
. Access to a blank SPSS database (used if an organization is licensed to use this statistical
software)
The following services can be purchased from the OSTRC for a nominal fee:
. Training on how to use the surveys by an OSTRC staff member
. Onsite facilitation of your conference evaluation
. Analysis of your survey data
Is the OSTRC interested in data collected by the surveys?
Organizations are expected to share their survey data with the OSTRC. This will help
primarily to create a data bank to track survey use over time on a national level and secondarily,
to gather information for survey revisions.
For more information, please contact Jennifer Buher-Kane, OSTRC Senior Research Coordinator
at (215) 898-2505 or jbuher@sp2.upenn.edu.
OSTRC Conference Evaluation Toolkit Reference Guide
5
Conference Evaluation Toolkit Surveys
Theoretical Basis
Each of the surveys in this Toolkit incorporates the theoretical principals of evaluating
professional development. Formal education researchers, Donald Kirkpatrick, Thomas Guskey,
and Joellen Killion each presented five ¡°levels¡± of evaluation:
1) Participant Satisfaction
2) Participant Learning (knowledge, skills, attitude)
3) Organizational Support/Integration
4) Application of New Knowledge/Skills
5) Student/Youth Outcomes
For the purpose of adapting this construct to out-of-school time, a sixth level was introduced by
Nancy Peter, Director of the OSTRC.
6) Extension of New Knowledge/Skills (e.g., adaptation to other programs and
sharing information with colleagues)
For more information on the development of the toolkit and citations, click here:
Survey Descriptions
A. Pre Workshop Survey
The Pre Workshop Survey gathers a few pieces of information:
1) Baseline data regarding participants¡¯ level of knowledge, skills, and belief regarding the
importance of the workshop topic in terms of benefiting program youth
2) Whether or not the participants attended specific workshops on this topic previously, and
3) Why the participants chose to attend these workshops (e.g., mandatory versus elective
attendance).
B. Post Workshop Survey
The Post Workshop Survey gathers a different set of data. This includes:
1) Comparative responses regarding participants¡¯ level of knowledge, skills, and belief
regarding the importance of the workshop topics in terms of benefiting program youth
2) Perceptions of various components of the workshops (relevance of information presented,
quality of presentation, effectiveness of activities, etc.), and
3) A second round of baseline data regarding participants¡¯ expectations of how successfully
they can apply the new information. This second round discusses elements, such as:
likelihood of application, organizational support, sharing information with others,
whether or not program youth will benefit from new information, etc. These questions
are asked again in the Follow-Up Survey in order to ascertain the intended versus actual
application of new knowledge.
OSTRC Conference Evaluation Toolkit Reference Guide
6
C. Comprehensive Post Workshop Survey
The Comprehensive Post Workshop Survey combines elements of both the Pre and Post
Workshop Surveys into one format that is distributed to participants after workshops. As such,
this Survey does not gather information before or after the workshops regarding the participants¡¯
level of knowledge, skills, and belief pertaining to the importance of the workshop topic or its
potential benefit to program youth. Rather, participants are asked to rate their own level of
increase in each of these areas in one consolidated Survey. In all other respects, this Survey is
similar to the Post Workshop Survey (see above).
D. Follow-Up Workshop Survey
The Follow-Up Workshop Survey gathers information about the actual process of
applying new information. It is a web-based survey which is emailed to participants one month
after attending the conference. This provides an insight into whether new information was used,
what barriers participants encountered through this process, and what support systems were
utilized to apply new knowledge/skills.
E. Overall Conference Survey
The Overall Conference Survey is used in conjunction with individual workshop Surveys.
This method gathers complimentary information regarding the conference, reactions to keynote
speakers and exhibitors, and any elements unique to the entire conference. This survey¡¯s results
provide feedback to conference planners for use in improving future efforts with the same
population (such as an annual conference for OST providers).
F. Comprehensive Conference Survey
The Comprehensive Conference Survey combines elements of the Overall Conference
Survey with the Comprehensive Post Workshop Survey. For example, if an organization decides
to only gather minimal information about the overall conference, as well as about individual
workshops, participants can fill out this Survey only.
G. Presenter Self-Assessment Survey
The Presenter Self-Assessment survey should be used in all conference evaluations.
Similar questions are posed, both to workshop participants and presenters, which allows the
analyst to compare perceptions of the workshop. Presenters are also asked to comment on the
overall conference, their support from the conference committee(s), the conference organization,
and any ¡°surprises¡± they encountered in their workshops. Examples of the latter are logistical
issues, different audience size than anticipated and different level of knowledge and/or
experience of participants than expected.
OSTRC Conference Evaluation Toolkit Reference Guide
7
Survey Description Summary Chart
Below is a grid that summarizes the data that can be gathered through each of the Surveys:
A B C D E F G
Data
Domains/
Categories
Pre
Workshop
Survey
Post
Workshop
Survey
Compre-
hensive
Post
Workshop
Survey
Webbased
Follow Up
Survey
Overall
Conference
Survey
Comprehensive
Conference
Survey
Presenter
Self-
Assessment
Survey
Demographics
. . . . . . .
Satisfaction
. . . . .
Content
. . . . .
Logistics
. . .
Knowledge,
Skill, & Belief
. . . . .
Expectations
of Workshop
. . . .
Organizational
Support
. . .
Application
. . .
Youth
Outcomes
. . .
Extension
. . .
Workshop
Activities
. . .
Presentation
Skills
. . . .
OSTRC Conference Evaluation Toolkit Reference Guide
8
Survey Combination Options
Given that specific conferences have specific needs, this Toolkit was designed to be
flexible. Below are the options for combining Toolkit Surveys:
Combination Option 1:
. Pre Workshop Survey
. Post Workshop Survey
. Follow-Up Workshop Survey
. Presenter Self-Assessment Survey
. Overall Conference Survey
Combination Option 2:
. Comprehensive Post Workshop Survey
. Follow-Up Workshop Survey
. Presenter Self-Assessment Survey
. Overall Conference Survey
Combination Option 3:
. Comprehensive Conference Survey
. Follow-Up Workshop Survey
. Presenter Self-Assessment Survey
The following explores the options and the advantages and disadvantages of each.
Combination Option 1
This option uses a pre/post-test survey design. Participants are given a Pre Workshop
Survey before the sessions begin and a Post Workshop Survey upon completion of the
professional development offering. Also, participants receive a web-based Follow-Up Survey
via email one month after the conference.
The Overall Conference Survey is distributed to all conference participants during or at
the end of the conference. This Survey gathers specific information regarding the conference as a
whole, including the logistics of registration, the overall program, food and facilities, etc. This is
most effective when used in conjunction with workshop surveys.
The Presenter Self-Assessment Survey compares the perceptions of both participants and
presenters. A unique perspective as to how these two perceptions differ or agree is offered.
Typically, the results are shared with presenters to assist in their own professional development.
This Survey also allows presenters to give feedback to conference planners in areas such as
logistical issues and problem audiences.
Advantages
Option 1 provides a comprehensive overview of a conference and gathers specific
information in several areas that will inform future planning efforts.
OSTRC Conference Evaluation Toolkit Reference Guide
9
Disadvantage
One downside of using a pre/post-test format is the need to individually match
participants¡¯ pre and post-test surveys. All Surveys include a ¡°unique identifier.¡± Participants
are asked to provide a combination of letters and numbers through which to match pre and posttest
answers, while maintaining data confidentiality. Participants may not always provide this
information which disallows comparison between their pre and post surveys, thus decreasing the
total number of matched surveys that are collected.
Combination Option 2
The second option is identical to the first, except for its use of a posttest-only survey
design to gather workshop information. Participants are only given one Survey, the
Comprehensive Post Workshop Survey, which combines elements of both Pre and Post
Workshop Survey questions.
Advantages
Option 2 allows participants to complete only one survey per workshop. This is less timeconsuming
and may elicit more participation in the evaluation as a whole. Also, the posttestonly
survey design does not require the analyst to match pre and post surveys, and may thereby
yield a higher response rate.
Disadvantages
There is a disagreement among researchers as to which method is more accurate: a
pre/post-test or a posttest-only design. For more information on this literature, see citations in
Appendix B.
Combination Option 3
The third option is the least intensive. The elements of the Overall Conference Survey
and the Post Workshop Survey are combined into one document, and administered to
participants at conference end. This Comprehensive Conference Survey still includes both a
Follow-up Workshop Survey and Presenter Self-Assessment Survey.
Advantages
Option 3 utilizes the least number of surveys and has the potential to yield the highest
overall response rate.
Disadvantages
This option gathers limited information regarding workshop sessions as well as the
conference as a whole and does not provide as much specific feedback as other options.
OSTRC Conference Evaluation Toolkit Reference Guide
10
Elements of a Successful Evaluation
As previously stated, the OSTRC describes conference evaluation as including three
phases: planning, implementation, and analysis. Each phase contributes to an effective
evaluation.
Planning
When planning a conference evaluation, it is important to first determine the end
result(s). Once an organization determines what information they wish to gather, they can
proceed to choosing an effective evaluation process.
Implementation
When implementing a conference evaluation, clearly communicating the importance of
the evaluation to all involved is crucial. Both participants and presenters respond best to this
evaluation process when the assessment tool¡¯s purpose and importance are highlighted in preconference
and on-site literature. The OSTRC has developed templates that can be customized
through which an organization can communicate this message to its audience. Emphasizing that
all survey responses will remain confidential and will be used to help plan future conferences is
an essential message to be made, perhaps repeatedly.
To implement this evaluation process, organizations should determine a survey
distribution and collection system well in advance. Assigning an individual ¡°room monitor¡± or
facilitator to each workshop provides a systematic and effective way of disseminating and
collecting workshop surveys.
Analysis
The surveys in this toolkit provide a wealth of information for analysis. In general, this
series of survey instruments utilizes a rich mix of close-ended (quantitative) and open-ended
(qualitative) questions. There are three types of analysis that can be performed from these
survey data:
1) Basic Analysis (which determines ¡°frequencies¡± or how often participants
respond to specific questions)
2) Intermediate Analysis (which draws upon the unique design of the surveys and
allows several comparisons to be made)
3) Advanced Analysis (which addresses significance testing)
OSTRC Conference Evaluation Toolkit Reference Guide
11
Through the OSTRC¡¯s research, a set of ¡°Promising Practices for Out-of-School Time
Conference Evaluation Checklist¡± was developed to assist your organization in successfully
evaluating your conference.
OSTRC¡¯s Promising Practices for Out-of-School Time
Conference Evaluation Checklist
Phase Practice

First decide on the end result (i.e., what should be learned).

Decide what data must be gathered to meet long-term goals; then pick
evaluation instruments (and combinations of instruments) accordingly.

Plan the evaluation well in advance . don¡¯t let it be an afterthought.

Discuss the evaluation, early and often, with key stakeholders such as
conference co-sponsors and conference planning committee members.

Facilitate open discussions with key stakeholders regarding the benefits of
performing a successful evaluation: providing essential information to multiple
parties and improving future conferences.

When soliciting workshop proposals and/or specific presentations, introduce
the evaluation to prospective presenters.

When disseminating conference advertisements and registration materials,
introduce the evaluation to prospective participants.

Planning
Create written materials (based on OSTRC templates in Appendix A) to
communicate the importance of the evaluation process to stakeholders,
participants, presenters, and conference volunteers.

Focus on garnering buy-in during the conference by clearly communicating the
evaluation purpose and process to participants, presenters, and conference
volunteers.

Copy each type of survey (e.g., Pre Workshop vs. Post Workshop Surveys) on
different color paper.

Distribute prepared written materials to participants, presenters, and conference
volunteers.

Implementation
Assign conference volunteers as ¡°room monitors¡± in each workshop, who will
be responsible for distributing and collecting surveys.

Enlist qualified professionals to analyze and interpret survey results for key
stakeholders. Beware of claiming causal relationships when interpreting
results.

Use the data to identify and focus on specific trends, questions, concerns, etc.

Analysis
Acknowledge limitations of survey data in all written reports.

OSTRC Conference Evaluation Toolkit Reference Guide
12
Getting Started
The OSTRC wants to clearly communicate the effective use of these survey instruments.
We encourage organizations to contact us with any questions that arise before, during, or after
the evaluation process.
As previously discussed, the following items and services are free to all who use the
Evaluation Toolkit:
. A one hour ¡°Start Up¡± consultation with the OSTRC
. Access to the surveys
. Access to additional ¡°Implementation Guide¡± (including an online PowerPoint¢â
presentation describing the processes of planning, implementing, and analyzing
conference evaluations, tips for successful evaluations, and templates that your
organization can customize and use)
. Access to a blank SPSS database (used if an organization is licensed to use this software)
The following services can be purchased from the OSTRC at a nominal fee:
. Training by an OSTRC staff member
. Onsite facilitation of your conference evaluation
. Analysis of your survey data
Free ¡°Start Up¡± Consultation with the OSTRC
After reviewing the information on this website, you may have some questions. The first
step is to call the OSTRC for more information. During your ¡°Start Up¡± Consultation, the
OSTRC staff will discuss with you how this Toolkit can help you facilitate an effective
conference evaluation and its terms of use. You will decide which of the three options of survey
packages is most suited to fit your needs. After your consultation, you will be given access to
the surveys of your choice, an online PowerPoint¢â presentation providing helpful tips, access to
a blank SPSS database (if your organization is licensed to use this software), and templates that
can be customized by your organization to facilitate the implementation process.
Fee-Based Services
OSTRC staff professionals will also discuss with you the ways in which you can utilize
the OSTRC¡¯s fee-based services to further assist with your conference evaluation. The design of
this toolkit is flexible. Your organization can choose from a range of options: from only using
our surveys for free . to hosting an OSTRC staff member to train your staff, facilitate the
implementation of the evaluation at your conference, and perform the analysis.
Training
In some circumstances, training may be helpful to familiarize organizations with the
details involved in the procedures. As such, organizations can access an online PowerPoint¢â
presentation that describes how to use these surveys appropriately and where to go for help.
OSTRC Conference Evaluation Toolkit Reference Guide
13
Organizations can also choose to have OSTRC staff personally train their staff onsite regarding
various aspects of the process including planning, implementing, and analyzing data.
On-site Services
The OSTRC provides two onsite services. First, the OSTRC can facilitate the evaluation
of your conference on-site. This would include OSTRC evaluation staff working with your
organizations¡¯ conference planners and other key personnel to coordinate a successful process of
distributing and collecting the conference evaluation surveys. As participants or presenters often
have questions regarding the evaluation, it is useful to have evaluation staff circulating the
conference sessions, answering questions and troubleshooting problems as they arise. Second,
the OSTRC can observe a sample of conference workshops and provide your organization with a
qualitative assessment of the observations.
Your organization can choose to purchase either of these services separately, or both.
Analysis
The OSTRC recommends that organizations either enlist qualified staff members or work
with the OSTRC to perform the analysis. The OSTRC staff performs three types of analysis of
this survey data: basic, intermediate and advanced.
OSTRC Conference Evaluation Toolkit Reference Guide
14
Suggested Levels of Evaluation
Below are a few examples of various levels of involvement that your organization may choose.
Keep in mind that these are a few suggested levels of involvement. Your organization may
choose a unique combination of these elements that will best suit your individual request.
Each fee-based service is priced according to an hourly rate. The exact charges will be
negotiated on a case-by-case basis depending on the number of hours required for each service,
and which level of evaluation is requested.
Level 1
Level 2
Level 3
Level 4
Level 5
Price Range
(individual
prices will be
negotiated based
on individual
requests)
¡°Start-Up¡±
Consultation
.
.
.
.
.
Free
Access to
Surveys
.
.
.
.
.
Free
Access to
additional
Implementation
Guide
.
.
.
.
.
Free
Access to SPSS
database
.
.
.
.
.
Free
Training
.
.
$225-$450
*On-site
Services
.
.
Facilitation:
$525 - $2625
Observation:
$525 - $3675
Both:
$1050 - $6300
Analysis
.
.
$150 - $5625
* Travel expenses are negotiated separately.
OSTRC Conference Evaluation Toolkit Reference Guide
15
Terms of Use
The goal of this Toolkit is to facilitate the use of research-based survey instruments
within OST professional development conferences. The OSTRC performed an extensive pilotstudy
that resulted in the current design of these survey instruments, and is extremely invested in
their success.
We request that organizations share data gathered through the use of these survey
instruments with the OSTRC. Doing so will facilitate the creation of a data bank to track the use
of these surveys over time, on a national level. Undoubtedly, these surveys will undergo future
revisions. Gathering this information generates necessary revisions to create products that are
useful, efficient and effective in evaluating OST professional development conferences.
Organizations will receive a ¡°Data Sharing Agreement¡± which will be co-signed by the
organization and the OSTRC at the University of Pennsylvania. This agreement requests that
organizations share their data with the OSTRC by providing survey copies or originals.
Upon returning a signed copy of the ¡°Data Sharing Agreement¡± to the OSTRC Senior
Research Coordinator, organizations can schedule a one-hour ¡°Start-Up¡± consultation session in
which specific requests will be discussed and a process of performing the training,
implementation, and analysis will be developed. The OSTRC will then provide the surveys
requested for the specific conference evaluation.
In the event that an organization elects to perform its own analysis, the OSTRC will
provide a blank copy of its SPSS database. The OSTRC does not recommend that organizations
unfamiliar with this software perform their own data entry or analysis.
OSTRC Conference Evaluation Toolkit Reference Guide
16
Appendix A: Development of the OSTRC Toolkit
In the summer of 2004, the OSTRC attempted to locate research-based survey
instruments used to evaluate OST professional development. After an extensive literature review
and conversations with key stakeholders, it was determined that these instruments did not exist.
The OSTRC then implemented a mixed method pilot study to design and test multiple survey
instruments that can be used in OST workshops and conference settings.
As part of the planning process, the OSTRC reviewed literature relating to effectively
implementing professional development1, including several models of professional development
evaluation from researchers Guskey, Killion, Kirkpatrick, and others2. These models have
elements in common as each defines ¡°levels¡± of evaluating professional development
(participants¡¯ satisfaction, learning, application, and results).
However, the above mentioned models were designed for formal education and relate
only peripherally to OST programs. In creating an OST model, the OSTRC added another level
of evaluating professional development: extension, referring to adapting knowledge to suit a
particular program and/or sharing this knowledge with others such as OST staff, programs, or
youth3.
Using the theoretical frameworks described in the literature, the OSTRC developed a
series of survey instruments that utilize a pre/post-test design and measure indicators of
knowledge, skills, and attitudes as well as intended vs. actual application. The surveys
incorporate both close-ended (or quantitative) questions as well as open-ended (or qualitative)
questions.
Research Questions
Initially, the OSTRC established several research questions:
1. In what ways is an OST conference setting typically a valuable professional development
experience for its participants and in what ways is it limited?
2. What steps are involved in the process of learning and applying knowledge gained in an
OST conference workshop?
3. Which types of workshops contribute to a substantial change in participants¡¯ content
knowledge, skills and/or belief in the importance of the topic in terms of benefiting
program youth? And, to what extent do participants apply new knowledge/skills or
experience a change in belief one month after the conference?
1 National Staff Development Council (1995). National Staff Development Council¡¯s Standards for Staff
Development, http://www.nsdc.org/standards/about/index.cfm; Gardner, H. (1983). Frames of Mind: The Theory of
Multiple Intelligences. New York: Basic Books, PA Pathways (2002). PA Pathways Training Resource Manual:
Guidelines for Training Organizations, http://www.papathways.org/PDFs/TRM_AdLearn.pdf.
2 Guskey, T. (2000). Evaluating Professional Development. Thousand Oaks, California: Corwin Press, Inc.,
Killion, J. (2002). Assessing Impact: Evaluating Staff Development. Oxford, Ohio: National Staff Development
Council, Kirkpatrick , D.L. (1994). Evaluating Training Programs: The Four Levels. San Francisco, CA: Berrett-
Koehler.
3 Peter, N. (2004). Out-of-School Time (OST) Professional Development Workshops: An Evaluation Framework,
http://www.sp2.upenn.edu/ostrc/pdf/OSTWorkshopEvaluation.pdf
OSTRC Conference Evaluation Toolkit Reference Guide
17
4. Is there typically a difference in how presenters and participants view the same workshop
and if so, what types of differences occur?
Pilot study
The OSTRC performed two major pilot tests of the survey instruments. The first
consisted of one local conference, while the second consisted of two conferences . one statewide
and one regional4. Between the first and second pilot, the OSTRC also conducted five focus
groups with 50 local OST staff, including those who worked directly with children as well as
those in administrative positions5. The purpose of the focus groups was to determine how
participants felt they benefited from professional development, specifically in terms of affecting
positive change in participants¡¯ knowledge, skills, and attitudes.
The surveys were revised after each pilot test as well as after the focus groups. Some
trends in the focus groups have modified or been added to the surveys as questions, while other
information has been used to inform analysis of the survey data.
Between November 2004 and May 2005, a total of 1,426 OST staff participated and
4,733 surveys have been collected. Specifically, the OSTRC has collected 1,863 Pre Workshop
Surveys, 2,084 Post Workshop Surveys, 457 Follow-Up Surveys, 93 Presenter Self-Assessment
Surveys, and 236 Overall Conference Evaluations.
4 ¡°Regional¡± in this context refers to the Mid-Atlantic region of the United States.
5 For more information on the findings from the OSTRC focus groups, see:
http://www.sp2.upenn.edu/ostrc/pdf/OSTRCFocusGroupSummary.pdf
OSTRC Conference Evaluation Toolkit Reference Guide
18
Appendix B: Additional Resources
Bouffard, S. (2004). ¡°Promoting Quality Out-of-School Time Programs through Professional
Development¡±. Harvard Family Research Project, Evaluation Exchange, Volume X, No. 1, Spring 2004.
Gardner, H. (1993). Frames of Mind: The Theory of Multiple Intelligences. New York: Basic Books.
Guskey, T. (2000). Evaluating Professional Development. Thousand Oaks, California: Corwin Press,
Inc.
Killion, J. (2002). Assessing Impact: Evaluating Staff Development. Oxford, Ohio: National Staff
Development Council.
Kirkpatrick , D.L. (1994). Evaluating Training Programs: The Four Levels. San Francisco, CA: Berrett-
Koehler.
National Staff Development Council (1995). National Staff Development Council¡¯s Standards for Staff
Development. Retrieved September 20, 2004 from http://www.nsdc.org/standards/about/index.cfm.
Halpern, R. (1999). After-school programs for low-income children: Promise and challenges. The Future
of Children, 9, 81-93.
Harvard Family Research Project, Evaluation Exchange. Issue Topic: Professional Development.
Volume XI, No. 4, Winter 2005/2006.
Lauver, S. (2004). Issue Topic: Evaluating Out-of-School Time Program Quality. The Evaluation
Exchange, 10 (1).
Shortt, J. (2002). Out-of-School Time programs: At a critical juncture. New Directions for Youth
Development, vol. 2002, no. 94, 119-123.
PA Pathways (2002). PA Pathways Training Resource Manual: Guidelines for Training Organizations,
http://www.papathways.org/PDFs/TRM_AdLearn.pdf.
Harms, T., Jacobs, E. & White, D. (1995). School Age Care Environmental Rating Scale. New York:
Teachers College Press.
Heck, S., Loucks, S.F. et al (1981). Measuring Innovation Configurations. Austin, Texas: Southwest
Educational Development Laboratory.
Horsley, D.L, & Loucks-Horsley, S. (1998). CBAM brings order to the tornado of change. Journal of
Staff Development, Vol. 19, No. 4. Retrieved September 20, 2004, from
http://www.nsdc.org/library/publications/jsd/horsley194.cfm.
Brookfield, S.D. (1986). Understanding and Facilitating Adult Learning. San Francisco: Joseey-Bass.
Knowles, M.S. (1980). The Modern Practices of Adult Education: From Pedagogy to Andragogy (Rev.
Ed.). Englewood Cliffs: Cambridge Adult Education.


#

tags