Re: [Videolib] Program evaluation

Media2 (media2@bellatlantic.net)
Wed, 06 Aug 2003 18:49:46 -0400

Hi Linda:
I'll send you our last one soon as I can find it. Wow, a topic I know
something about. Had a "polls and surveys" course in my graduate program
and use the skills and research all the time.

The most visible evaluation tool is circulation, turn downs and shared
materials records. Reduce turn downs, monitor titles that get shared a
lot and maintain a circulation that is 45 to 50 percent of your student
population (for once a week deliveries).

Second is the material condition form we put in every video that goes
out. We get back about 100 a week (about 1% of circ) and of the 100
about 30% are negative about 60% are positive and about 33% of the total
contain notes on the back with suggestions.
The key question on the form is across the bottom: "Did this video help
meet the objectives of your lesson? __ Y ___ N. If no please
explain on the back." Valuable input.

Every three years or so we go through the motions of sending out a
formal user evaluation survey. We stratify it to HS, MS and Elementary
level and target it directly to about 300 of our regular users by name
(obtained from circ system) using a random, skip digit approach. We get
about 99% returned and a clear picture of what we need to improve, do
different or leave alone. Asking a lot of questions to people who don't
use the service just skews the results and gets depressing. For
example, out of 7,700 teachers only 2,300 actually borrow materials from
us :- ( . The percentage holds across most RMCs. But it is better than
most ITV stations who think about 2% of the teachers in their service
areas actually watch and or tape day time ITV programs at least once a
month.

We try to keep the questionnaire open ended and on a single sheet, two
sides about 10 questions max. We decide a head of time what the focus
of the survey will be for this round. In 2000, it looked at cross
curriculum needs. We wanted to know if our perception was accurate -
are a majority of 1-8 grade curricula moving to more of a cross
content/curriculum approach and how would this affect purchasing
patterns or subject needs. We also ask them for input on new units
they developed. At the high school level we were looking at desk top
access (prepping for marketing our digital delivery system) and also
looking to see if certain subject areas were moving to more project
learning and less passive instructional approaches.

The last section is always real open:
What is the best part of our service? (we don't ask them what they
"like" about the service)
What part of our service needs improvement? Specifically . . .

What would make borrowing materials easier for you? (this gets
good responses)

These last three get some thoughtful answers, many of which we can't
accommodate, but they provide ammunition to use with the board of
directors when we are trying to justify an upgrade or new services.

Several times we sent questionnaires to USERS asking questions about
rookie teachers and how we could get them to become regular users, or
how we could better reach non-user veteran teachers.

Never ask users what they WANT from the Center. They don't know.
Consumers, by and large, are blind to future possibilities. They THINK
they want all the videos on DVD because that's what the advertising and
articles in Newsweek say. If we had asked 10 years ago, do you want an
on-line catalog? The answer would have been "no." Surveys don't drive
improvement. Good leadership drives improvement. Have the technology and
content available when they are ready for it.

If you are trying to improve your program or operational quality I
suggest learning how to benchmark your operations in terms of both scope
of service and excellence of service provided. You never know how good
you are until you can compare your operation against several other RMCs
in and out of state. You may be the benchmark others will be compared
against!

It's a Good Life if You Don't Weaken -

M. Richie

Linda Fox wrote:

>Hello;
>We are exploring ways to evaluate our Instructional Media Service. Does
>anyone have a form that they ask users to complete? What type of
>questions do you ask? How is your rate of return on these forms? Does
>the use of such a survey really drive changes in your program or
>services? If you have a form and are willing to share, please send it to
>me electronically. I'll share mine after we've developed it.
>Thank you.
>Linda Fox
>Director
>Capital Region BOCES School Library System and Instructional Media
>Service
>Suite 102, 900 Watervliet-Shaker Road
>Albany, NY 12205
>(518) 464-5104 (phone)
>(518) 464-5101 (fax)
>lfox@gw.neric.org
>www.crbsls.org
>
>_______________________________________________
>Videolib mailing list
>Videolib@library.berkeley.edu
>http://www.lib.berkeley.edu/mailman/listinfo/videolib
>
>
>

_______________________________________________
Videolib mailing list
Videolib@library.berkeley.edu
http://www.lib.berkeley.edu/mailman/listinfo/videolib