This is part one of a presentation I’ll be giving tomorrow at the annual meeting of the Pacific Central District of Unitarian Universalist congregations. As you will see, growth is not rocket science; growth is all about patient attention to detail. I think you will find this presentation to be quite different from other Unitarian Universalist approaches to growth: it’s kind of geeky; it’s not exciting; it lacks sexy jargon terms; and it’s all about management and administration. However, since the exciting, sexy, theological approaches don’t seem to be working all that well, maybe you should check out my approach….
Welcome!
We’re going to talk about transforming and growing your programs and ministries for children and youth. And my emphasis is going to be on growth. I believe that there are many families who would love to have their children participate in our programs and ministries for young people, and wee need to make room for them in our congregations. Furthermore, research by the Search Institute shows that regular participation by youth in a religious congregation correlates with a decrease in risky behaviors such as substance abuse; therefore, by having more kids participating regularly in our congregations, we are literally saving lives.
If this is not the workshop you were expecting, feel free to leave now or at any time without embarrassment. I only want you to be here if you want to be here.
And if you want to ask questions, please write them down (legibly). I am going to post the entire presentation online, and I want to include your questions online. I will stop periodically during this workshop to take your questions.
How to measure growth
If you really want to grow your programs and ministries for children and youth, the first thing you have to do is figure out how you’re going to measure growth. More often than not, you get exactly the kind of growth you measure for. This is so important that we’re going to take fifteen minutes right now to go over this.
Perhaps the most common way to measure the size of a religious education program is to count the number of children and youth for whom you have signed registration forms — after all, this is the number the Unitarian Universalist Association (UUA) asks congregations to report each year. The strength of this measure of program size is that it is well-defined: you gather all the paper registration forms and count up the number of names on them. However, this measure of program size has two big weaknesses. First, if you’re aggressive at getting everyone who walks in the door with kids to fill out a registration form, you can drive your numbers up very quickly, but you will not be accurately measuring the number of kids who actually show up at your program on a regular basis. Second, if you’re not very aggressive at getting families to fill out registration forms, there are always a significant percentage of families (sometimes as high as 30-50%) who just never fill out the form. In short, in my experience the number of registered children and youth usually is not a good measure of the actual number of warm bodies in your classrooms.
The next most common way to measure the size of a religious education program is to count the number of children and youth who are physically present on every Sunday of the year, and then to calculate the average Sunday attendance. Most of us keep good attendance records, because we are dealing with legal minors, so usually it is a simple matter to calculate average annual weekly attendance. Furthermore, the UUA is now requiring annual attendance figures, so most of us are keeping records on attendance anyway. However, this measure of program size also has its weaknesses. First, when you have intergenerational services, it’s difficult to count the number of children and youth present, and the children and youth are simply counted as part of the adult attendance. Second, there are often non-traditional programs that perhaps do not take place on Sunday morning (e.g., OWL, Coming of Age, Children’s Choir, etc.), but which attract children and youth who do not attend other programs, and it is never entirely clear whether those children and youth should be included in the total attendance or not. Finally, volunteers sometimes forget to take attendance, or aren’t accurate in their attendance taking. While average attendance provides a more accurate measure of the size of a religious education program, the margin of error might be as large as five to ten percent.
A third measure of program size which I have found useful is what I call enrollment. I define enrollment as follows: children are enrolled in the religious education program if (a) their parents/guardians have filled out and signed a registration form; and/or (b) if they have attended three or more times in the previous six to eight months; and children or youth who have not attended at all in the last six to eight months are taken off the enrollment list if no new registration form has been submitted. I find enrollment helps me to correct some of the problems inherent in registration numbers. I calculate enrollment twice annually: once in late August, and once in late December. In August, enrollment allows me to plan for the maximum expected attendance on any given week, an important consideration when recruiting volunteers and allocating rooms. In December, enrollment is the number which I report to the UUA as total registration. Looking at enrollment lists twice a year also allows me to determine which families have drifted away, and what new families have arrived. However, enrollment numbers are more fictional than actual average attendance.
Rather than using just one measure of program size, I think it is wise to use at least two measures of program size together. I like to look at enrollment and average attendance. And I look at these numbers as dynamic, not static; that is, they are constantly changing over time, which means I can look at trends. Ideally, I would collect enrollment and attendance data for at least twenty-five years so I could get a picture of the long-term trends in a given congregation. I also look at the relation between enrollment and attendance figures: if attendance is less than 50% of enrollment then I suspect that the enrollment numbers were padded, or that the program was lousy; if attendance is greater than 70% of enrollment, then I suspect the attendance figures were padded or the program was unbelievably good.
I further break down these two big numbers into more detailed numbers. Perhaps most importantly, I break down enrollment into age groups. This allows me to see if we start losing kids at a certain age. Most often, you’ll see a slight drop off in enrollment beginning at about grade 5; then there’s typically a sharp drop off in grades 7 and 8, because we lose about half our UU kids in middle school. Next in importance, I break down attendance into monthly averages, and I graph that to see when we typically have high and low attendance. I also break down enrollment into number of families, and I look at the number of families who pledge or make substantial financial contributions.
As you can see, measuring the size of a program is not a simple matter. And what you measure really makes a difference when you are striving for growth. Let me give you two examples:
In the first example, let’s say that you have decided to measure growth solely by attendance. I once served as a religious educator in a congregation back east where one of my predecessors did precisely this: this person set a goal of growing the program, and measured growth by signed registration forms. This staff person was notorious for “swooping down on newcomers” the moment they walked into the building, and basically forcing them to sign a registration form. On the basis of increased registration numbers, this staff person successfully advocated for a substantial increase in hours and compensation. But when I tracked down attendance records, I discovered that attendance was basically flat throughout this person’s tenure. So by this other measure, the program had not grown at all.
In the second example, let’s look at the congregation I’m currently serving. In 2009, the Board hired a consultant who showed us how to measure the size of the congregation by average attendance, and we subsequently determined that we would aim to increase the average attendance of Sunday morning programs. In 2011, the Sunday morning religious education programs achieved the phenomenal growth rate of 21% over the previous year, as measured by attendance. However, the Board pointed out that there was no increase in people who signed the membership book, adult attendance had not increased, and furthermore a closer look at the numbers revealed that most of the average increase came from a very successful summer program, whereas attendance in winter months was essentially flat.
At the same time, pledge income increased 15% in the 2011 canvass over the previous year, so perhaps the growth wasn’t entirely fictional; increased attendance can be linked to increased satisfaction, which can in turn be linked to increased pledging. (Parenthetical note: pledges increased once again in the 2012 canvass, and several families with children are in the top quintile of givers.) But on the other hand, while average annual religious education attendance increased by 21%, the total average annual Sunday morning attendance did not increase; the increase in religious education attendance turned out to be linked to a slight decrease in adult attendance. Unfortunately, due to defects in our congregational database, we were unable to determine if our enrollment was up or not; but enrollment was probably flat.
So while we had originally agreed upon average attendance as the way to measure growth, in the end this measure did not prove satisfactory to key stakeholders. The final consensus was that the RE program grew in attendance by about 5% in 2011 (but not in certified members); grew significantly in satisfaction as indicated on evaluations; and grew in terms of pledges. We got the growth we measured for, but we didn’t get the growth we were really hoping for.
It becomes clear that if your goal is to grow your program, you have to be careful what you measure, and how you measure it.
I believe the best way to measure the size and growth of a religious education program is by looking at three numbers simultaneously: enrollment, attendance, and financial contributions. If your enrollment is going up year by year; if your average attendance is increasing in absolute numbers and is either increasing or remaining constant considered as a percentage of enrollment; if pledge income from families is increasing both in absolute numbers and increasing as fast as or faster than inflation — then you can be sure you are growing.
Now: if your enrollment is up but neither of the other numbers is up, then most probably your program is not growing, you are merely being aggressive about getting newcomers to sign registration forms. Or if your attendance is up but enrollment is not, then probably satisfaction is increasing, and demands on volunteer and staff time are increasing, but the consensus among congregational leaders will be that you’re not growing. Or if your pledge income is up, but enrollment and attendance are not up, then maybe you just had really lousy giving in previous years — or maybe you just have really generous people in your congregation, which is a really nice thing to have, but it’s not growth the way most of us think about growth.
Therefore, I say unto you: measure enrollment, average attendance, and pledge income. When they are all going up, then you’re growing.
One final note about measuring growth: I have rarely found that the main congregational database can track the kind of information I need to measure growth. Pretty much the only way I have been able to get accurate information is to track attendance and enrollment on spreadsheets that I maintain on my own computer.
Any questions about how to measure growth?