Martha L. Henderson
Political Science and Criminal Justice
Department of Criminal Justice
Mt. Mercy College
M. Joan McDermott
Southern Illinois University Carbondale
Center for the Study of Crime, Delinquency, and Corrections
EVALUATION RESEARCH ENABLES policy analysts and policymakers to identify best practices, to sort out what works from what doesn’t work, and to select the best program or policy alternative for solving a social problem. Because process evaluations are intended to examine how programs are implemented, they are particularly useful for identifying barriers to effective program implementation. In this paper we examine the implementation of a School-based Probation Program in a rural county in the Midwest. The study is designed to explore the individual, organizational, and systemic barriers to implementation that inhibited program development and evaluation.
School-based Probation in Context
The practice of bringing probation to the schools is best understood within the context of the multiple needs of youth and the corresponding drive for a multi-agency, collaborative response to problems of juvenile delinquency and crime. The services that work best with juvenile offender populations are services offered by and within a variety of social and justice agencies within the community (Leone, Quinn and Osher 2002). Because youth in trouble or at risk often have multiple needs, Brown, DeJesus, Maxwell and Schiraldi (2000:12) state, “ . . . effective youth programs collaborate and form connections with other agencies to strengthen their outcomes and enable them to refer youth offenders to agencies and programs better suited to attend to their other needs.”
Collaborative efforts employed to address juvenile justice issues within school settings have taken three primary forms: law enforcement education programs provided during the school day; school-based initiatives for a greater law enforcement presence in schools; and other collaborations, usually including a broader network of community organizations. The majority of collaborative efforts employed to address juvenile justice issues in the school environment have been law enforcement education efforts and have only indirectly involved the juvenile court system (Gottfredson, Wilson, and Najka 2002). More recently, school-based probation has been advocated as a program that can increase juvenile accountability, reduce violence within schools, increase success rates with juvenile probationers, and foster better communication between probation departments and schools (Decker 2000; Torbet, Ricci, Brooks and Zawacki 2001). The program provides more intensive supervision of students on probation than can be found within traditional juvenile probation services. Consequently, school-based probation increases public safety because while in school the student does not pose a threat to the public outside of those surroundings (Seyko 2001). Additionally, as school-based probation officers are a visible force in school, many believe that school-based probation might also serve as a deterrent for non-court involved youth (Seyko 2001).
Preliminary research on school-based probation indicates that such programs are effective in several areas. School-based probation programs have been found to increase student attendance, reduce absenteeism, and lower dropout rates (Clouser 1995; Griffin 1999; Metzger 1997; Torbet, Ricci, Brooks and Zawacki 2001). Consequently, academic improvement among juvenile probationers has increased (Clouser 1995; Stephens and Arnette 2000). In several states school-based probation decreased detentions and in-school suspensions among juvenile probationers (Clouser 1995; Griffin1999). Thus, research indicates that school systems with school-based probation programs may reduce the number of serious violations among juvenile probationers.
Research on Program Implementation
The focus of program evaluation research is slanted heavily toward outcome analysis. The determination of whether or not a program is successful traditionally is based on the impact the program has on the target population. Relatedly, explanations for why a program is successful or unsuccessful tend to focus on factors specific to the targeted program participants. However, program implementation and service delivery are directly relevant to program outcomes. This relevancy is due to the fact that these factors determine the availability of a particular program to the selected participants (Etheridge and Hubbard 2000; Heinrich and Lynn 2001).
Evaluation researchers have suggested that program evaluation research should incorporate public management and organizational theoretical frameworks as part of the methodology for determining program outcomes (Etheridge and Hubbard 2000; Heinrich and Lynn 2002; Mardsen 1998). These theoretical frameworks allow evaluation researchers to examine how program implementation and delivery are affected by factors outside the specific program environment (Mardsen 1998; Mead 1997; Sandfort 2000). These factors are referred to as system, organization, or process variables (Douzenis 1994; Etheridge and Hubbard 2000; Lynn and Heinrich 2001; Mardsen 1998). Research in the field of public management suggests that it is important to consider these implementation and delivery factors in order to enrich the utility of evaluation research.
Research has identified macro-level factors that affect program outcomes beyond the individual outcome differences among program participants (Heinrich and Lynn 2001; Mead 1997). For example, the variability of treatment outcomes across similarly situated community-based substance abuse treatment programs may be a result of variation in the implementation and delivery of the programs as opposed to individual participant factors (D’Aunno, Sutton and Price 1991; Etheridge and Hubbard 2000; Gerstein and Harwood 1990). In addition, Kramer, Laumann and Brunson (2000) found that rural schools found it difficult to maintain a school-based program for children dealing with divorce or death, due to implementation and delivery roadblocks in three structual contexts: community, school, and family. Research on other types of school-based programs has also found that program implementation and delivery in rural areas is particularly affected by disconnect among administration, staff, and participants (Helge 1981). Helge (1981) found in an evaluation of a rural special education programs, that teacher retention and recruitment-resistant attitudes in the administration, s well as travel issue,s impacted the successful implementation and delivery of the program.
These process factors are significant in evaluating difficulties in program implementation and delivery, but are often difficult to quantify in evaluation research (Heinrich and Lynn 2001). This difficulty is increased when there is no organizing context or model in which evaluators can categorize unobserved factors (Heinrich and Lynn 2001). Therefore, Heinrich and Lynn (2001) suggest that evaluators utilize a multi-level framework to discuss qualitative factors related to program implementation and delivery. We attempt to follow this suggestion by analyzing barriers to the effective implementation and delivery of a rural, School-based Probation Program in the context of individual, organizational, and systemic factors.
Individual Practitioner Barriers
A growing body of literature indicates that despite strategic planning efforts, new correctional initiatives often fail as a result of individual practitioner-related barriers to implementation (Klein and Sorra 1996; Miller, Koons-Witt and Ventura 2004; Porporino 2005; Simpson 2002). Community-based correctional staff may lack the basic knowledge, skills and abilities to effectively implement the correctional initiative. Staff members who lack these skills are not able to carry out the minimum requirements of the job outlined by a new correctional initiative (Liddle et al. 2002; Simpson 2002). The lack of knowledge, skills, and abilities often has multiple causes. Some community-based staff members have never participated in formal pre-service training (Miller, Koons-Witt and Ventura 2004); and consequently are not sure of job expectations. In addition, a significant percentage of staff members having direct contact with offenders hold only a high-school education and lack the credentials to deliver more sophisticated services (Miller, Koons-Witt and Ventura 2004). Others have in fact received some form of training, but the training received was not relevant to the requirements of their employment (Porporino 2005). Staff must be trained, monitored, and evaluated on content relevant to the nature of their job for an intervention to be successful (Farabee, Prendergast, Cartier, Wexler, Knight and Anglin 1999; Liddle et al. 2002; Simpson 2002; Young 2004).
Even when staff have knowledge, skills, and abilities to do the job, implementation barriers may exist due to lack of clarity in goals, burnout, poor supervision by managers, and role conflicts experienced by staff (Lehman, Greener and Simpson 2002; Dansereau and Dees 2002; Young 2004). Effective interventions provide an opportunity for staff to practice intervention strategies, receive feedback from supervisors, and receive positive reinforcement for effectively implementing a new initiative (Andrzejewski, Kirby, Morral, and Iguchi 2001; Simpson 2002; Dansereau and Dees 2002). Such efforts allow for greater communication between line-staff and supervisors as well as providing opportunities to clarify expectations and adjust programmatic issues. Confusion among staff over the scope of their responsibilities compromises their capacity for engaging clients and following implementation plans (Latessa 2004; Lehman, Greener and Simpson 2002). Role conflict also occurs when staff perceive aspects of the new initiative as reflecting the interests of administrators rather than what line-staff perceive to be key needs and requirements for effective programming (Lehman, Greener and Simpson 2002). Staff resistance to change is an inevitable part of the implementation process and often results from inadequate motivation for change, lack of feeling of self-efficacy by staff, or other negative attitudes toward programming (Liddle et al. 2002; Young 2004). The importance of staff attitudes for implementation is documented in a study by Fulton, Stichman, Travis, and Latessa (1997) that indicated that intensive supervision probation officers who had received training on the principles of effective interventions and who held attitudes supportive of rehabilitation favored behavioral change models. Similarly, low levels of motivation for implementing correctional practices have been documented as leading to the failure of new initiatives (Latessa 2004; Lehman, Greener and Simpson 2002). Staff must perceive the changes to have utility and be confident in their abilities to implement the initiatives for success to occur (Liddle et al. 2002; Simpson 2002; Young 2004).
Organizational barriers to implementing correctional initiatives are legion (Liddle et al. 2002; Mears, Kelly and Durden 2001; Simpson 2002; Young 2004). Latessa (2004) has termed such implementation barriers “organizational responsivity,” which can limit full implementation of evidence-based and effective correctional practices. At the local level, administrators of community-based programs often fail to engage in strategic planning prior to implementing a new initiative, resulting in a failure to address service delivery issues (Mears, Kelly, and Durden 2001; Simpson 2002). As part of the strategic planning process for effective interventions, most correctional agencies determine the staffing configuration, frequency of sessions, length of sessions, programmatic content, and physical location of the program, all of which have been found to impact outcomes (Simpson 2002). For example, locating programs in an area that is inconvenient for clients or holding sessions during times that do not fit with client work or school schedules may result in client “no-shows” and lack of success for the program (Miller, Koons-Witt and Ventura 2004; Lehman, Greener and Simpson 2002). The failure to engage in more than cursory strategic planning for a new initiative can also result in a disjunction between the goals of the program and actual practice.
Another barrier to effective implementation at the organization level is the challenge of recruiting and retaining staff to work in community-based programs, particularly when the program is located in a rural community (Miller, Koons-Witt, and Ventura, 2004). According to Miller, Koons-Witt, and Ventura (2004), corrections has a lengthy history of staff retention problems related to location, low base-rate salaries, and the inability to provide contractual incentives for experienced labor. Losing trained staff can damage staff member morale and feelings of self-efficacy. As those most experienced with the treatment modality leave, inexperienced staff are left without an essential tool for clarifying implementation issues. Moreover, effective implementation also requires strong leaders. The correctional literature on evidence-based practices has revealed the importance of engaged and charismatic leaders (Latessa 2004; Roman and Johnson 2002; Simpson 2002). These leaders are change agents whose presence signals the organization’s commitment to the change process, increases staff buy-in, and ensures greater communication of ideas within and between agencies (Roman and Johnson 2002; Simpson 2002).
In addition, administrators confront the common implementation barrier of finding money to start-up new initiatives and to support existing programmatic structures. Some correctional programs will rush to implement new initiatives simply because federal, state, or local agencies are willing to provide grant funding, only to discover that the program cannot continue support of the initiative once the grant period ends (Brown and Campbell 2005). Relatedly, in light of the recent emphasis on performance measures and efficiency for determining and maintaining funding levels within community-corrections agencies, lack of quality in the data available for monitoring, quality assurance, and evaluation can be a significant barrier to the continued existence of a new initiative (Henderson and Hanley 2006). Unfortunately, problems in data quality among correctional agencies are well documented (Miller, Koons-Witt and Ventura 2004). Without data indicating efficiency, performance, and quality assurance, new initiatives cannot be evaluated and ultimately run the risk of losing funding (Latessa and Holsinger 1998; Parent and Barnett 2004).
Programs are not stand-alone entities; they are embedded within larger criminal justice and social service systems. Consequently, the barriers to successful program implementation are often cross-cutting. For example, competition between agencies over scarce funding streams, difficulties in starting and sustaining interagency collaborations, and lack of support from the courts have been found to significantly impact outcome (Brown and Campbell 2005; Young 2004). When community-based practitioners do not have the power to enforce treatment participation and other programmatic components, clients may be dissuaded from full participation within the program (Miller, Koons-Witt, and Ventura 2004). Furthermore, the inability of correctional agencies to sustain community-based partnerships has been well documented (Byrne 2004; Brown and Campbell 2005; Joplin et al. 2005; Parent and Barnett 2004). Often the inability to sustain community partnerships results from mutual distrust between agencies (Young 2004). For example, according to Roskes and Feldman (1999:1615) mental health agencies are “often viewed (by correctional staff) as soft on crime, uninterested in public safety, and as making excuses for criminals with mental illness.”
This paper draws on a process evaluation of a School-based Probation Program in a rural county with a couple of small towns in the Midwest (Author citation, 2004). A mix of qualitative and quantitative methods was used in the research. Researchers examined numerous program documents (grant proposals, internal reports and reports to the funding agency, and assorted documents such as job descriptions, lists of officers and caseloads, and so on). In-depth interviews, lasting from one to two hours, were conducted with four key administrative personnel in the courts and probation, as well as with five probation officers (three school-based officers and two line officers, explained below) and one former school-based officer. As this was a process evaluation and there was continuous contact over many months with the officers and their supervisors, a number of respondents were interviewed more than once, and the researchers had frequent informal conversations with them.
During the course of the project, researchers did 20 ride-alongs to 10 different schools with school-based probation officers (SBOs) to conduct observations of juvenile contact in schools. Three of the ride-alongs included home visits. Each ride-along lasted approximately 20 to 90 minutes, depending on the location of the schools, the number of students and school officials contacted, and the number of schools visited by the school-based officer. A school visit typically began with a stop at a secretary’s or school attendance officer’s office, where the SBO often picked up grades or attendance reports and was informed of any problems. Then anywhere from no juveniles (sometimes the probationers were not in school) to a maximum of three probationers were seen individually by the SBO.
The project also included data from a survey of school personnel, although the response rate was poor (9 of 40 potential school respondents identified by SBOs, or 22.5 percent). And, while the original research proposal called for interviews with a sample of juveniles and their parents who would be identified and recruited through home visits with school-based probation officers, these interviews were not conducted. At approximately the time the interviews were to begin, officers stopped doing home visits. Finally, data were collected from probation files on juveniles (which included probation and court data as well as limited school data) as well as probation files on officers (travel log data). For the purposes of this paper we draw primarily on the interview and observation (ride-along) data. We mention the other data collection efforts here because we return to them in the discussion of evaluation.
The School-based Program
During the course of the evaluation from May 2003 through July 2004, the School-based Probation Program was staffed with either two or three school-based officers, who operated out of a satellite office approximately seven miles from the main county probation office. What differentiates the program of School-based Probation discussed here from similar programs in urban areas is that the school-based officers did not have full-time presence in particular schools, nor did they have offices in schools. Instead, because of the rural and small-town nature of the county, 18 schools are covered under a system in which each of the three school-based officers had a juvenile caseload that includes a group of schools. Two officers had caseloads defined by all of the schools in one or the other of two small cities and the third officer had a caseload that included schools for youth with behavioral disorders (called the “BD schools” by the officers) and the county’s more rural schools with few probationers. Each juvenile probationer had the same main probation officer (internally called the line officer”), and this officer did not visit schools.
The county School-based Probation Program was funded through a state grant. The program grant proposal, written and funded in the spring of 2001, outlines six goals. The first goal was juvenile recognition of probation monitoring. That is, juvenile probationers would see their school-based officers regularly in the school setting. The second goal was to improve the relationship between probation and the schools through more and better contact and information exchange between school personnel and school-based officers. The third goal of the program was improved relationships between probation officers and parents. It was hoped that through the School-based Program parents would recognize a team approach to monitoring juvenile probationers.
The fourth goal was more immediate remedial attention to potential violations. The idea was that through more intensive contact with juvenile probationers and school personnel, the school-based officers would be in a better position to recognize when the probationers were at risk of violating probation. The fifth goal was a decrease of 20 percent in juvenile offenses. This goal was based on the presumed deterrent effect of the school-based officer’s enhanced monitoring of probationers. The sixth and final goal was improvement in the quality of education, not only for juvenile probationers (who would demonstrate increased attendance, decreased school disciplinary measures, decreased dropout rates, and increases in grade point averages), but also for other students in the school because the school would, presumably, be a safer place.
Barriers to Implementation
Barriers at the Individual Practitioner Level
Document analysis and interviews confirmed that individual practitioner characteristics impacted the implementation of the School-based Probation Program. Effective implementation of school-based correctional programs requires staff competency, staff efficacy, and staff knowledge of local school and community programming. For example, because experienced officers were not attracted to these positions due to lack of job security (explained below), almost all school-based officers were completely new to the job of probation. This meant that a great amount of time was spent in training the SBOs in the basics of probation as well as in the job of the SBO, taking away from time they could have spent in schools. Moreover, the new officers also had to learn the network of referral agencies in the communities. For example, the SBOs generally were not aware that they could call upon truant officers for assistance. As explained by a supervisor, it took nine months to a year to completely train the SBO, who was then eligible to move from the school-based position to a probation officer position with job security. Thus, almost always the School-based Probation Program was staffed by inexperienced probation officers who were unfamiliar with the job, the schools, and communities in which they worked. In addition, because they were generally young men and women, often fresh college graduates, they lacked the interpersonal skills and confidence necessary for working with school personnel in an effective way. Some of the inexperienced SBOs were fearful of carrying out duties associated with their position and lacked confidence in their abilities. SBOs reported feeling afraid to go into certain neighborhoods and consequently, did not make the required home visits for probationers who missed school. Others reported feeling unprepared to handle school and family-related issues in the community.
Interviews with line officers and SBOs reveal the existence of role confusion and some resistance to the implementation of the program. As administrator explained in an interview:
The concept of school based probation involves a team approach to the monitoring of juveniles in the probation system – one officer handles court, family, and out of school matters, while the school based officer makes all contacts with the juvenile during the school day, and works closely with the teachers and school personnel who are best suited to know the most about the daily successes and failures of the student.
Despite having different job responsibilities, both the line officer and SBOs reported blurring of job roles and confusion. When asked whether the staff worked as a team, one line officer replied:
. . . . From the beginning this has been the big question as to who has what responsibilities, where are the lines drawn, how do you share a case, and, I don’t think it’s defined quite yet.
This line officer expressed frustration with the division of tasks in juvenile supervision and with not knowing exactly when or how to ask school-based officers for their assistance.
The school-based officers also pointed to role confusion, noting that the lines between the two jobs were often blurred, although in their opinion the SBOs were there also to provide assistance to the line officer. As one school-based officer described his relationship with the line officer, it is clear that the line officer, who is technically not in a supervisory position, is the one with overall responsibility for the case. The SBO put it this way:
No, he doesn’t supervise us, but in a way it kind of looks like that. It’s kind of a weird situation actually. I mean he’s . . . I wouldn’t do anything very important with a kid as far as trying to get him into a certain program without conferring with him to make sure it would be something he’s wanting to do.
In summary, an ongoing difficulty in the program was the distinction between line officer and school-based officer functions. This problem was further exacerbated by the fact that the distinction changed in the summer months when school was not in session and also by the very high turnover rate in school-based officers. This latter factor meant that the relationship between school-based officers and line officer had to be continually explained and worked out with new SBOs.
Through most of the period of the research the juvenile probation staffing configuration in the county office consisted of one line officer and three school-based officers. While the school-based officers were trained probation officers, and while they occasionally performed duties (such as making court appearances) normally handled by the line officer, their job was designed with a focus on working with juveniles in schools on matters related to schooling (attendance, grades, behavior in school). SBOs were to coordinate their work with the line officer to provide an overall team approach to juvenile supervision. The supervisor of the juvenile officers tried to minimize the “line” functions performed by the SBOs and emphasized the differences between the SBO and the line officer jobs. In an interview with the county probation director, this conflict was also mentioned as a significant obstacle in implementing the program. According to the director,
. . . the problem that really sticks out in my mind that we had to overcome was Line staff and the new program people sharing the kids. You know, they were wanting to know where are the boundaries at, what do they do, what do I do, I don’t want them superseding something that I’ve done or said with a kid and vice versa. . . . I just really stressed communication between the two. . . . the School-based officers handle school related issues and the Line staff handles everything else, but there’s going to be times where those overlap, you can’t help it.
Finally, staff turnover was an enormous problem in the School-based Probation Program. Between April 2001 when the program began and July 2004 when the evaluation concluded, a total of nine different probation officers were employed as SBOs. Excluding the two officers still employed in July 2004, the average tenure of school-based officers was 8.6 months. The exceptional turnover appeared to be connected to the fact that the SBO Program was funded through a grant. As explained by a supervisor:
Veteran probation officers are not easily attracted to the position of School-based Officer because it is a grant position. Under the union contract (the Fraternal Order of Police is the union), positions that are funded through grants are lost when the grant expires and probation officers in these positions are not entitled to bump other, less experienced officers in regular positions. This means that when the positions are lost the persons in these positions are unemployed. Once a new officer has successfully completed his/her probationary employment period of nine months, transfer to another position that becomes open in the (multiple county area) becomes possible. Thus, grant-funded employees will ‘bail’ to regular positions when they can, generating a relatively high turnover rate, higher than in other positions.
This school-based officer turnover produced problems working with the schools. The school survey revealed that many school personnel were unfamiliar with the specific goals of the program. School personnel were fairly regularly introducing new SBOs to matters such as how grades were kept, when and how disciplinary reports were filed, whether a particular student’s absence was something to be investigated, and so on.
Organizational Barriers to Implementation
Service delivery barriers were identified by school-based and line probation officers as hindering implementation of the School-based Probation Program. The frequency and length of sessions in school, the physical location, and the type of services provided were all reported to negatively impact implementation of the program. First of all, the original proposal requested funding for two school-based probation officers. According to the proposal, “These officers would be responsible for school based intervention with a case load of approximately 90 juveniles in the 18 schools of _________ County. They would be assigned to a new office to be located in the most populous city and would be supported by a secretary in that office.” It made great sense to locate a satellite office in that small town because most of the juveniles on probation (83 of 129 probation/supervision cases at the time of the proposal) lived there. However, locating SBOs in the satellite office and the line officer in the central probation office created confusion for probationers. An example of this confusion given by SBOs is that a probationer who would have an office appointment with the line officer would skip the appointment if he saw the SBO in school or in the satellite office.
Secondly, the turnover in SBOs produced practical problems in service delivery. Because it often took a month or more to fill vacant SBO positions, caseloads were often shifted among officers and there were frequent gaps in service. For example, juvenile probationers in X School would become familiar with Y School-based officer, but then she would leave her job and it might be a month or more before they met Z School-based officer. Relatedly, the program initially implemented a pager system so that schools might contact the SBOs if they needed immediate assistance with a juvenile probationer. Only one SBO reported the pager system as working well. In fact, no evidence exists to indicate that the school officials knew of the existence of this system or that it was widely used. The majority of SBOs said the system did not work well and the school-based officers stopped carrying their pagers.
Another significant implementation problem related to the school-based officer job description was the question of what the SBO was to do in the summer months when school was out. A supervisor referred to this as “kind of a flaw in the concept of school-based probation.” During the period of the research the SBOs did run a few workshops for probationers during the summer. These workshops focused on educating probationers on things like how to fill out job applications and how to prepare for job interviews. However, the officers couldn’t mandate that the probationers attend these workshops and as a rule attendance was poor (around 5 or 6 probationers).
Another service delivery problem was that according to the job description and the grant proposal, the SBO was supposed to be working with parents. Exactly how this was to happen was never clearly articulated. During the school year, the school-based officers generally did not conduct home visits, except when a juvenile probationer was not in school. Then, the SBO was supposed to go to his or her home to try to find out the reason for the absence. However, even though many probationers skipped school or were absent for legitimate reasons, the SBO rarely followed up with a home visit to check on the reason for absenteeism. Moreover, parents generally did not attend office visits with the juvenile probationers and consequently, SBOs were unable to discuss school, family, or other concerns with a custodial parent. Parental contact with SBOs was also limited during the summer months. During the summer, SBOs made infrequent visits to probationer homes where parent contact might have been expected to occur. Thus, SBO contact and involvement with parents was severely restricted during the entirety of the evaluation period.
In addition to barriers related to service delivery, administrators and staff reported the lack of strategic planning, workload disparities, and lack of supervision as contributing to the problems experienced by the program. A review of documents and interviews with key stakeholders indicated that the School-based Probation initiative was developed with cursory local level planning. The grant proposal outlined six goals and suggested some performance indicators. When questioned about the thinking behind these goals, the administrator who wrote the grant proposal admitted frankly that the goals seemed “like plausible measures of success,” but were also “absolutely arbitrary wishes.” He talked about the difficulty of setting measures of success for any probation program that increased surveillance. In his view, increasing the contact between probation officers and juvenile probationers also increased the likelihood of problems being discovered. He also admitted setting the goal of 20 percent for a decrease in offense was selected because the figure sounded “significant.”
Moreover, rather than follow a strategic planning model for implementation, the School-based Probation Program was primarily planned by the first school-based officer hired with little direction from administrators and without supporting documentation. The first school-based officer reported trying to create a model program based on his observations of a school-based probation program in another county. Consequently a major problem mentioned by administrators of the program in interviews was that in designing the School-based Program there was no “how to” manual for performing the job of the school-based officer and that SBOs needed more guidance on how to do their jobs. This problem was noted by the school-based officers as well. It was almost as if they were expected to figure out on their own how to do their jobs–how to work with school personnel, what kinds of questions to ask probationers, what kinds of academic or behavioral goals were reasonable to set for probationers, and so forth, and in doing this ensure successful program implementation. This was too great a task for several of the young, inexperienced officers.
In addition to the organizational impediments related to the failure to engage in strategic planning, disparate workloads created significant impediment to full implementation of the initiative. Through most of the period of the research one line officer performed all of the line responsibilities for all of the juvenile probationers in the county – a caseload of approximately 70 to 90 juveniles at any given time. By contrast, with three SBOs most of the time, both the caseload size and the number of responsibilities were substantially less for the SBOs, who each had specifically school-related duties for about 20 to 30 juveniles in a handful of schools. The disparity in workloads produced friction between thelLine officer and the SBOs.
The SBOs and the line officer also had difficulties “sharing cases.” A supervisor referred to this as “territorial non-cooperation” on the part of the line officer, who after all, had general responsibility for what happened with the case. As the SBO program was implemented, responsibilities such as paperwork, court appearances, home visits, and so forth, kept shifting back and forth between the line officer and the SBOs, causing some confusion.
From the management perspective, there were other problems with the job performance of the SBOs. The satellite office, which had only opened with the School-based Program, had no full-time supervisory staff; there were simply no funds for on-site staff supervision. The SBOs set their own schedules and were in and out of the satellite office because of school visits. With no on-site supervision for largely inexperienced officers who had small caseloads, a problem of accountability arose. Several of the SBOs exhibited less than professional behaviors by arriving late in the morning, leaving early in the afternoon and not returning to the office, and taking long lunch hours.
Another problem was records management. A number of errors in record keeping, processing of cases, communication with the Line officer, and so on, occurred. Files on juvenile probationers were kept by the line officer in the main county office and by the school-based officers in the satellite office, seven miles away. The way it was explained to researchers by all officers interviewed, the central file was kept in the county office, while the SBOs kept satellite files containing mostly school-related information, which was then duplicated for the central file. However, research examination of files revealed that exactly what was kept in files varied substantially across schools and probationers. School data (attendance, grades, disciplinary records, etc.) were reported inconsistently by the 18 schools in the county with probationers. Not all school-related information located in the SBO files was duplicated for the line officer’s central files. It was never clear whether the information was given to the line officer and simply not filed, or whether there was some problem in providing school information to the line officer. What was clear was that there was a fairly significant gap overall in school-related information on juvenile probationers, which in turn had implications for the evaluation project.
According to one SBO:
We do each have a file, but anything that’s in my files is in (the line officer’s) files. But it’s not the other way around. I don’t have everything from his file in my file. Ours are smaller because we’re focused on school issues and things like that.
Most often grades and attendance reports were found in SBO files. Sometimes the SBOs were given copies of disciplinary reports from schools, but not always. For example, an alternative school in one small town in the county did not give disciplinary reports to SBOs, because in the words of one SBO, “Anytime something happens they just send them home.” Not all schools reported grades in the same way, and schools were uneven in supplying data to the SBOs. Schools that were fully automated had a greater ease of reporting grades and attendance than schools that were not automated.
Systemic Barriers to Implementation
The partnership between the local schools and the probation department collapsed within a year of start-up for several reasons. First, the relationship between the schools and probation was never formalized through contractual or binding agreement. The results of this informal arrangement were disparities in data available from schools, lack of knowledge of school officials about the purpose of the program, and a lack of participation by school administrators in the School-based Program. A survey of school personnel revealed that the majority of school officials were not familiar with the objectives of the program nor did they know the SBOs. Moreover, some decided to handle problem juveniles on their own without probation help and primarily supplied grades. Second, the probation department administrators failed to follow through with expressed intentions for involvement of SBOs in the daily routine of the school. The SBOs did not have offices in the schools nor did they have the contacts in the community necessary for making connections with service providers in the community to address probationer needs. The fact that school-based officers were prohibited from ordering any treatment not specifically provided for in the court order limited the ability of officers to intervene in the lives of juveniles when problems (such as mental health, anger management, drug problems, or tutoring) were detected by the officers. This in turn affected the ability of the program to reduce juvenile offenses. The Regional Director, referring specifically to mentoring services for juveniles, explained the problem succinctly:
And being totally candid a lot of our problems stem from us not being able to require the minors on probation to participate in programs. . . I have beat myself up trying to get these kids into services and then I can’t do anything. I have no teeth, you know, in it so it’s kind of that double-edged sword there that happens. I think this program would be . . . much more effective if we had the backing of the court and we don’t.
Conclusions and Implications for Evaluation
Following the suggestion of evaluation researchers that the utility of program evaluation research can be increased by expanding the focus on this research beyond individual outcome variables to include process variables, our research analyzes a school-based probation program in terms of individual, organization, and system barriers to implementation and delivery. The implementation and delivery of the school-based probation program was disastrously affected by individual practitioner, organization, and systemic barriers. The individual practitioner barriers included inexperienced staff, turnover, role confusion, and role conflict. Organization barriers to the implementation and delivery of this program consisted of the location of the program offices, a lack of communication between SBOs, line officers, supervisors and the school, “territorial non-cooperation” between SBOs and line officers, and workload disparities. These barriers also included inaccurate and incomplete record keeping and a lack of supervision by management. The individual and organization barriers were compounded by the existence of systemic barriers in the form of non-cooperation between the school and probation department and the failure to incorporate treatment services into the program structure. As with other program evaluation research, our research underscores the importance of examining the existence and influence of these types of process factors when completing a program evaluation. In this particular program, these factors paralyzed the effective operation of the program and ultimately signaled the death of the program.