Comments
on Rossi
Michael Laracy, former director of Policy, Planning, and Program
Evaluation, New Jersey Department of Human Services
I
cannot comment on the technical and methodological issues that Dr. Rossi raises
(see Chapter X). I am not an expert
in quantitative analysis and am not prepared to weigh in on whether, for
example, a logit or probit regression�with or without a Huber correction�is
an appropriate analytic methodology. Dr. Rossi�s conclusions are reasonable,
and his assessment is fair, balanced, and warranted. Specifically, the three
major criticisms on which he elaborates throughout his chapter are accurate.
Even putting aside the question of whether the regression models are
appropriate, the other criticisms that Dr. Rossi makes of the impact evaluation
justify his conclusion that �the deficiencies [of the Rutgers evaluation] are
serious enough to cast strong doubts on the validity of the findings.�
The
Rutgers impact evaluation of FDP (Camasso, Harvey, Jagannathan, and
Killingsworth 1998a.,b) is suggestive and informative, but not conclusive,
definitive, or authoritative; in the absence of other, stronger studies, it can
help inform the debate about the effects of FDP, but by no means does it settle
anything. I have several historical and contextual observations about why the
Rutgers study�particularly the research on the family cap�was so flawed.
First, objective and rigorous evaluation of FDP was a low priority
for the Florio administration when the evaluation decisions were made in 1991
and 1992. When the FDP laws were originally enacted and the state sought
�Section 1115� waivers, senior officials within the governor�s office (who
were engaged in the negotiations) were actively opposed to rigorous evaluation
of FDP, particularly to any evaluation of �social� versus �economic�
effects. The Florio administration recognized that the family cap was highly
controversial and did not want to provide any evidence for its opponents. The
sponsor (Senator Wayne Bryant) and the governor�s staff realized that
objective evaluations might be used to challenge�both in the courts and in the
press�their predictions and pronouncements of early success. In negotiations
with the U.S. Department of Health and Human Services (HHS), the governor�s
staff were instructed to resist the 1115 waiver requirements for evaluations,
and the state acquiesced only under unrelenting federal insistence. In my
experience, no complex, challenging, expensive, controversial, and long
evaluation can succeed without genuine support from those in leadership roles.
Because such support was absent from the project�s inception, odds were always
stacked against the NJDHS and the Rutgers evaluators.
Second, the state budget crisis of the period mitigated against
allocating sufficient resources for the evaluation. To minimize outlays, the
state opted for the least expensive modes and methodologies of evaluation.
Although most of the major research evaluation organizations attended the RFP
bidders� conference, none submitted bids. Only three bids were submitted, and
the state selected the cheapest proposal, despite some misgivings about its
adequacy. The Rutgers proposal was bid at roughly $1 million, half the cost of
the next lowest bid. Frankly, the resulting evaluation was a bargain-basement
job. For example, the difficult task of tracking down and interviewing
participants about their views on FDP fell to minimally trained students, not
professionals. Thus, the dismal survey response rate was hardly surprising. The
state got what it paid for.
Third, as Dr. Rossi details, the implementation of FDP was
inconsistent, and the treatment and control groups were badly contaminated. In
the rush to implement FDP, little was done to ensure that the treatment and
control groups received the appropriate services and were affected by the
correct policies. Likewise, little was done to inform treatment and control
groups about their status and how the law would or would not affect them.
Surveys of members of the treatment and control groups showed, not surprisingly,
widespread misperceptions about their status. Moreover, because New Jersey has a
county-operated welfare system, there was no one FDP program; rather, there were
21 programs, with considerable variation across the counties. Because the
interventions were not correctly or consistently applied, it is impossible to
attribute any outcomes to them. Also, no attempts were made to deal with the
�community effects� and media saturation to which both treatment and control
subjects were equally subjected. Dr. Rossi�s comments in this regard are
particularly well taken.
Fourth, FDP was a package of reforms; no analytic attempt was made
to disentangle the effects, only to estimate the net aggregate effects of the
whole package. As Dr. Rossi�s assessment mentions,
seven simultaneous interventions occurred within FDP, with no means in
the evaluation methodology to determine which reforms were causing which
outcomes. Some of FDP�s reforms plausibly may have operated in opposition to
each other, negating each reform�s effects. For example, although the family
cap might have encouraged those affected by its provisions to exit welfare more
quickly, the law�s heavy emphasis on secondary and postsecondary education
(rather than workforce attachment) might tend to encourage longer stays on
welfare. Or, the heavy emphasis on schooling somehow might have contributed to
changes in birth rates, either reinforcing or weakening any effects of the
family cap.
Fifth, societal
changes were taking place, altering the environment within which FDP and the
family cap were operating. Social
attitudes and behaviors regarding nonmarital births and teen births began
changing nationally in quite dramatic ways in 1991 and 1992. In many ways, the
family cap was one crystallization of these changing societal values and norms.
For instance, starting in 1992, concurrent with the adoption of the New Jersey
reforms, teen births started to decline nationwide (after six years of steady
increases). The declines have continued every year since then, becoming quite
pronounced. With that type of
social change, it becomes almost impossible to tease out the effects of one
policy change. Likewise, starting in 1994, welfare caseloads began to decline
nationwide and in New Jersey, primarily a result of the robust economic growth
of the past nine years. Consequently, the residual caseload in 1996 had quite
different characteristics from the 1992 caseload, including age, race,
education, average length of stay, and family composition.
Any pre�post comparison that does not adjust for the compositional
changes is flawed.
Sixth, it is one thing to evaluate labor market participation; it
is far harder to understand and measure such social behaviors as reproductive
decisions. Decades of research on welfare-to-work demonstrations have generated
a pretty good set of theoretical constructs, models, assumptions, and contexts
within which to interpret evaluation findings about earnings, labor force
participation, and so forth. In contrast, we know little about how financial
incentives affect reproductive decisions; few models and constructs are widely
accepted, the relevant data are not very reliable, and the values and hypotheses
are wildly confusing and conflicted. This lack of shared knowledge and
assumptions makes it hard to appraise weak findings such as those of the Rutgers
evaluation. In contrast, the Rutgers evaluation of FDP�s welfare-to-work
effects, costs, and benefits were consistent with other studies and are not
controversial (Camasso, Harvey, and Jagannathan 1998). Despite the many
limitations and flaws of the evaluation, most analysts will probably accept the
Rutgers findings on the labor-market effects as true, if not convincingly
proved.
Finally, a single evaluation within one state should never have
been expected to provide a definitive answer to such a complex and controversial
policy reform. Even if the Rutgers findings had been far more robust and
unequivocal, no serious analyst would want to make policy decisions or
conclusions on the basis of one study. No one ever should have thought that one
evaluation in one state would provide anything more than a single contribution
to a complex mosaic of analyses and studies. The conclusions and lessons about
the relative effects of workforce attachments versus human capital investment
models took dozens of separate evaluations and a decade of intensive analysis.
The effects of this type of unprecedented social policy reform will take at
least as much study before firm conclusions can be reached.
My observations might be construed as being critical of either the
Rutgers evaluation team or the NJDHS staff responsible for overseeing the
evaluation. That would be wrong and unfortunate. Many of the decisions and
circumstances that so compromised the potential and results of the evaluation
were largely beyond the control of the people most directly involved in the
project. The decisions to avoid, constrain, minimize, and underfund the
federally mandated evaluation were made by the governor�s office, not by
anyone at NJDHS. Likewise, the inability of the researchers and their NJDHS
colleagues to isolate the treatment and control groups from dramatic secular and
environmental changes or to make midcourse corrections was not their fault.
Finally, the unrealistic expectations and wishes of the public, the media, and
most policy makers for definitive and conclusive findings�notwithstanding
repeated disclaimers by NJDHS staff�destined the project to disappoint. Under
the best of circumstances, evaluating the effects of the family cap would have
been a daunting task. Under the miserable circumstances that prevailed, it was
doomed.
References
Camasso, M. J.; Harvey,
C.; Jagannathan, R.; and Killingsworth, M. 1998a. A final report on the impact of New Jersey�s Family Development
Program. New Brunswick, NJ: Rutgers University.
Camasso, M. J.;
Harvey, C.; Jagannathan, R.; and Killingsworth, M. 1998b. A final report on the impact of New Jersey�s Family Development
Program. Results from a pre-post
analysis of AFDC case heads from 1990 to 1996. New Brunswick, NJ: Rutgers
University.
Camasso, M. J.; Harvey, C.; and Jagannathan, R.
1998. Cost-benefit analysis of New Jersey�s Family Development Program:
Final report. New Brunswick, NJ: Rutgers University.
Back to top