About Us
Topics
BUDGET POLICY
CHILD CARE, EARLY
EDUCATION & HEAD START
CHILD WELFARE & CHILD ABUSE
EDUCATION
ELDERLY
FAMILY POLICY, MARRIAGE & DIVORCE
FOOD ASSISTANCE, SNAP & WIC
HEALTH CARE POLICY
INCOME & POVERTY
JOB TRAINING
LEGAL ISSUES
PAY FOR SUCCESS, PAY FOR RESULTS & SIBS
POLITICAL PROCESS
PROGRAM EVALUATION
SOCIAL POLICY
TEEN SEX & NON-MARITAL BIRTHS
WELFARE REFORM
Child Abuse Training
International Activities
Rossi Award for Program Evaluation
UMD Capstone Courses
Publications
Mailing List
Contact Us




Notes

Introduction

1. Daniel Patrick Moynihan, Maximum Feasible Misunderstanding: Community Action in the War on Poverty (New York, NY: Free Press, 1969), 193.

2. U.S. House of Representatives, Committee on Ways and Means, 1996 Green Book, 104th Congress, 2nd session, 4 Nov. 1996, 1333-96; Mark Greenberg and Steve Savner, �A Detailed Summary of Key Provisions of the Temporary Assistance for Needy Families Block Grant of H.R. 3734,� Center for Law and Social Policy, 13 Aug. 1996.

Current Evaluations

1. The mandatory AFDC caseload excludes those adults who are exempt (because of the age of their children, a disability, or other specified factors) or have good cause for not participating. Nationally, for participation rate purposes under FSA requirements, less than half of the AFDC caseload was considered mandatory in fiscal year 1995.

2. Stephen Freedman and Daniel Friedlander, The JOBS Evaluation: Early Findings on Program Impacts in Three Sites (Washington, DC: U.S. Department of Health and Human Services and U.S. Department of Education, Sept. 1995). Associated reports include: (1) Thomas Brock and Kristen Harknett, �Separation versus Integration of Income Maintenance and Employment Services: Which Model Is Best? Findings from a Case Management Experiment,� Manpower Demonstration Research Corporation, Jan. 1997. This study examines the impact of integrating income maintenance and employment services in Columbus, Ohio, within the context of a human capital development approach. (2) Stephen Freedman, Daniel Friedlander, Kristen Harknett, and Jean Knab, �Preliminary Impacts on Employment, Earnings, and AFDC Receipt in Six Sites in the JOBS Evaluation,� Manpower Demonstration Research Corporation, Jan. 1997. This study presents two-year findings on the effectiveness of ten programs in six sites. Some emphasized the labor force attachment approach and others the human capital development approach. Six of the programs showed positive impacts on earnings, and nine reduced the average number of months of AFDC receipt. (3) Gayle Hamilton, The JOBS Evaluation: Monthly Participation Rates in Three Sites and Factors Affecting Participation Levels in Welfare-to-Work Programs (Washington, DC: U.S. Department of Health and Human Services and U.S. Department of Education, Sept. 1995). This report analyzes the participation patterns of recipients in three sites, including the extent of participation and reasons for nonpartici-pation. (4) Kristin A. Moore, Martha J. Zaslow, Mary Jo Coiro, Suzanne M. Miller, and Ellen B. Magenheim, The JOBS Evaluation: How Well Are They Faring? AFDC Families with Preschool-Aged Children in Atlanta at the Outset of the JOBS Evaluation (Washington, DC: U.S. Department of Health and Human Services and U.S. Department of Education, Sept. 1995). This report provides a description of a range of child outcomes near the beginning of the evaluation in one site, Fulton County, Georgia; it finds that the families in the study are disadvantaged in many ways. Other reports from MDRC describe preliminary impacts in specific sites.

3. Although, after two years, the findings for the labor force attachment approach are more impressive than those of the human capital model, MDRC cautions that education and training programs may initially keep some participants on welfare longer, but are intended to improve the skills necessary to increase self-sufficiency in the long run. Thus, longer follow-up will be necessary to identify the more effective approach. MDRC also cautions that the results should be considered preliminary because the survey data were only available for 39 percent of the full sample (so that sample sizes were small) and follow-up was only for two years, which may not be long enough for the human capital development model to show its full impact.

4. Jan L. Hagen and Irene Lurie, Implementing JOBS: Progress and Promise (Albany, NY: The Nelson A. Rockefeller Institute of Government, Aug. 1994).

5. Rebecca Maynard, ed., Building Self-Sufficiency Among Welfare-Dependent Teenage Parents: Lessons from the Teenage Parent Demonstration (Princeton, NJ: Mathematica Policy Research, Inc., Jun. 1993).

6. Anu Rangarajan, Taking the First Steps: Helping Welfare Recipients Who Get Jobs Keep Them (Princeton, NJ: Mathematica Policy Research, Inc., 1996).

7. Earl Johnson and Fred Doolittle, Low-Income Parents and the Parents� Fair Share Demonstration: An Early Qualitative Look at Low-Income Noncustodial Parents (NCPs) and How One Policy Initiative Has Attempted to Improve Their Ability to Pay Child Support (New York, NY: Manpower Demonstration Research Corporation, 1996).

8. Steve Savner and Mark Greenberg, The CLASP Guide to Welfare Waivers: 1992 1995 (Washington, DC: Center for Law and Social Policy, 23 May 1995).

9. Dan Bloom and David Butler, Implementing Time-Limited Welfare (New York, NY: Manpower Demonstration Research Corporation, 1995).

10. LaDonna Pavetti and Amy-Ellen Duke (with Clemencia Cosentino de Cohen, Pamela Holcomb, Sharon K. Long, and Kimberly Rogers), Increasing Participation in Work and Work-Related Activities: Lessons from Five State Welfare Reform Projects, vol. I and II (Washington, DC: The Urban Institute, Sept. 1995).

11. Douglas J. Besharov, Kristina Tanasichuk White, and Mark B. Coggeshall, Health-Related Welfare Rules (Washington, DC: American Enterprise Institute for Public Policy Research, Nov. 1996).

12. Rosina M. Becerra, Alisa Lewin, Michael N. Mitchell, and Hiromi Ono, California Work Pays Demonstration Project: January 1993 through June 1995 (Los Angeles, CA: School of Public Policy and Social Research, UCLA, Dec. 1996).

13. Peggy Cuciti, Colorado Personal Responsibility and Employment Program: Preliminary Analysis (Denver, CO: University of Colorado at Denver, Feb. 1997).

14. Dan Bloom, James J. Kemple, and Robin Rogers-Dillon, The Family Transition Program: Implementation and Early Impacts of Florida�s Initial Time-Limited Welfare Program (New York, NY: Manpower Demonstration Research Corporation, 1997); Dan Bloom, The Family Transition Program: An Early Implementation Report on Florida�s Time-Limited Welfare Initiative (New York, NY: Manpower Demonstration Research Corporation, Nov. 1995).

15. Larry Kerpelman, David Connell, Michelle Ciurea, Nancy McGarry, and Walter Gunn, Preschool Immunization Project Evaluation: Interim Analysis Report (Cambridge, MA: Abt Associates, Inc., 1 May 1996).

16. Thomas Fraker, Lucia Nixon, Jan Losby, Carol Prindle, and John Else, Iowa�s Limited Benefit Plan (Washington, DC: Mathematica Policy Research, Inc., May 1997).

17. Schaefer Center for Public Policy, Maryland�s Primary Prevention Initiative: An Interim Report (Baltimore, MD: University of Baltimore, 22 Nov. 1995).

18. Alan Werner and Robert Kornfeld, The Evaluation of To Strengthen Michigan Families: Final Impact Report (Cambridge, MA: Abt Associates, Inc., Sept. 1997).

19. Virginia Knox, Amy Brown, and Winston Lin, MFIP: An Early Report on Minnesota�s Approach to Welfare Reform (New York, NY: Manpower Demonstration Research Corporation, Nov. 1995).

20. William L. Hamilton, Nancy R. Burstein, August J. Baker, Alison Earle, Stefanie Gluckman, Laura Peck, and Alan White, The New York State Child Assistance Program: Five Year Impacts, Costs, and Benefits (Cambridge, MA: Abt Associates Inc., Oct. 1996).

21. Johannes M. Bos and Veronica Fellerath, Final Report on Ohio�s Welfare Initiative to Improve School Attendance: Ohio�s Learning, Earning, and Parenting Program (New York, NY: Manpower Demonstration Research Corporation, Aug. 1997); David Long, Judith M. Gueron, Robert G. Wood, Rebecca Fisher, and Veronica Fellerath, Three-Year Impacts of Ohio�s Welfare Initiative to Improve School Attendance Among Teenage Parents: Ohio�s Learning, Earning, and Parenting Program (New York, NY: Manpower Demonstration Research Corporation, Apr. 1996); Dan Bloom, Hilary Kopp, David Long, and Denise Polit, Implementing a Welfare Initiative to Improve School Attendance Among Teenage Parents: Ohio�s Learning, Earning, and Parenting Program (New York, NY: Manpower Demonstration Research Corporation, Jul. 1991).

22. Utah Department of Human Services, Utah Single Parent Employment Demonstration Program: It�s About Work, Preliminary Two Year Report (undated).

23. State of Wisconsin Legislative Audit Bureau, An Evaluation of Third Semester Effects of the Wisconsin Learnfare Program (Madison, WI: 1 May 1996).

24. See generally Institute for Research on Poverty, �Monitoring the Effects of the New Federalism: A Conference,� Focus 18, Special Issue 1996, 12-17.

25. Janet C. Quint, Johannes M. Bos, Denise F. Polit, New Chance: Final Report on a Comprehensive Program for Disadvantaged Young Mothers and Their Children (New York, NY: Manpower Demonstration Research Corporation, Jul. 1997); Janet Quint, Denise Polit, Hans Bos, and George Cave, New Chance: Interim Findings on a Comprehensive Program for Disadvantaged Young Mothers and Their Children (New York, NY: Manpower Demonstration Research Corporation, Sept. 1994).

26. David Card and Philip Robins, Do Financial Incentives Encourage Welfare Recipients to Work? Initial 18-Month Findings from the Self-Sufficiency Project (Ottawa, Ontario: Social Research and Demonstration Corporation, Feb. 1996).

27. Dudley Benoit, The New Hope Offer: Participants in the New Hope Demonstration Discuss Work, Family, and Self-Sufficiency (New York, NY: Manpower Demonstration Research Corporation, 1996).

Future Evaluations

1. The Survey of Income and Program Participation (SIPP), produced by the Census Bureau, collects monthly information on about 20,000 households for a period of two-and-a-half years. It collects detailed information regarding employment, income, and participation in social programs. Because it is longitudinal, it is particularly useful for analyzing changes in income and program participation over time. The Census Bureau�s Current Population Survey (CPS), the primary source of information on income and poverty in the United States, also may be used by researchers to analyze the impact of the new welfare law. Based on a sample of 60,000 households surveyed each March, it collects data on the demographic and economic characteristics of the sample individuals and households in the preceding year.

2. Devolution is the assignment of planning and decisionmaking responsibilities to lower levels of government or even to communities.

3. Mary Jo Bane and David Ellwood, Welfare Realities: From Rhetoric to Reform (Cambridge, MA: Harvard University Press, 1994).

4. LaDonna Pavetti, �Who Is Affected by Time Limits?� in Welfare Reform: An Analysis of the Issues, ed. Isabel V. Sawhill (Washington, DC: The Urban Institute, 1995).

Evaluating the Evaluations

1. See Matthew Birnbaum and Michael Wiseman, �Extending Assistance to Intact Families: State Experiments with the 100-Hour Rule,� Focus 18, Special Issue 1996, 38 41.

2. George Galster, �The Challenges for Policy Research in a Changing Environment,� in The Future of the Public Sector (Washington, DC: The Urban Institute, Nov. 1996).

3. Sheila Zedlewski, Sandra Clark, Eric Meier, and Keith Watson, Potential Effects of Congressional Welfare Reform Legislation on Family Incomes (Washington, DC: The Urban Institute, 26 Jul. 1996).

4. John Harwood, �Think Tanks Battle To Judge the Impact of Welfare Overhaul,� The Wall Street Journal, 30 Jan. 1997, A1.

5. Dan Bloom, The Family Transition Program: An Early Implementation Report on Florida�s Time-Limited Welfare Initiative (New York, NY: Manpower Demonstration Research Corporation, Nov. 1995).

6. Jodie Allen, �An Introduction to the Seattle/Denver Income Maintenance Experiment: Origins, Limitations, and Policy Relevance,� Proceedings of the 1978 Conference on the Seattle and Denver Income Maintenance Experiments (Olympia, WA: Department of Social and Health Services, 1979), 18.

7. David J. Fein, �Waiver Evaluations: The Pitfalls‹and the Opportunities,� Public Welfare, Fall 1994, 27.

8. Paul Decker, REACH Welfare Initiative Program Evaluation: Estimating the Effects of the REACH Program on AFDC Receipt (Princeton, NJ: Mathematica Policy Research, Inc., Aug. 1991), 1.

9. State of Wisconsin Legislative Audit Bureau, An Evaluation of Third Semester Effects of the Wisconsin Learnfare Program (Madison, WI: 1 May 1996).

10. Charles Manski and Irwin Garfinkel, �Introduction� in Evaluating Welfare and Training Programs, ed. Charles Manski and Irwin Garfinkel (Cambridge, MA: Harvard University Press, 1992), 8.

11. Robert LaLonde, �Evaluating the Econometric Evaluations of Training Programs with Experimental Data,� American Economic Review 76, Sept. 1986, 604 20.

12. Thomas Fraker and Rebecca Maynard, �Evaluating Comparison Group Designs with Employment-Related Programs,� Journal of Human Resources 22, Spring 1987, 194 227.

13. Robert LaLonde , �Evaluating the Econometric Evaluations of Training Programs with Experimental Data,� American Economic Review 76, Sept. 1986, 617.

14. Ibid.

15. Thomas Fraker and Rebecca Maynard, �Evaluating Comparison Group Designs with Employment-Related Programs,� Journal of Human Resources 22, Spring 1987.

16. James J. Heckman and Jeffrey A. Smith, �Assessing the Case for Social Experiments,� Journal of Economic Perspectives 9, Spring 1995, 85 110.

17. Ibid, 91.

18. See James J. Heckman and Joseph V. Hotz, �Choosing Among Alternative Nonexperimental Methods for Estimating the Impact of Social Programs: The Case of Manpower Training,� Journal of the American Statistical Association 84, Dec. 1989, 862 74.

19. James J. Heckman and Jeffrey A. Smith, �Assessing the Case for Social Experiments,� Journal of Economic Perspectives 9, Spring 1995, 91.

20. James Riccio, Daniel Friedlander, and Stephen Freedman, GAIN: Benefits, Costs, and Three-Year Impacts of a Welfare-to-Work Program (New York, NY: Manpower Demonstration Research Corporation, Sept. 1994).

21. Ibid, 125.

22. Ibid, 10.

Appendix A

1. Gary Burtless, �The Case for Randomized Field Trials in Economic and Policy Research,� Journal of Economic Perspectives 9, Spring 1995, 69.

2. David Greenberg and Mark Shroder, Digest of Social Experiments (Madison, WI: Institute for Research on Poverty, University of Wisconsin, 1991).

3. Erica Baum, �When the Witch Doctors Agree: The Family Support Act and Social Science Research,� Journal of Policy Analysis and Management 10, Fall 1991, 603 15.

4. Larry L. Orr, Howard S. Bloom, Stephen H. Bell, Winston Lin, George Cave, Fred Doolittle, The National JTPA Study: Impacts, Benefits, and Costs of Title II-A (Bethesda, MD: Abt Associates Inc., Mar. 1994).

5. See generally Peter H. Rossi and Howard E. Freeman, Evaluation: A Systematic Approach 5, 5th ed. (Newbury Park, CA: SAGE Publications, Inc., 1993).

6. Of course, whether a particular intervention makes someone better off or worse off cannot be determined a priori.

7. Michael J. Puma, Janet DiPietro, Jeanne Rosenthal, David Connell, David Judkins, and Mary Kay Fox, Study of the Impact of WIC on the Growth and Development of Children. Field Test: Feasibility Assessment. Final Report: Volume I (Cambridge, MA: Abt Associates Inc., 1991).

8. Anne Gordon, Jonathan Jacobson, and Thomas Fraker, Approaches to Evaluating Welfare Reform: Lessons from Five State Demonstrations (Princeton, NJ: Mathematica Policy Research, Inc., Oct. 1996.)

9. Peter H. Rossi and Howard E. Freeman, Evaluation: A Systematic Approach 5, 5th ed. (Newbury, CA: SAGE Publications, Inc., 1993).

10. Charles E. Metcalf and Craig Thornton, �Random Assignment,� Children and Youth Services Review 14, 1992, 152.

11. Peter Rossi, �What the New Jersey Experiment Results Mean and Do Not Mean,� in Addressing Illegitimacy: Welfare Reform Options for Congress (Washington, DC: American Enterprise Institute for Public Policy Research, 11 Sept. 1995).

12. Larry C. Kerpelman, David B. Connell, Michelle Ciurea, Nancy McGarry, and Walter Gunn, Preschool Immunization Project Evaluation: Interim Analysis Report (Cambridge, MA: Abt Associates Inc., 1 May 1996).

13. Rossi and Freeman explain: �Although randomly-formed experimental and control groups are `statistically equivalent� at the start of an evaluation, non-random processes may threaten their equivalence as the experiment progresses. Differential attrition may introduce differences between experimentals and controls. In the income maintenance experiments, for example, families in the experimental groups who received the less generous payment plans and families in the control groups were more likely to stop cooperating as subjects.� Peter H. Rossi and Howard E. Freeman, Evaluation: A Systematic Approach 5, 5th ed. (Newbury, CA: SAGE Publications, Inc., 1993).

14. Robert Moffitt, �Evaluation Methods for Program Entry Effects,� in Evaluating Welfare and Training Programs, ed. Charles Manski and Irwin Garfinkel (Cambridge, MA: Harvard University Press, 1992), 231 52.

15. Irwin Garfinkel, Charles F. Manski, and Charles Michalo-poulos, �Micro Experiments and Macro Effects,� in Evaluating Welfare and Training Programs, ed. Charles Manski and Irwin Garfinkel (Cambridge, MA: Harvard University Press, 1992), 253 76.

16. James J. Heckman and Jeffrey A. Smith, �Assessing the Case for Social Experiments,� Journal of Economic Perspectives 9, Spring 1995, 85 110.

17. Demetra S. Nightingale, Lynn C. Burbridge, Douglas Wissoker, Lee Bawden, Freya L. Sonenstein, and Neal Jeffries, Experiences of Massachusetts ET Job Finders: Preliminary Findings (Washington, DC: The Urban Institute, 1989).

18. Anne Gordon, Jonathan Jacobson, and Thomas Fraker, Approaches to Evaluating Welfare Reform: Lessons from Five State Demonstrations (Princeton, NJ: Mathematica Policy Research, Inc., Oct. 1996), 23.

19. Research Triangle Institute, Final Report: Evaluation of the 1981 AFDC Amendments (Research Triangle Park, NC: Research Triangle Institute, 15 Apr. 1983); Ira Muscovice and William J. Craig, �The Omnibus Budget Reconciliation Act and the Working Poor,� Social Service Review 58, Mar. 1984, 49 62; and U.S. General Accounting Office, An Evaluation of the 1981 AFDC Changes: Initial Analyses (Washington, DC: U.S. Government Printing Office, 1984).

20. Paul Decker, REACH Welfare Initiative Program Evaluation: Estimating the Effects of the REACH Program on AFDC Receipt (Princeton, NJ: Mathematica Policy Research, Inc., Aug. 1991).

21. Peter H. Rossi and Howard E. Freeman, Evaluation: A Systematic Approach 5, 5th ed. (Newbury, CA: SAGE Publications, Inc., 1993), 250. �In general, then, simple before/after reflexive designs provide findings that have a low degree of credibility. This is particularly the case when the time elapsed between the two measurements is appreciable‹say, a year or more‹because over time it becomes more and more likely that some process occurring during the time period may obscure the effects of the program, whether by enhancing them or by diminishing them.� (p. 343).

22. Burt Barnow, �The Impact of CETA Programs on Earnings: A Review of the Literature,� Journal of Human Resources 22, Spring 1987, 157 93.

23. Peter H. Rossi and Howard E. Freeman, Evaluation: A Systematic Approach 5, 5th ed. (Newbury Park, CA: SAGE Publications, Inc., 1993).

24. June O�Neill, Work and Welfare in Massachusetts: An Evaluation of the ET Program (Boston, MA: Pioneer Institute for Public Policy Research, 1990).

25. Ibid.

26. Gary Burtless, �The Case for Randomized Field Trials in Economic and Policy Research,� Journal of Economic Perspectives, Spring 1995, 72.

 

� 1997 by the University of Maryland, College Park, Maryland.  All rights reserved.  No part of this publication may be used or reproduced in any manner whatsoever without permission in writing from the University of Maryland except in cases of brief quotations embodied in news articles, critical articles, or reviews.  The views expressed in the publications of the University of Maryland are those of the authors and do not necessarily reflect the views of the staff, advisory panels, officers, or trusties of the University of Maryland


Back to top


HOME - PUBLICATIONS - CONFERENCES - ABOUT US - CONTACT US