Aquila

AQUILA

aquila

STATEMENT OF PURPOSE

The AQUILA Working Group is dedicated to providing accurate, evidence-based information about batterer intervention programs and their impact on men who batter.  We are committed to enhancing dialogue and public awareness about these programs and about the potential for change for many men who have a history of domestic violence.


Contact David J. H. Garvin, BISC-MI Chair to join the Aquila Listserv: dgarvin@biscmi.org


 We support and promote program practices that:

  • Center on the safety and well-being of adult victims/survivors of intimate partner violence and children.
  • Promote responsibility and safe, nurturing relationships for men who have a history of domestic violence.
  • Encourage multi-institutional, community and family capacity to hold men who batter accountable for their conduct and encourage them to change.
  • Acknowledge that many men who attend batterer intervention programs face multiple obstacles to long-term change (such as poverty, exposure to trauma, racism, addiction and disproportional impact of our systems), and promote holistic services to help men deal with issues that destabilize the change process.

The Full Picture of Research on Batterer Intervention Programs

Evidence of a Postive Effect of batterer programs:

  • Evaluating batterer counseling programs: A difficult task
    showing some effects and implications, Evaluating_Batterer_Programs–CDC_summary-fin
  • To BIP, or not to BIP?, Moyer_paper
  • Countering Confusion About The Duluth Model (Paymar & Barnes)
  • Attorney General’s & Lt. Governor’s, Family Violence Council, SUBJECT: Position on Effectiveness of Abuser Intervention Programs effectiveness_pospaper_cover
  • FAMILY VIOLENCE COUNCIL’S DOMESTIC VIOLENCE ABUSER RESEARCH
    COLLABORATIVE, Position on Effectiveness of Abuser Intervention Programs (March 2002), Nitsch_effectiveness_final: That organization is now known as MAIC (Maryland Abuser Intervention Collaborative). Dr. Chris Murphy, from the University of Maryland, Baltimore County, is the research co-chair & Lisa Nitsch Practitioner.
  • Link  for Judicial Monitorning Audioconference (no longer available: Featuring Dr. Michael Rembel, Judge Carl Ashley Milwaukee, WI, Judge Libby Hines Ann Arbor, MI. Hosted by Barbara Hart.
  • Contributions of Batterer Programs
  • Quotes on Batterer Program Evaluations
  • Batterer Program Evaluations Using a Systems Perspective and showing an impact of batterer programs in context
    • Gamache, D., Edleson, J., & Schock, M. (1988). Coordinated police, judicial and social service response to woman battering: A multi-baseline evaluation across three communities. In G. Hotaling, D. Finkelhor, J. Kirkpatrick, & M. Straus (Eds.), Coping with family violence: Research and policy perspectives (pp. 193-209). Newbury Park, CA: Sage Publications.
    • Murphy, C., Musser, P., & Maton, K. (1998). Coordinated community intervention for domestic abusers. Journal of Family Violence, 13, 263-285. 
    • Bennett, L., Stoops, C., Call, C., & Flett, H. (2007). Program completion and re-arrest in a batterer intervention system. Research on Social Work Practice, 17, 42-54. 
    • Bouffard, J., & Muftic, L. (2007). An examination of the outcomes of various components of a coordinated community response to domestic violence by male offenders. Journal of Family Violence, 22, 353-366.
    • Bledsoe, L., Bibhuti, S., & Barbee, A. (2006). Impact of coordinated response to intimate partner violence on offender accountability. Journal of Aggression, Maltreatment & Trauma, 13, 109-129.
    • Visher, C., Newmark, L., & Harrell, A. (2006). Final report on the evaluation of the Judicial Oversight Demonstration, Volume 2: Findings and Lessons on Implementation. Washington, DC:
      Urban Institute.
    • Macleod, D., Pi, R., Smith, D., & Rose-Goodwin, L. (2008). Evaluation of California Batterer Intervention Systems. Final report to the National Institute of Justice, Washington, DC.
    • Gondolf, E. (2002). Batterer Intervention Systems: Issues, Outcomes, and Recommendations. Thousand Oaks, CA: Sage Publications.
  • The response of Judge Carl Ashley and Judge Libby Hines to Mike Rempel/CCI’s Bronx Study on Judicial Monitoring. A Conference Call facilitated by Barbara Hart J.D.
    • Go to: http://www.eaglevoicemail.com/mp3/herf2005-0911.mp3 (no longer available)
      • And you can find more information at the dropsite:http://drop.io/Muskie_Rempel/login (no longer available)
      • Be sure to use the password  bwjp (case sensitive)
  • Solutions to the Research-Practice Gap in Domestic Violence: A Modified Delphi Study with Domestic Violence Coalition Leaders Summary of Findings, Solutions_Research-Practice

Articles on Research issues:

AJA Education Announcement

 

American Judges Association: Effective Adjudication of Domestic Abuse Cases

education-aja-logo

Articles on “Gender-neutral” research:


Some articles on the limitations of Experimental Program Evaluations:

  • Berk, R. (2005).  Randomized experiments as the “bronze standard.’ Journal of Experimental Criminology, 1, 416-433.
  • Angrist, J. (2005). Instrumental variables methods in experimental criminological research: What, why, and how? Journal of Experimental Criminology, 1, 23-44.
  • Goldkamp, J. (2008). Missing the target and missing the point: “Successful” random assignment but misleading results. Journal of Experimental Criminology, 4, 83-115.
  • Durlak, J., & DuPre, E. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327-350.
  • Matt, G., & Navarro, A., (1997). What meta-analysis have and have not taught us about psychotherapy effects: A review and future directions. Clinical Psychology Review, 17, 1-32.
  • Dobash, R. E., &  Dobash, R. P. (2000). Evaluating criminal justice interventions for domestic violence. Crime and Delinquency, 46, 252-271.
  • Gondolf, E. (2001).  Limitation of experimental evaluations of batterer programs. Trauma, Violence, and Abuse, 2, 79-88.

Articles on the limitations of Experimental Program Evaluations:

http://www.upne.com/1555537692.html

A chapter in the book on “The future of batterer programs” has a chapter devoted to the effectiveness debate and the oversimplifications and misinterpretations that have come out of it.

gondolf_furureorbattererinterventionprograms

A paragraph that offers directly the counter point: A 2007 meta-analysis from the Cochrane Collaboration11 questions any “doesn’t work” interpretation of the previous meta-analyses more explicitly (Smedslund, Dalsbø, Steiro, Winsvold, & Clench-Aas, 2007): “The methodological quality of the included (experimental) studies was generally low . . . The research evidence is insufficient to draw conclusions about the effectiveness of cognitive behavioral interventions for spouse abusers . . . We simply do not know whether the interventions help, whether they have no effect, or whether they are harmful” (p. 18). An earlier analysis funded by the Centers for Disease Control and Prevention used a broader inclusion criteria of fifty intervention and prevention programs and reached a conclusion similar to the more selective Cochrane Collaboration: “The diversity of data, coupled with the relatively small number of (experimental) studies that met the inclusion criteria for the evidence-based review, precluded a rigorous, ququantitative synthesis of the findings” (Morrison, Lindquist, Hawkins, O’Neil, Nesius, & Mathew, 2003, p. 4).12 Berk, R. (2005).  Randomized experiments as the “bronze standard.’ Journal of Experimental Criminology, 1, 416-433.

  • Angrist, J. (2005). Instrumental variables methods in experimental criminological research: What, why, and how? Journal of Experimental Criminology, 1, 23-44.
  • Goldkamp, J. (2008). Missing the target and missing the point: “Successful” random assignment but misleading results. Journal of Experimental Criminology, 4, 83-115.
  • Durlak, J., & DuPre, E. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327-350.
  • Matt, G., & Navarro, A., (1997). What meta-analysis have and have not taught us about psychotherapy effects: A review and future directions. Clinical Psychology Review, 17, 1-32.
  • Dobash, R. E., &  Dobash, R. P. (2000). Evaluating criminal justice interventions for domestic violence. Crime and Delinquency, 46, 252-271.
  • Gondolf, E. (2001).  Limitation of experimental evaluations of batterer programs. Trauma, Violence, and Abuse, 2, 79-88.