I worry at times that we’re slipping into a space where the quality of evaluations just isn’t where it should be. In a broader environment that sometimes devalues research, that’s risky—and not just for individual projects, but for the credibility of our entire field. We have to make sure evaluations are rigorous, well-documented, and transparent enough to withstand peer scrutiny.
It’s a tricky balance, though. Programs often just want a short, accessible impact report that helps them with external case-making and internal decision-making. That’s why, at LRA, we always generate a brief, visually engaging impact report as well as a longer, more detailed report that fully documents our methods and analyses. It’s our way of ensuring clarity and credibility.
Remember, randomized controlled trials are great, but they’re not the only way to validate impact.
We were thrilled to read the results of a four-year randomized controlled trial (RCT) from Big Brothers Big Sisters of America (BBBS) and Arnold Ventures, showing that high-quality mentoring can reduce youth delinquency and substance use.
The rigor of this study—and its compelling findings—will not only strengthen BBBS’s evidence base, but also help build momentum around the power and potential of youth development programs more broadly.
That said, it’s worth remembering that for 90% of youth development programs, a randomized controlled trial isn’t necessary (or even appropriate) to demonstrate impact.
So feel free to take joy in these results, but don’t panic! Your program doesn’t need an RCT to demonstrate its value.
CHECK OUT THE STORY