• August 31, 2017

If there’s one thing the anti-trafficking world lacks, it’s reliable data. Good data would help us understand the scope of the epidemic; the shared characteristics of victims, survivors, and perpetrators; the impact of our interventions; and more. The challenge in gathering trustworthy data is part of what makes human trafficking possible in the first place – its invisibility. As human trafficking scholar Ronald Weitzer says:

… in 2010, the U.S. government asserted that 0.18 percent of the world’s population were current trafficking victims—trafficking defined as “forced labor, bonded labor, and forced prostitution” (U.S. Department of State 2010, 7). No sources were provided to document any of these figures. Likewise, Kevin Bales claims that “the number of slaves in the world today is 27 million” (Bales 2004, 8); Bales says the figure is “a good guess” but offers no evidence or even a rough idea of how he arrived at it. Unfortunately, many media and government sources have treated this figure as factual…

On Getting ANY Data

There are organizations working to solve the dearth of trustworthy data. The Global Slavery Index, a project of the Walk Free Foundation, is perhaps the most well-known entity attempting to do this at a global level.

Unlike those who made the claims above, The Global Slavery Index at least presents their methodology – sort of. Unfortunately, the methodology they present is too opaque to warrant reliability.

According to their website, they have conducted 25 surveys with Gallup Inc. since 2014. Overall, they’ve surveyed 42,000 people in 25 countries and 53 languages. Given the costs of surveys, in both time and money, this is an impressive feat. Nevertheless, their research claims to cover 167 countries, and while this year’s methodology doesn’t mention extrapolations from one country where surveys were conducted to one where they were not, we can see no other way to cover the 142 countries where surveys weren’t conducted without it.

From religious views of work to cultural understandings of gender, when it comes to the incredibly complex nexus of issues that lead to human trafficking and slavery, context matters. While extrapolations from one country to another can offer speculative insight – insight that could, say, offer guidance for constructing a hypothesis to be tested – it cannot offer trustworthy data, especially when extrapolations are based on general metrics such as countries with similar GDPs or employment rates. Maybe The Global Slavery index uses a rich set of highly detailed metrics for extrapolations, but as it stands, we just don’t know.

At The Freedom Story, we appreciate the desire behind The Global Slavery Index. We’ve even used some their data. Though not entirely reliable, their estimates help communicate the likely scope of this epidemic quickly, which even the most skeptical person believes to be huge. While this may appear hypocritical, it’s just NGO reality; in our whiplash-paced age, where attention is yanked a hundred different directions each day, quick communication is key.

We raise the problems of The Global Slavery Index for one reason alone: it’s really – and we mean, really – hard to get reliable data when it comes to human trafficking and modern slavery. When it comes to global numbers, it’s best to be skeptical.

On Trying to Get BETTER Data

The good news is that many organizations and academics are doing the hard work of rigorous, forensic-level research in specific areas, despite the lack of global coverage and celebrity. Chab Dai is in the midst of conducting an incredibly exciting longitudinal study that has already shed much light on the rehabilitation and reintegration of trafficking survivors. Love 146 is another organization that conducts phenomenal research.

This past year, we at The Freedom Story began what we hope to be a long-term trend of helping to fill this lacuna in the counter-trafficking world. (Click here to learn about our Social Impact Assessment and see a summary of our findings!)

Our Process

Recognizing the importance of partnerships, we partnered with Liberty Asia, which “aims to prevent human trafficking through legal advocacy, technological interventions, and strategic collaborations with NGOs, corporations, and financial institutions in Southeast Asia,” to provide data on our interventions and programs. After conducting our first social impact assessment, we’ve taken a more rigorous approach to measuring our own programs. While we look forward to having concrete data to share with the broader counter-trafficking world, the process of conducting research has already paid off.

Unaware of how long it would take to complete, we began our social impact assessment in January 2015 when an independent researcher, Dr. Melissa Anderson-Hinn, came to Chiang Rai, Thailand to conduct focus groups and in-depth interviews with The Freedom Story’s constituents. Over the course of three weeks, she recorded the interviews and discussions through both short hand notes and audio. Several months later, she revealed her findings to our staff.

Using Dr. Anderson-Hinn’s findings as a guide, we brought on another independent researcher, Athalie Waugh, to help us develop and administer surveys for staff, scholarship students, and parents. There were two reasons for this: 1) to see if we could provide some quantitative(ish) support for Dr. Anderson-Hinn’s qualitative findings, and 2) to explore the feasibility of conducting a longitudinal study.

While we still have a lot to learn about the actual process of conducting research, we’re happy to say that we’ve already learned a lot. You can read about some of it below.

Lessons Learned

Confront Your Assumptions

Since mentorship is such a huge part of our programs, we assumed it was a mutually recognizable concept. As it turns out, this isn’t true. There is no Thai word for mentor, in fact, and thus some of our students weren’t sure whether they had a mentor or not, nor what this relationship was supposed to be. We didn’t realize this until we looked at the data. As mentorship is a key aspect of our programing, this finding suggests that, at the very least, we need to help our students understand what mentorship is. We also need to take a critical look at how we’re running our mentorship program. We learned that our Thai team needed more clarity on their roles. We also learned that it would be beneficial for us to formalize several of our processes on the ground in Thailand.  

Research is Easy to Plan, Hard to Finish

During the second phase of our social impact assessment, there were a lot of unforeseen obstacles. For example, Thai communal culture means people love to talk and share as they work – a byproduct of that is that “copying” is not as frowned upon. We underestimated how hard it would be for students and staff to understand the importance of taking the surveys on their own.

We also didn’t realize how challenging it would be to administer surveys to the parents of our scholarship students, as they had limited literacy skills and were often difficult to reach at home. We thought having staff help administer surveys would help reduce costs and also help parents with limited literacy skills; however, this solution conflicted with the goal of keeping them anonymous. Moreover, asking staff to visit families in their homes to conduct the surveys would have created an undue burden on staff who already had full schedules.  Because staff wouldn’t be able to administer the surveys and we didn’t have the budget to hire another outside researcher, we attempted to administer the surveys at our annual parent meeting. This, however, ended up being another idea that sounded good in theory but not in practice. After going over the other essential aspects of the meeting, no time was left for the surveys. Thus administering parent surveys remains a future goal.

Emojis Help

Before administering surveys to all of our scholarship students, we had a couple students do a test run. They had trouble with the Likert (1-5) scale we used. Abstract thinking develops in adolescence, so it should have come as no surprise that our younger students were having trouble thinking in gradations. We decided to insert emojis into our scale. If two years ago someone had told me that the different expressions spread across the faces of emojis would have saved a research project, I wouldn’t have believed them; but they would have been correct.

Knowledge is Hard Won

Even after we put all of our protocols in place to assure the anonymity of the results of the surveys, it’s possible some bias was still introduced. The reason? Our staff administered the surveys. If we really wanted to be objective, we would have hired an outside researcher to do the entire thing. That’s expensive, however, and, as with most non-profits, money is always tight. Though we administered the surveys ourselves, this doesn’t mean we can’t learn from our results. We have already gained valuable information and data.

It does mean, however, that the data we’ve gained from our research might not pass peer review — the rigorous process of subjecting a study’s methodology, objectives, and findings to a board of qualified professionals. Peer review, for each of its disciplines, represents the gold standard for acquiring trustworthy knowledge. Given the current ascendance of “alternative facts” in the US, it’s all the more important that we’re honest about the best methods we have for wrestling truth from the world.

While happy with the insights we’ve gained thus far, we realize we have room to grow. Our goal is to get to the place where all of the research we produce is able to pass peer review.

Be Careful When Reading Legislation

Finally, one of my roles for the project was to do a literature review, and part of this consisted in defining human trafficking. Human trafficking was, of course, first defined at the international level at Palermo, Italy in 2000. Less than a month after this, in December 2000, Congress passed The Trafficking Victims Protection Act (TVPA), and President Bill Clinton signed it into law. The law has since been reauthorized and amended several times, the latest iteration having passed Congress in 2015. An important aspect of this legislation is the legal designation of children and youth. For sex workers and others involved in illegal forms of labor, whether they’re considered a youth or adult can be the difference between receiving compensation for abuse or doing jail time.

While conducting research for the literature review, I read a section of a recent amendment that defined a child as anyone under the age of 11 and a youth as anyone between 11 and 24. I’m a bit interested in cognitive neuroscience, so when I read this, I thought it made perfect sense – the prefrontal cortex, a region of the brain responsible for higher order cognitive functions such as understanding the likely future consequences of present actions, isn’t fully developed until 25 on average. How enlightened of Congress to recognize that the physical substrate for making informed decisions isn’t fully developed in youth, and thus that sex workers, or those caught up in other forms of illegal labor, shouldn’t be held legally responsible for this until it is.

The only problem was that Congress wasn’t that enlightened; I had misread the law. If I now understand correctly, that age range seems to apply only to runaways, not youth in general and not sex workers. Thus, anyone 18 or over who sells sex is considered a criminal, not a victim. I didn’t catch my mistake, however, until later, when I was revisiting the law for a section I was writing for our new website. The final version of our social impact assessment had already gone to press.

This wasn’t the only mistake (I found typos in the printed report – so frustrating!), but it was the biggest. This is, of course, why news agencies and publishing houses employ fact checkers and editors. Our limited budget meant that I was the author, the main editor, and the fact checker. Even with the eagle eyes of Jade Keller, without whom I couldn’t have completed the final report, some mistakes made it through. Maybe not the worst mistakes, and I hope this confession helps atone for them, but every time I see our printed SIA, I feel stupid and have to fight the urge to rip these pages out.

Looking Ahead

While catching errors and stumbling in cultural minefields can be frustrating, the exciting part of this process is how much we learned, and how much stronger and more experienced we feel about future attempts to do follow-up research. Though we must always anticipate new hurdles and new lessons to learn, we come armed with this experience and ready for the next excursion into field research!


Dan Olson is The Freedom Story’s in-house writer and researcher.