Interviews can supplement the quantitative data gathered by a survey

Several firms combine modes of data gathering. They start with a survey emailed to their invitee list or otherwise publicized. At some point later the firm (or the service provider it retained) seeks interviews with a subset of the invitees. (At least we assume that those who were interviewed also completed a survey, but the reports do not confirm that assumption.)

The survey gathers quantitative data while the interviews gather qualitative insights. Interviews cost money, but what firms learn from conversations deepens, clarifies and amplifies the story told by survey data. Interviews also enable the firm to strengthen its connections to participants who care about the topic.

The reports make little of the interview process and provide almost no detail about them in general. They show up as quotes and case studies. DLA Piper Debt 2015 , for example, states that 18 interviews were conducted; commendably it lists the names and organizations of those who were interviewed [pg. 30]. We show the first few in the snippet below.

Reed Smith LondonWomen 2018 [pg. 22] mentions that “Several individuals opted to take part in further discussion through email exchange, in-person meetings and telephone interviews.” As a prelude to those discussions, in the invitation to women to take the survey the firm explained: “We will be inviting those who wish to speak on-the-record to take part in telephone or in-person interviews to impart advice and top tips. If you wish to take part in an interview, please fill in the contact details at the end of the survey.” This background tells us about the opt-in process of the firm, although the report itself does not refer to it.

HoganLovells Cross-Border 2014 [pg. 28] explains that interviews were conducted with 140 “general counsel, senior lawyers, and executives.” As with the other examples here, the report adds no detail about how long the interviews lasted or the questions asked during them.

Clifford Chance Debt 2007 [pg. 3] doesn’t say how many interviews were conducted, only that interviews took place during November 2007. It would have been good for the firm to have said something more about how many people they spoke with and how those people were chosen.

Norton Rose Lit 2017 surveyed invitees, “with a telephone interview campaign following” [pg. 5] and adds later in the report [pg. 38] that there was an “interview campaign following [the online survey] across July, August and early September 2017.”

Limited interviews fall short of “data”; glimmers of awareness of machine learning

Two observations arise from a report published by KPMG, “Through the looking glass, How corporate leaders view the General Counsel of today and tomorrow” (Sept. 2016), one about what constitutes “data from a survey” and the other about dawning awareness among general counsel of data analytics.

Regarding the first observation, the report states that its conclusions are based on interviews with 34 “CEOs, Chairmen, General Counsel and Heads of Compliance who made themselves available for interviews and kindly agreed to participate in our research.” (pg. 27).   While you can certainly identify themes from interviews, unless you ask everyone the same question (or some questions), you can’t quantify your findings.  Writing that “risk management is top of mind for GCs” is worlds apart from writing that “Twenty-six out of 34 interviewees mentioned risk management as a significant concern.”  Additionally, surveys are designed to gather data that is representative of a larger population.  It is unlikely that the particular group of 34 who agreed to speak to the KPMG interviewers are representative of the broader population of global CEOs, Chairmen of the Board of Directors, General Counsel or Chief Compliance Officers.  Subjective interpretations of what a limited group of people say falls short of quantified research, although those interpretations have whatever credibility a reader assigns them.

The second observation highlights the passing reference — but at least it is a reference — to machine learning software becoming more known to general counsel.  “Technology was also cited as an important tool to help the GC improve efficiency, at a time when they are continually being asked to do more with less: ‘New technology helps the GC to be more responsive to the real-time demands of the C-suite of executives,’ says the CEO of a large consumer services company. Companies are making greater use of data analytics and are increasingly moving from descriptive analytics (where technology is used to compress large tranches of data into more user-friendly statistics) to predictive analytics and prescriptive models that extrapolate future trends and behavior. The Office of the GC is being transformed by this process, for example, when performing due diligence on M&A targets or monitoring global compliance.” (page 14).  The following sentences direct attention to predictive coding in e-discovery, it is true, but at least the report links awareness of predictive analytics to transformation of law departments.