Interviews create or supplement surveys by law firms

Most people who follow surveys by law firms assume that the firms collect their data with an online questionnaire. Interviews by telephone, it turns out, play a significant role in survey data collection. In fact, a small number of reports indicate that the law firm, or an organization it commissioned, only conducted interviews to collect data. They did not use an online survey tool.

For example, Allen Overy Innovative 2012 [pg. 5] only used interviews: “Interviews tended to last for about an hour and followed a structured questionnaire.” During a structured interview, the person doing the interview follows a careful script of questions. The script assures that they stick to the same order and wording, and that they collect the same information even if multiple people carry out the interviews. In a way, an online survey questionnaire is a structured interview — but silent.

Ashurst GreekNPL 2017 [pg. 2] also collected its information solely through personal interviews instead of with online survey (50 interviews).

Proskauer Rose Empl 2016 [pg. 3] combined an online questionnaire and phone interviews of 100 people.

A variation on the previous method appears in Allen Overy Models 2014 [pg. 2]. That firm deployed two levels of interviews: “The views of 185 individuals were captured through 20-minute structured telephone conversations. A further 13 individuals participated in a longer in-depth interview.”

Sometimes firms compile their data from an online questionnaire, but then turn to selected interviews to gain depth and color. One example is Herbert Smith CorpDebt 2016 [pg. 2] which followed up with some participants to discuss the survey results. The same two-punch methodology was employed in Reed Smith LondonWomen 2018 [pg. 22], except that the firm went back to several participants who opted in. Pinsent Masons Energy 2017 [pg. 5] explains that “The survey included a combination of qualitative and quantitative questions, and all interviews were conducted over the telephone by appointment.”

CMS GC 2017 [25] pulled off a three-step information gathering, as explained in the snippet below. That firm conducted two surveys plus a series of interviews.

The ratio of interviews to online-survey participants varies widely and cannot always be determined from the survey report. White Case Arbitration 2010 [pg. 3] explains that its data comes from 136 questionnaires and 67 interviews, approximately a two-to-one ratio.

Interviews can supplement the quantitative data gathered by a survey

Several firms combine modes of data gathering. They start with a survey emailed to their invitee list or otherwise publicized. At some point later the firm (or the service provider it retained) seeks interviews with a subset of the invitees. (At least we assume that those who were interviewed also completed a survey, but the reports do not confirm that assumption.)

The survey gathers quantitative data while the interviews gather qualitative insights. Interviews cost money, but what firms learn from conversations deepens, clarifies and amplifies the story told by survey data. Interviews also enable the firm to strengthen its connections to participants who care about the topic.

The reports make little of the interview process and provide almost no detail about them in general. They show up as quotes and case studies. DLA Piper Debt 2015 , for example, states that 18 interviews were conducted; commendably it lists the names and organizations of those who were interviewed [pg. 30]. We show the first few in the snippet below.

Reed Smith LondonWomen 2018 [pg. 22] mentions that “Several individuals opted to take part in further discussion through email exchange, in-person meetings and telephone interviews.” As a prelude to those discussions, in the invitation to women to take the survey the firm explained: “We will be inviting those who wish to speak on-the-record to take part in telephone or in-person interviews to impart advice and top tips. If you wish to take part in an interview, please fill in the contact details at the end of the survey.” This background tells us about the opt-in process of the firm, although the report itself does not refer to it.

HoganLovells Cross-Border 2014 [pg. 28] explains that interviews were conducted with 140 “general counsel, senior lawyers, and executives.” As with the other examples here, the report adds no detail about how long the interviews lasted or the questions asked during them.

Clifford Chance Debt 2007 [pg. 3] doesn’t say how many interviews were conducted, only that interviews took place during November 2007. It would have been good for the firm to have said something more about how many people they spoke with and how those people were chosen.

Norton Rose Lit 2017 surveyed invitees, “with a telephone interview campaign following” [pg. 5] and adds later in the report [pg. 38] that there was an “interview campaign following [the online survey] across July, August and early September 2017.”

Limited interviews fall short of “data”; glimmers of awareness of machine learning

Two observations arise from a report published by KPMG, “Through the looking glass, How corporate leaders view the General Counsel of today and tomorrow” (Sept. 2016), one about what constitutes “data from a survey” and the other about dawning awareness among general counsel of data analytics.

Regarding the first observation, the report states that its conclusions are based on interviews with 34 “CEOs, Chairmen, General Counsel and Heads of Compliance who made themselves available for interviews and kindly agreed to participate in our research.” (pg. 27).   While you can certainly identify themes from interviews, unless you ask everyone the same question (or some questions), you can’t quantify your findings.  Writing that “risk management is top of mind for GCs” is worlds apart from writing that “Twenty-six out of 34 interviewees mentioned risk management as a significant concern.”  Additionally, surveys are designed to gather data that is representative of a larger population.  It is unlikely that the particular group of 34 who agreed to speak to the KPMG interviewers are representative of the broader population of global CEOs, Chairmen of the Board of Directors, General Counsel or Chief Compliance Officers.  Subjective interpretations of what a limited group of people say falls short of quantified research, although those interpretations have whatever credibility a reader assigns them.

The second observation highlights the passing reference — but at least it is a reference — to machine learning software becoming more known to general counsel.  “Technology was also cited as an important tool to help the GC improve efficiency, at a time when they are continually being asked to do more with less: ‘New technology helps the GC to be more responsive to the real-time demands of the C-suite of executives,’ says the CEO of a large consumer services company. Companies are making greater use of data analytics and are increasingly moving from descriptive analytics (where technology is used to compress large tranches of data into more user-friendly statistics) to predictive analytics and prescriptive models that extrapolate future trends and behavior. The Office of the GC is being transformed by this process, for example, when performing due diligence on M&A targets or monitoring global compliance.” (page 14).  The following sentences direct attention to predictive coding in e-discovery, it is true, but at least the report links awareness of predictive analytics to transformation of law departments.