Need for standardized position descriptions

The laxity in describing the respondent sample by position and percentage in each position should trouble those who want to rely on law-firm research surveys. How much credibility does data have if you don’t know who provided that data? But even if a firm classifies its respondents by level, the review I carried out raises another troubling point: the position descriptions vary wildly.

More specifically, in my set of law firm research surveys 1, the firms used a grand total of 16 different descriptions for titles in law departments. The position “General Counsel” was most common but after it the terminology was all over the place.

These are the actual terms used for the legal positions, listed roughly in descending order of corporate level: Chief Legal Officer, General Counsel, Chief Legal Officer/General Counsel (or other head of in-house legal function), Region or Division General Counsel (or equivalent), Chief Legal or Associate/Deputy General Counsel, Deputy General Counsel, Assistant General Counsel, Associate/Deputy/Assistant General Counsel, direct reports to general counsel, senior lawyers, Head of litigation, Senior Counsel, in-house counsel, in-house attorneys/corporate counsel, in-house lawyer, and Counsel. Several of the reports also included an “Other” category for respondents in the law department whose position was not among the selections.

The methodology sections of these reports do not say whether the online questionnaire explained what was meant by the selections of the multiple-choice questions for positions, such as years of experience or scope of responsibility. Probably they simply listed some number of selections and ask respondents to pick one of them (possibly with “Other”) to answer a seemingly simple question question: “What is your level, title or position?” The firm that gave a common title and then added “or its equivalent” showed thoughtfulness. It recognized that titles have proliferated in law departments so the firm was using a label to try to convey the level of responsibility of the respondent.

As for titles of respondents outside the law department, those also vary widely. Quoting from the reports, they include CEO/Director, C-suite executives, senior-level executives, Senior management, Executives, Human Resources Professionals, Risk/compliance, other professionals, and other business contacts.

As with the law department positions, the firms had no standard set of positions to draw from, so they came up with their own categories. Going forward, it would help the legal industry and its movement toward more data analytics to have at least a core of standardized position terms for the law department respondents and client respondents to choose from.


  1. Reports by Allen & Overy, Berwin Leightner Paisner, Carlton Fields, Davies Ward Phillips, Eversheds, Goulston & Storrs, Haynes and Boone, Hogan Lovells, Littler Mendelson, Norton Rose Fulbright, Proskauer Rose, Ropes & Gray, Seyfarth Shaw, White & Case, and Winston & Strawn

Provide respondent numbers and positional breakdown

Readers of a law firm’s research-survey report 1 want from the start to assess the credibility of the findings. As that credibility in large measure rests on the quality of the survey’s respondents, readers would like to know how many people responded and what were their positions (aka levels, roles, or titles).

A review of survey reports I have collected suggests that we can capture the methodological disclosures of respondent positions by five classifications. Here they are in increasing praiseworthiness.

  1. No information. Woefully, two of the reports lack data on both the number of respondents or their positions. How much stock can anyone put in findings from a black box?
  2. Total respondents. One report, by a very eminent law firm, only disclosed how many respondents its survey had collected, but nothing about their positions.
  3. Total respondents and a broad position. As far as I could glean, five firms told how many people had participated (the number of respondents), but gave only the most general description of their position. How useful is it to know they were “senior management?”
  4. Total respondents and some position breakdowns. Three firms went a step farther and gave some breakdown of the respondents’ positions. One of those firms gave the percentage of total respondents in one broad category, but oddly provided no other quantitative data regarding positions.
  5. Total respondents and full percentage breakdown. A half dozen firms did well: They laid out how many respondents their survey had, broke them down by three-to-five positions, and provided the percentage of respondents in each position. These six firms win the coveted Golden Data Analyst Award: Berwin Leightner Paisner, Davies Ward Phillips, Littler Mendelson, Norton Rose Fulbright, White & Case, and Winston & Strawn.

This detail of disclosure should be the minimum standard for all research surveys conducted by law firms. Tell your readers about who provided data to you and give a clear, quantitative decomposition of the percentage at each position 2.


  1. My set includes reports by Allen & Overy, Berwin Leightner Paisner, Carlton Fields, Davies Ward Phillips, Eversheds, Goulston & Storrs, Haynes and Boone, Hogan Lovells, Littler Mendelson, Norton Rose Fulbright, Proskauer Rose, Ropes & Gray, Seyfarth Shaw, White & Case, and Winston & Strawn.
  2. It is for another time to consider weighting the responses of people by their level of seniority.