Long series of surveys by law firms and their meta-topics

Several law firms have conducted (and may still be conducting) two different series. These firms include Baker McKenzie (Brexit and cloud computing), Berwin Leighton (hotels, two different geographies), Clifford Chance (M&A and European debt), Freshfields Bruckhaus (corporate crises and whistle blowers), Herbert Smith (M&A and finance), Jackson Lewis (workplace and dress codes), Miller Chevalier (Latin American corruption and tax policy), and Morrison Foerster (legal industry and M&A).

A few firms have done (or may still be conducting) three surveys on different topics; CMS (the legal industry, Brexit, M&A in Europe), DLA Piper (compliance, debt in Europe, M&A), and Pinsent Masons (Brexit and two on construction in different geographies).

We can also look at the broad topics where one or more firms have coordinated a series of at least five years’ length. We have coded the particular topics into broader meta-topics. The next chart tells us that three meta-topics on industries are included in these long-running series: construction, real estate, and private equity. Second, firms have also run five-plus-year series on disputes (litigation, class actions, and arbitration). Finally, the most popular subject for research surveys has been mergers and acquisitions, with three different meta-topics.

 

Titles of law firm reports based on research surveys

The most common style of title starts with a few keywords and then adds another line or a few explanatory words after a colon. Here Ashurst GreekNPL 2017 is one of many examples:

Titles on survey reports range from functional to fanciful. “Outsourcing Public Services Across Local Government” [Ashfords Outsource 2017] is about as meat-and-potatoes as it gets; “South Florida Real Estate 2016 Outlook Survey” [Berger Singerman SFlaRE 2016] has a similar matter-of-factness.

Some titles expand: “The Good, the Bad, and the Troubling: Fasken Martineau’s 2017 Employer Occupational Health and Safety Survey Report Legal Compliance Challenges For Canadian Employers” [Fasken Martineau OHS 2017] or “Getting it right from the ground up: A survey on construction disputes: The causes and how to avoid them” [Russell McVeagh ConstrDisp 2018].

Only rarely does a title include both the name of the firm and the year, as in “Baker McKenzie 2016 Cloud Survey” [Baker McKenzie Cloud 2016]. A touch more commonly, the year alone appears: “European Acquisition Finance Debt Report 2011” [DLA Piper EuropeDebt 2011].

Titles that stand out entice the reader. Examples include “Mythbusting the common law marriage” [Mills Reeve CommonLaw 2017] or “The Multichannel High Street: A Nation of Shoppers: but is it a nation of shopkeepers?” [Squire Sanders Retail 2013].

Most titles get the job done with simple language and structure. A few approach complexity: “Finding the balance: human touch versus high tech: Millennials and the future of the hotel and restaurant sector” [CMS Restaurants 2018].

When law firms conduct a series of surveys, the titles usually morph in minor ways as the years pass. For instance:

  1. “Survey Of Office Occupiers: Changing Attitudes To Property Needs” [Irwin Mitchell Occupiers 2014]
  2.  “Survey Of Office Occupiers – Part III: Changing Attitudes To Property Needs – Autumn 2015” [Irwin Mitchell Occupiers 2015]
  3.  “Survey Of Office Occupiers – Part IV: Changing Attitudes To Property Needs and the Impact of Brexit – Summer 2016” [Irwin Mitchell Occupiers 2016]
  4.  “Property Trends in 2018 – Survey of Office Occupiers” [Irwin Mitchell Occupiers 2018].

Largest U.S. firms, gross revenue and number of research surveys

From the AMLAW listing for 2017, I looked at the 25 top law firms in terms of gross revenue. As to which of them have conducted or taken part in data-oriented research surveys, my investigations so far consist of searching for the name of the firm and the word “survey” using Google and then scanning down the first five or six pages of hits. The better method would be to search on the website of the firm itself, which should take place eventually.

In any case, at this point it appears that 16 of the 25 highest grossing U.S. law firms have not been involved in a research survey. In the plot below, they are the firms that have no green bar: Latham Watkins, Kirkland Ellis, Skadden Arps, Jones Day, Sidley Austin (which tried a survey a couple of years ago but didn’t complete it), Morgan Lewis, Gibson Dunn, Greenberg Traurig, Sullivan Cromwell (although I ran across a reference to a survey done in 2010 about Boards of Directors), Simpson Thacher, Cleary Gottlieb, Weil Gotshal, Paul Weiss, Quinn Emanuel, Davis Polk, and Wilmer Cutler.

The other nine firms are known to have sponsored at least one research survey, and six of them have been involved in more than one. The laurel wreath goes to DLA Piper, which at 28 surveys known to me almost equals the combined 32 of the other eight firms.

The plot sorts the law firms in descending order by gross revenue, which shows that five the top 12 firms have put this tool to use. Overall, however, the majority of these elite, huge U.S. law firms have not seen sufficient reason to take part in or publish a research survey.

More participants in surveys with reports than in surveys without formal reports

Of the 420-some research surveys by law firms that I know about, about one-out-of-five of them are known only from a press release or article. I have located a formal report for all the others. To be forthright, either the law firm or a co-coordinator possibly published the results in a formal report, but so far I have not located the report.

Obviously, it costs less to draft a press release or an article than to produce a report. Reports require design decisions, plots, graphics, and higher standards of quality. Moreover, aside from expense, it also seems plausible that firms choose not to go through the effort of preparing a formal report if the number of participants in the survey seems too small.

To test that hypothesis about relative numbers of participants, I looked at nine survey results — let’s call them “non-report surveys” — that disclose the number of survey participants. I also collected data from 204 formal reports that provide participant numbers. The average number of participants in the non-report surveys is 157. Where there is a report, the average is 420, but that number is hugely inflated by one survey that has 16,000 participants. When we omit that survey, the average drops to 343 — still more than twice the average number of participants as in the non-report surveyrs.

When we compare the median number of participants, the figures are 157 participants in the non-report surveys versus 203 participants in the reports. Thus, the medians disclose one-third more participants where a report has been published than where a report has not been published.

A statistical test looks at whether the difference in averages between two sets of numbers suggests that the sets are likely to have a meaningful difference — here, on participants. With our data, the crucial test value turns out to be so small that we can confidently reject the hypothesis that no difference exists between the two sets in terms of participants. Larger numbers of participants are strongly associated with reported surveys.

Technical note: For those interested in the statistics, we ran a Welch Two-Sample t-test and found a p-value of 0.0331, which means that if someone could sample over and over from the universe of reported and non-reported surveys, only about 3% of the time would such a large difference in averages show up. Such a low percentage justifies statisticians concluding that the data comes from meaningfully different populations (the results are “statistically significant”). Bear in mind that I have not looked at all the non-reports in my collection and that a few more of them added to the group analyzed as described above could potentially change the figures materially and therefore the statistical test. Or there may be a formal report somewhere.

Profusion of research surveys on Brexit and the GDPR

Law firms follow current events and especially those that suggest major legal repercussions. For example, the Brexit vote of the United Kingdom has unleashed a torrent of political and legal ramifications. Accordingly, it is not surprising that law firms have launched surveys to research aspects of Brexit, but that 10 or more have been completed may be surprising.

The ten studies found so far include Brodies Firm Brexit 2017, CMS Brexit 2017, DLA Piper Brexit 2018, Eversheds Sutherland Brexit 2017, Herbert Smith Brexit 2018, HoganLovells Brexometer 2017, Norton Rose Brexit 2017, Pinsent Masons Brexit 2017, Shakespeare Martineau Brexit 2018, and Simmons Simmons Brexit 2017.

Not surprisingly, all the firms are either UK based or UK-oriented with a major U.S. presence (DLA Piper, Norton Rose). Of the six Brexit reports available online, the average is 23 pages of plots and text per report.

Likewise, the European Union’s far-reaching regulation of data privacy, the General Data Protection Regulation (GDPR), has ushered in massive economic, political and legal changes. Law firms are keenly aware of all the work awaiting them, so GDPR has resulted thus far in at least six research surveys by law firms.

The GDPR survey research includes Brodies Firm GDPR 2017, Eversheds DublinGDPR 2018, McDermott Will GDPR 2018, Paul Hastings GDPR 2017, and Technology Law GDPR 2017.

On this topic, two UK firms have weighed in, but so have five U.S. firms. It is also quite possible that several other surveys that address cyber-security and hacking include some questions about GDPR.

Five multi-year streaks of research surveys by law firms

As of today, I have collected more than 50 research surveys by law firms. Of the 24 firms represented in that collection, 19 of them appear to have conducted one-shot surveys. They may have conducted more than one survey, but only a single year for any topic. Outsiders have no way of figuring out why the firm chose not to follow a survey with another and thereby start identifying trends and building a brand for the survey.

On the other hand, at least four firms have continued a string of surveys on the same topic through 2017. One firm continued a long series but discontinued it five years ago.

Fulbright & Jaworski began its annual survey of litigation in 2004. The firm merged and is now Norton Rose Fulbright, but it kept up its survey effort through 2017, an astonishing 14 years.

Carlton Fields Jordan Burt began a survey focused on class action litigation in 2012 and has done six annual versions since then.

Littler Mendelson runs a survey focused on the views of executive employers. Begun in 2012, the survey streak has reached the six-year mark.

For the past three years, Haynes and Boone has conducted its twice yearly survey of borrowing base determinations. Even though that constitutes six surveys, the firm has three years under its belt for this reckoning.

Proskauer’s inaugural survey regarding employment issues appeared in 2016 and the firm returned to survey the same topic in 2017.

Another noteworthy streak appears to have ended five years ago. Davies Ward Phillips teamed with the Canadian Corporate Counsel Association (CCCA) in 2005 for a study of Canadian in-house counsel, called the Barometer. I did some online research and believe the series continued eight years until 2012.

The segment plot below presents this data visually.

Pages of research-survey report devoted to marketing

Once a law firm goes through the effort to design and conduct a survey, then analyze the data and prepare a report, management certainly hopes for a return on that investment. At the top of the list would be calls from prospective clients asking about the firm’s services related to the survey’s topic. Furthermore, the firm would like potential clients to think more favorably of the firm and its contribution to knowledge (the oft-used term, “thought leadership”). Other benefits of surveys come to mind, but this post is about an aspect of marketing: how much space the survey report devotes to touting the firm.

All the reports have a portion that is “About the Firm.” I estimated how much those sections occupied using a notion of full-page equivalent (FPE). Usually, the description of the firm and its services takes a full page or two, which made it easy to count the FPE. Other firms devoted only part of a page to self-promotion, so I estimated the percentage of a full page that the section took up. I did not include forwards or quotes from partners and only considered pages if there were some text about the firm (i.e., not cover pages or back covers that have the firm’s name).

The resulting data is in the plot below, which has converted each of the 16 firm’s FPEs into a percentage of all the pages in the firm’s report.

 

With the exception of the firm at the top, most of the firms were relatively reticent with respect to their self-descriptions. After all, at least they can be expected to include some contact information. If you assume some bare minimum of firm information is justified, then the length of the report significantly determines the resulting percentage. Shorter reports tend to have a higher percentage of report pages devoted to the firm.