Average number of pages in reports by originating law firm’s geography

From the period 2013 through now, we have found 154 research surveys where a law firm conducted or sponsored the survey and a PDF report was published. That group includes 55 different law firms.

We categorized the firms according to five geographical bases: United States firms, United Kingdom firms, vereins, combinations of U.S. and U.K. firms (“USUK”), and the rest of the world (“RoW” — Australia, Canada, Israel, New Zealand, and South Africa). We thought we would find that the largest firms, either the vereins or the USUK firms, would write the longest reports. Our reasoning was that they could reach more participants and could analyze the more voluminous data more extensively (and perhaps add more marketing pages about themselves).

Quite true! As can be seen in the table below, the average number of pages and the median number of pages for the five geographical groupings of firms each stand at approximately the same number. How many surveys are included in each category is shown in the column entitled “Number”. Nevertheless, the two large classes of firms do indeed produce more pages of reports.

GeoBase Number AvgPages MedianPages
RoW 13 25.0 20.0
UK 41 24.1 20.0
US 78 22.5 19.0
USUK 17 30.2 22.0
Verein 5 27.6 28.0

We tested the difference between the average number of pages for the USUK reports and average pages for the US reports. We selected those two groups because they had the largest gap [30.2 versus 22.5].

A statistical test called the t-test looks at two averages and the dispersion of values that make up each average. It tells you how likely it is that the difference of those averages is statistically significant, meaning that if random samples of survey reports were taken repeatedly from law firms in each geography, less than 5% of the time a gap of that amount or more would show up. If that threshold is not met, you can’t say that the differences are due to anything other than chance. If the threshold is met, statistician say that the difference can be relied on, in that it is statistically significant. On our data, the t-test was 1.2 and the p-value is 0.24, much above the threshold of 0.05 for statistical significance. The swing between USUK average pages and US average pages may look material, but on the data available, we can’t conclude that something other than random variation accounts for it.

Profusion of research surveys on Brexit and the GDPR

Law firms follow current events and especially those that suggest major legal repercussions. For example, the Brexit vote of the United Kingdom has unleashed a torrent of political and legal ramifications. Accordingly, it is not surprising that law firms have launched surveys to research aspects of Brexit, but that 10 or more have been completed may be surprising.

The ten studies found so far include Brodies Firm Brexit 2017, CMS Brexit 2017, DLA Piper Brexit 2018, Eversheds Sutherland Brexit 2017, Herbert Smith Brexit 2018, HoganLovells Brexometer 2017, Norton Rose Brexit 2017, Pinsent Masons Brexit 2017, Shakespeare Martineau Brexit 2018, and Simmons Simmons Brexit 2017.

Not surprisingly, all the firms are either UK based or UK-oriented with a major U.S. presence (DLA Piper, Norton Rose). Of the six Brexit reports available online, the average is 23 pages of plots and text per report.

Likewise, the European Union’s far-reaching regulation of data privacy, the General Data Protection Regulation (GDPR), has ushered in massive economic, political and legal changes. Law firms are keenly aware of all the work awaiting them, so GDPR has resulted thus far in at least six research surveys by law firms.

The GDPR survey research includes Brodies Firm GDPR 2017, Eversheds DublinGDPR 2018, McDermott Will GDPR 2018, Paul Hastings GDPR 2017, and Technology Law GDPR 2017.

On this topic, two UK firms have weighed in, but so have five U.S. firms. It is also quite possible that several other surveys that address cyber-security and hacking include some questions about GDPR.

Co-contributors to law-firm surveys (Part II)

A law firm should decide at the beginning whether it wants to conduct the survey on its own or coordinate with others. In the data set we have been examining, many firms teamed with another group on a survey or with a group that shared an interest in the topic. In fact, we spotted nine survey reports that had two other organizations coordinating with a law firm. Organizations aplenty can help develop questions, distribute invitations, publicize findings, and analyze data.

Obviously, assistance such as this sometimes comes at a cost. We don’t know how much firms have paid co-contributors but it could be a fairly substantial amount if they services obtained fall into the broad range of consulting. Having a co-contributor also adds complexity and elapsed time because the firm must manage the external provider (or be managed by it, since sometimes another company leads the survey process) and adapt to its scheduling. There is also the matter of finding the third-party.

The benefit of bringing in outside experience is that your eventual product will be superior. They also can provide the benefits of a consultant of helping to keep the project on track and on time and bringing experienced talent.

Having previously written about 18 instances of co-contributors, here are another 13 co-contributors and their respective survey reports.

  1. 451 Research — Morrison Foerster MA 2016
  2. Acritas — Proskauer Rose Empl 2016
  3. Bank Polski and Polnisch-Deutsche Industrie- und Handelskammer — CMS Poland 2016
  4. Biopharm Insight and Merger Market Group — Reed Smith Lifesciences 2015
  5. Economist Intelligence Unit — Bryan Cave CollectiveLit 2007
  6. J.D. Power, Univ. of Michigan — Miller Canfield AutoCars 2018
  7. Local Area Property Association — Simpson Grierson Reserve 2016
  8. Local Government New Zealand — Simpson Grierson LocalGov 2015
  9. Meridian West — Allen Overy Models 2014
  10. Oxford Economics — HoganLovells Brexometer 2017
  11. Rafi Smith Research Institute — Gilad Saar Trust 2018
  12. VB/Research and The Lawyer — DLA Piper Debt 2015
  13. WeAreTheCity — Reed Smith London 2018

At least a dozen kinds of software used in survey project

As to the software that a law firm might use to carry out a survey project, the list is lengthy. Not that the firm itself needs to have each of the applications that are listed below, but from start to finish someone may need to deploy them.

  1. Database software for customer relations management (CRM) or some software to provide email addresses of invitees
  2. Bulk email software (e.g., ConstantContact) so that the firm can effectively invite its clients and others to take part in the survey
  3. Word processing software to draft questions, invitations and the text of the report
  4. Survey software (e.g., NoviSurvey, SurveyMonkey, SmartSurvey, Qualtrics, SurveyGizmo) to create an online questions and capture the responses of the participants
  5. Spreadsheet software (e.g., Excel) so that the firm can export the responses from the survey software into a worksheet and manipulate the data
  6. Statistical software (e.g., R, Stata, Excel, Tableau) so that an analyst can calculate various statistics
  7. Data visualization software (e.g., R, Excel or PowerPoint) so that an analyst can prepare plots and graphs
  8. Desktop publishing (e.g., LaTex, markdown languages, Adobe InDesign) so that the firm can integrate text, plots, tables and other elements into the report
  9. Presentation software (e.g., PowerPoint) or specialized software to prepare infographs
  10. Graphical design software (e.g., gimp, PhotoShop) so that the firm can work with images and photos and otherwise design the report as it wishes
  11. PDF software (e.g., Foxit, Adobe Acrobat, PScript5, ScanSoft, QuarkXPress) so the firm can save its report in a portable document format [see the plot below for more details]
  12. All kinds of other software are also involved, such as email, instant messaging, social media, website packages, video-conferences, calendaring, backup software and more.

The plot below examined data from 153 survey reports in PDF format. Of the set, 141 include metadata about the software used to create the report. The firms used nine different software packages although over the years they used multiple versions of the same package. Thus, for example, Adobe InDesign — all versions — dominated with more than 100 reports created with it.

Interviews can supplement the quantitative data gathered by a survey

Several firms combine modes of data gathering. They start with a survey emailed to their invitee list or otherwise publicized. At some point later the firm (or the service provider it retained) seeks interviews with a subset of the invitees. (At least we assume that those who were interviewed also completed a survey, but the reports do not confirm that assumption.)

The survey gathers quantitative data while the interviews gather qualitative insights. Interviews cost money, but what firms learn from conversations deepens, clarifies and amplifies the story told by survey data. Interviews also enable the firm to strengthen its connections to participants who care about the topic.

The reports make little of the interview process and provide almost no detail about them in general. They show up as quotes and case studies. DLA Piper Debt 2015 , for example, states that 18 interviews were conducted; commendably it lists the names and organizations of those who were interviewed [pg. 30]. We show the first few in the snippet below.

Reed Smith LondonWomen 2018 [pg. 22] mentions that “Several individuals opted to take part in further discussion through email exchange, in-person meetings and telephone interviews.” As a prelude to those discussions, in the invitation to women to take the survey the firm explained: “We will be inviting those who wish to speak on-the-record to take part in telephone or in-person interviews to impart advice and top tips. If you wish to take part in an interview, please fill in the contact details at the end of the survey.” This background tells us about the opt-in process of the firm, although the report itself does not refer to it.

HoganLovells Cross-Border 2014 [pg. 28] explains that interviews were conducted with 140 “general counsel, senior lawyers, and executives.” As with the other examples here, the report adds no detail about how long the interviews lasted or the questions asked during them.

Clifford Chance Debt 2007 [pg. 3] doesn’t say how many interviews were conducted, only that interviews took place during November 2007. It would have been good for the firm to have said something more about how many people they spoke with and how those people were chosen.

Norton Rose Lit 2017 surveyed invitees, “with a telephone interview campaign following” [pg. 5] and adds later in the report [pg. 38] that there was an “interview campaign following [the online survey] across July, August and early September 2017.”

Icons in bar plots

Most students of plots reject icons as stand-ins for bars or columns. They distract readers from the figures that matter. Some examples of the practice reinforce the objection.

To its credit, Dykema Gossett MA 2017 [pg. 5] plucks an appropriate icon for growth, but the plant icons nevertheless divert the reader’s attention from the two important numbers. Moreover, it is hard to know the y-axis scale.

DLA Piper RE 2017 [pg. 6] borrows its icons from the world of finance. To be precise, this graphic appears to size the iconic bull and bear in proportion to the percentages they represent.

Pinsent Masons Infratech 2017 [pg. 29] invokes a different role for icons — not as a stand-in for columns but as a pictorial version of column labels. This fillip contributes nothing but complexity.

KL Gates GCDisruption 2017 [pg. 13] has nine icons to the right of the 14-bar plot (The snippet captures only part of the page-tall plot.) Not only do the icons not match the bars, five of them have dotted borders and four have solid borders. If no distinction is drawn, it is superfluous to vary the borders.

Reed Smith Lifesciences 2015 [pg. 17] scattered random icons of integrated circuits around an array of area-circles. Confusion is the likely effect.

Lists, little used but flexible

Lists appear infrequently, an underutilized technique in survey reports. Yet they summarize key points, they break them up and present them better than blocks of text, create an open feel to a page, and allow aesthetic touches such as the bullets (shape, color, size, spacing), the indenting before and after the bullet, and the spacing between items. Here are examples of some of these variations.

Freshfields Bruckhaus Crisis 2013 [pg. 7] uses small blue circles and no indenting from the header. One line fits between each list item.

Morrison Foerster MA 2016 [pg. 4] also uses a simple, small circle as the bullet but indents the items two or three letters from the preceding main text.

Seyfarth Shaw RE 2017 [pg. 11] uses white bullets and indenting (note that the yellow shading is not from the original; it uses white typeface as in the bottom item). Here, too, the designer reduced the typeface for the list items and nestled the bullets close to the item text.

Dykema Gossett Auto 2012 [pg. 4] chose square blue boxes for the bullets and seems to have left more room between each item than do the previous three examples.

Page texture (Part II of background color)

Miller Canfield AutoCars 2018 [pg. 43] applies a background of dark blue, which means the page needs a white or light color for the typeface.

Thompson Hine Innovation 2018 [pg. 4] also applies a blue texture, although not as dark as Miller Canfield’s shade, and also includes what look to be icons of file folders.

On a divider page, Allen Overy Innovative 2012 [pg. 13] uses a grey hue for the whole page, behind an enormous “1” in green.

Carlton Fields CA 2013 [pg. 12] puts a golden border or background behind parts of its plots. This one reveals the hint of the texture on the left side and extending into part of the top and bottom.

Page text (background color)

Another design decision for law firms introduces on report pages background color. We will refer to the background as texture. Quite often the cover and the back page have texture, but we looked for examples of it inside the report itself (other than divider pages). Sometimes, the effect and design of texture becomes more apparent when you see multiple pages and how the texture varies by color, portions of pages, or locations of color.

Clyde Co Parental 2015 [pg. 2] shades most of the page in cobalt blue, leaving only the top margin and a sliver on the side white.

Brodies Firm Housebuilding 2015 [pg. 1] places a colored rectangle (a gradient color no less) behind the plot, probably both to emphasize the data and to create aesthetic appeal. Note how much stuff fills just this portion of a single page!

Carlton Fields CA 2015 [pg. 31] highlights a quotation with a khaki green rectangle.

As a final example, Berwin Leighton ArbDocuments 2013 [pg. 3] backgrounds the entire page with light blue.

Additional detail to highlight data in a plot or table

Careful attention to the central message of a plot or table can lead a firm to drill down or annotate that part of the data. Various techniques direct the reader to a key point. Here are three examples.

Dykema Gossett MA 2017 [pg. 10] calls attention to China with the text box and mini-bar plot inside it, and the text tells readers more about the country.

Carlton Fields CA 2013 [pg. 12] outlines the most important data in red braces and lines and also adds in the right margin “> than 50%,” thereby highlighting the most significant findings.

Herbert Smith CorpDebt 2018 [pg. 8] breaks out the 30 percent “yes” slice into six sub-slices. If the firm had not done that it could have made the donut plot a seven-slice plot, but doing so would have muted the more detailed insights from exploding out the “yes” details.