Law firms that produced only PDF reports of surveys, only non-reports, or both

As described previously, in my collection of hundreds of law-firm research surveys, 44 firms have released at least one set of survey results only as a press release, a post on a blog, or an article (a “non-report”). Also, 88 law firms have produced at least one survey report in PDF format (a “formal report”). Some — or perhaps all — of those law firms have produced the results of a research survey in both formats, formal report and non-report, but I would have to confirm with each firm about its history of reporting to be sure of both numbers.

Nevertheless, with the data at hand, 24 law firms are in both camps, having published at least one formal report and at least one non-report. Another 64 formal-report firms have not issued a single non-report, while 20 firms have not produced a single formal report.

The pie chart below visualizes these findings. The largest group, in the bottom slice, represents the 64 law firms that have produced only formal reports in PDF. At the upper left, the green (darkest) segment, extends out only about a third as far out as the largest segment, as it represents the 20 law firms that have not released a formal report of their survey findings (about one-third of 64). The third segment, in the upper right, represents the remaining 24 firms that have chosen both formats.

These preliminary findings of significant variability in reporting practices may reflect the decentralized style of large law firms. Individual practice groups or countries on their own can launch a research survey and then decide how to release the results they obtain. Then too, it may be that as a firm becomes more familiar with research surveys, it decides to shift how it brings the results to the attention of the world. Marketing functions may have more or less sway over budgets and standards for releasing data results. In the end, however, we must regard these findings as provisional, because further research may shift the composition of the three groups significantly.

Size of law firm correlated to whether it produces a PDF report

As of this writing, I have found 44 law firms that have released at least one set of survey results as a non-report. At least I have not located a report in PDF format. Some of those law firms have produced survey reports in PDF format, which I have. As the comparison set, I have found 88 law firms that have produced at least one survey report in PDF format — what I refer to as a “formal report.” Among those firms, some have produced non-reports also.

My goal was to find out whether smaller law firms, when they sponsor a research survey, are more likely to resort to non-reports. Unfortunately, it is not easy to determine the number of lawyers (or solicitors) in each of the firms. Often precise data exists as is the case for large US law firms. But for firms based in the UK, Ireland and Australia, data sources do not use consistent definitions of who is a practicing lawyer and who is a “legal professional” or “fee earner.” In any event, I did my best to record an appropriate number of lawyers for almost all of the firms noted above.

In short, the underlying data for the following observations is sketchy. So long as we recognize the methodological shortcomings, we can at least look at two summaries.

For the non-report firms, the average number of lawyers in the firm is 1,243; for the firms that produced a formal report, the average fell to 1,024. Considering the median number of lawyers, the non-report firms came in at 700 lawyers while the formal-report firms came in ten percent lower, at 635. Based on these averages and medians, the two groups of firms seems reasonably similar in terms of size as measured by number of lawyers. If anything, my hypothesis appears to be wrong: the non-report firms are larger!

Increase recently in number of surveys without a published report

Of the more than 400 research surveys by law firms that I have tracked, about one-out-of-five of them are known only from a press release or article. I have previously explained some caveats about that group proportion, and I call the information I have located “non-reports.”

In addition to the non-reports, I have learned of another 64 surveys but have categorized them as “Missing.” “Missing” denotes surveys that later surveys refer to but for which I have not located a press release, an article, or any other manifestation — other evidence is missing.

Setting aside the “Missing” reports, has the proportion varied of published reports to non-reports over the past few years? The plot that follows addresses that question. The height of the dark segment on top of each column, which represents the number of non-reports that year, certainly jumped in 2017 and so far in 2018, so it appears that the proportion of non-reports has noticeably increased. To explain one column, the year 2016 represents 40 reports in the bottom, lighter segment and 5 non-reports in the top, dark section.

Why might that be? Perhaps more firms are undertaking surveys, which has brought online somewhat smaller firms, so they don’t have the resources to invest in a report. Alternatively, experience or instinct has led increasing numbers of firms to feel that the return on investment from a report is not sufficient. Then again, perhaps I simply haven’t found the published reports.

By the way, we have seen no evidence that firms issue a press release and considerably later a report. Rather, our sense is that the first publicity about the results of the survey comes simultaneously with the publication of the report and other business development efforts.


More participants in surveys with reports than in surveys without formal reports

Of the 420-some research surveys by law firms that I know about, about one-out-of-five of them are known only from a press release or article. I have located a formal report for all the others. To be forthright, either the law firm or a co-coordinator possibly published the results in a formal report, but so far I have not located the report.

Obviously, it costs less to draft a press release or an article than to produce a report. Reports require design decisions, plots, graphics, and higher standards of quality. Moreover, aside from expense, it also seems plausible that firms choose not to go through the effort of preparing a formal report if the number of participants in the survey seems too small.

To test that hypothesis about relative numbers of participants, I looked at nine survey results — let’s call them “non-report surveys” — that disclose the number of survey participants. I also collected data from 204 formal reports that provide participant numbers. The average number of participants in the non-report surveys is 157. Where there is a report, the average is 420, but that number is hugely inflated by one survey that has 16,000 participants. When we omit that survey, the average drops to 343 — still more than twice the average number of participants as in the non-report surveyrs.

When we compare the median number of participants, the figures are 157 participants in the non-report surveys versus 203 participants in the reports. Thus, the medians disclose one-third more participants where a report has been published than where a report has not been published.

A statistical test looks at whether the difference in averages between two sets of numbers suggests that the sets are likely to have a meaningful difference — here, on participants. With our data, the crucial test value turns out to be so small that we can confidently reject the hypothesis that no difference exists between the two sets in terms of participants. Larger numbers of participants are strongly associated with reported surveys.

Technical note: For those interested in the statistics, we ran a Welch Two-Sample t-test and found a p-value of 0.0331, which means that if someone could sample over and over from the universe of reported and non-reported surveys, only about 3% of the time would such a large difference in averages show up. Such a low percentage justifies statisticians concluding that the data comes from meaningfully different populations (the results are “statistically significant”). Bear in mind that I have not looked at all the non-reports in my collection and that a few more of them added to the group analyzed as described above could potentially change the figures materially and therefore the statistical test. Or there may be a formal report somewhere.

Published reports from law-firm surveys (compared to unpublished)

Most law firms that go through the effort to collect online survey data proceed to publish their results in a report. Almost always those reports are available on the firm’s website in PDF format. Out of the 349 surveys currently collected, 65% of them (227 surveys) are available online in PDF format.  Or, in a few instances, this blogger obtained the reports directly from the law firm and it is possible that they are not available to the public on the firm’s website.

Another 22% of the surveys (78) are evidenced by a Word file created by the author that captures a press release or some other reference to a survey (in computer-speak, PDF = FALSE). Finally 13% are deemed “Missing” where this author knows about the survey, perhaps from a statement in an extant survey, but not even a Word file memorializes it. The survey report is missing in action.

For each column year in the plot, the light, yellow segment at the bottom conveys the number of surveys obtainable in PDF format (PDF = TRUE). The tiny green slivers in the middle represent the number of “Missing” surveys, and the remaining dark, purple segment at the top of each column represents the number of Word files.

It is likely that some of the missing and non-PDF surveys are in fact available in PDF format, but the arduous task of tracking them down and confirming that likelihood has not been completed. Also, we should note, in the last few years we have seen some survey reports published other than in PDF format. Firms have used new graphical-presentation software to create their research reports.