Frequently-used terms

Several terms crop up so frequently here that readers deserve definitions of them as well as mention of alternative phrasings or synonyms that might appear.

  • Survey. Whether online or in hard copy or by electronic voting or during an interview, any questionnaire that a law firm administers to collect information from participants. Sometimes it is referred to as a “poll” or a “straw vote.”
  • Contact. Anyone who is invited to participate in a survey. We may sometimes refer to them as “invitees,” “clients” or “prospects.”
  • Participant. Anyone who starts a survey. A participant who submits answers to a survey is a “respondent.”
  • Respondent. A participant in a survey who submits the survey.
  • Company. The organization of a person who takes a survey. Mostly a a company will be an incorporated entity, but the term also applies broadly to partnerships, not-for-profit organizations, governmental entities and any other entity.
  • Report. The electronic file or hard copy publication that contains a survey’s findings and analysis. Most typically an electronic report is in PDF format. It could, however, be in a Word file, PowerPoint deck or other formats.
  • Text. Whatever is written or listed in the survey’s report.
  • Graphic. A plot or table that displays data. We also refer to them as “graphs.” If a an element of a report does not convey data, then it would be text or a “design element.”
  • Design element. Anything in a report that is neither text nor a graphic, such as borders, images, pictures, lines, shapes, glyphs, or other elements.

Cure ambiguities in selections for multiple-choice questions

When creating the choices for a multiple-choice question, a careful developer will take time to be sure that the choices are as unambiguous as possible. Helping respondents know what each choice means may entail writing a definition of the term. Note that your survey software needs to have this capability or you may have to do it in the body text of the question. Additionally, a conscientious developer will ask several people to vet the choices for lack of clarity before releasing the survey.

The survey conducted by Berwin Leighton Paisner in 2014 1 offers an instructive example of the importance of defining terms. Below you can see the graphical results of the question.

However, the report does not include the actual form of the question asked on the survey so we do not know if any of the choices were defined. If we assume that the question asked something like “What is your role?”, we might further assume that the position choices were simply those shown as the five labels along the x axis at the bottom of the plot. Are each of them clear?

If a respondent were the general counsel for North America of a global company that has a global chief legal officer, which selection is appropriate? If a lawyer admitted to practice is working in the risk or compliance group, should she select that group or “In-house lawyer?” This example admittedly uses titles that are quite commonly included in research surveys, but still the important lesson teaches us that with multiple-choice questions to try to wring out blurriness and varied interpretations of key terms.

A second observation about this particular finding highlights the relatively large number of “Other.” If BLP’s survey included a text box for the person to provide a title not covered by the four given, it would have been better to review those additional titles and create another position or two to account for some or all of them specifically. Without further insight into the positions of respondents who selected “Other,” the category is quite large relative to the remaining four and created an analytic hole if the law firm wanted to analyze responses by position.


  1. Previous posts explain the set of research surveys by 15 law firms from which BLP’s is one.