The Layperson’s Guide to Legal Employment Data

Commentary

Laura LeDuc, Associate Dean for Planning, Assessment and Accreditation | October 24, 2013

A newspaper headline one day reads: U.S. Unemployment Rate Falls to Lowest Point in Three Years. The next day another reads Job Outlook Poor for Recent Graduates. The articles each support their headlines with what appears to be official data from reliable sources, but the official data seems to lead to opposite conclusions about what’s happening in the job market. How is this possible? How is the average reader supposed to know which news item paints the correct picture?

Answering these questions requires readers to think critically, as well as have a bit of background information on the different types of official employment data that is collected in the United States. There are two main sources of legal employment data—government data and non-government data. Let’s take a look at each type of data, as well as some common pitfalls in data reporting, and explore why there is so much contradiction and confusion, and why there are so many misleading conclusions in the headlines we read today.

GOVERNMENT DATA SOURCES: The U.S. Department of Labor-Bureau of Labor Statistics

The most comprehensive and often-cited U.S. employment data comes from one source: the U.S. Department of Labor-Bureau of Labor Statistics. BLS is the federal government’s official collector and reporter of labor and economic statistics, including employment, unemployment, wages, inflation, the consumer price index, and productivity. BLS data is the most “official” you can get, but citing BLS data alone can still lead to confusion, and sometimes, contradictory conclusions. Why? Because the agency collects employment-related data from multiple sources, under different timelines, and with different methodologies. Understanding how, how often, and from whom the BLS collects its data is the key to understanding the data itself.

The main BLS jobs-related data sources are the Current Populations Survey (CPS), Current Employment Statistics (CES), Occupational Employment Statistics (OES), and the Occupational Outlook Handbook (OOH). The information presented below is publically available on the BLS websites listed for each source.

Source 1: BLS Current Population Survey (CPS)

The CPS is sometimes called the “household survey” because it surveys households--about 60,000 households across all 50 states--every month. The purpose of the monthly survey is to determine the labor force status of the civilian non-institutional population 15 years of age and older. It does this by surveying all members of the household who are of appropriate age. While data is collected for 15 year olds, child labor and compulsory school attendance laws severely limit the work they can do, so the CPS reports include only persons aged 16 and older. The CPS reports also exclude active duty military personnel and any institutionalized persons.

According to the BLS website, the CPS data is:

As such, CPS data is the most often reported in the media. The monthly, quarterly, and annual national unemployment rates reported by the federal government are calculated using CPS data. In addition, BLS uses its CPS data for the following purposes:

In digesting CPS employment data, it is essential to understand who the CPS counts as employed or not employed. The Current Population Survey counts as employed all persons 16 and over who:

(1) did any work at all as paid employees, worked in their own business or profession or on their own farm, or worked 15 hours or more as unpaid workers in a family-operated enterprise; and

(2) all those who did not work but had jobs or businesses from which they were temporarily absent due to illness, bad weather, vacation, childcare problems, labor dispute, maternity or paternity leave, or other family or personal obligations — whether or not they were paid by their employers for the time off and whether or not they were seeking other jobs.

CPS counts as unemployed all persons who:

(1) had no employment during the reference week;

(2) were available for work, except for temporary illness; and

(3) made specific efforts, such as contacting employers, to find employment sometime during the 4-week period ending with the reference week. Persons who were waiting to be recalled to a job from which they had been laid off need not have been looking for work to be classified as unemployed.

In a nutshell, CPS counts as unemployed only those who did not work but were actively seeking work during the survey period. The survey process inevitably finds individuals that do not meet either the CPS definition of employed or unemployed. These individuals are categorized as “not in the labor force.” A good example of a person not in the labor force is a stay-at-home mom who chooses not to work outside the home. There is also a sub-group of persons not in the labor force who do want and are available for a job and who have looked for work sometime in the previous 12 months, but who did not look for work during the survey period. These people are designated as "marginally attached to the labor force," and they are further sub-divided into two groups:

(1) those not currently looking because they believe their search would be futile — so-called discouraged workers; and

(2) those not currently looking for other reasons such as family responsibilities, ill health, or lack of transportation.

For discouraged workers, the reasons for not currently looking for work are that the individual believes that: No work is available in his or her line of work or area; he or she could not find any work; he or she lacks necessary schooling, training, skills, or experience; employers would think he or she is too young or too old; or he or she would encounter hiring discrimination.

It is important to keep these definitions in mind when looking at CPS employment data next to data from other sources, since failing to do so can lead to an unwitting apples-to-oranges comparison.

Source 2: BLS Current Employment Statistics (CES)

The Current Employment Statistics are sometimes called the “establishment survey” because each month, the CES program surveys about 141,000 businesses and government agencies in order to provide detailed industry data on employment, hours, and earnings of workers on nonfarm payrolls. The establishments surveyed each month represent approximately 486,000 individual worksites.

From the establishment survey, BLS calculates and reports monthly estimates of nonfarm, payroll jobs in the U.S. economy, work hours, and earnings for the Nation, States, and major metropolitan areas. According to the BLS website, the CES monthly employment series are the first economic indicator of current economic trends each month and are used in computing many gauges of the U.S. economy including:

CES employment data are inputs into other major economic indicators reported by the Bureau:

Since the CES survey recipients are business establishments, the definition of employment differs from the CPS definition. In the CES survey, employment is defined as:

Included as employed are:

Excluded from the data are:

The CPS definition and CES definitions differ tremendously because the surveys target two very different groups. It is up to the informed reader to decide which definition paints the more accurate picture for any particular statistic or forecast reported from these two surveys.

This leads to an excellent question: Why are there two monthly measures of employment?

A clear answer is given in the BLS Frequently Asked Questions, which states:

The establishment survey and household survey both produce sample-based estimates of employment and both have strengths and limitations. The establishment survey employment series has a smaller margin of error on the measurement of month-to-month change than the household survey because of its much larger sample size. An over-the-month employment change of about 100,000 is statistically significant in the establishment survey, while the threshold for a statistically significant change in the household survey is about 400,000.

The establishment survey provides employment estimates for over 900 detailed industries, for the United States, each State, the District of Columbia, Puerto Rico, the U.S. Virgin Islands, and about 400 metropolitan areas. The household survey has a more expansive scope than the establishment survey, because it includes the self-employed, unpaid family workers, agricultural workers, and private household workers, who are excluded by the establishment survey. The household survey also provides estimates of employment for demographic groups.

Source 3: BLS Occupational Employment Statistics (OES)

Like the CES, the OES program surveys establishments. The OES program surveys approximately 200,000 different establishments by mail every six months, which equates to a sample of 1.2 million establishments over a 3-year cycle. Data from self-employed persons are not collected and are not included in the estimates produced with OES data.

According the BLS website, the OES program collects data on wage and salary workers in nonfarm establishments in order to produce employment and wage estimates for about 800 different occupations.

The OES program produces occupational estimates by geographic area, and by industry and ownership. The BLS website states that these estimates are used for:

BLS releases dozens of separate reports on a monthly, quarterly, and annual from the data in the CPS, CES, and OES, plus the Bureau creates 10-year Employment Projections (EP) every two years from compiled data from BLS and U.S. Census Bureau data.

Source 4: BLS Employment Projections (EP) and Occupational Outlook Handbook (OOH)

The Employment Projections and the Occupational Outlook Handbook are two more BLS data sources. The OOH is based on EP, so they are discussed together. The EP program develops 10-year economic, employment, and labor force projections for the U.S. labor market as a whole. The projections are updated every two years, with the most recently being for 2010 to 2020, published in February 2012.

The BLS acknowledges that the OOH data may differ from CPS and OES data, and it offers the following explanation of those differences:

The employment figure given for a particular occupation in the Occupational Outlook Handbook (OOH) is total employment (for all classes of workers) from the Bureau's National Employment Matrix, which combines employment data from several different sources. Data in the matrix come primarily from the establishment-based OES survey, which reports employment of wage and salary workers only, for each occupation in every industry except agriculture and private households.

Matrix data also come from the household-based CPS, which provides information on the number of self-employed and unpaid family workers in each occupation. In addition, the matrix incorporates CPS employment data for all classes of workers—wage and salary, self-employed, and unpaid family workers—in the agriculture and private household industries.

Furthermore, some OOH occupations combine several occupations listed in the matrix. For these reasons, employment numbers cited in the OOH often differ from employment data provided by the OES survey, CPS, and other employment surveys.

The Bureau also notes that its employment projections are based “on analysis of long-term structural changes to the economy, not short-term business cycle fluctuations. BLS does not attempt to project the peaks and troughs of business cycles, and the current BLS projections model assumes a full employment economy in 2020, the target year. The 2010 (base year) employment for many industries still had not recovered to pre-recessionary levels when the 2010-20 projections were developed. This low employment, coupled with an expected return to full employment over the 10-year projections period, means faster growth rates and more numerous openings than might have been expected in these industries and their occupations had the recession not occurred.”

NON-GOVERNMENT DATA SOURCES:

Non-government sources of legal employment data have recently found their way into news headlines, and have produced some of the most confusing, if not most controversial conclusions. There are three main sources of legal employment data: the National Association for Law Placement (NALP), the American Bar Association (ABA) and the various commercial publications that produce law school guides and rankings.

Each of these three sources collects placement data and calculates placement rates very differently. Because of this, the reliability of the data varies from source to source. Let’s look at each source and how its data collection, placement rate calculations, and other factors influence the reliability of the data.

SOURCE 5: The National Association for Law Placement (NALP)

The National Association for Law Placement (NALP) is the primary source of national data on the employment of recent law school graduates. NALP is an independent educational association established in 1971 to meet the needs of all participants in the legal employment process, which includes career planning, recruitment and hiring, and the professional development of law students and lawyers. NALP’s membership includes virtually every ABA-approved law school in the US and Canada and hundreds of legal employers from both the public and private sectors.

NALP began collecting employment data for recent law school graduates with the Class of 1975, and since 2001 NALP has used essentially the same basic format and categories to describe the nature of legal employment. NALP’s employment categories are not established by the law schools or the ABA; they were developed by NALP.

NALP collects placement data about recent law school graduates through a written survey instrument that it releases each fall. Along with the survey instrument, NALP provides law schools with detailed instructions, definitions, and best practices for administering the survey, and, by virtue of the published principles and standards law schools agree to through their membership in NALP, law schools agree to follow the survey instructions. Once law schools receive the survey instrument from NALP, they administer it to their recent graduates in both paper and online formats until the survey deadline. Because the employment data comes directly from the graduates, the NALP survey data is similar to the Bureau of Labor Statistics Current Population Survey.

Once the survey period is complete, the law schools report the raw data they receive from their graduates to NALP. The schools themselves do not calculate any employment rates. Note that NALP data is self-reported by the graduates themselves. Law schools are not able to mandate submission of employment information from their graduates, but despite this lack of mandate, over the past ten years, NALP’s data has included the employment status for between 95% and 96% of the graduates of the reporting schools and between 91% and 93% of all ABA graduates. From a research statistics standpoint, the higher the response rate to a survey, the more confidence one can have in the accuracy of the resulting data.

Once schools submit their raw data to NALP, NALP aggregates the data and, each August, NALP releases a publication, entitled Jobs & JDs, which reports detailed national findings on the surveyed class. Included in this publication are NALP’s findings on:

NALP also provides each school with a summary of similar findings for its own graduates.

The main NALP data reported in the media relates to the calculated national employment and unemployment rates. NALP calculates the employment rate by dividing the number of graduates known to be employed by the number of graduates with known employment status. For the Class of 2011, those with known employment status represented about 97% of all graduates reported to NALP and about 94% of all graduates of ABA-approved law schools.

The employment rate calculation method NALP uses has been criticized in the media, but this criticism is unjustified, given the data reliability resulting from an excellent survey response rate. Critics advocate that the employment rate should be calculated by dividing number of graduates known to be employed by the total number of ABA graduates. Using the critics’ proposed calculation method assumes that every graduate who did not report placement data is unemployed, which is a virtual impossibility because a large percentage of the missing 6% of graduates represent the entire graduating classes from law schools who do not report any data to NALP. There is no legitimate basis to conclude that 100% of the graduates from, on average, three non-reporting schools were unemployed or to believe that these non-reporting schools would have unemployment rates outside the range of the reporting schools. An examination of the placement data from these missing schools--which schools are required to report to the ABA and which is published the ABA-LSAC Official Guide to Law Schools--clearly demonstrates that the employment rates for the missing schools closely approximate that of the graduates of the reporting schools.

In news articles, NALP’s calculated employment rate is often erroneously reported in the inverse, meaning that NALP reports the national employment rate but the media reports the remaining percentage as the unemployment rate. This is not the case, and is the main reason NALP’s data can be confusing and misleading. Recent graduates whose employment status is known but who are not listed as employed are categorized by NALP in one of three categories:

Graduates who end up in NALP’s Unemployed-Not Seeking and Enrolled Full-Time Advanced Degree categories fit the BLS Current Population Survey definition of “not in the labor force” and as such should not be included in calculating the unemployment rate.

Below is an example of how reporting the inverse of the 2010 NALP employment rate is misleading. NALP reported that 87.6% of 2010 graduates were employed, but the news article reported in the inverse, stating that 12.4% of recent law school graduates were unemployed. The actual unemployment rate for the Class of 2010, using the BLS definition, was 6.2%--half the rate reported in the news article.

NALP 2010 GRADUATE DATA % of Grads with Known Employment Status
Graduates Known to be Employed 87.6%
Graduates Enrolled Full Time in Advanced Degree Program 2.9%
Graduates Unemployed-Seeking Work 6.2%
Graduates Unemployed-Not Seeking Work 3.2%

Given the consistency of the format and the high response rate over the years, NALP’s data is trustworthy, and it is especially valuable for identifying trends and making comparisons. Unfortunately, the way NALP’s data is reported in the media can be misleading, so readers should use a critical eye when scanning the headlines and blogs.

SOURCE 6: The American Bar Association (ABA)

The American Bar Association (ABA) is the only other official source of national employment data on recent law school graduates. While the ABA publishes placement data by school, it does not calculate any employment or unemployment rates or other statistics about individual schools or law schools collectively. The ABA also does not issue and findings, estimates, or projections based on the data. Essentially, only the raw data is available, so you are unlikely to see many news articles quoting ABA data. It is an excellent resource for research.

Through 2010, the reporting categories used by the ABA mirrored NALP’s categories, and schools were asked to submit their employment data as they reported it to NALP in their annual reports to the ABA. Except for the handful of schools that do not report to NALP, the NALP data and ABA data are essentially the same, with the main difference being that NALP publishes the data in the aggregate while the ABA publishes the data by individual school in its ABA-LSAC Official Guide to Approved Law Schools. Up through the 2012 Official Guide, the data was presented generally in the same format each year, allowing for easy comparisons and trend analyses for individual law schools.

Beginning with the Class of 2010, the ABA changed how law schools were required to report their placement data. Law schools now must submit data on each individual graduate directly to the ABA rather than submitting NALP’s calculated placement rates for the school. The ABA also issued its own instructions and definitions, which generally match those of NALP, but there are minor definitional differences in some categories. The ABA has the same three sub-categories that define graduates who are not employed that NALP does: Enrolled Full-Time in Advanced Degree Program, Unemployed-Seeking Work, and Unemployed-Not Seeking Work.

The ABA also no longer publishes placement data in the Official Guide, instead offering it in electronic format on a dedicated website (listed above).

SOURCE 7: Rankings Publications and Guides

All of these commercial publications are secondary data sources. Law schools report data to these publications on a voluntary basis. The publications are under no obligation to report accurately and do no auditing of the data submitted by the schools. As such, they are the least reliable sources of legal employment data.

These publications generally ask schools to submit the same data as they do to NALP or the ABA. The major difference is that some publications take data submitted by the schools and calculate placement rates using a different formula than NALP uses. For example, until its 2011 law school ranking, US News and World Reports chose to arbitrarily count 75% of graduates with an unknown employment status as unemployed. This is a statistically invalid assumption, especially in light of the known data, in which there is a high degree of confidence because the sample size is greater than 95% of the graduating class population.

The current U.S. News employment rate calculation--which it asserts in its methodology as a more realistic presentation of the employment data—divides the total number of employed graduates by the total number of graduates, regardless of whether their employment status was known. This calculation method results in an artificially low employment rate because it assumes that every graduate with an unknown employment status is unemployed, which as refuted above, cannot possibly be true.

Readers should be wary of any legal employment data presented in these publications, especially those presenting employment and unemployment rates. Readers should ignore these commercial publications and obtain more accurate and reliable legal employment data directly from NALP or the ABA.

Additional Data Pitfall #1: Devil is in the Details

Industry-specific CPS, CES, and OES data is often cited in news articles, and this can lead to confusion because the level of detail being reported often differs among the three. The CPS, CES, and OES programs all use classifications from either the North American Industry Classification System (NAICS, the system used by the U.S. Census Bureau) or the Standard Occupational Classification (SOC, used by the Bureau of Labor Statistics and the Office of Management and Budget). The major difference between the two is fundamental--NAICS classifies establishments, while the SOC classifies worker occupations.

In media articles it is rarely clear which system has been used, or what level of detail is being reported. For example, how is a reader to know whether a news article that simply recounts job losses in legal services is talking about the NAICS “Legal Services” industry classification or the SOC “Legal Occupations” occupational classification or what types of law-related jobs are counted in each? Not knowing the underlying classification system (or sub-classification therein) can lead to misleading interpretations and overly simplified conclusions.

In the above example, it is completely possible for a subset of “Legal Occupations” to have a very different trend than that reported for the classification as a whole. How? Under the SOC heading “Legal Occupations” are two very different sub-groups:

(1) Lawyers, Judges, and Related Judicial Workers, and
     (a) Lawyers and Judicial Law Clerks
     (b) Judges, Magistrates, and Other Judicial Workers (Mediators, Arbitrators, etc.)
(2) Legal Support Workers
     (a) Paralegals and Legal Assistants
     (b) Miscellaneous Legal Support Workers

The first group requires a completely different educational background than the second group, and workers in each group tend to have very different pay scales. Because of these distinctions, changes in the economic climate affect each group differently. The number of workers in each sub-group, relative to the overall classification, also impacts the bottom line.

A news item reports that the legal sector gained only 54,000 jobs from 2010 to 2011, and based on this CPS statistic, the article either implies or directly concludes that the job market for lawyers is weak. When you look at the CPS detail behind this headline, the article is misleading because it fails to show that lawyers, judges, and paralegals and legal assistants categories collectively gained 105,000 jobs during that same period, but that number was offset by 50,000 jobs lost by miscellaneous legal support workers. In reality, jobs gains for lawyers, judges, and paralegals/legal assistants were nearly double the reported number.

Broad statements about wider industries should be a signal for the reader to look deeper into the details behind the headline. This is especially true if conclusions are drawn about sub-groups based on the wider group, but there is no accompanying data for any sub-groups.

Additional Data Pitfall #2—Definition of Employed

Some non-government data sources and media articles based on them claim that lawyers or recent law grads working as solo practitioners should not be counted as employed. Some of these sources even question whether those working in small law firms should be count as employed. These claims are shocking when you consider the fact that, according to the most recent stats from the American Bar Association, 49% of all lawyers in private practice in the U.S. are solo practitioners. Another 14% work in small firms of less than six attorneys, so that means 63% of all lawyers practicing in private firms in the country work in small firms—this is nearly half of all lawyers in the country.

Excluding solo practitioners and small firm lawyers from the employment definition is even more illogical in light of changing trends in the lawyer-client relationship. Increasingly over the past decade, clients are turning toward small, boutique law firms and away from the big firms. Today’s clients are seeking both small-firm expertise and the more cost-effective solution to legal representation that small firms provide. This demand for small-firm services is higher than ever, and the market is still adjusting to meet this increased demand, so many of the available jobs are at small firms.

Another claim made by some non-government sources is that anyone employed anyplace other than a traditional law firm should not count as employed. They use this argument in an attempt to justify the conclusion that obtaining a law degree is no longer a worthwhile or financially lucrative pursuit. This is another argument that defies both logic and historical data. Earning a JD degree has never been only about working in a law firm. Those with JD degrees can and do succeed in a wide variety of settings, and have done so since NALP began tracking the data in 1975. It is precisely because the skills that JD earners possess are so versatile that the legal profession has been able to weather slow economic periods better than almost every other profession.

Following the premise that only those employed in traditional law firms should count as employed would exclude anyone employed in government, non-profit groups, academia, and in virtually every type of business and industry. Following that logic would exclude the following people, none of whom anyone could argue did not have successful careers or hold financially lucrative jobs:

More business and universities are hiring lawyers in their most prestigious administrative posts than ever before, and this is because the skills lawyers possess match the skills these complex positions require. It should be noted that, with the exception of the Jerry Sandusky prosecutor example, none of these jobs requires a JD degree; they are in a category known as “JD Advantage” or “JD Preferred.” Some non-government data sources and media outlets may not consider people working in these types of jobs as employed because these positions do not require bar passage.

Additional Data Pitfall #3—Conclusions Based on Data that Lack Historical Context

Media articles, especially headlines, often reach conclusions that are based on an inadequate amount of data to support those conclusions. For example, a news article states that the job market has “plummeted the last three years.” Often missing from this type of conclusory statement is any analysis of how the data compares in a wider context. Any article that reaches conclusions about or that analyzes three years’ worth of data without providing context for that data should be viewed skeptically. Words like “plummeted” beg the question “plummeted from what?” Is the starting point of the downward trend a historical high or an average year? Was it a typical year when compared to the last ten, twenty, or thirty years? It is also important to know how the data in the stated trend period compares to the mean and median for the same period and for longer periods that encompass the trend years.

What seems to be drastic reduction over three consecutive years could, in fact, be market adjustment to the previous five years of increases. In fact, this is the case for legal employment if you take the three consecutive decreases in employment that occurred for 2008, 2009, and 2010 graduates. This three-year “trend” makes it appear as though the proverbial sky is falling, but when you look at these same three years in a wider context, their impact is less apocalyptic. Pictures are worth much more than words in illustrating this point:

Chart 1: The 3-Year Data

NALP 3-Year Employment Data

This chart isolates the data and fails to give any comparative context. It also fails to show that 2007 was a 10-year high.

Chart 2: The 3-Year Data in a 10-Year Context

NALP 10-Year Employment Data

The 10-year chart provides more context than the three-year chart because it illustrates that 2007 was a 10-year peak and it shows the employment rate each year compared to the 10-year mean. However, in instances where you are trying to analyze how the current legal employment market compares to those during and following recessions, there is still insufficient historical context in the 10-year chart.

Chart 3: The 3-year Data in a 36-Year Context

NALP 36-Year Employment Data

The 36-year chart provides the greatest context for analyzing the data. It includes multiple upward and downward cycles for comparison and illustrates how the 3-year data looks in comparison to historical highs and lows. This chart also reveals something important that neither of the previous two charts shows—the historical low. It shows the frequency and severity of the downward trends and provides information on how the market rebounded after each; this allows readers to compare see how the current downward trend compares. Consumers should understand that without context, most data is meaningless and can be easily manipulated to support a desired conclusion. Any data analysis, trends, or conclusions based on data that lack adequate context should be questioned.

Additional Data Pitfall #4—The Percentage Game

Understanding that percentages, such as employment and unemployment rates, can be calculated many different ways is vital to analyzing data reported in the media. Be wary of any article that cites a percentage without showing you how that percentage was calculated. Also be wary of any article that provides only a portion of data and leaves the reader to calculate a percentage based on that partial data.

For example, a news article that accurately states the ABA’s reported number of employed graduates and the total number of graduates for any given law school but fails to provide the schools reported number of graduates for whom employment status is known is asking the reader to calculate their own percentage based on a statistically invalid assumption. Unless a school knows the employment status of 100% of their graduates, any percentage calculated using the total number of graduates as the denominator instead of the total number of graduates for whom employment status is known will result in a lower employment rate because it assumes that every graduate with an unknown status is unemployed.

To illustrate with numbers: Law School A’s data is reported in an article as 100 total graduates and 50 employed. Missing is the reported number of graduates with known employment status, which is 75. The article leaves readers to do the math based on only the 100 total grads and 50 reported employed. The unwitting reader incorrectly concludes that Law School A has 50% employment rate. The employment rate should be calculated by dividing 50 by 75, yielding a 66.7% employment rate because, as discussed above, there is no statistically valid reason to conclude that the graduates who did not report their status are all unemployed.

CONCLUSION

After reviewing the fundamental differences between the seven main sources of legal employment data presented here, it is easy to see why dissecting it can be so confusing. The best way to combat confusion is to read critically, examine the source of the data, and look further for detail when general statements are made. This guide has borrowed liberally from, paraphrased, and often directly quoted the language found on the source websites, particularly the methodology portions of those sites. Quotations and footnotes were not used to enhance readability. The relevant websites are identified in each section, and the methodology sections are easily navigable from those main sites. These websites are excellent resources to check for further details and explanations.

In general, the BLS household survey (CPS) and the NALP data on recent graduates are reliable data sources on lawyer employment and unemployment. They are both particularly useful resources for analyzing industry trends, and because both sources collect data directly from individual workers, looking at the BLS and NALP trends together can assist with understanding the whole legal employment picture.

The ABA is the only source of data available for school-by-school analysis. Each school’s data is reported publicly as a raw number, so any article that calculated placement rates based on ABA data have not been calculated by the ABA but by the author of the article.

In conclusion, here are a few key things to remember when analyzing legal employment data:

  1. Be an informed consumer—make sure you understand which data source is being quoted, check the accuracy of the data by going directly to the source.
  2. Unemployment is not the reciprocal of employment. The entire picture is made up of four groups—the known employed, the known unemployed job seekers, those known to be not in the labor force (those who are unemployed and not seeking work plus those in graduate school full time) and those with unknown status. These definitions hold true even for figures broken down by job type.
  3. Determine whether a given unemployment figure includes only those who are actually in the job market. Remember that the official Bureau of Labor Statistics national unemployment rate does not include persons who are not actively seeking work, so to get an apples-to-apples comparison to NALP’s data, those who reported that they were attending graduate school full time or were not seeking work should be excluded from the NALP unemployment figure.
  4. Determine whether the data figures, trends, or conclusions for a broad group are the same for each sub-group within it. (Pitfall #1)
  5. Determine whether any categories of jobs are being excluded from the total number of employed. (Pitfall #2)
  6. Determine whether conclusions or trends based on a few years worth of data have adequate context to support those conclusions. (Pitfall #3)
  7. Determine how any percentages have been calculated and whether those percentages were calculated by the data source or by the author of the article. (Pitfall #4)
  8. For any news item, analyze the overall adequacy of detail provided, the context provided for any conclusions, and whether or not the author has calculated any percentages fairly and logically; if any of these areas is lacking, you should question the article’s reliability.
  9. Finally, use common sense and think critically—asking whether the author or media source has a particular bias or goal that they are trying to achieve is often the only question that needs to be answered.