An Examination of the American Coal Industry (Part 1)

Thomas Shelby
38 min readJan 10, 2023

--

It is often said that the American Coal Industry is a prime example of worker exploitation — workers often had to work long hours under harsh conditions and were subjected to a lifetime of debt due to the fact that it was said that their wages were eaten up by the company stores in the company towns they lived in.

However, I will seek to argue in Part 1 that contrary to what is often said about the coal industry — improvements in productivity and the accumulation of capital allowed for the increase of wages and better working conditions.

Company stores which will be addressed in Part 2, most notably were the best alternative in terms of access to goods compared to independent stores — the prices charged for the same basket of goods were similar. The same could be said of company towns which varied in quality, but nonetheless served the function of providing adequate accommodations for coal miners.

Wages in the Coal Industry

In writing about the coal industry, Economic Historian Price Fishback states in Soft Coal, Hard Choices: The Economic Welfare of Bituminous Coal Miners, 1890–1930 that wages were quite high relative to other sectors of the economy in order to compensate for the amount of risk that came with the job, and that the wages in the industry were highly correlated with the change in coal prices:

Bituminous coal miners in the early 1900s are commonly perceived as receiving low pay for dangerous work in isolated regions. The perception is only partially right. Estimates from various sources show that hourly earnings in coal mining were substantially higher than in manufacturing until the late 1920s. However, the mines typically were open seventy fewer days a year than manufacturing concerns, leaving annual earnings in coal about the same or slightly lower than in manufacturing. High hourly earnings in coal mining helped compensate miners for accepting greater risk of injury, a limit on work opportunities, and living in an isolated area. On the other hand, a worker had to face the rigors of work seventy more days in manufacturing just to earn the same or slightly higher income he could earn in coal mining. Further, a worker had far more independence in making workplace decisions in coal mining than in manufacturing.

Since the United Mine Workers of America (UMWA) was among the most successful unions in the early 1900s, the story of the coal industry has often been told in terms of union struggles. The UMWA has been given credit for much of the improvement in coal wages during the boom in coal mining through World War I. Many see the UMWA’s demise as the cause of the decline in wages during the 1920s. The timing of the rise and fall of the U.S. average coal earnings and the UMWA seems so right that few question this description. Yet there were other secular trends that may be as important if not more so. Labor demand in the coal industry rose and fell with changes in coal prices, and coal companies were forced to match the secular rise in manufacturing wages if they wanted to continue to attract workers. Regression analysis of the U.S. average hourly earnings in coal mining helps sort out the relative importance of the union and these other factors.

(studies that support occupational hazard wage premium: here, here, here, here, here)

Source: Soft Coal, Hard Choices
Source: Soft Coal, Hard Choices
Source: Soft Coal, Hard Choices

In the early 1890s, coal hourly earnings were roughly 11 to 13 percent higher than manufacturing earnings before plunging to 11 percent less in 1897. As the United Mine Workers established their base in the Central Competitive Field, coal hourly earnings jumped to 28 percent above male manufacturing earnings by 1900.

The tremendous coal boom in the early 1900s kept coal earnings between 35 and 44 percent higher than manufacturing earnings until 1913. From 1913 through World War I coal hourly earnings stayed around 30 percent higher than manufacturing earnings. Surprisingly, during the period of labor strife and the severe economic downturn of the early 1920s, coal mining hourly earnings rose relative to manufacturing hourly earnings. Although 1921 and 1922 brought significant drops in coal annual earnings, hourly earnings rose to levels 71 and 82 percent greater than manufacturing earnings. As coal demand stagnated and the strength of the United Mine Workers waned, coal hourly earnings dropped relative to manufacturing hourly earnings, bottoming out at 3 percent less in 1933…

The gap between coal and manufacturing earnings in Table 6–2 is smaller than in Table 6–1 in part because the Conference Board’s survey was skewed toward larger firms in the Northeast. However, both tables show the pattern of a sharp rise to 1922 and then a decline through the rest of the 1920s. Until the late 1920s, coal hourly earnings were higher than earnings in such industry classifications as automobiles, iron and steel, and foundries and machine shops.

In fact, coal earnings exceeded the earnings of all the Conference Board industries except printing until the late 1920s. Comparisons so far may be misleading, because the skills required in various industries differed. We can control for skill differences across industries by examining the earnings of the unskilled workers listed in Table 6–2. Unskilled workers generally had more mobility between industry than other workers because they were less likely to have skills specific to only one industry. Inside laborers and outside laborers in coal mining definitely were unskilled.

Most loaders received minimal training, much of which was teaching the man how to stay alive while loading coal. Many saw the replacement of pick miners with cutting machines and coal loaders as a major form of deskilling of the coal miner’s task.

However, the large differences in earnings between loaders and inside laborers in Table 6–2 suggests that loaders were more than unskilled workers. Even if coal loaders were semiskilled workers, the comparisons of their earnings and those of unskilled manufacturing workers are instructive. A coal loader who switched to manufacturing was likely to start as an unskilled worker.

Depending on the level of risk and the position that an individual played in coal mining operations, there was a wage compensation differential for those who were exposed to more safety hazards as detailed below. Coal mining compared to manufacturing paid more, and the cut that they would receive in pay by switching to another job such as factory work detered people generally from switching jobs:

The gap between the coal and manufacturing earnings of unskilled workers is wider than the gap for all workers. In 1922 outside coal laborers, the lowest paid group of unskilled coal workers, earned 61 percent more than unskilled manufacturing workers. Yet outside laborers were only a small fraction of the unskilled workers at coal mines. Working inside increased the laborer’s earnings at least 10 percent, so that he earned 13 percent more per hour than the typical unskilled manufacturing worker in 1922; the advantage fell to 11 percent in 1929. The inside-outside difference is probably a premium for accident risk, since working inside was significantly more dangerous than outside the mines. Working as a coal loader raised the unskilled coal worker’s earnings another order of magnitude as coal loaders earned 207 percent of the male manufacturing unskilled wage in 1922, although the difference fell to 122 percent by 1929. Clearly for unskilled workers and for loaders, the tremendous cut in hourly pay was a major obstacle they faced in moving into manufacturing.

However, if the average hourly wages that coal miners received were generally competitive — why is it that it is often claimed that coal miners were underpaid? Fishback explains that annual salaries were often subject to constant change by external factors such as economic downturn and demand for coal:

The workers’ earnings experience in coal mining depended heavily on the period in which they worked because coal miners experienced tremendous fluctuations in their relative fortunes. Some fluctuations were caused by external forces. Economy-wide downturns caused greater declines in annual earnings in mining than in manufacturing. During the depression of the 1890s Greenslade’s lower-bound estimates of coal earnings fell below 70 percent of manufacturing earnings.

In the Great Depression, miners’ annual earnings fell to lows of 50 to 60 percent of male manufacturing earnings. Miners experienced milder relative drops during the recessions of 1904, 1907–08, 1914, and 1921. The increase in coal demand during World War I caused the miners’ relative fortunes to peak, as their annual incomes rose between 13 and 28 percent above incomes for male manufacturing workers. Some fluctuations were caused by internal strife in the coal industry, as miners struck for higher wages or better working conditions in the face of the owners’ intransigence. Major strikes in 1897, 1919, and 1922 caused large drops in coal annual earnings. Smaller strikes also contributed to reduced working time and reductions in annual earnings.

Another factor that often played a huge role in determining the wages of coal miners was the time constraints or limited working time. Coal miners on average worked less days and less hours than workers in manufacturing as Fishback points out:

In average years while most manufacturing concerns worked 270 to 300 days a year, the typical coal mine was open only 200 to 220 days a year due to overcapacity, seasonal fluctuations, and problems in obtaining railroad cars. Thus, on average the worker’s annual earnings in coal mining were 92 percent to 104 percent of annual earnings in manufacturing. The same general pattern of higher hourly earnings, more limited working time, and the same or lower annual earnings in coal mining compared with manufacturing also was present within the major coal states… Assume the worst, that coal miners earned on average 8 percent less each year than manufacturing workers. Would workers freely choose coal mining over manufacturing?

High hourly wages served to compensate coal workers for constraints on their working time caused by layoffs when the mines were closed. Consider a year like 1915, when Greenslade’s lower-bound estimate of coal annual earnings was at the average of 92 percent of manufacturing earnings. In manufacturing average annual male earnings were $625; workers worked 279 nine-hour days for 24.9 cents per hour. In coal mining the mines were closed many more days so that coal workers averaged only 203 8.6-hour days for the year. Say coal employers had tried to pay the manufacturing hourly wage of 24.9 cents an hour. They would have faced enormous problems in trying to attract workers because coal workers would have earned only $434 for the year, while being laid off intermittently the rest of the year.

To attract enough workers to meet coal demand, employers had to raise hourly earnings to 33.1 cents per hour, which gave coal miners annual earnings of $577. Coal employers did not find it necessary to raise hourly earnings enough to set coal annual earnings equal to manufacturing annual earnings because workers were willing to give up some income for a significant reduction in the time they endured at work. This willingness to make an “income-leisure tradeoff” is a standard subject in labor economics textbooks. In 1915 the worker who chose coal mining over manufacturing gave up $48 or 8 percent in annual income, but in return his rigorous workload was reduced by seventy-six days as well as almost an half hour each day…

But even if coal miners worked the same number of days, they would still be paid more due to the risks of being a coal miner:

If the coal operators had been able to cut down on layoffs and keep the mines open longer, they would have been able to pay lower wages and still attract coal workers. Coal employers sought to pay the lowest hourly earnings that still allowed them to hire a full work force. If mines had been open the same number of days as manufacturing firms, coal employers would have paid an hourly wage much closer to the manufacturing wage of 24.9 cents per hour.

Coal employers probably could not have gotten away with paying the manufacturing hourly wage because high coal hourly earnings also compensated miners for accepting more accident risk and for living in isolated communities. Table 6–3 shows that in 1926 bituminous coal mining paid higher hourly earnings than any other industry to unskilled workers, the workers who were most likely to be able to switch industries with little loss in earning power. In part the high earnings compensated coal workers for working at least nine fewer hours per week than in any other industry. The measures of accident risk listed by industry in Table 6–3 show that coal mining was more dangerous than any manufacturing industry. State workers’ compensation funds forced industries with higher fatal and nonfatal accident rates to pay higher premiums.

In West Virginia bituminous coal mines paid $2.10 in workers’ compensation premiums for every $100 in wage payments, 60 cents more than the next highest industry in the table and more than double the premiums paid in most industries. The coal industry’s relative record for fatal accidents was far worse. U.S. bituminous mines experienced two fatal accidents per million man hours worked. The next highest industry was lumber and mill work at less than one-fourth that level…

Source: Soft Coal, Hard Choices

Workers clearly responded to the changes in relative wages in mining and manufacturing. Greenslade notes that when coal hourly wages rose relative to manufacturing wages, coal employment expanded. This was particularly true during the early 1920s. Coal hourly earnings reached levels 80 percent above those in manufacturing at the time. Despite a drop in days worked below 150 in 1920 and 1921, and thus low annual earnings, coal employment expanded. When coal hourly earnings fell from that peak, the exodus from coal mining was huge. Coal employment fell by over 28 percent from over 700,000 in 1923 to 500,000 in 1929.

Contrary to what many historians might say, Fishback argues that the rise and fall of United Mine Workers union cannot explain the rise and fall in hourly earnings, but rather other factors such as competition with manufacturing sector, coal prices and amount of days available to work in a year played an important role in dictating wages alongside unionization:

Real hourly earnings in coal rose 91.5 percent from 1890 to 1929, while real annual earnings rose 48 percent. Neither trend was steady. Real hourly earnings rose 169 percent from 1890 to 1923. Annual earnings peaked about three years earlier, as recessions and labor strife in the early 1920s reduced days worked, and artificial shortages drove coal prices and hourly earnings higher. Both annual and hourly earnings declined after 1923, reaching a trough in 1933. Annual earnings finally reached their World War I level again during the late 1930s.

Labor historians and many contemporaries attributed the rise and fall in coal hourly earnings to the rise and fall of the United Mine Workers. Although union strength played an important role, other factors are of equal and possibly greater importance. The derived demand for labor in coal mining was stimulated by increases in coal prices. As manufacturing wages rose over time, the miners’ opportunity cost rose, forcing coal employers to pay higher wages to attract miners. Further, fluctuations in days the mines were open affected wages by forcing coal employers to raise hourly earnings when days fell and allowing them to lower hourly earnings when days rose.

Source: Soft Coal, Hard Choices

Increases in coal hourly earnings were associated with increases in coal prices, increases in union strength, and increases in the opportunity cost wage in manufacturing; all coefficients are statistically significant at the 90 percent level. Consistent with the discussion on the miners’ choice between mining and manufacturing, the days coefficient suggests that coal earnings rose more when the mines were closed more days. Decompositions in Table 6–4, based on the coefficients above, show that changes in union membership were not the only important determinant of the longterm rise in coal hourly earnings. The primary cause of the near doubling in coal hourly earnings over the entire period from 1890 to 1929 was an increase in the opportunity cost wage in manufacturing, which accounted for roughly 66 percent of the rise. Increased membership in the United Mine Workers, starting at 5 percent of the work force in 1890 and ending at 31 percent in 1929, contributed only 36 percent to the rise in coal earnings. Meanwhile changes in real coal prices had small effects on the long-term rise in hourly earnings.

Source: Soft Coal, Hard Choices

Analysis of the long-term trend understates the impact of the UMWA on earnings in important subperiods of the hand-loading era. Changes in union strength had much stronger effects in the first and last subperiods in Table 6–4 than had the other explanatory variables. From 1890 to 1902, membership in the UMWA rose from 5 percent of the bituminous work force to roughly 50 percent, as the union established its major stronghold in the Central Competitive Field after a successful strike in 1897 and 1898. This gain in strength more than explains the rise in real hourly wages, and the union contribution to the rise in coal earnings is more than twice as large as the combined effects of the increases in manufacturing wages and the coal price. A sharp drop in union strength from 68 percent to 31 percent of the work force also contributed the most to the 52 cent drop in coal hourly earnings from 1923 to 1929, accounting for 61 percent of the decline. Meanwhile, a sharp drop in coal prices contributed another 36 percent to the decline. Coal earnings fell despite an increase in the miners’ opportunity cost earnings in manufacturing.

Changes in union strength were not the primary contributor to the rise in coal earnings in the remaining subperiods. From 1902 to 1913, while the UMWA’s relative strength remain unchanged, an increase in manufacturing earnings explains 80 percent of the rise in coal earnings. As coal hourly earnings soared from 1913 to 1923, the rise in manufacturing earnings contributed about 23 percent, while the rise in union strength from 50 to 68 percent of the work force contributed about 20 percent, and an increase in coal prices contributed 13 percent. If changes in coal prices were caused by changes in unionization, our measure of the impact of unions may be understated. Yet, even under the extreme assumption that all changes in coal prices were caused by changes in unionization, the relative importance of unionization and manufacturing earnings in explaining coal earnings in the various time periods is not changed much.

Source: Soft Coal, Hard Choices
Source: Soft Coal, Hard Choices

The U.S. average wage for coal mining disguises substantial variation in coal hourly and annual earnings across states and coal districts:

Table 6–5 contains the Bureau of Labor Statistics’ (BLS) estimates of hourly earnings of loaders, who accounted for roughly 45 percent of coal workers and received earnings around the mean for all coal workers. The BLS estimates are the most accurate available for piece rate workers because they are compiled from payroll surveys where the time spent in the mine by piece rate workers was explicitly measured.

Throughout the early 1900s, coal workers consistently earned the most per hour in the Central Competitive Field states of Illinois, Indiana, and Ohio, where the UMWA was strongest. They earned the least in the southern, mostly nonunion states of Alabama, Tennessee, and Virginia. Colorado was the exceptional nonunion state with wages above the national average, at times ranking among the top three states. West Virginia’s ranking is particularly interesting. West Virginia was the site of several major struggles between the miners’ union and coal employers, including all-out warfare in 1912–13 and 1919–21. UMWA rhetoric described West Virginia miners as virtual slaves, while coal operators claimed that their miners earned as much as union miners. Neither side was very accurate. West Virginia miners earned less per hour than miners in the Central Competitive Field, but they fared well relative to manufacturing workers. During the 1920s hand loaders in West Virginia earned about the national coal average per hour, and therefore more per hour than the typical manufacturing worker. Earlier in 1902 and 1909, West Virginia miners earned less per hour than the national coal average but still more than the average U.S. manufacturing worker; see Table A-l in Appendix A. The high hourly earnings help explain why so many miners stayed in West Virginia despite the wage cutting and labor strife of the 1920s.

Source: Soft Coal, Hard Choices

Despite lower daily earnings, coal employers in nonunion districts in West Virginia and Pennsylvania claimed that their workers were better off than union workers. They argued that nonunion workers, by avoiding collective action, had more work opportunities and earned more annually. Note the irony. Coal employers emphasized high hourly earnings in mining-manufacturing comparisons, but within the coal industry nonunion employers emphasized the advantages of greater work opportunities in nonunion-union comparisons. We can examine the nonunion operators’ claims with two measures of annual earnings for tonnage men and daymen reported in Table 6–7. The earnings for full-time workers are earnings of men who worked in all twenty-four payroll periods. Due to high turnover in 1921, full-time workers generally accounted for 10 to 50 percent of average employment, as shown in the far right-hand column. The annual earnings of the average worker is the product of earnings per start and average starts per man.

Annual earnings comparisons for daymen generally deny the nonunion operators’ claims. Union daymen generally received both higher hourly earnings and higher annual earnings than nonunion daymen. This was true when comparing earnings in the union states of Illinois, Ohio, and Indiana with earnings in nonunion Pennsylvania and West Virginia. It was also true in three of four comparisons of annual earnings in union and nonunion districts within West Virginia and Pennsylvania.

The nonunion operators found mixed support for their claims of higher annual earnings among the different measures of annual earnings for tonnage workers. Consider full-time tonnage workers in the eastern United. States. The highest fulltime earnings are found in nonunion Maryland and the union districts in Illinois, Indiana, and Ohio. Relative to the top union districts, the nonunion areas of Pennsylvania and West Virginia fared poorly. However, within both states full-time earnings were higher in nonunion than in nonunion areas. Of course, full-time earnings focus only on workers who stayed at the same mine all year. The nonunion districts in West Virginia and Pennsylvania fare better in comparisons of the annual earnings estimates for the average worker. Annual earnings in nonunion West Virginia exceed annual earnings not only in union West Virginia but also in Indiana and Ohio. Earnings in the Pennsylvania nonunion districts were higher than in the Pennsylvania union districts, although still slightly below the earnings in Indiana and Ohio.

However, even though the previous tables showed that union districts paid more in wages, when an analysis was done calculating the impact of unionization — it turns out as a variable, unionization was smaller in its impact on wages:

While Tables 6–5 and 6–6 show that union districts generally paid higher wages, a multivariate analysis shows more clearly how much of the wage differences across states are directly attributable to unionization. Collective action by miners helped to raise wages not only through union membership but also through strike activity. Labor demand theory and numerous wage studies show that hourly earnings are functions of the final price of the product and the productivity of workers. Wages also tend to rise to compensate workers for reductions in work opportunities, higher accident rates, and lower payments to injured workers in the absence of workers’ compensation.

Table 6–8 contains the coefficients of a standard reduced-form wage equation. The regression sample is a pool of state averages in twenty-three coal states for the years 1912 to 1923. The wage is the average hourly wage rate for inside daymen, deflated to 1967 dollars. A listing of the wages is in Table B-l in Appendix B. Although most workers were tonnage men, the piece rates reported for them were not easily comparable, due to differences in mine conditions within and across states. Since both daymen and tonnage workers were hired in the same labor market, the hourly earnings for the two were generally highly correlated. However, the union effect may be stronger in this sample of daymen’s wages than was actually the case for all coal workers. Tables 6–6 and 6–7 show that daymen fared relatively better than tonnage men in union districts. Working in the other direction, the union effect on real earnings might be understated slightly because the hourly earnings are not adjusted for regional differences in the cost of living, and evidence collected by the U.S. Coal Commission suggests that company store prices were lower in union districts

Table 6–8 reports both ordinary least squares (OLS) estimates and weighted least squares (WLS) estimates. Both sets of estimates are from fixed-effects regressions, which include dummy variables for all states except Alabama and all years except 1912. The state dummies control for state-related factors that influence earnings but are not included among the regression variables, for example, differences in the cost of living in various states. The year dummies capture time-related influences like the effect of government influence in the labor market during World War I that are not captured in the other regression variables.

Source: Soft Coal, Hard Choices

Even after controlling for other influences, union representation and strike activity aided miners in obtaining higher wages. Coal workers in states with a higher percentage of paid-up membership in the UMWA received higher wages, although the union coefficient in the WLS regression is not statistically significant. In states where the UMWA percentage was 5.4 percent higher (10 percent of the mean union percentage), real hourly wages were higher by 0.3 to 0.6 percent. Membership in the UMWA measures only part of the strength of collective action by miners. Union members and nonunion miners also exercised bargaining power through strikes. The coefficients in Table 6–8 imply that when strike days per employee rose 10 percent from a mean of 13.6 to 15, hourly wages rose by 0.1 to 0.2 percent. Actually, the miners in one state benefitted from spillover benefits when miners in the rest of the country struck. A 10 percent increase in strike days per employee in the rest of the United States raised hourly wages by 0.4 to 0.5 percent.

While collective action contributed to raising wages, the factors with the largest effects on coal workers’ wages were the standard labor demand variables, the price of coal and output per man-hour. A 10 percent increase in coal prices raised the hourly wage by 2.9 to 3.3 percent, compared with unionization and strike effects well below 1 percent. A 10 percent increase in output per man-hour boosted hourly wages between 0.5 and 1.7 percent, although the OLS coefficient was not statistically significant.

Coal wages also adjusted to offset changes in nonwage aspects of coal employment. Miners experienced large drops in annual earnings when days the mines were open fell. The loss from fewer workdays was partially offset by an increase in hourly wages, although the WLS days coefficient was not statistically significant. In states that worked 10 percent fewer days the average hourly wages were 0.5 to 1 percent higher.

Besides compensation in the form of higher hourly wages for taking on risky work, coal miners received compensation for workplace accidents and injuries:

During the sample period from 1912 to 1923, there were two alternative state legal systems under which coal workers could obtain direct compensation when injured in accidents. Under the negligence system, an employer was not required to compensate miners for accidents unless the accident had been caused by the employer’s negligence. As a result, only about 50 percent of families of injured miners received some form of compensation for major injuries or deaths under the negligence system. When workers compensation was introduced in many states, payments to injured miners jumped markedly. All serious accidents were compensated, no matter who was at fault, and the amount of compensation jumped…

Higher wages at least partially offset the lower payments for injuries in states without workers’ compensation. The coefficients imply that workers received 1.3 to 2.3 percent higher wages in states with no workers’ compensation law. Workers also received higher wages in states with higher accident rates; however, it is not clear how much to trust the accident rate coefficient estimate.

The estimate is not statistically significant in either equation, which means that we cannot reject the hypothesis that higher accident rates had no effect on the wage rate. Since the wage equation is a reduced form, the accident-rate coefficient may be disguising a compendium of labor market interactions. In an earlier study I tried to sort out these interactions by estimating a system of accident-rate, labor-demand, and laborsupply equations. The results implied that higher accident rates had offsetting effects on wages. Increases in accident rates reduced labor demand, driving wages down, while miners reacted to higher accident rates by reducing their labor supply, driving wages up.

To add onto Fishback’s analysis — prior to federal mandates, a great deal of workers were covered by private industrial sickness funds as well as funds organized by unions as the Economic History Association shows below:

Industrial sickness funds provided an early form of health insurance. They were financial institutions that extended cash payments and in some cases medical benefits to members who became unable to work due to sickness or injury. The term industrial sickness funds is a later construct which describes funds organized by companies, which were also known as establishment funds, and by labor unions.

These funds were widespread geographically in the United States; the 1890 Census of Insurance found 1,259 nationwide, with concentrations in the Northeast, Midwest, California, Texas, and Louisiana (U.S. Department of the Interior, 1895). By the turn of the twentieth century, some industrial sickness funds had accumulated considerable experience at managing sickness benefits. A few predated the Civil War. When the U. S. Commissioner of Labor surveyed a sample of sickness funds in 1908, it found 867 non-fraternal funds nationwide that provided temporary disability benefits (U.S. Commissioner of Labor, 1909). By the time of World War I, these funds, together with similar funds sponsored by fraternal societies, covered 30 to 40 percent of non-agricultural wage workers in the more industrialized states, or by extension, eight to nine million nationwide (Murray 2007a). Sickness funds were numerous, widespread, and in general carefully operated…

Industrial sickness funds were among the earliest providers of any type of health or medical benefits in the United States. In fact, their earliest product was called “workingman’s insurance” or “sickness insurance,” terms that described their clientele and purpose accurately. In the late Progressive Era, reformers promoted government insurance programs that would supplant the sickness funds.

Source: Economic History Association

Working Conditions in the Coal Industry

In writing about the working conditions in the coal industry, Fishback describes how dangerous it was to be a coal miner:

In the early 1900s coal mining was nearly four times as dangerous as it is today. Prior to the drop in employment in the late 1920s, between 1500 and 2000 miners were killed in U.S. coal mines each year. The fatal accident rates in Table 7–1 show that in the United States before 1930 slightly more than two deaths occurred for every million man-hours worked, or roughly three to four fatal accidents for every thousand workers each year. Accident rates varied across states. Fatality rates were highest in the western states of Utah, Colorado, and Oklahoma; lowest in Texas and Missouri. West Virginia held the dubious distinction of having the highest accident rate of states east of the Mississippi River.

However as he notes, the responsibility of ensuring workplace safety was up to individual miners who had a great deal of independence:

The vast majority of deaths occurred one at a time, receiving attention mostly from family and friends before joining the long roll of fatalities in state mining reports. The typical accident occurred in the miner’s room when the roof fell or explosives misfired. Operators paid the miners piece rates and gave them a great deal of independence. The miner therefore explicitly saw the trade-off between income and safety while he made nearly all of the accident prevention decisions within his own workplace. He decided how often to timber the roof to prevent roof falls, and how large a blast to use in dislodging the coal…

The division of safety tasks between miners and operators typically assigned responsibilities to the lower-cost preventer. While the miner was the primary preventer of accidents in his own room, the operator was primarily responsible for providing safety-related “public goods” and services for which there were economies of scale. This led management to take responsibility for ventilation, mine gas inspections, watering of coal dust to prevent the spread of mine fires and explosions, and provision of precut timbers to use as roof props.

In several ways the operators’ and miners’ safety responsibilities overlapped. State laws and many mines’ safety rules assigned the mine foreman the role of safety supervisor. On visits to a workplace, the foreman often examined the mine roof and could force the workman to make his workplace safe before he resumed mining. Yet the foreman visited at most once a day, leaving the miner alone to make nearly all the decisions about safety in his workplace. The operator also provided the large capital equipment, such as track and motors for haulage and cutting machines. Both the operator’s choice of safety features on such equipment and the worker’s care in handling it determined the probability of equipment-related accidents.

Fishback explains that since individual miners exercised independent judgment in their work — this often created the sort of unsafe conditions that characterized the industry:

Early in the period much of the blame for accidents was placed on the miners’ work habits and safety attitudes. More recently, a historical backlash has sought to assign operators most of the blame. Unfortunately, both miners and operators at times ignored or relaxed safety precautions. State mine inspectors regularly complained that miners inadequately timbered their workplaces, rode illegally on mine cars, brought too much powder into the mine, overcharged their shots, and ignored many of the rules for “shooting off the solid,” blasting the coal without making an undercut at the base of the wall. Ignorance of proper techniques was an often-cited cause, especially in areas where there were many new immigrants, who lacked mining experience and often the ability to speak English.

Miners also cut corners when they thought that the precautions unnecessarily hindered their earning power. Government mine officials also blamed coal operators, who failed at times to provide enough mine timbers, or meet state requirements with respect to ventilation, training in proper mining techniques, and supervision of miners.

The only reason that people were willing to work in such conditions was as mentioned before due to the high wages offered to compensate for risk:

Within the coal industry, workers in the more dangerous jobs inside the mines received wages up to 14 percent higher than similarly skilled workers outside the mines. The regression analysis summarized in Table 6–8 of Chapter 6 shows that miners received higher wages when they received less compensation for injuries under the negligence liability system.

Since neither workers’ compensation nor negligence lawsuits paid miners the full value of lost working time, miners still sought higher wages in areas with higher accident rates. Although the accident rate coefficient was positive in Table 6–8, we could not reject the hypothesis that there was no direct relationship between wages and accident rates across states. One reason wages did not appear to adjust may be that cross-state differences in coal mining fatality rates were not large enough to be obvious to the miners. Thus workers sought and obtained higher wages for obvious differences in accident rates between coal mining and other industries. But miners were less successful at getting wages fine-tuned to less obvious differences across states in coal mining accident rates.

Another reason the results in Table 6–8 might not show a relationship between safety and earnings is that it is a reduced-form equation that fails to fully illuminate several offsetting relationships between wages and accident rates. Accident rates were higher at mines where the natural conditions of the mine were more dangerous or the employer offered inadequate safeguards. Accident rates increased when more accident-prone (typically less-experienced) workers were hired. And higher accident rates may have reflected decisions by workers or employers to work with less regard for safety when wages changed.

Each cause of higher accident rates influenced different aspects of the wage accident relationship. If a mine was more dangerous, either naturally or because the employer skimped on safeguards, miners would adjust their supply of labor to the mine, requiring higher wages before agreeing to work there. Thus, in labor-supply relationships, wages and accident rates were likely to be positively correlated.

Moreover, the relationship between wages and accidents is made clear when we see that higher accident rates increased wages, and higher wages encouraged coal employers to lower accident rates in order to lower wages:

Changes in wage rates caused by factors beyond the control of miners and coal employers also affected accident prevention by employers and workers. An increase in the wage rate had offsetting effects on accident prevention by employers. The wage increase might have caused employers to cut costs by spending less on safety. However, higher wages also gave employers more incentive to prevent accidents because higher wages raised the compensation employers paid to each injured worker and the risk premium they paid in wages.

The miners’ response to exogenous wage changes depended on how well they could adjust their incomes across time periods. Some writers argue that miners had such low incomes and so few opportunities to save or borrow that they were forced to maximize income in each year to survive. When wage rates were cut, accident rates rose as miners “gambled” their lives more to maintain their meager standard of living.

However, the earnings evidence in Chapter 6 suggest that the miners’ incomes were generally above the subsistence level. Miners actually saved during upturns and dissaved, sometimes accumulating debts, during downturns. Thus they were able to shift income over a longer time horizon. If miners maximized their “permanent” (long-term) income, they faced some incentive to increase safety efforts when wages fell. Since they were paid piece rates, the opportunity cost of time spent preventing accidents was the earnings lost from producing less coal. When piece rates fell, miners gave up less in earnings when they devoted more time to safety, leading them to increase accident prevention at the expense of earnings. This implication is derived from a formal model describing the actions of a risk-neutral miner who maximizes permanent income. Generally, miners were risk-averse and the effect of a wage cut was ambiguous. A miner’s aversion to risk reduced his willingness to accept higher risk despite the fall in the opportunity cost of accident prevention.

Source: Soft Coal, Hard Choices

In an attempt to calculate and determine a statistically significant relationship between accident rates and wages, Fishback discovered the following:

To examine each of these relationships between wages and accident rates, I estimated a system of simultaneous equations — an accident prevention equation, a labor-demand equation, and a labor-supply equation — using evidence from the twenty-three major coal mining states for the years 1912–1914, 1917, 1919–1923. Table 7–3 lists the estimated relationships between wages and accident rates from those equations. The estimates marked by an asterisk are ones where statistical tests reject the hypothesis that there was no relationship.

Only the labor-supply relationship where miners required higher wages to work at more dangerous mines consistently passes statistical significance tests. Holding union membership, hours worked, and strike activity constant in the equation for small-scale underground accidents, coal workers were willing to accept an added risk of one death for every 10 million man-hours for a wage increase of 1.5 cents (in 1967 dollars) per hour. This wage premium offers an implicit estimate of the value miners attached to their lives.

If the miner was risk-neutral, the wage premium of 1.5 cents is equal to the additional expected loss from a small-scale fatal accident (the change in probability of a fatal accident (1/10,000,000) multiplied by the value the miner attaches to his life). The value of life implied by the accident rate coefficient in the small-scale-fatality rate equation is $150,000, by the coefficient in the total fatality rate equation $60,000, and by the coefficient in the roof-fall equation $200,000 (all in 1967 dollars).

The estimates are similar to the range of $140,000 to $260,000 estimated by Thaler and Rosen with labor market evidence from the late 1960s.22 The lowest estimate of $60,000 is approximately equal to the discounted present value of lifetime earnings of a coal miner in the 1920s. Miners in 1920 earned roughly $2000 in 1967 dollars each year. Assuming a real interest rate of 2 percent and a working life of fifty years, the present value of that stream of earnings is $62,847 in 1967 dollars.23 To the extent that the fluctuations in accident rates reflected differences in accident proneness or inexperience, the labor demand results suggest that employers paid less for more accident-prone workers. However, we cannot reject the hypothesis in most of the estimates that higher accident rates had no effect on labor demand.

The gambling hypothesis is tested in the relationship between wages and accident rates in the accident rate equation. We cannot reject the hypothesis that accident rates were not affected by changes in the wage rate. The results seem inconsistent with the gambling hypothesis in that lower wages were not associated with higher accident rates. However, the test does not fully refute the gambling hypothesis. Mixed in with the workers’ responses to higher wages are the responses of employers, which economic theory predicts would be indeterminant. Further, the analysis could not hold the mining experience (and accident proneness) of the work force constant. When wages were cut, less experienced miners were less likely to enter mining, causing accident rates to fall. Therefore our estimate of the relationship in the accident rate equation was biased against the gambling hypothesis.

But this begs the question — did unions help improve workplace safety? Fishback proceeds to examine the impact of unions on workplace safety by examining the two ways that unions enhanced miners’ economic welfare with respect to accidents:

One was to raise the wage rate paid, holding accident rates constant. The regression results in Table 6–8 of Chapter 6 show that the unions met that challenge. The union coefficient in the reduced-form wage equation shows that a shift in the work force from fully nonunion to fully union, would have caused hourly earnings to rise between 8 and 15 cents per hour (in 1967 dollars), or roughly 6.3 and 10.1 percent (based on the mean). Collective action by miners was also effective. Had all workers in the state gone on strike for ten days, the hourly wage would have risen by 0.8 to 1.6 percent…

Full unionization shifted the labor supply curve such that wages would have risen 40 to 44 cents per hour (see Table 7–3) had the union been able to force the employers to continue to hire the same number of workers (HN). However, the higher wage demands of the union pushed employers back along their labor-demand curve, causing them to hire fewer workers for fewer hours (Hy). Therefore, the overall impact of the union on the wage was the 8-to-15-cent rise found in Table 6–8 in Chapter 6.

The second way unions might have enhanced the miners’ welfare was to lower accident rates directly:

Unions potentially offered the miners an effective voice for negotiating improvements in safety, particularly for safety public goods, like ventilation or keeping coal dust down to prevent the spread of explosions. Public goods like ventilation benefitted all miners, but there were free rider problems if miners negotiated individually. Each individual miner received only a small part of the overall benefits of ventilation and the benefits would have come to him if someone else successfully negotiated for ventilation. Thus each miner faced incentives to let other miners negotiate for ventilation while he focused on negotiating for his own wage and workplace. The union might have negotiated better for ventilation because it represented all miners and could get the miners to pay union dues to cover the costs of negotiating for ventilation.

In fact, the UMWA did not have much effect on accident rates. The statistical tests for a relationship between UMWA membership and fatal accident rates shown in Table 7–3 cannot reject the hypothesis that the UMWA had no impact on accident rates. Graebner suggests that mine safety was a secondary goal of the UMWA. Most of its energies were devoted to organizing drives in nonunion states, where expansions in coal production threatened the strength of the union. The UMWA seemed ambivalent toward mine safety legislation. It sought certification of miners to reduce the number of inexperienced workmen, but this may have been an attempt to obtain more control over the labor supply. In fact, results later in the chapter show that accident rates were no lower and possibly higher in the states where state boards certified miners. Union pit committees might have promoted safety by providing a grievance mechanism that protected workers from dismissal when they complained of unsafe conditions. Graebner notes, however, that few of the grievances adjudicated by pit committees were related to safety. Miners were less likely to use the grievance mechanism to protest unsafe conditions than to seek reinstatement of a miner fired for violating safety provisions in the contract.

If unions had little impact, did government do better? Fishback states that they did not improve workplace safety either by most accounts:

If the laws reduced accident rates, the coefficients in the table would be negative and the t-tests in parentheses would reject the hypothesis that the coefficients were zero. In each of the equations, the coefficients of the licensing laws — state licenses of foremen, licensing of miners by state boards, and requirements that foremen train miners — are positive. This does not necessarily imply that the licensing laws raised total accident rates and roof fall accident rates because the t-tests do not reject the hypothesis that there was no effect. Most of the remaining laws also lack much impact.

Although some coefficients are negative, we generally cannot reject the hypothesis for most laws that their coefficients were zero. There were some exceptions to this rule. States that passed laws preventing miners from riding on coal cars saw declines in both the overall accident rate and the small-scale underground accident rate. The coefficient of -0.213 in the equation for the small-scale accident rate implies that the law saved an extra life for every 2934 men who worked an average work year of 200 eight-hour days. Required use of permissible explosives served to lower small-scale accident rates a little more, saving an extra life for every 2551 men who worked an average work year.

The U.S. Bureau of Mines claimed that closer supervision would reduce problems with roof falls. Their claim is supported here by the negative and statistically significant coefficient on the law requiring a minimum number of daily visits to each workplace by the foreman. The coefficient of — 0.219 implies that for an additional required daily visit by the foreman, an extra life was saved from a roof fall death for every 2854 men working an average work year.

There are two possible reasons why most of the laws had little or no impact on accident rates.

First, many of these laws may have just codified practices followed by most mines already. Many mines, whether regulated or not, had daily inspections by fire bosses, provided mine timbers, and insulated electric wires. The laws preventing riding of coal cars and requiring use of permissible explosives probably had impact because they changed behavior in a major way. Second, the laws may not have been enforced very effectively.

The average state mining department spent $2.60 (in 1967 dollars) on mine inspections for every thousand tons of coal produced. The coefficients and t-tests in Table 7–5 imply that increased expenditures on inspections would have lowered small-scale and roof fall accident rates. Doubling the average mine inspection budget to $5.20 per thousand tons of coal would have lowered the small-scale accident rate per million man hours by 0.177, saving an extra life for every 3535 men working an average work year.

The state mining inspectors were more successful than the Occupational Safety and Health Administration (OSHA) is today. Numerous economic investigations of OSHA show that it has negligible impact on workplace safety, partly because OSHA’s enforcement resources, measured as real appropriations per worker covered, are even lower than those for the old state mining departments.

Appropriations for the OSHA compliance budget in 1975 were roughly $0.54 per nonagricultural employee in the United States, adjusted for inflation in terms of 1967 dollars. In West Virginia in 1916 total appropriations for mine safety — inspectors’ salaries plus travel, clerks’ salaries, and equipment — were almost four times higher at $2.1 (in 1967 dollars) per coal worker.

The U.S. Bureau of Mines also had no statistically significant impact on mine safety. Lewis-Beck and Alford find that federal government intervention first lowered coal mining accident rates after 1941 when U.S. Bureau of Mines inspectors were allowed to inspect mines, although without coercive power. Accident rates were nearly halved from their 1930s level when appropriations for coal mine inspections and investigations by the Bureau reached approximately $7.85 (in 1967 dollars) per mine worker in 1949.

Accident rates then fluctuated around a constant trend for two decades, despite additional increases in appropriations, rising to $65 (in 1967 dollars) per mine worker in 1969. The plateau ended with the passage of the stringent Coal Mine Health and Safety Act of 1969. The added costs of halving accident rates again were substantially higher than they were in the 1940s. The level of appropriations required rose to $173 (in 1967 dollars) per miner in 1972 before stabilizing at $137 (in 1967 dollars) per mine worker by 1975.41 The resource costs to society were even greater, as coal mining productivity holding accident rates constant fell significantly.

In his conclusion regarding union efforts and state regulations meant to improve workplace safety, Fishback concludes the following:

While coal miners faced numerous dangers in the mine, they did receive some compensation for accepting those risks. Wages were higher in coal mining than in other industries and higher for jobs inside than for jobs outside the mines. Miners were sensitive to the risks and adjusted their labor supply to require higher wages before working at more dangerous mines. The United Mine Workers succeeded in improving the employment package for workers by raising wages holding the accident rate constant. However, the states where the UMWA was strong were not associated with lower accident rates.

Progressive Era safety legislation in many areas was a disappointment although some specific laws may have enhanced safety. The U.S. Bureau of Mines offered large amounts of information about improving mine safety, but the information seemed to have little impact on accident rates. The federal government generally had little impact on coal mining accident rates until the federal agency began inspecting mines in the 1940s with a budget roughly four times the level per worker of the budgets of state mining departments. State laws to license miners and foremen failed to reduce accident rates.

In fact, the only specific laws that lowered accident rates were rules against riding coal cars and requiring the use of permissible explosives. Increases in the required number of visits by foremen to the workplace also lowered the roof fall accident rate. Most of the remaining laws either codified existing practices or were not enforced. Giving more resources to state mining departments for inspections, however, would have lowered accident rates such that doubling the typical budget would have reduced fatal accident rates by roughly 10 percent.

Lastly, Fishback addresses the impact of changes in liability rules for workplace accidents — it turned out that workers’ compensation laws promoted moral hazard in that they increased workplace accidents:

Incentives for accident prevention were also established by the liability rules for compensation of accidents. Prior to 1910, liability in most states was established by common law rules of negligence. The rules imposed most of the costs of accidents on the miners. Thus the employer often was not required to compensate miners for accidents in their rooms, although the employer typically paid compensation for accidents where he was the least-cost preventer. With the passage of workers’ compensation legislation, miners were compensated for all serious injuries and received on average more compensation than they did under negligence liability. Miners benefitted because they now received much more compensation for their injuries. The extra compensation provided more of a safety net that allowed them to raise earnings by working with less regard to safety. Unfortunately, this meant higher accident rates because employers faced high costs of trying to prevent the extra accidents that resulted. The employers thus chose to pay the compensation instead of the extra costs of preventing the accidents.

One must be wonder then if early efforts to mitigate fatal accidents in the coal industry by state intervention and organized labor failed — what exactly explains what we see today in terms of coal mining fatalities?

Source: Statista

The simple answer is increases in worker productivity which can only be made possible when firms accumulate capital, and re-invest said capital into improving the output of their workers through the purchase and development of new equipment and tools that also improve their safety.

As the Economic History Association notes:

As the coal industry expanded, it also incorporated new mining methods. Early slope or drift mines intersected coal seams relatively close to the surface and needed only small capital investments to prepare. Most miners still used picks and shovels to extract the coal, but some miners used black powder to blast holes in the coal seams, then loaded the broken coal onto wagons by hand. But as miners sought to remove more coal, shafts were dug deeper below the water line.

As a result, coal mining needed larger amounts of capital as new systems of pumping, ventilation, and extraction required the implementation of steam power in mines. By the 1890s, electric cutting machines replaced the blasting method of loosening the coal in some mines, and by 1900 a quarter of American coal was mined using these methods. As the century progressed, miners raised more and more coal by using new technology. Along with this productivity came the erosion of many traditional skills cherished by experienced miners.

Source: 2005 Study cited here

A 1966 study examining output, employment and productivity in the United States (1893–1913) elaborates on these improvements in worker productivity — particularly the coal industry:

During the first two decades of the twentieth century, trends in output per man-day differed widely between anthracite and bituminous coal mining. In the anthracite industry production per man-day hardly changed. In contrast to the bituminous industry, deteriorating resource conditions of the long-worked deposits were not offset by development of richer or more easily accessible supplies. The average width of the seam declined steadily, while the depth of the anthracite mines increased. The industry was particularly difficult to mechanize, largely because of the steep slope of many of the coal beds. In the bituminous coal industry, on the other hand, one-quarter of the total production was mined by machines by 1900. Ten years later this share had risen to 42 percent and by the end of World War I to 56 pe cent. But even here the most intensive progress in mechanization and the most rapid increase in productivity still lay in the future.

The impact of such mechanization can be seen when we examine gold mines for instance as this 2020 study showed that mechanization increases worker productivity and decreases the rate of accidents:

New resource determinations enable gold mines to change from the conventional system of mining and introduce mechanized mining methods. This decision could however be plagued with problems. It is necessary to review mechanization to determine its impact. In this study safety, gold production, productivity, manpower, maintenance and equipment cost data was collected and analyzed along with questionnaire and interviews from a mine to determine its performance before and after mechanization. From the study, mechanization reduced mining grade cut-off by 86%, increased gold production per annum by 94%, has increased tonnage productivity per man-month by 564%, improved the skill levels of workers but reduced manpower by 53% and has potential to creating labour unrest. It has helped reduce Loss Time Injury Frequency Rate by 1160%, has helped reduce accidents by 94%, but has potential to increase fatality on a mine.

--

--

Thomas Shelby
Thomas Shelby

Written by Thomas Shelby

Austrian Economics - advocate for voluntary exchange, property rights, and the wholesale destruction of the state in favor of spontaneous order.

No responses yet