Gout: Epidemiology and Risk Factors




Key Points





  • Unlike osteoarthritis and rheumatoid arthritis, gout is typically an episodic arthritis. The intervals between attacks can be as long as decades with complete absence of symptoms between attacks. These factors, and errors in diagnosis, intrinsically complicate assessment of gout epidemiology.



  • The case definitions used for epidemiologic studies of gout are seldom as rigorous as those used for clinical diagnosis. For example, investigators in the Sudbury Study could validate only 44% of self-reported cases using Rome or New York criteria, and in a study of health professionals, only 70% of cases could be validated by ACR criteria. However, in one study of physicians, it was reported that 100% of self-reported cases could be validated by ACR criteria and medical record review. The variable magnitude and direction of this measurement error mean that all statistical conclusions about gout epidemiology need to be carefully scrutinized.



  • A large proportion of people with gout have relatively minor and self-limiting forms of the disease; these patients seldom require medical attention, and the diagnosis is often not confirmed.



  • The appropriate utilization of gout medications in primary care is notoriously poor; those on gout medications do not necessarily have gout, unlike diagnostic certainty in patients treated with chemotherapy for cancer or insulin for diabetes, as common examples.



  • There is no way to reliably distinguish the many possible health outcomes of gout from those of concurrent osteoarthritis and other comorbid conditions.



  • Although several authors have defined “primary” gout as the disease occurring in the absence of other causes such as diuretics, this definition is difficult to apply in epidemiologic studies.





Introduction


Gouty arthritis (gout) is common in industrialized nations. Gout is a complex disease involving the metabolic, renal, cardiovascular and immunologic systems; advances in clinical care of the disease have followed the lava-lamp model ( Figure 6-1 ).




Figure 6-1


The lava lamp model of clinical advances. Clinical advances are represented as traversing the lamp. The liquid ( tan ) and the denser material ( red ) in the lamp represent epidemiologic and “basic” science, respectively. Advances can occur by several trajectories, and there is no one path to a useful solution. Some solutions will involve just clinical science, and others will involve a mixture of basic science and epidemiology. Importantly, the definitions of basic science and epidemiology, and their relative positions, may change with time. Rate limiting is the ability to see through the lamp and then traverse it.

(Modified from Rees J. Complex disease and the new clinical sciences. Science 2002;296(5568):698-700.)




Epidemiologic Principles


The building blocks of epidemiologic inferences are incidence rates. Accurate measurement of incidence of any disease is, to some extent, dependent on case definition and classification criteria used. The primary epidemiologic risk factors for gout include increasing age, male gender, menopausal status among women, renal dysfunction, hypertension and other comorbidities that decrease renal uric acid excretion such as metabolic syndrome and obesity, use of diuretics (which increase renal urate reabsorption), diet and alcohol factors (see Chapter 12 ), comorbidities that increase purine turnover such as psoriasis, rare single-gene disorders such as HPRT deficiency, and dietary risk factors. The relationships among these risk factors are complex, and for a nuanced understanding of the epidemiology of gout, it is critical to recognize the first principles of gout epidemiology cited in the Key Points at the head of this chapter.


Descriptive Epidemiology


The two metrics used to characterize the frequency of chronic diseases in a population are incidence and prevalence. Incidence rates represent the number of newly diagnosed cases in a well-defined population in a specified time, usually number per 1000 individuals per year. The prevalence proportion represents the proportion of individuals alive with a specific condition at a given time. In case of episodic arthritis syndromes like gout, one needs to differentiate between 1-year prevalence (i.e., had an attack in the past 1 year) versus lifetime prevalence (i.e., ever had a gout attack). For chronic diseases with low mortality rate, the prevalence rates can be expected to increase with age even with steady incidence rates.


Prevalence of gout


Table 6-1 summarizes all the relevant studies of prevalence in the United States. The prevalence figures vary substantially among studies because some authors used period prevalence and some used lifetime prevalence; in other studies, the definition used was not clearly stated. In the most recent NHIS survey on gout (1996), the prevalence for the 1-year period was 940 per 100,000 adults aged 18 years or older in the United States. In 2008, the prevalence proportion of self-reported gout in the United States was estimated to be approximately 2 to 3 million adults. Managed care data (based on claims data codes) suggests that the prevalence rates have increased over time, especially among older adults (older than 65 years) as shown in Figure 6-2 .



Table 6-1

Prevalence of Gout in the United States a




























































Source and year of study/gout definition b Prevalence per 100,000
Age, y Male Female Total
REGIONAL POPULATION STUDIES
Tecumseh Community Health Study, 1960/“Rome” d ≥20 720 480 ND c
Framingham Heart Study, 1964/arbitrary e ≥42 (mean 58) 2850 390 1480
Sudbury Study, 1972/Rome and New York ≥15 660 100 370
NATIONAL SURVEY STUDIES
NHIS, 1988/self-report (1-year prevalence) f ≥18
18–44
45–64
≥65
ND
290
3350
4110
ND
90
950
1700
850
310
2100
2700
NHIS, 1992/self-report (1-year prevalence) f ≥18
18–44
45–64
≥65
ND
440
2630
4410
ND
30
810
1820
840
380
1680
2900
NHIS, 1996/self-report (1-year prevalence) f ≥18
18–44
45–64
≥65
ND
340
3350
4640
ND
20
1200
1950
940
180
2240
3080
NHANES III, 1988-1994/self-report (lifetime prevalence) g ≥20
≥20-29
30-39
40-49
50-59
60-69
70-79
≥80
3,800
200
2,100
2,600
5,600
9,400
11,600
7,100
1,600
500
100
900
2,300
3,200
5,200
5,300
2,600
400
1,100
1,700
3,900
6,100
8,000
5,900

a Data from Lawrence RC, Felson DT, Helmick CG, et al. Estimates of the prevalence of arthritis and other rheumatic conditions in the United States. Part II. Arthritis Rheum 2008;58(1):26-35.


b NHIS, National Health Interview Survey; NHANES III, National Health and Nutrition Examination Survey III.


c ND = no data.


d Rome = Rome criteria used “insofar as possible.”


e Arbitrary indicated at least two of the following three features: a typical attack of arthritis, an attack of arthritis with a prompt response to colchicine therapy and/or hyperuricemia.


f One-year prevalence of gout was ascertained by the question, “Have you or any member of your household had gout within the past year?”


g Lifetime prevalence of gout ascertained by the question, “Has a doctor ever told you that you had gout?” Interviewers were instructed to emphasize the word “doctor.” If the respondent stated that it was another health professional who gave the diagnosis of gout, the answer was coded as “no.”




Figure 6-2


Prevalence of gout in U.S. men, 1990–1999.

(From Wallace KL, Riedel AA, Joseph-Ridge N, et al. Increasing prevalence of gout and hyperuricemia over 10 years among older adults in a managed care population. J Rheumatol 2004;31(8):1582-7.)


The unadjusted prevalence of gout in the U.K. population has been estimated to be 1.4%. A general practice–based study with a smaller sample size reported an unadjusted prevalence rate of 1%. Almost all the recent national data from the United Kingdom come from general practice–based registers. These include smaller consortia , the General Practice Research Database (GPRD), and the Royal College of General Practitioners Weekly Returns Service sentinel general practice network in England and Wales. The second and fourth U.K. National Morbidity Studies demonstrated a threefold increase in gout prevalence between 1971 and 1991. In a study using the large GPRD, the unadjusted prevalence of gout in 1999 was 1.4%, with the highest rate of 7.3% observed in men aged 75 to 84 years. A more recent study in Germany and the United Kingdom showed the same 1.4% prevalence of gout in both countries over the period 2000 to 2005. The yearly incidence rates for the United Kingdom derived from the GPRD for 1990 through 1999 showed modest increases in the early 1990s in older men and women but a return toward 1990 values by the end of the decade. However, such trends were not observed in the Royal College of General Practitioners study ( Figure 6-3 ).




Figure 6-3


Incidence of gout in the United Kingdom (1994–2007). A, Annual incidence of acute attacks of gout per 10,000 population (1994–2007); data are shown for all ages (total) and for males and females. B, Age-specific annual incidence of acute attacks of gout per 10,000 population (1994–2007).

(From Elliot AJ, et al. Seasonality and trends in the incidence and prevalence of gout in England and Wales 1994-2007. Ann Rheum Dis 2009;68:1728-33.)


Incidence


There have been relatively few studies that have examined the incidence of gout. Methodologically, the most rigorous was a population-based study from Rochester, Minnesota, that compared incidence rates of new cases of gout using American College of Rheumatology Classification Criteria between 1977/1978 and 1995/1996. The authors of this study concluded that the unadjusted incidence rate of gout doubled from 45 per 100,000 to 62 per 100,000. Such increases would have contributed to the increasing prevalence of gout over time. The U.S. hospitalization data are also consistent with this theory ( Figure 6-4 ).




Figure 6-4


Trends in hospitalizations with gout as a principal diagnosis ( A ) and the presence of gout as a comorbid diagnosis in the United States over time ( B ). The data are weighted national estimates from the Healthcare Cost and Utilization Project (HCUP) Nationwide Inpatient Sample (NIS), Agency for Healthcare Research and Quality (AHRQ), based on data collected by individual states and provided to AHRQ.


Epidemiology of Tophaceous Gout


The population epidemiology of tophi is harder to study than that for other clinical manifestations of gout since visible tophi tend to be asymptomatic and diagnosis during a physical examination requires some degree of sophistication. The proportion of tophaceous disease among those with gout varies substantially, with the Sudbury study reporting that about 23% of patients with gout had tophaceous gout, the Framingham Study reporting about 7%, and the study on the Hmong reporting 31%.


Menopausal Women and Gout


As can be expected from differences in the distribution of serum urate concentrations, women in general have lower prevalence of gout than do men ( Figure 6-5 ). This disparity is evident through all age groups. There is a belief that women are protected from gout in the premenopausal age group and that in the postmenopausal period, the rates “catch up” with those among men. This is an inherently difficult hypothesis to test in epidemiologic studies since age, age at menopause, and menopausal status are highly correlated with each other. The prevalence of gout among premenopausal women is low compared to that among men of the same age range but the relative prevalence remains lower in older age groups as well. In the NHANES III surveys, the lifetime prevalence of gout among men increased from 2600 per 10,000 in the 40- to 49-year-old age groups to 9400 per 10,000 in the 60- to 69-year-old age group—a 3.6-fold increase. The corresponding figures among women were 900 and 3200, respectively—a similar 3.5-fold increase. Such a proportional increase was also observed in the data from the U.K. GPRD data (see Figure 6-5 ). These data suggest that increases in prevalence of gout among menopausal women are merely a reflection of age-related changes and are unrelated to menopause. On the other hand, a cross-sectional study of the U.S. population reported a definite but modest (about 0.30 mg/dl) increase in serum urate concentration associated with menopause, compared to age- and other risk-adjusted premenopausal women. In this study as well as in others, postmenopausal women who used hormones had an about 0.2 mg/dl lower serum urate level compared to nonusers; one interpretation is that the “estrogen attributable fraction” of hyperuricemia may be too modest to be of any clinical utility. Interestingly, data on gout incidence reported from the Nurses Health Study report an approximately 20% increase in the risk for gout attributable to menopause and an approximately 20% decrease in risk with hormone replacement. The significance of this investigation is in the implications for the prevention of gout.




Figure 6-5


Prevalence of gout by age and gender. The proportionate increase with age is similar in men and women.

(From Mikuls TR, Farrar JT, Bilker WB, et al. Gout epidemiology: results from the UK. General Practice Research Database, 1990-1999. Ann Rheum Dis 2005;64(2):267-72.)


Seasonality of Gout


There is some evidence of seasonality for gout incidence and flares, but the relative magnitude of such seasonality is negligible compared to those of other seasonal illnesses such as hay fever and flu. Gout has been described in the popular literature as the “scourge of the holiday season” (November-December), presumably because of dietary indiscretions. However, this is not supported by data. Schlesinger et al. reported that in patients from Pennsylvania, crystal gout flares were the lowest in the winter and highest in the spring. There was no consistent correlation between gout flares and temperature or humidity. These observations were confirmed in a study of patients from Ferrara, Italy; in these patients, the peak incidence of flares was in April. It is notable that these studies defined flares as those with crystal demonstration in synovial fluid and that the pattern of diagnosis might be influenced by performance of arthrocentesis over different months of the year. Another Italian study examined the first-ever attack of gout in a cohort of 73 patients and concluded that June, July, and December were the peak incidence months. The largest study to date confirmed that new diagnoses of gout peak between late-April and mid-September.


Geography, Ethnicity, and Gout


Wide variations in the incidence, prevalence, and severity of gout among various ethnicities have been observed as illustrated by Figure 6-6 . Most notable is the high prevalence of gout among Pacific Islanders. Maoris in particular have greater severity of gout as manifested by hospitalization rates 6 times greater than that of the New Zealand general population. Asian/Pacific Islanders in the United States (especially Filipinos, Tongans, and Samoans) have an almost threefold higher frequency than the age- and gender-adjusted white population. The proposed explanation is an underlying inability of people of these ethnicities to clear uric acid by the kidneys when challenged with a high-purine diet. This hypothesis is supported by the observation that the rates of gout among the Maoris have increased with increasing adoption of European diet and lifestyle. Heterogeneity of methodology explains some but not all of this variation; most is likely due to complex gene–environment interactions. Whether Western dietary amounts of fructose consumption is a factor is not yet clear. (See also Chapter 11 .)




Figure 6-6


Geography, ethnicity and prevalence rates of gout. An asterisk indicates that prevalence of gout was assessed by the COPCORD group of investigators. The rates are not necessarily directly comparable to each other, but the markedly high rates in indigenous people of Australia are consistent with observations in other indigenous Pacific Islanders. The mean U.S. prevalence rate (indicated by ∗∗ ) was calculated from the National Institutes of Health estimates of 6 million lifetime prevalent cases in a population of 250 million.

Data for the United Kingdom were obtained from GPRDS (United Kingdom General Practice Research Data).


Risk Factors and Causality


In most epidemiologic studies, the true objective is to detect and measure the effects of an etiologic factor or an intervention. Sometimes the effect is not directly observable and we have to settle for the observation of the association of that specific factor or intervention. The measurement of a substitute may work well in cases where the association does not equal the effect. For example, the effect of caffeine consumption and arthritis is difficult to measure but the association between self-reported coffee consumption and arthritis risk is not difficult to assess. However, the latter may or may not be the true effect of caffeine. On the other hand, use of moonshine whiskey as a surrogate for lead-toxicity–related gout may provide a good reflection of the effect of lead on gout incidence.


The term risk factor denotes a factor associated with an increased risk of disease or other poor outcomes. Risk factors are correlations and correlation does not imply causation. For example, watching more than 5 hours of television every day may be a risk factor for gout but it is not causal. It is the association with a lifestyle characterized by low physical activity, obesity, and insulin resistance that is the real pathway to gout.


Causality is a subject of perennial discussion among professional philosophers. In the context of epidemiologic research, causal relationships have a narrower interpretation and the features that “upgrade” a risk factor to a causal factor have been well accepted ( Table 6-2 ). In the case of gout, few risk factors have successfully passed through the “filter of truth” ( Figure 6-7 ).



Table 6-2

Characteristics Distinguishing Causal Relationships From Correlative Observations







































TEMPORALITY
There is a time relationship between cause and effect in that the effect occurs after the cause. If some delay is expected between cause and effect, then that delay should be observed.
STRENGTH AND ASSOCIATION
Cause and effect may be observed by statistical correlation between strength of response and consistently strong of result in repeated events or experiments. Full strength correlation has a correlation coefficient of 1. A weaker association between cause and effect will see greater variation.
BIOLOGICAL GRADIENT (DOSE-RESPONSE)
In treatment, there might be expected to be a relationship between the dose given and the reaction of the patient. This may not be a simple linear relationship and may have minimum and maximum thresholds.
CONSISTENCY
One apparent success does not prove a general cause and effect in wider contexts. For example, to prove a treatment is useful, it must give consistent results in a wide range of circumstances.
PLAUSIBILITY
The apparent cause and effect must make sense in the light of current theories and results. If a causal relationship appears to be outside of current science then significant additional hypothesizing and testing will be required before a true cause and effect can be found.
SPECIFICITY
A specific relationship is found if there is no other plausible explanation. This is not always the case in medicine where any given symptoms may have a range of possible causing conditions.
EVIDENCE
A very strong proof of cause and effect comes from the results of experiments, where many significant variables are held stable to prevent them interfering with the results. Other evidence is also useful but cause and effect cannot be as readily determined as in controlled experiments.
ANALOGY
When something is suspected of causing an effect, then other factors similar or analogous to the supposed cause should also be considered and evaluated as a possible cause or otherwise eliminated from the investigation.
COHERENCE
If laboratory experiments in which variables are controlled and external everyday evidence are in alignment, then it is said that there is coherence.



Figure 6-7


Association to causality: Epidemiologic truth filters weed out misleading associations from true causative factors. These filters are critical for identification of points of intervention that can be used to successfully reduce the burden of illness for the individual and the population.


Multiple Causation


Within an individual patient, barring certain genetic syndromes, gout is likely to be caused by several factors. “Multiple causation” is the canon of contemporary epidemiology, and its metaphor and model is the “web of causation.” Several of these causes facilitate gout but are not sufficient to cause gout individually ( Figure 6-8 ). The most important categories of risk factors include (1) the metabolic “six-pack” of obesity, diet, inactivity, shared genes, hypertension, and hyperlipidemia (discussed later); (2) other comorbidities that increase uric acid production or decrease uric acid excretion; (3) other genetic factors; and (4) medications. All of these influence the risk for gout by affecting serum urate concentration (see Figure 6-8 ).




Figure 6-8


Web of causation in gout. All known risk factors for gout are linked through hyperuricemia. Notably, lifestyle and metabolic factors are tightly correlated and possibly act on the kidney through hyperinsulinemia.


Hyperuricemia: A Necessary but Not Sufficient Cause of Gout


The U.S. Veterans Normative Aging Study reported that among those with serum urate concentrations greater than 79 mg/dl, the cumulative 5-year incidence of gout was 22%, whereas the incidence rate for those with a lower serum urate concentration (7.0 to 8.9 mg/dl) was about 3%. The annual incidence rates were about 5% in the former group compared to 0.5% in the latter and 0.1% among those with serum urate concentration less than 7.0 mg/dl. In the Framingham Study, the 12-year cumulative incidence rates were 36% among those with serum urate concentration greater than 8 mg/dl and 1.8% for those with serum urate concentration less than 6.0 mg/dl. Such stark differences in the risk rates are likely to be an exaggeration, as those with higher urate concentrations are more likely to be older and male, whereas those with lowest serum urate concentrations are likely to be younger and female.


Although urate concentration is clearly a risk factor, there are other clues that suggest that the link between hyperuricemia and gout may not be linear. Several studies have documented the presence of intraarticular urate crystals without clinical inflammatory manifestations. This is instructive because urate crystals are strong stimulants of innate immunity, and lack of gout in some but not others may signify heterogeneity of immune responses to crystals.


Web of Causation of Gout


All known risk factors for new-onset gout are linked through hyperuricemia (see Figure 6-8 ). The metabolic six-pack is composed of six overlapping factors that lead to hyperinsulinemia and other related effects (putatively including increased leptin in obesity) that in turn reduce the renal clearance of uric acid. These include the factors of the metabolic syndrome—hypertension, hyperlipidemia, obesity (especially truncal obesity), and lifestyle factors such as poor physical activity and high-risk diet. The latter risk factors are discussed separately.


The risk factors for incidence of gout (i.e., gout for the first time in a person) may be different from those for recurrent gout flares. A novel case-crossover study has examined these factors and observed that the chronicity of gout and comorbid factors are the two major determinants of recurrence of gout flares in addition to alcohol and diuretic use.


It is notable that hyperuricemia was included as a criterion for metabolic syndrome when the term was coined by Haller in 1977. Two often-used sets of criteria for metabolic syndrome are shown in Table 6-3 . The World Health Organization (WHO) criteria require the presence of dysglycemia, whereas the NCEP criteria does not; the presence of renal dysfunction is included in the WHO criteria but not in the NCEP criteria. Nevertheless, these two sets identify populations with existing or imminent insulin resistance and subclinical renal dysfunction—factors that lead to reduced renal clearance of uric acid.


Mar 5, 2019 | Posted by in RHEUMATOLOGY | Comments Off on Gout: Epidemiology and Risk Factors

Full access? Get Clinical Tree

Get Clinical Tree app for offline access