Just eat meat.

Created by Michael Goldstein (@bitstein)

Gout

What is Gout?

Gout refers to disease that occurs in response to the presence of monosodium urate (MSU) crystals in joints, bones, and soft tissues. Hyperuricemia (having high uric acid levels in the blood) is a necessary predisposing factor for gout. Uric acid then reacts with physiologic sodium to form MSU which can crystallise in painful places. Not everyone who has hyperuricemia goes on to develop gout - individual differences in the formation of crystals and/or in inflammatory responses to those crystals may play a role in whether a person with hyperuricemia will develop the disease.

British physician Alfred Garrod, in the mid-19th century, identified uric acid as the causative agent; the idea being that uric acid accumulates in the circulation [and] crystallizes into needle-sharp urate crystals. These crystals then lodge in the soft tissues and in the joints of the extremities –- classically, the big toe — and cause inflammation, swelling and an excruciating pain that was described memorably by the 18th century bon vivant Sydney Smith as like walking on one’s eyeballs.

Ash Simmonds

As usual the conventional wisdom is arse-about with cause/effect, gout etc isn’t primarily caused by purine rich or high protein foods upping uric acid, it’s the inability to effectively clear out uric acid which eventually form crystals in the joints and stones in the kidneys - which is from a feedback loop of inflammation, and usually there’s a catalyst feature which induces it.

If you are newish to low carb/keto and wondering about the dangers, then uric acid levels can rise substantially in the first 2-4 weeks, but they level out after 6-8 weeks. This is explained in the literature later in this thread, basically what happens is ketones compete for excretion with uric acid, but as you keto-adapt this is no longer a problem. The worst thing you can do is go back and forth between keto/carbs whilst you are still adapting. Either stick to it religiously for at least 2 months and adapt properly, or don’t bother and save yourself some possible flare-ups.

Gary Taubes

Gout and the condition known technically as hyperuricemia, or elevated levels of uric acid, are the most recent examples of this kind of institutional neglect of the potential health effects of fructose, and how pervasive it can be.

Gout itself is an interesting example because it is a disease that has gone out of fashion in the last century and yet the latest reports suggest it is not only as prevalent as ever, but becoming more so. Recent surveys suggest that nearly 6 percent of all American men in their fifties suffer from gout, and over ten percent in their seventies. The proportion of women afflicted is considerably less at younger ages but still rises over 3 percent by age 60.(1) Moreover, the prevalence of gout seems to have doubled over the last quarter century, coincident (perhaps not coincidentally) with the reported increase in obesity, and it may have increased five- or even six-fold since the 1950s, although a large portion of that increase may be due to the aging of the population.(2)

Until the late 17th century, when the spread of gout reached almost epidemic proportions in Britain, the disease afflicted almost exclusively the nobility, the rich and the educated, and so those who could afford to indulge an excessive appetite for food and alcohol. This made gout the original example of a disease linked to diet and over-consumption, and so, in effect, the original disease of civilization.

But once gout became easily treatable, in the early 1960s, with the discovery of the drug allopuranol, clinical investigators and researchers began to lose interest. And the pathology of gout has been understood since the British physician Alfred Garrod, in the mid-19th century, identified uric acid as the causative agent; the idea being that uric acid accumulates in the circulation to the point that it falls out of solution, as a chemist would put it, and so crystallizes into needle-sharp urate crystals. These crystals then lodge in the soft tissues and in the joints of the extremities – classically, the big toe — and cause inflammation, swelling and an excruciating pain that was described memorably by the 18th century bon vivant Sydney Smith as like walking on one’s eyeballs.(3) Because uric acid itself is a breakdown product of protein compounds known as purines – the building blocks of amino acids – and because purines are at their highest concentration in meat, it has been assumed for the past 130-odd years that the primary dietary means of elevating uric acid levels in the blood, and so causing first hyperuricemia and then gout, is an excess of meat consumption.

The actual evidence, however, has always been less-than-compelling: Just as low cholesterol diets have only a trivial effect on serum cholesterol levels, for instance, and low-salt diets have a clinically insignificant effect on blood pressure, low-purine diets have a negligible effect on uric acid levels. A nearly vegetarian diet, for instance, is likely to drop serum uric acid levels by 10 to 15% percent compared to a typical American diet, but that’s rarely sufficient to return high uric acid levels to normality, and there is little evidence that such diets reliably reduce the incidence of gouty attacks in those afflicted.(4) Thus, purine-free diets are no longer prescribed for the treatment of gout, as the gout specialist Irving Fox noted in 1984, “because of their ineffectiveness” and their “minor influence” on uric acid levels.(5) Moreover, the incident of gout in vegetarians, or mostly vegetarians, has always been significant and “much higher than is generally assumed.” (One mid-century estimate, for instance, put the incidence of gout in India among “largely vegetarians and teetotalers” at 7%.)(6) Finally, there’s the repeated observation that eating more protein increases the excretion of uric acid from the kidney and, by doing so, decreases the level of uric acid in the blood.(7) This implies that the meat-gout hypothesis is at best debatable; the high protein content of meats should be beneficial, even if the purines are not.

The alternative hypothesis is suggested by the association between gout and the entire spectrum of diseases of civilization, and between hyperuricemia and the metabolic abnormalities of Syndrome X. In the past century, gout has manifested all of the now-familiar patterns, chronologically and geographically, of diseases of civilization, and so those diseases associated with western diets. European physicians in World War I, for instance, reported a reduced incidence of gout in countries undergoing food shortages.(8) In primitive populations eating traditional diets, gout was virtually unknown or at least went virtually unreported (with the conspicuous exception of Albert Schweitzer who says he saw it with surprising frequency.) The earliest documented cases reported in Asia and Africa were in the late 1940s.(9) And even in the 1960s, hospital records from Kenya and Uganda suggested an incidence of gout lower than one in a thousand among the native Africans. Nonetheless, by the late 1970s, uric acid levels in Africa were increasing with westernization and urbanization,(10) while the incidence of both hyperuricemia and gout among South Pacific islanders was reportedly sky-rocketing. By 1975, the New Zealand rheumatologist B.S. Rose, a colleague of Ian Prior’s, was describing the native populations of the South Pacific as “one large gouty family.”(11)

Gout has also been linked to obesity since the Hippocratic era, and this association is the origin of the assumption that high-living and excessive appetites are the cause. Gouty men have long been reported to suffer higher rates of atherosclerosis and hypertension, while stroke and coronary heart disease are common causes of death.(12) Diabetes is also commonly associated with gout. In 1951, Menard Gertler, working with Paul Dudley White’s Coronary Research Project at Harvard, reported that serum uric acid levels rose with weight, and that men who suffered heart attacks were four times as likely to be hyperuricemic as healthy controls.(13) This led to a series of studies in the 1960s, as clinical investigators first linked hyperuricemia to glucose intolerance and high triglycerides, and then later to high insulin levels and insulin resistance.(14) By the 1990s, Gerald Reaven, among others, was reporting that insulin resistance and hyperinsulinemia raised uric acid levels, apparently by decreasing uric acid excretion by the kidney, just as they raised blood pressure by decreasing sodium excretion. “It appears that modulation of serum uric concentration by insulin resistance is exerted at the level of the kidney,” Reaven wrote, “the more insulin-resistant an individual, the higher the serum uric acid concentration.” (15)

These observations would suggest that anything that raised insulin levels would in turn raise uric acid levels and might cause gout, which would implicate any high carbohydrate diet with sufficient calories. But this neglects the unique contribution of fructose. The evidence arguing for sugar or fructose as the primary cause of gout is two-fold. First, the distribution of gout in western populations has paralleled the availability of sugar for centuries, and not all refined carbohydrates in this case. It was in the mid-17th century, that gout went from being exclusively a disease of the rich and the nobility to spread downward and outward through British society, reaching near epidemic proportions by the 18th century. Historians refer to this as the “gout wave,”(16) and it coincides precisely with the birth and explosive growth of the British sugar industry(17) and the transformation of sugar, in the words of the anthropologist Sydney Mintz, from “a luxury of kings into the kingly luxury of commoners.”(18) British per capita sugar consumption in the 17th century was remarkably low by modern standards, a few pounds per capita per year at the turn of the century, but the change in consumption over the next century and a half was unprecedented: between 1650 and 1800, following the British acquisition of Barbados, Jamaica and other “sugar islands”, total sugar consumption in England and Wales increased 20- to 25-fold.(19)

The second piece of evidence is much less circumstantial: simply put, fructose increases serum levels of uric acid. The “striking increase” in uric acid levels with an infusion of fructose was first reported in the Lancet in the late 1960s by clinicians from Helsinki, Finland, who referred to it as fructose-induced hyperuricemia.(20) This was followed by a series of studies through the late 1980s confirming the existence of the effect and reporting on the variety of mechanisms by which it came about. Fructose, for instance, accelerates the breakdown of a molecule known as ATP, which is the primary source of energy for cellular reactions and is loaded with purines. (ATP stands for adenosine triphosphate; adenosine is a form of adenine, and adenine is a purine.) And so this in turn increases formation of uric acid. Alcohol apparently raises uric acid levels through the same mechanism, although beer also has purines in it.(21) Fructose also stimulates the synthesis of purines directly, and the metabolism of fructose leads to the production of lactic acid, which in turn reduces the excretion of uric acid by the kidney and so raises uric acid concentrations indirectly by that mechanism.(22)

These mechanistic explanations of how fructose raises uric acid levels were then supported by a genetic connection between fructose metabolism and gout itself. Gout often runs in families, so much so that those clinicians studying gout have always assumed the disease has a strong hereditary component. In 1990, Edwin Seegmiller, one of the few veteran gout researchers in the U.S., and the British geneticist George Radda, who would go onto become director of the Medical Research Counsel, reported that the explanation for this familial association seemed to be a very specific defect in the genes that regulate fructose metabolism. Thus, individuals who inherit this defect will have trouble metabolizing fructose and so will be born with a predisposition to gout. This suggested the possibility, Seegmiller and Radda concluded, that this defect in fructose metabolism was “a fairly common cause of gout.”(23)

As these observations appeared in the literature, the relevant investigators were reasonably clear about the implications: “since serum-uric-acid levels are critical in individuals with gout, fructose might deserve consideration in their diet,” noted the Helsinki clinicians in The Lancet in 1967, and so the chronic consequences of high-fructose diets on healthy individuals required further evaluation.(24) Gouty patients should avoid high-fructose or high-sucrose diets, explained Irving Fox in 1984, because “fructose can accelerate rates of uric acid synthesis as well as lead to increased triglyceride production.”(25) Although none of these investigators seemed willing to define what precisely constituted a high-fructose or a high-sucrose diet. Was it 50 pounds of sugar a year? 100 pounds? 150 pounds? 300 pounds? And would high-fructose diets induce gout in healthy individuals or would they only exacerbate the problem in those already afflicted? In 1993, the British biochemist Peter Mayes published an article on fructose metabolism in the American Journal of Clinical Nutrition that is now considered the seminal article in the field. (This was in the special issue of the AJCN dedicated to the health effects of fructose.) Mayes reviewed the literature and concluded that high-fructose diets in healthy individuals were indeed likely to cause hyperuricemia, and he implied that gout could be a result, as well, but the studies to address that possibility had simply never been done. “It is clear,” Mayes concluded, “that systematic investigations in humans are needed to ascertain the precise amounts, both of fructose consumption and of its concentration in the blood, at which deleterious effects such as hyperlipidemia and hyperuricemia occur.”(26) Add to this Reaven’s research reporting that high insulin levels and insulin resistance will increase uric acid levels, and it suggests, as Mayes had remarked about triglycerides, that sugar (sucrose) and high fructose corn syrup would constitute the worst of all carbohydrates when it comes to uric acid and gout. The fructose would increase uric acid production and decrease uric acid excretion, while the glucose, through its effect on insulin, would also decrease uric acid excretion. Thus, it would be reasonable to assume or at least to speculate that sugar is a likely cause of gout, and that the patterns of sugar consumption explain the appearance and distribution of the disease.

Maybe so, but this hypothesis has never been seriously considered. Those investigators interested in gout have focused almost exclusively on alcohol and meat consumption, in part because these have historical precedents and because the implication that gouty individuals and particularly obese gouty individuals shy away from meat and alcohol fit in well with the dietary prescriptions of the 1970s onward.

More than anything, however, this sugar/fructose hypothesis was ignored, once again, because of bad timing. With the discovery and clinical application of allopurinol in the 1960s, those clinical investigators whose laboratories were devoted to studying the mechanisms of gout and purine metabolism – James Wyngaarden’s, for instance, at Duke and Edwin Seegmiller’s at NIH – began focusing their efforts either on working out the nuances of allopurinol therapy, or to applying the new techniques of molecular biology to the genetics of gout and rare disorders of hyperuricemia or purine metabolism. Nutritional studies were simply not considered worthy of their time, if for no other reason than that allopuranol allowed gout suffers to eat or drink whatever they wanted. “We didn’t care so much whether some particular food might do something,” says William Kelley, who is a co-author with Wyngaarden of the 1976 textbook, Gout and Hyperuricemia and who started his career in Seegmiller’s lab at NIH. “We could take care of the disease.”(27)

This exodus, however, coincided with the emergence of research on fructose-induced hyperuricemia. By the 1980s, when the ability of fructose and sucrose consumption to raise uric acid levels in human subjects was demonstrated repeatedly, the era of basic research on gout had come to an end. The major players had left the field and NIH funding on the subject had dwindled to a trickle. Wyngaarden published his last research paper in 1977 and spent the years 1982 to 1989 as director of the National Institutes of Health. Kelley published his last papers on the genetics of gout in 1989, when he became dean of medicine at the University of Pennsylvania. Irving Fox, who did much of the basic research on fructose- and alcohol-induced hyperuricemia in Kelley’s lab, went to work in the biotechnology industry in the early 1990s. Only Edwin Seegmiller remained interested in the etiology of gout, and Seegmiller says that when he applied to the NIH for funding to study the relationship between fructose and gout, after elucidating the genetic connection with Radda in 1990, his grant proposals were rejected on the basis that he was too old and, as an emeritus professor, technically retired.(28) “In the 1950s and 1960s, we had the greatest clinical scientists in the world working on this disease,” says Kelley. “By the 1980s and 1990s, there was no one left.”

Meanwhile, the medical journals would occasionally run articles on the clinical management of the gout, but these would concentrate almost exclusively on drug therapy. Discussions of diet would be short, perhaps a few sentences, and confused about the science. On those occasions when the authors would suggest that gouty individuals might benefit from low-purine diets, they would invariably include “sugars” and “sweets” as among the recommended foods with low-purine contents.(29) In a few cases – a 1996 article in the New England Journal of Medicine, for instance (30)– the articles would also note that fructose consumption would raise uric acid levels, suggesting only that the authors had been unaware of the role of fructose in “sugars” and “sweets.” Even when the New England Journal published a report from Walter Willett and his Harvard colleagues in March 2004, this same kind of nutritional illiteracy manifested itself. Willett’s article had reported that men with gout seemed to eat more meat than healthy men. But Willett, who by this time was arguably the nation’s most influential nutritional epidemiologist, later explained that they had never considered sugar consumption in their analysis because neither he nor his collaborators had been aware of the hyperuricemic effect of fructose. Willett’s co-author, Gary Curhan, a nephrologist and gout specialist with a doctorate in epidemiology, said he might have once known that fructose raised uric acid levels, but it had slipped his mind. “My memory is not what it used to be,” he said. He also acknowledged, in any case, that he never knew sucrose was half fructose.

The addenda to this fructose-induced hyperuricemia story may be even more important. When the New England Journal of Medicine published Willett’s gout study, it ran an editorial to accompany it written by the University of Florida nephrologist Richard Johnson. Over the past decade, Johnson’s research has supported the hypothesis that elevating the uric acid concentration in the circulation also damages the blood vessels leading into the kidneys in such a way as to raise blood pressure directly, and so suggests that fructose consumption will raise blood pressure.

This is another potentially harmful effect of fructose that post-dates the official reports exonerating sugar in the diet. And it is yet another mechanism by which sugar and high fructose corn syrup could be a particularly unhealthy combination. The glucose in these sugars would raise insulin levels, which in turn would raise blood pressure by inhibiting the kidney’s secretion of sodium and by stimulating the sympathetic nervous system, as we discussed in an earlier chapter, and the fructose would do it independently by raising uric acid levels and so damaging the kidney directly. If this were the case, which has never been tested, it would potentially explain the common association of gout and hypertension and even of diabetes and hypertension.(31) Johnson is only now looking into this possibility, however. Unlike Willett and his colleagues, Johnson had long been aware of the ability of fructose to raise uric acid levels, and so was studying that phenomenon in his laboratory. But it was only in the summer of 2004, he explained, three months after his NEJM editorial was published, that he realized that sucrose was half fructose and that his research of the past years was even relevant to sugar.(32)

A decade later, Thomas Benedek described the epidemiology of gout in The Cambridge World History of Human Disease this way: “Worldwide the severity and prevalence of gout have changed paradoxically since the 1940s. In the highly developed countries, as a result of the advent of effective prophylactic drug therapy, the disease is now rarely disabling. Elsewhere, however, it has become more prevalent, predominantly as a result of ‘improved diets.’”

The economist and historian Ralph Davis estimates that the supply of sugar from the Caribbean into Britain rose from three or four thousand tons a year in the late fifteenth century to over two hundred thousand tons by the 1770s, or an increase of over fifty-fold. (davis r, the rise of the atlantic economies, cornell university press, 1973, p. 251, 255)

Resources

Abstract

BACKGROUND: Gout, an inflammatory arthritis, reportedly afflicts more than 2 million men and women in the United States. Previous reports have suggested an association between gout and kidney stone disease; however, these studies did not adjust for such important potential confounders as obesity and the presence of hypertension. To our knowledge, no published study has examined the independent association between gout and kidney stone disease.

METHODS: We used a national probability sample of the US population to determine the independent association between reported gout and history of kidney stone disease.

RESULTS: Among men and women 20 years and older, 5.6% (10 million) reported the previous passage of a kidney stone and 2.7% (5.1 million) reported a diagnosis of gout by a physician. Moreover, 8.6% of individuals who reported the passage of a kidney stone on two or more occasions had a history of gout. Conversely, the prevalence of previous kidney stones in subjects with reported gout was 13.9%. In the age-adjusted model, gout was associated with an increased odds ratio (OR) for previous kidney stones (OR, 1.97; 95% confidence interval [CI], 1.37 to 2.83). After further adjustment for sex, race, body mass index, and presence of hypertension, the OR for previous kidney stones in individuals with gout decreased to 1.49 (95% CI, 1.04 to 2.14).

CONCLUSION: Showing an independent association between kidney stone disease and gout strongly suggests that they share common underlying pathophysiological mechanisms. Identification of these mechanisms may lead to improved preventive strategies for both conditions.

Abstract

OBJECTIVE: To determine whether the incidence of gout is higher in 1995-1996 compared to 1977-1978.

METHODS: Using the Rochester Epidemiology Project computerized medical record system, all potential cases of acute gout in the city of Rochester, Minnesota during the time intervals of 1977-1978 and 1995-1996 were identified. The complete medical records of all potential cases were screened and all who fulfilled the 1977 American College of Rheumatology proposed criteria for gout were included as incidence cases. Demographic data, body mass index, clinical presentation, and associated comorbid conditions were abstracted. The overall and age-gender adjusted incidence rates from the 2 cohorts were calculated and compared.

RESULTS: A total of 39 new cases of acute gout were identified during the 2 year interval 1977-1978 representing an age and sex-adjusted annual incidence rate of 45.0/100,000 (95% CI: 30.7, 59.3). For the interval 1995-1996, 81 cases were diagnosed, representing an annual incidence rate of 62.3/100,000 (95% CI: 48.4, 76.2). There was a greater than 2-fold increase in the rate of primary gout (i.e., no history of diuretic exposure) in the recent compared to the older time periods (p = 0.002). The incidence of secondary, diuretic related gout did not increase over time (p = 0.140).

CONCLUSION: Our results indicate that the incidence of primary gout has increased significantly over the past 20 years. While this increase might be a result of improved ascertainment of atypical gout, it may also be related to other, as yet unidentified, risk factors.

Abstract

OBJECTIVE: To provide a single source for the best available estimates of the national prevalence of arthritis in general and of selected musculoskeletal disorders (osteoarthritis, rheumatoid arthritis, juvenile rheumatoid arthritis, the spondylarthropathies, systemic lupus erythematosus, scleroderma, polymyalgia rheumatica/giant cell arteritis, gout, fibromyalgia, and low back pain).

METHODS: The National Arthritis Data Workgroup reviewed data from available surveys, such as the National Health and Nutrition Examination Survey series. For overall national estimates, we used surveys based on representative samples. Because data based on national population samples are unavailable for most specific musculoskeletal conditions, we derived data from various smaller survey samples from defined populations. Prevalence estimates from these surveys were linked to 1990 US Bureau of the Census population data to calculate national estimates. We also estimated the expected frequency of arthritis in the year 2020.

RESULTS: Current national estimates are provided, with important caveats regarding their interpretation, for self-reported arthritis and selected conditions. An estimated 15% (40 million) of Americans had some form of arthritis in 1995. By the year 2020, an estimated 18.2% (59.4 million) will be affected.

CONCLUSION: Given the limitations of the data on which they are based, this report provides the best available prevalence estimates for arthritis and other rheumatic conditions overall, and for selected musculoskeletal disorders, in the US population.

Gout has been recognised since antiquity, and descriptions of the disease date back to the Babylonian empire. It was certainly well described by Hippocrates. Called the “king of diseases” and the “disease of kings,” gout is more the “disease of plenty.” In Gout Porter and Rousseau track medical thinking about the disease across the centuries, from Hippocrates and Galen to Paracelsus, Harvey, Archibald Garrod in the Victorian era, and beyond. They discuss the cultural, moral, religious, and personal qualities associated with the condition, examining social commentary, personal writings, cartoons and visual arts, and literature (including novels by Dickens, Thackeray, and Joseph Conrad). Many of the quotations still ring true: “gout the bitter fruit of luxury” (Cheyne), “the seeds of this evil are frequently derived from the parents!” (Blackmore), “those most vulnerable were men with hail and athletic constitution,” who must pursue “abstinence in eating, temperance in drinking strong liquors and proper exercise.” Folklore deemed gout a disease of the better sort, a superiority tax, a celebrity complaint “fit for a man of quality.” Gout was the distemper of a gentleman whereas the rheumatism was the distemper of a hackney coachman. In addition, gout was viewed by Georgian doctors such as Walpole “as one of the nature’s solutions to depravities of the humours. If not entirely expelled, peccant humours or morbific matter were exiled to far-flung parts. Hence it should be left to do its work.” Gout could be seen as a desideratum, a life insurance rather than a death sentence. While gout was in possession of the body, no truly deadly enemy such as palsy, dropsy, or apoplexy could strike. However, Walpole’s contemporary Samuel Johnson pursued quite different strategies: “That the gout is a medicine I never perceived, for when I had it most in my foot, I had the spasms in my breast. At best the gout is only a dog that drives the wolf away and eats the sheep himself, for if the gout has time for growth, it will certainly destroy and destroy by long and lingering torture.” King George IV mutinied against the royal physicians’ policy of “quieta non movere” and phlebotomy: “I have borne your half measures long enough to please you; now I shall please myself and take colchicum.” Gout has been the sign of distinction, the patrician malady to top all others. One of the aims of this book is to delve deeply into how people come to “choose” to be ill and, being ill, which sickness they select and how they sell it. In this inquiry Porter and Rousseau explore in an exemplary manner medical writings, literary texts, journals and diaries, and the visual arts. Weaving these threads together, the authors provide an enjoyable account that integrates the medical and the moral, the scientific and the humanistic, the verbal and the visual across an impressive sweep of time. Although recently published, the book’s appearance has the flavour of the 1950s, with old fashioned notes (40 pages) and an extensive bibliography and index that historians will love.

This review summarizes the recent data on lifestyle factors that influence serum uric acid levels and the risk of gout and attempts to provide holistic recommendations, considering both their impact on gout as well as on other health implications. Large-scale studies have clarified a number of long-suspected relations between lifestyle factors, hyperuricemia, and gout, including purine-rich foods, dairy foods, various beverages, fructose, and vitamin C supplementation. Furthermore, recent studies have identified the substantial burden of comorbidities among patients with hyperuricemia and gout. Lifestyle and dietary recommendations for gout patients should consider overall health benefits and risk, since gout is often associated with the metabolic syndrome and an increased future risk of cardiovascular disease (CVD) and mortality. Weight reduction with daily exercise and limiting intake of red meat and sugary beverages would help reduce uric acid levels, the risk of gout, insulin resistance, and comorbidities. Heavy drinking should be avoided, whereas moderate drinking, sweet fruits, and seafood intake, particularly oily fish, should be tailored to the individual, considering their anticipated health benefits against CVD. Dairy products, vegetables, nuts, legumes, fruits (less sugary ones), and whole grains are healthy choices for the comorbidities of gout and may also help prevent gout by reducing insulin resistance. Coffee and vitamin C supplementation could be considered as preventive measures as these can lower urate levels, as well as the risk of gout and some of its comorbidities.

5 hydrick and fox, p. 748-749.

7 Hydrick cr and fox ih, nutrition and gout, in present knowledge in nutrition, fifth edition, the nutrition foundation, Washington dc, 1984, p. 743

8 duncans diseases of metabolism, p. 638

9 Traut ef, rheumatic diseases, diagnosis and treatment, the C.V. Mosby Company, St. Louis, 1952 p. 303.

9benedek, in Cambridge history of diseases

9Trowel hc, a case of gout in a ruanda African, the east African medical journal, oct. 1947, p. 346-348

Abstract

The prevalence of gout and the frequency distribution of serum uric acid (SUA) concentrations have been studied in four South African populations. Approximately 450 respondents over the age of 15 years were investigated in each of the following: a tribal Xhosa community in Transkei; a rural Tswana community in the northwestern Transvaal; an urban Negro population in Johannesburg; and a Caucasian community in the same city. No case of gout was encountered in any of the Negro groups, while the prevalence among the urban Caucasians was 13/1 000 men and 3/1 000 women. The mean SUA concentrations showed two consistent trends: (i) the levels rose with age in all four populations and in both sexes; (ii) they were generally higher in men than in women throughout the age range. There was, moreover, an increase in SUA concentrations with increasing sophistication of lifestyle (P less than 0,01), the lowest levels occurring in the tribal Africans, and the highest in the urban communities. This latter finding could not be explained on a genetic basis, nor were there significant differences in physical configuration and nutritional status among the three Negro groups. It is suggested that hyperuricaemia, and possibly the clinical manifestations of gout, have a polygenic aetiology in which acculturation plays an important contributory role.

Background/Purpose: Recent research has suggested that renaming gout to a pathophysiological illness label (urate crystal arthritis) avoids inaccurate lay perceptions of gout and promotes more effective management strategies. In Aotearoa/New Zealand, Māori (indigenous New Zealanders) have high prevalence of gout, with early onset and severe disease. It is unknown how a change in illness label would impact on indigenous New Zealanders who are disproportionally affected by gout. The aim of this study was to examine the effect of changing the illness label of gout on the perceptions of the disease and its management in Māori in Aotearoa/New Zealand. Methods: Supermarket shoppers in rural and urban locations with large Māori communities were recruited into a study examining the perceptions of different types of arthritis. Participants were randomised 1:1 to complete a questionnaire examining the perception of the same disease description labelled as either ‘gout’ or ‘urate crystal arthritis’ (UCA). Participants rated likely causal factors for the disease, illness perceptions and the usefulness of various management strategies using Likert scales. Differences between the two illness labels were tested using independent sample t-tests. Results: Completed questionnaires were available from 172 Māori participants. The gout-labelled illness was most likely to be viewed as caused by diet (P=0.003), whereas the UCA-labelled illness was most likely to be viewed as caused by aging (P=0.001). ‘UCA’ was seen as having a wider range of factors as responsible for the illness, with stress or worry, hereditary factors, chance and pollution more likely to be viewed as causes of ‘UCA’. ‘Gout’ was less likely to be viewed as having a chronic timeline than ‘UCA’ (mean (SD) for ‘Gout’ 6.9 (2.8) and for ‘UCA’ 7.9 (2.4), P=0.013). ‘Gout’ was also viewed as better understood than ‘UCA’ (mean (SD) for ‘Gout’ 6.3 (3.1) and for ‘UCA’4.4 (3.3), P=0.001). Other illness perceptions did not differ between the illness label groups. Changing to a healthier diet was perceived as more helpful for ‘Gout’ compared to ‘UCA’ (mean (SD) for ‘Gout’ 8.5 (2.3) and for ‘UCA’ 7.3 (2.7), P=0.003). Participants also viewed stopping or restricting alcohol use as more helpful for ‘Gout’ than ‘UCA’ (mean (SD) for ‘Gout’ 8.1 (2.8) and for ‘UCA’ 7.0 (3.1), P=0.017). There were no differences between ‘Gout’ and ‘UCA’ in perceptions that adopting regular exercise, losing weight or taking long-term medications would be helpful for managing the illness (P>0.23 for all). Conclusion: In an indigenous population that is disproportionately affected by gout, causal beliefs and management strategies for a gout-labelled illness are consistent with widely-held lay beliefs that gout is a disease caused by self-inflicted dietary excess. Renaming gout to urate crystal arthritis promotes more complex causal beliefs, a longer timeline for the disease, and is likely to avoid perceptions that dietary modification and alcohol restriction are the main strategies for effective management.

12 duncan’s diseases of metabolism, 1947, p. 631

13 gertler mm, et al, erum uric acid in relation to age and physique in health andin coronary ehart disease, Ann Intern Med. 1951 Jun;34(6):1421-31. Reiser S, Uric Acid and Lactic Acid, in REISER S AND HALLFRISCH J, METABOLIC EFFECTS OF FRUCTOSE, crc press, boca raton fl, 1987 p. 113-134

13

14 duncan’s diseases of metabolism, p. 631

14 reaven gm, The Kidney: An Unwilling Accomplice in Syndrome X, Am J Kid Dis, Vol. 30, n0 6, December, 1997: pp. 928-931.

15 Facchini F et al, Relationship Between Resistance to Insulin-Mediated Glucose Uptake, Urinary Uric Acid Clearance, and Plasma Uric Acid Concentration, JAMA, December 4, 1991, vol. 266, no. 21, 3008-3011

Abstract

The concept of an abnormality of glutamine metabolism in primary gout was first proposed on the basis of isotope data: when [15N]glycine was administered to gouty subjects, there was disproportionately great enrichment of N-(3 + 9) of uric acid, which derive from the amide-N of glutamine. An unduly high concentration of 15N in glutamine was postulated, and attributed to a hypothetical defect in catabolism of glutamine. Excess glutamine was proposed as the driving force of uric acid overproduction.

We have reexamined this proposition in four gouty subjects: one mild overproducer of uric acid with “idiopathic gout,” one marked overproducer with high-grade but “partial” hypoxanthine-guanine phosphoribosyl-transferase deficiency, and two extraordinary overproducers with superactive phosphoribosylpyrophosphate synthetases. In the last three, the driving force of excessive purine biosynthesis is a known surplus of α-5-phosphoribosyl-1-pyrophosphate. Disproportionately high labeling of N-(3 + 9) was present in all four gouty subjects, most marked in the most flamboyant overproducers. The precursor glucine pool was sampled by periodic administration of benzoic acid and isolation of urinary hippuric acid. Similarly, the precursor glutamine pool was sampled by periodic administration of phenylacetic acid and isolation of the amide-N of urinary phenylacetylglutamine. The time course of 15N enrichment of hippurate differed from that of the amide-N of glutamine. Whereas initial enrichment values of hippurate were very high, those of glutamine-amide-N were low, increasing to a maximum at about 3 h, and then declining less rapidly than those of hippurate. However, enrichment values of hippurate and of phenacetyl glutamine were normal in all of the gouty subjects studied. Thus, preferential enrichment of N-(3 + 9) in gouty overproducers given [15N]glycine does not necessarily reflect a specific abnormality of glutamine metabolism, but rather appears to be a kinetic phenomenon associated with accelerated purine biosynthesis per se.

In addition, greater enrichment of N-9 than of N-3 on days 1 and 2 provided suggestive evidence for a second pathway for synthesis of the initial precursor of purine biosynthesis, phosphoribosylamine, perhaps utilizing ammonia rather than the amide-N of glutamine as nitrogen donor. In this limited study, the activity of this potential second pathway did not appear to be selectively increased in gout.

A fascinating persuasive history of how sugar has shaped the world, from European colonies to our modern diets In this eye-opening study, Sidney Mintz shows how Europeans and Americans transformed sugar from a rare foreign luxury to a commonplace necessity of modern life, and how it changed the history of capitalism and industry. He discusses the production and consumption of sugar, and reveals how closely interwoven are sugar’s origins as a “slave” crop grown in Europe’s tropical colonies with is use first as an extravagant luxury for the aristocracy, then as a staple of the diet of the new industrial proletariat. Finally, he considers how sugar has altered work patterns, eating habits, and our diet in modern times. “Like sugar, Mintz is persuasive, and his detailed history is a real treat.” -San Francisco Chronicle

Significance

Human susceptibility to gout is driven by the fact that we have a pseudogene for uricase that prevents a functional enzyme from being produced. Our inability to convert highly insoluble uric acid into a more soluble molecule makes us vulnerable to disease and other health complications. We have exploited ancestral sequence reconstruction to better understand how and why apes lost this functional enzyme. Our ancient proteins support one hypothesis that the progressive loss of uricase activity allowed our ancestors to readily accumulate fat via the metabolism of fructose from fruits. This adaptation may have provided our ancestors with an advantage when the energy-rich rainforests of Europe and Asia were displaced by temperate forests by the end of the Oligocene.

Abstract

Uricase is an enzyme involved in purine catabolism and is found in all three domains of life. Curiously, uricase is not functional in some organisms despite its role in converting highly insoluble uric acid into 5-hydroxyisourate. Of particular interest is the observation that apes, including humans, cannot oxidize uric acid, and it appears that multiple, independent evolutionary events led to the silencing or pseudogenization of the uricase gene in ancestral apes. Various arguments have been made to suggest why natural selection would allow the accumulation of uric acid despite the physiological consequences of crystallized monosodium urate acutely causing liver/kidney damage or chronically causing gout. We have applied evolutionary models to understand the history of primate uricases by resurrecting ancestral mammalian intermediates before the pseudogenization events of this gene family. Resurrected proteins reveal that ancestral uricases have steadily decreased in activity since the last common ancestor of mammals gave rise to descendent primate lineages. We were also able to determine the 3D distribution of amino acid replacements as they accumulated during evolutionary history by crystallizing a mammalian uricase protein. Further, ancient and modern uricases were stably transfected into HepG2 liver cells to test one hypothesis that uricase pseudogenization allowed ancient frugivorous apes to rapidly convert fructose into fat. Finally, pharmacokinetics of an ancient uricase injected in rodents suggest that our integrated approach provides the foundation for an evolutionarily-engineered enzyme capable of treating gout and preventing tumor lysis syndrome in human patients.

Gout—a disease of red, painful, swollen joints—has an unfair reputation as a disease that only affects the wealthy after a lifetime of overindulgence. In reality, it’s the legacy of evolutionary changes that took place more than 20 million years ago, which we’re still paying for now. Gout was once called the “king of diseases and the disease of kings”. It could equally be the “disease of apes”. The substance responsible for the condition is uric acid, which is normally expelled by our kidneys, via urine. But if there’s too much uric acid in our blood, it doesn’t dissolve properly and forms large insoluble crystals that build up in our joints. That explains the painful swellings. High levels of uric acid have also been linked to obesity, diabetes, and diseases of the heart, liver and kidneys. Most other mammals don’t have this problem. In their bodies, an enzyme called uricase converts uric acid into other substances that can be more easily excreted. Uricase is an ancient invention, one that’s shared by bacteria and animals alike. But for some reason, apes have abandoned it. Our uricase gene has mutations that stop us from making the enzyme at all. It’s a “pseudogene”—the biological version of a corrupted computer file. And it’s the reason that our blood contains 3 to 10 times more uric acid than that of other mammals, predisposing us to gout. How did it come to this? Why did we do away with such an important enzyme? And when? To find out, a group of scientists led by Eric Gaucher at the Georgia Institute of Technology resurrected long-gone editions of uricase that haven’t been seen for millions of years. Team members James Kratzer, Miguel Lanaspac and Michael Murphy compared the uricases in modern mammals to infer the sequences of ancestral varieties. “It’s like what a historical linguist does, when they study modern languages and try to understand how an ancient ones were pronounced,” says Gaucher. Then, the team actually built these ancient enzymes in their labs, and compared their ability at processing uric acid. The oldest version, which was wielded by the last common ancestor of all mammals 90 million years ago, was the most active one. It outperformed all its modern descendants. Since that ancient heyday, things have gradually gone downhill. Throughout mammalian evolution, and especially during primate evolution, uricase has picked up mutations that have made for progressively less efficient enzymes. In the last common ancestor of all apes, uricase had already been hobbled to the point of near-uselessness. The ape-specific mutations that turned our uricases into broken pseudogenes merely disabled something that was already FUBARed to begin with. Why this slow, creeping decline? Gaucher suspects that the answer involves fruit. The biggest drops in uricase’s efficiency coincided with a time when the Earth’s climate was cooling. The ancient fruit-eating primates of Europe and Asia faced a glut of food in the summer, but risked starving in winter when fruit was unavailable. Here’s where uric acid comes in. Our cells produce the stuff when they break down fructose, the main sugar in fruit. In turn, uric acid stimulates the build-up of fat—a process that uricase counters. Indeed, when Gaucher’s team dosed human cells with the ancient, efficient uricases, they became less good at making fat when exposed to fructose. But with later inactive uricases, they produced a substantial amount of fat. So, disable uricase and you risk building up high levels of uric acid, but you also become a champion at turning fruit into fat. For ancient primates facing an increasingly seasonal food supply, that trade-off may have been worth it. It’s a nice story, although it only explains the final act of uricase’s downfall. Other factors almost certainly played a role in the enzyme’s gradual decline. For example, Michael Hershfield from Duke University notes that early primates lived in rainforests, had easy access to water, and could make a lot of urine—all the better for getting rid of surplus uric acid. He speculates that these conditions might have reduced the need for uricase enough to allow the enzyme to accumulate disabling mutations.

Still, Gaucher’s results provide some support for an old but unproven idea called the thrifty gene hypothesis. Proposed in 1962, it says that humans have genes that suited our ancestors during times of scarce food, but predispose us to diabetes and obesity in the modern age of free-flowing calories. Uricase is the first good example. Our broken version may have helped our primate ancestors to thrive but it leaves us prone to gout and other illnesses linked to uric acid, whose rates have soared in recent years.

The team’s resurrected enzymes may be able to help with that too.

For over 20 years, pharmaceutical companies have tried to develop treatments for gout by using working versions of uricase from other mammals. But you can’t simply inject a pig uricase into a human patient—our immune reaction would go nuts in the presence of such a foreign enzyme.

Hershfield’s group developed a workaround by fusing the pig uricase with the baboon version. The pig bit does the heavy metabolic lifting, and the baboon bit cloaks it from our immune system. In 2010, the US Food and Drug Administration approved this chimeric enzyme, known as Krystexxa, for treating severe chronic gout.

Gaucher thinks that we can find better solutions by looking to the past. His team found that the oldest of their resurrected enzymes is both more efficient than the raw pig-baboon chimera, and lasts longer in rats. And despite its ancient nature, it’s a closer match for human uricase than even the baboon version, so it might be even less provocative to the immune system. The team have now filed a patent for the ancient uricases and formed a start-up company to turn them into an actual drug.

Hershfield anticipates bumps along the way. “It took about 17 years from the time I conceived of developing a recombinant uricase for treating refractory gout to the time it received FDA approval,” he says. “I wish [them] success, but I suspect I may not be around to witness approval of their drug.” Gaucher counters that the existing drug has already paved the way for FDA approval: “We’ll either jump through fewer hoops or we won’t have to jump as high.”

He undoubtedly has a long way to go but it’s an enticing notion that an enzyme hasn’t been around since the dinosaurs ruled the world might help gout sufferers in the future. As Belinda Chang from the University of Toronto says in a related commentary, “We are all prisoners of our history, but perhaps we can find better solutions for the future by learning from the past.”

Abstract

The hyperuricemia responsible for the development of gouty arthritis results from a wide range of environmental factors and underlying genetically determined aberrations of metabolism. 31P magnetic resonance spectroscopy studies of children with hereditary fructose intolerance revealed a readily detectable rise in phosphomonoesters with a marked fall in inorganic phosphate in their liver in vivo and a rise in serum urate in response to very low doses of oral fructose. Parents and some family members heterozygous for this enzyme deficiency showed a similar pattern when given a substantially larger dose of fructose. Three of the nine heterozygotes thus identified also had clinical gout, suggesting the possibility of this defect being a fairly common cause of gout. In the present study this same noninvasive technology was used to identify the same spectral pattern in 2 of the 11 families studied with hereditary gout. In one family, the index patient’s three brothers and his mother all showed the fructose-induced abnormality of metabolism, in agreement with the maternal inheritance of the gout in this family group. The test dose of fructose used produced a significantly larger increment in the concentration of serum urate in the patients showing the changes in 31P magnetic resonance spectra than in the other patients with familial gout or in nonaffected members, thus suggesting a simpler method for initial screening for the defect.

Abstract A marked increase in gout was observed in England during the 17th to 20th centuries. Many have ascribed this rapid increase in gout to the introduction of wines that were laced with lead. In this article, we suggest another likely contributor, which is the marked increase in sugar intake that occurred in England during this period. Sugar contains fructose, which raises uric acid and increases the risk for gout. Sugar intake increased markedly during this period due to its introduction in liquors, tea, coffee and desserts. We suggest that the introduction of sugar explains why gout was originally a disease of the wealthy and educated, but gradually became common throughout society.

Sugar increases serum uric acid levels and increases the risk for gout Sugar, or sucrose, is a disaccharide containing fructose and glucose. Upon ingestion, the enzyme sucrase, which is present in the small intestine, converts sucrose to monosaccharides, which are then absorbed. The metabolism of glucose and fructose is similar except for the first few enzymic steps, and this is key in our understanding of how sugar raises uric acid. When glucose is metabolized by glucokinase, the initial phosphorylation of glucose is carefully regulated, and adenosine triphosphate (ATP) levels are always maintained in the cell. In contrast, the metabolism of fructose by fructokinase C, the principal isoform of fructokinase, results in the rapid phosphorylation of fructose to fructose-1-phosphate, resulting in transient depletion of intracellular phosphate and ATP. As intracellular phosphate falls, the enzyme adenosine monophosphate deaminase is activated, resulting in the shunting of adenosine monophosphate to inosine monophosphate, inosine, hypoxanthine and eventually uric acid [11]. The rise in intracellular uric acid is marked, and results in a spillage into the circulation and a rapid rise in uric acid levels. The ability of fructose to acutely raise uric acid levels has been so consistent that it has been developed as a test (fructose tolerance test), as some individuals, such as those with gout and hypertension, show a greater increase than that observed in the otherwise healthy individual [1214]. In addition, chronic ingestion of fructose also turns on the de novo synthesis of uric acid [15]. Consistent with these findings, subjects chronically fed sugar (or fructose) show an increase in fasting serum uric acid levels [16, 17]. Epidemiological studies have linked increased intake of sugary soft drinks with elevated serum uric acid levels and increased risk for gout [18, 19]. In contrast, the relationship of fruits (which also contain fructose) with gout is less strong, perhaps because many fruits are rich in vitamin C (ascorbate), which lowers serum uric acid and stimulates urate excretion [20, 21]. Nevertheless, physicians such as Sir William Osler felt that restricting fruit intake was critical in the gouty patient to prevent recurrent arthritic attacks [22].

Sugar consumption increases in Shakespearean England Sugarcane was originally grown in India in the Ganges River Valley, where it was harvested as early as 500 BC. FIG. 1 The introduction of gout. A painting from 1818 by George Cruikshank shows a wealthy man who is eating fruits and drinking wine and is about to be introduced to the pleasures of gout. Note the eruption of Mount Vesuvius in the background as a premonition for this special event. Nevertheless, sugar remained largely unknown to Europe until the Middle Ages, when some of the earliest reports came from the Crusaders fighting in the Holy Lands. By approximately 1000 AD, however, sugar was entering Europe from Egypt and Ceylon primarily via the port of Venice [23]. At first sugar was so expensive that it was available only in small amounts and was primarily used as a medicine. For those who were rich, however, sugar became a special, sought-after food. Many kings loved sugar. For example, court records showed that King Edward I of England ordered 1877 pounds of sugar in 1287 and 6258 pounds of sugar in 1288 [24]. Saint Thomas Aquinas of Italy declared that sugar, while being a medicine, could be eaten during the fast [25]. Aquinas went on to become extremely fat himself. Beginning in the 1400s sugar production was occurring in southern Spain, as well as by the Portuguese in the Madeira Islands and Sao Tome off the coast of Africa [26]. The discovery of Hispaniola by Columbus and of Brazil by Cabral led to a further expansion of the sugar industry. By 1505 the first boat carrying slaves from Africa arrived in the Americas, primarily to work in the sugarcane fields, and by 1516 the first boats loaded with sugar were returning to England [24]. A triangle of trade developed in which slaves were sent from Africa to the Americas to produce sugar that would then be sent to Europe, followed by the shipping of manufactured goods to Africa. During the next three centuries between 10 and 20 million African slaves were brought to the Americas, creating one of the darkest periods of history. While Spain and Portugal initially received much of the imported sugar, these countries largely lost control of the sugar trade to England during the 1500s. There were likely multiple reasons, including the rebellion (the Dutch Revolt) of the Low Countries from Spain in the 1560s and the loss of the Spanish Armada to England in 1588. By 1544 England had also established its first sugar houses in London [23]. This was fuelled by increasing shipments of sugar to England from Barbados and other islands in the West Indies. England imported 1200 barrels (termed hogsheads) of sugar in 1660, 50 000 barrels in 1700 and 11 000 barrels in 1753 [24]. The only other country that was importing sugar to the same extent was Holland, due largely to the formation of the Dutch East Indies Company in 1602. The Dutch brought sugar to Amsterdam from the East Indies. One of the major sources was Java, where the number of sugar mills increased from 20 in 1650 to 130 in 1710. Sugar becomes increasingly available to the English As sugar imports increased, sugar prices fell and it became increasingly available to the common man. The intake of sugar in England increased 5-fold between 1700 and 1800, whereas it remained unchanged in France (Fig. 2) [27]. Per capita intake of sugar increased from an average of 4 pounds/year in 1700, to 18 pounds in 1800 and to 90 pounds in 1900 [28]. While part of this increase was driven by increasing sugar imports, the tax on sugar was also stepwise reduced until it was completely eliminated in 1874. One of the first uses of sugar was in drinks, particularly sweet wines and sweet ciders (Fig. 3). Some wines, such as those from Portugal and the Madeira Islands, were often rich in sugar content. Spain and the Canary Islands were also famous for producing sack, in which spirits from grapes, sugar beets or sugarcane were added to the wine. If the spirits were added before fermentation was complete, then the wine would have a high residual sugar content. Sack was imported to England in relatively large amounts beginning around 1517 and was often combined with additional sugar to make the famous sack and sugar drink. Sack became a common drink available in taverns in England by the late 1500s, where it could cost as little as 2 shillings a gallon. Falstaff, a character in several of Shakespeare’s plays, took the nickname of Sack and Sugar because of his love for this drink, and perhaps not surprisingly, developed gout in the play Henry IV (Fig. 4). Another common drink was hippocras, in which sugar, cinnamon and nutmeg was added to wine. By the early 1630s, punch became a common drink in which initially wine and brandy, and later Jamaican rum, were combined with sugar, lemon and spices. Rum is derived from sugarcane and became the official drink of the British Navy following the fall of Kingston, Jamaica, to the British in 1655. Sugar was also commonly used in foods (sweetmeats) and desserts (such as sack posset, which was a custardlike dessert made from cream, eggs and wine). One of the more common uses of sugar, however, was to add it to tea or coffee. Interestingly, the habit of adding sugar to tea or coffee was introduced in England in the mid-1600s, and by the turn of that century the first teahouses and coffee houses were present in London [27, 29].

Background/Purpose The conventional low-purine dietary approach to gout offers limited efficacy, palatability, and sustainability, and promotes increased consumption of refined carbohydrates and saturated fat that can actually worsen gout’s cardiovascular (CV)-metabolic comorbidities. In contrast, effective dietary approaches to reduce CV-metabolic conditions (including obesity) could also lower serum uric acid (SUA) levels by lowering adiposity and insulin resistance. Similarly, high-protein, low-carbohydrate diets such as the Atkins diet may lower SUA despite substantial purine loading and ketogenesis. Indeed, a small study (n=13) that employed a high-protein diet with reduced calories found that mean SUA levels decreased from 9.6 to 7.9 mg/dL, with reduced gout attacks over 16 weeks (Ann Rheum Dis 2000). Additional benefits included an improved lipid profile. We investigated the SUA response to the Atkins diet among overweight or obese individuals over a 6 month period.

Methods Our study population was derived from the Dietary Intervention Randomized Controlled Trial (DIRECT) of overweight or obese participants (BMI ≥ 27). The Atkins diet (i.e., high protein, low-carbohydrate, no calorie restriction) was one of DIRECT’s intervention groups and was a focus of the current analysis. We used serum samples at -80°C to compare SUA levels at baseline and 6 months among 74 participants with complete datasets and analyzed the SUA level response as well as lipid profile, weight change, fasting insulin levels, and glucose levels.

Results The mean age was 51 years and the mean BMI was 31. Most participants (91%) were men. The overall rate of adherence to the diets in DIRECT was > 95% during our 6-month study period. Baseline SUA level was 6.0 mg/dL and the overall SUA change at 6 months was -0.8 mg/dL. This change varied substantially according to baseline characteristics (Table), particularly baseline SUA levels. Individuals (N=18) with SUA levels > 7mg/dL (above the saturation point) showed a decrease in mean SUA levels from 7.9 to 5.5 mg/dL (p <.0001). Of the 18, 11 (61%) reached SUA < 6mg/dL (the usual anti-gout SUA therapeutic target) and 6 (33%) reached SUA level < 5mg/dL (the SUA therapeutic target for advanced gout) (Figure). Those with obesity and younger individuals (<50 years) tended to have a larger SUA decline. Additional benefits included significant improvements in HDL-cholesterol, total cholesterol/HDL-C ratio, triglyceride levels, and fasting insulin levels (p <.0001).

Conclusion Our findings suggest that the Atkins diet (i.e., a high protein diet without calorie restriction) can reduce SUA levels despite substantial purine loading. This effect may be more pronounced and clinically meaningful among those with hyperuricemia or obesity. Comparative effectiveness research with other proven CV-metabolic diets would help determine the optimal dietary approach to lower SUA levels.

Abstract

Introduction:

Circulating concentrations of uric acid may be affected by dietary components such as meat, fish and dairy products, but only a few studies have compared uric acid concentrations among individuals who exclude some or all of these foods from their diet. The aim of this study was to investigate differences in serum uric acid concentrations between meat eaters, fish eaters, vegetarians and vegans.

Subjects and Methods:

A sample of 670 men and 1,023 women (424 meat eaters, 425 fish eaters, 422 vegetarians and 422 vegans, matched on age and sex) from the European Prospective Investigation into Cancer and Nutrition Oxford cohort were included in this cross-sectional analysis. Diet was assessed using a semi-quantitative food frequency questionnaire and serum concentrations of uric acid were measured. Mean concentrations of uric acid by diet group were calculated after adjusting for age, body mass index, calcium and alcohol intake.

Results:

In both men and women, serum uric acid concentrations differed significantly by diet group (p,0.0001 and p = 0.01, respectively). The differences between diet groups were most pronounced in men; vegans had the highest concentration (340, 95% confidence interval 329–351 mmol/l), followed by meat eaters (315, 306–324 mmol/l), fish eaters (309, 300–318 mmol/l) and vegetarians (303, 294–312 mmol/l). In women, serum uric acid concentrations were slightly higher in vegans (241, 234–247 mmol/l) than in meat eaters (237, 231–242 mmol/l) and lower in vegetarians (230, 224–236 mmol/l) and fish eaters (227, 221–233 mmol/l).

Conclusion:

Individuals consuming a vegan diet had the highest serum concentrations of uric acid compared to meat eaters, fish eaters and vegetarians, especially in men. Vegetarians and individuals who eat fish but not meat had the lowest concentrations of serum uric acid.

Abstract Most of the metabolic effects of fructose are due to its rapid utilization by the liver and it by-passing the phosphofructokinase regulatory step in glycolysis, leading to far reaching consequences to carbohydrate and lipid metabolism. These consequences include immediate hepatic increases in pyruvate and lactate production, activation of pyruvate dehydrogenase, and a shift in balance from oxidation to esterification of nonesterified fatty acids, resulting in increased secretion of very-low-density-lipoprotein (VLDL). These effects are augmented by long-term absorption of fructose, which causes enzyme adaptations that increase lipogenesis and VLDL secretion, leading to triglyceridemia, decreased glucose tolerance, and hyperinsulinemia. Acute loading of the liver with fructose causes sequestration of inorganic phosphate in fructose-1-phosphate and diminished ATP synthesis. Consequently, the inhibition by ATP of the enzymes of adenine nucleotide degradation is removed and uric acid formation accelerates with consequent hyperuricemia. These effects are of particular significance to potentially hypertriglyceridemic or hyperuricemic individuals.

Abstract

Aging and lipotoxicity are two major risk factors for gout that are linked by the activation of the NLRP3 inflammasome. Neutrophil-mediated production of interleukin-1β (IL-1β) drives gouty flares that cause joint destruction, intense pain, and fever. However, metabolites that impact neutrophil inflammasome remain unknown. Here, we identified that ketogenic diet (KD) increases β-hydroxybutyrate (BHB) and alleviates urate crystal-induced gout without impairing immune defense against bacterial infection. BHB inhibited NLRP3 inflammasome in S100A9 fibril-primed and urate crystal-activated macrophages, which serve to recruit inflammatory neutrophils in joints. Consistent with reduced gouty flares in rats fed a ketogenic diet, BHB blocked IL-1β in neutrophils in a NLRP3-dependent manner in mice and humans irrespective of age. Mechanistically, BHB inhibited the NLRP3 inflammasome in neutrophils by reducing priming and assembly steps. Collectively, our studies show that BHB, a known alternate metabolic fuel, is also an anti-inflammatory molecule that may serve as a treatment for gout.

Highlights

Men who consume two or more sugary soft drinks a day have an 85% higher risk of gout compared with those who drink less than one a month, a study suggests. Cases in the US have doubled in recent decades and it seems fructose, a type of sugar, may be to blame, the British Medical Journal study reports. UK experts said those with gout would be advised to cut out sugary drinks. Dr Hyon Choi, from the University of British Columbia, in Vancouver said dietary advice for gout had focused on restricting purine-rich foods, such as red meat and beer. He said practitioners should advise patients with gout to reduce their fructose intake.

Abstract The effects of ketone body metabolism suggests that mild ketosis may offer therapeutic potential in a variety of different common and rare disease states. These inferences follow directly from the metabolic effects of ketosis and the higher inherent energy present in d-beta-hydroxybutyrate relative to pyruvate, the normal mitochondrial fuel produced by glycolysis leading to an increase in the DeltaG’ of ATP hydrolysis. The large categories of disease for which ketones may have therapeutic effects are:(1)diseases of substrate insufficiency or insulin resistance,(2)diseases resulting from free radical damage,(3)disease resulting from hypoxia. Current ketogenic diets are all characterized by elevations of free fatty acids, which may lead to metabolic inefficiency by activation of the PPAR system and its associated uncoupling mitochondrial uncoupling proteins. New diets comprised of ketone bodies themselves or their esters may obviate this present difficulty.

Abstract BACKGROUND: The association between alcohol consumption and risk of gout has been suspected since ancient times, but has not been prospectively confirmed. Additionally, potential differences in risk of gout posed by different alcoholic beverages have not been assessed.

METHODS: Over 12 years (1986-98) we used biennial questionnaires to investigate the relation between alcohol consumption and risk of incident gout in 47?150 male participants with no history of gout at baseline. We used a supplementary questionnaire to ascertain whether reported cases of gout met the American College of Rheumatology survey gout criteria.

FINDINGS: We documented 730 confirmed incident cases of gout. Compared with men who did not drink alcohol, the multivariate relative risk (RR) of gout was 1.32 (95% CI 0.99-1.75) for alcohol consumption 10.0-14.9 g/day, 1.49 (1.14-1.94) for 15.0-29.9 g/day, 1.96 (1.48-2.60) for 30.0-49.9 g/day, and 2.53 (1.73-3.70) for > or =50 g/day (p for trend <0.0001). Beer consumption showed the strongest independent association with the risk of gout (multivariate RR per 12-oz serving per day 1.49; 95% CI 1.32-1.70). Consumption of spirits was also significantly associated with gout (multivariate RR per drink or shot per day 1.15; 95% CI 1.04-1.28); however, wine consumption was not (multivariate RR per 4-oz serving per day 1.04; 95% CI 0.88-1.22).

INTERPRETATION: Alcohol intake is strongly associated with an increased risk of gout. This risk varies substantially according to type of alcoholic beverage: beer confers a larger risk than spirits, whereas moderate wine drinking does not increase the risk. r/Ketoscience Interpretation: The more carbohydrates alcohol has, the worse it is for you.

Low Carb Bloggers

Peter Dobromylskyj

Amy Berger

Dr. Andreas Eenfeldt, MD

Burton Abrams

Dr. Barry Groves

Stephen Phinney and Jeff Volek

Prof Richard Johnson

Mark Sisson

Chris Kresser

Created by Travis Statham @travis_statham Jan 15, 2019