Reduce bounce rates

Unraveling the GMO Debate: Separating Fact from Fiction

As we navigate the intricate landscape of genetically modified organisms, it’s crucial to distinguish between unfounded fears and scientific realities. While public skepticism persists, the evidence suggests that GMOs can play a pivotal role in creating a sustainable and nourished future.

Here Are the New Drugs and Treatments We Could See in 2024

2023 was a strong year for innovative new drugs, with new medications for Alzheimer’s disease, weight loss, and the first treatment based on the gene-editing technology CRISPR.

But 2024 is also shaping up to be a milestone year for some exciting therapies. Here’s what to expect.

Another new Alzheimer’s drug

Eli Lilly could debut a new treatment for Alzheimer’s disease that targets amyloid, the protein that builds up in the brains of patients. In studies that the company has submitted to the U.S. Food and Drug Administration (FDA) for approval, people receiving the drug experienced 35% slower cognitive decline according to cognitive tests than those getting placebo, and 40% less decline in their ability to perform daily activities such as driving or holding conversations. That’s a slightly higher efficacy than the existing medications for the neurodegenerative disease, and experts are hoping that if patients start taking it early enough, they might be able to hold off the worst effects of memory loss and cognitive decline for a few years. The FDA is expected to make a decision about the drug in early 2024.

[time-brightcove not-tgx=”true”]

Innovative blood-disorder treatments

After approving the first CRISPR treatment, Casgevy, for sickle cell anemia, the FDA is reviewing the same therapy for another genetic blood disorder called beta thalassemia. In both conditions, people have abnormal blood cells that can’t carry enough oxygen, which leads to painful attacks and frequent blood transfusions and hospitalizations. The gene-editing therapy is a one-time treatment that allows people to make more healthy blood cells, which can reduce the number of painful episodes. U.K. health authorities have granted Casgevy conditional marketing authorization, and the FDA is expected to decide in March whether to approve the treatment for beta thalassemia.

The agency is also considering other gene-based treatments that don’t use CRISPR but rely on more traditional virus-based methods. One is for hemophilia B. Patients with the condition experience moderate-to-severe bleeding episodes because they lack a coagulation factor that the gene therapy provides. In studies by the manufacturer, Pfizer, the therapy lowered the risk of annual bleeding among a few dozen men who tested it by 71%. The FDA is expected to make a decision about the treatment in the second quarter of 2024.

A novel schizophrenia drug

Later in the year, the FDA will review a new drug treatment for schizophrenia—the first for the psychiatric condition in decades. Karuna Therapeutics has improved upon existing antipsychotics by targeting a different brain chemical than existing medications, which focus on dopamine. In a study of a couple hundred people with schizophrenia, the drug, which works on the muscarinic receptors in the brain involved in regulating positive and negative thoughts, helped to reduce the extremes of symptoms that are typical of the condition. If approved, the drug could help more people with schizophrenia relieve their worst symptoms, since many people stop taking the existing medications because of their side effects.

New year, same old questions of access

As exciting as the possible new medications are, they also raise questions about affordability and accessibility. Innovative drug treatments involving gene therapy and CRISPR, for example, are designed to be one-time treatments that can mitigate the need for repeated and often lifelong medical care. But that means higher upfront costs, and it’s not clear whether insurers will cover such hefty price tags.

As more therapies reach the market, however, they could change the reimbursement structure as insurers will likely feel increasing pressure to cover treatments that could be not just life-changing but also potentially curative—and save millions in long-term health care costs.

Here Are the New Drugs and Treatments We Could See in 2024 2024 could bring more effective ways to treat Alzheimer’s, schizophrenia, and hemophilia.

Wearing Hearing Aids May Help You Live Longer

Hearing loss has huge consequences, but you might not know that based on how infrequently it’s treated. Roughly a third of adults who develop hearing loss report never seeing a physician about it—in part because of stigma and fear surrounding aging, disability, and hearing aids.

Dr. Janet Choi, an otolaryngologist at Keck Medicine of USC who herself wears a hearing aid, wanted to know how the devices might affect a person’s long-term health. Previous research has shown that adults with untreated hearing loss have shorter lifespans than those without any hearing loss—but what about those who use hearing aids, she wondered? In her new study, out Jan. 3 in The Lancet Healthy Longevity, Choi and her co-authors found that the regular use of hearing aids is associated with a 24% reduction in mortality among adults with hearing loss.

[time-brightcove not-tgx=”true”]

Choi analyzed federal data from more than 10,000 adults with hearing loss who were age 20 or older. All of them were surveyed by the U.S. Centers for Disease Control and Prevention between 1999 and 2012 about their hearing loss and hearing aid use. Mortality data on the group were also analyzed. Choi and her team were surprised by how strong the link was between hearing aid use and longevity, especially considering that her definition of “regular” hearing aid users included those who wore their device for as little as five hours per week. “The results were significant even after adjusting for age and severity of hearing loss, socioeconomic status, and other medical conditions,” she says.

Read More: Hearing Aids Are Now Available Over the Counter. What to Know Before Buying One

The finding was also surprising because researchers don’t yet know exactly how or why hearing loss might shorten lifespan. But several factors likely contribute. Those who are born with hearing loss or who lose hearing before learning language don’t experience the same negative health outcomes as those who lose hearing later in life, which suggests that health might suffer when the ability to communicate does, Choi says.

“Social isolation, depression, anxiety, and dementia have all been associated with hearing loss,” Choi says. While each of these factors has also been tied to mortality, the specific way that they add to the burden of hearing loss isn’t as well understood. A recent line of research has also tied hearing loss to structural changes, atrophy, and tissue loss in parts of the brain, particularly those related to auditory processing. In other words, people who had previously been able to hear well may suffer not only from the loss of their ability to communicate and engage with the world around them, but also from the loss of ambient noise more generally.

“The question we really don’t know is when you actually use hearing aids, does that have a protective impact, or does it actually restore any of these brain structures?” Choi says. By better understanding the benefits of hearing aids, she hopes that more people will become open to wearing them. This includes both younger people like herself—despite lifelong left-ear hearing loss, Choi didn’t begin wearing an aid until her 30s—and those who are a part of older populations already at risk for social isolation, like the quarter of adults over 60 who experience what the World Health Organization classifies as “disabling” hearing loss. 

“I really want to encourage any people experiencing hearing difficulties to seek care,” Choi says. “I’ve tried at least three different hearing aids. But when I found the one that really fit me and that I liked, I was surprised by the sound that I was missing.”

Wearing Hearing Aids May Help You Live Longer Social isolation, depression, anxiety, and dementia have all been linked to hearing loss.

How Food Can Improve Your Mood, According to Nutritional Psychiatrists

You are what you eat, as the saying goes. But it might be more accurate to say you feel how you eat, since the burgeoning field of nutritional psychiatry suggests your diet plays an important role in your mental health. 

The right mix of foods and nutrients may serve as a buffer against stress, anxiety, depression, and a range of other psychological issues, research suggests. Studies have shown, for example, that people who follow a Mediterranean diet—one heavy on fruits, vegetables, legumes, olive oil, and fish—tend to have lower risks for depression than people who don’t. Piling your plate with foods like these may even be better for mental health than social support, a known psychological booster, according to a study from 2017.

[time-brightcove not-tgx=”true”]

There may not be an immediately obvious link between what goes into your belly and what happens in your brain. But “humans are one highly complex, highly integrated system,” says Felice Jacka, co-director of the Food and Mood Centre at Australia’s Deakin University and first author of the 2017 study. “The body and the brain…are in constant conversation.” 

Indeed, there’s a rich body of evidence to suggest that physical activity of nearly any type, duration, and intensity can improve mental health. And now, leading health authorities, including the World Health Organization, acknowledge that nutrition plays an important role, too. 

“Hippocrates was onto this eons ago. He made the connection between the gut and the brain,” says Dr. Uma Naidoo, director of nutritional and lifestyle psychiatry at Massachusetts General Hospital and author of Calm Your Mind with Food. Now, modern science is catching up.

Researchers are still learning about exactly how food influences mental health, but it seems that the gut microbiome plays a key part. Trillions of microbes live in your digestive system, working to break down components of the food you eat and interacting with numerous other parts of the body along the way, Jacka explains. Just as they nourish the physical body, nutrient-dense foods nurture the microbes in your gut, which translates to a range of benefits—including, research suggests, better mental health. One 2023 study in mice linked a type of bacteria found in foods like yogurt to lower levels of stress, and potentially lower risks for anxiety and depression, apparently due to its ability to regulate parts of the immune system.

The gut also has a direct line of communication to the brain via the vagus nerve, which runs from the brainstem to the large intestine. Mood-regulating neurotransmitters, including feel-good serotonin, are made in the gut. And once the gut pumps them out, the vagus nerve “acts like a two-way text messaging system that allows neurotransmitters to go back and forth, up and down, all the time,” Naidoo explains.

Although the science isn’t settled, some researchers have even posited that the mineral zinc, which is found in foods including oysters and nuts, may boost levels of a protein that promotes new growth in the brain, potentially leading to better cognitive function and mental health, says Dr. Drew Ramsey, an assistant clinical professor of psychiatry at Columbia University and author of Eat to Beat Depression and Anxiety. By eating well, “you’re giving your brain cells all the nutrients they need to grow and thrive,” he says.

Want to learn more about how we eat and drink now? Get guidance from experts:

9 Food Trends to Ditch in 2024

How to Reduce Food Waste and Save Money

The Food Trends to Get Excited About in 2024

How to Be a Healthier Drinker

Jacka isn’t overly prescriptive about what people should and shouldn’t eat for peak mental health. But, as a general rule of thumb, she suggests orienting your diet around a wide variety of plant-based foods, such as fruits, vegetables, nuts, beans, herbs, and whole grains, and limiting how often you eat ultra-processed foods, such as packaged chips, cookies, and snacks.

How and where you eat is also important, Ramsey says. Many people build their shopping lists primarily around what is cheapest and easiest to prepare. But developing an emotional connection to food, whether by purchasing it at farmer’s markets where you can meet the people who grow it or by slowing down and sharing meals with friends and family, can nourish the mind and soul as well as the body, he adds. “Through our relationship with food,” Ramsey says, “we can build community.”

If you’re looking for specific mood-boosting foods, here’s what science suggests you should put on your grocery list.

What to eat for better mental health

Omega-3 fatty acids: While much of the research is preliminary, there’s some evidence that eating foods rich in omega-3 fatty acids—including seafood, nuts, and plant oils—at least a few times per week can improve mood disorders like depression and bipolar disorder.

Cruciferous vegetables: Veggies including cauliflower, broccoli, cabbage, and arugula contain compounds that reduce inflammation, which is linked to a range of health issues including depression and anxiety. In one 2022 study, people who ate multiple servings of cruciferous veggies each day had significantly lower self-reported stress levels than people who ate less.

Fermented foods: Famous for feeding your gut microbes, fermented foods like plain yogurt, kimchi, and sauerkraut are powerhouses for enhancing the brain-body connection. Some research suggests that eating two to three servings per day is linked to measurable reductions in stress and depressive symptoms. 

Spices: Cinnamon, saffron, turmeric, black pepper, and other spices are rich in antioxidants, contain anti-inflammatory compounds, and improve metabolism, which may also boost mental health. Whenever you’re preparing food, Naidoo recommends reaching for spices to add flavor, rather than salt or sugar.

Beans and leafy greens: Some research suggests anxiety is related to magnesium deficiency—so eating foods that are rich in this mineral, such as beans, spinach, and Swiss chard, may help calm the mind.

How Food Can Improve Your Mood, According to Nutritional Psychiatrists Omega-3 fatty acids, cruciferous vegetables, and fermented foods can all boost your mood.

Is Eating a Plant-Based Diet Better for You?

It’s no secret that fruits and veggies are good for you. But a new Netflix show, You Are What You Eat: A Twin Experiment, shows just how powerful—and fast-acting—they can be.

The show features pairs of adult identical twins who participated in a study published in November 2023. For eight weeks, everyone in the study ate a diet rich in fruits, vegetables, whole grains, and legumes and low in sugars and refined starches. But one twin from each pair was assigned to eat only these plant-based foods, while the other also ate animal products such as chicken, fish, eggs, and dairy.

[time-brightcove not-tgx=”true”]

Both groups saw improvements in their cholesterol levels and modest reductions in weight over the eight weeks, but those trends were more dramatic among twins who followed the vegan diet. Average fasting insulin levels—another marker of cardiometabolic health—also dropped among the vegan, but not omnivorous, twins.

“This suggests that anyone who chooses a vegan diet can improve their long-term health in two months,” Christopher Gardner, a Stanford University professor and senior author of the study, said in a statement. And, Gardner added, following a vegan diet may not be as difficult as many people imagine: 21 of the 22 twins assigned to that eating plan stuck with it for all eight weeks.

Another point for plants

The Stanford study is not the only recent evidence pointing to the promise of plant-rich diets. A study published December 2023 in JAMA Network Open found that people who eat low-carbohydrate diets rich in plant-based proteins and fats, as well as whole grains, tend to gain less weight over time than people who eat low-carb diets with a lot of animal products and refined starches.

“Having a diet that’s rich in fresh fruits, non-starchy vegetables, whole grains, nuts, legumes, and plant-based oils is advisable for maintaining or improving your overall health,” says Binkai Liu, a research assistant in the Harvard T.H. Chan School of Public Health’s nutrition department and first author of the JAMA Network Open study.

Two recent analyses of previously published studies also found benefits associated with plant-based diets. The first linked vegetarian diets to a lower risk of heart disease than omnivorous diets, while the second, like the twin study, found that vegan and vegetarian diets are associated with lower levels of cholesterol and other markers of potential heart problems.

Which is more important: more plants or less meat?

In addition to validating plant-based diets, studies have long shown that eating too much meat—particularly red and processed meat, such as sausage and bacon—is linked to health problems including heart disease and cancer. But is all meat consumption bad?

It’s debatable. Some studies and experts refute the idea that vegan diets are automatically healthier than those that include meat. Becoming a vegan or vegetarian can make it difficult to get certain nutrients found in animal products, such as vitamins B12 and D, and people who eliminate meat often replace it with foods that may limit the nutritional benefits of a vegetarian lifestyle. Plus, numerous studies suggest that people who eat a Mediterranean diet—which includes fish—tend to live longer and report better health than people who follow other eating styles.

In the statement, Gardner said that cutting out all meat shouldn’t necessarily be everyone’s goal. “What’s more important than going strictly vegan,” Gardner said, “is including more plant-based foods into your diet.” Even the omnivores in his study, after all, saw some drops in cholesterol and body weight after eight weeks, likely in part because they ate plenty of fresh foods high in fiber and low in saturated fat.

A study from 2017 backs up that idea. Researchers tracked a group of people for more than a decade to see how dietary changes affected longevity. They estimated that even one small daily change—swapping a serving of red or processed meat for nuts or legumes—translated to an 8% to 17% drop in early death risk.

It’s hard to make one-size-fits-all statements when it comes to nutrition, as people’s bodies are unique and have different needs. Another twin study, this one from 2019, found that even people share nearly all of their DNA can have different physiological responses to the same foods, for example.

But if there’s any universal truth in nutrition science, it seems to be that loading up your plate with plants is always a good decision.

Is Eating a Plant-Based Diet Better for You? Here’s what the research says.

The Paradox of How We Treat Diabetes

Understanding diabetes today requires holding two conflicting realities in your head simultaneously.

First, diabetes therapy has been revolutionized by a world of new drugs that have become available since the turn of the century—most notably, drugs of the same class as Wegovy and Ozempic that began their existence as diabetes medications and are now hailed as wonder drugs for treating obesity. These drugs do the best job yet of controlling blood sugar and, of course, body weight, which is critical for those Type 2 diabetes, the common form of the disease that constitutes over 90 percent of cases and is associated with age and obesity. For type 1 diabetes, the acute condition that typically strikes in childhood and adolescence, new devices—continuous blood sugar monitors and automated insulin delivery systems—make blood sugar control easier than ever. Still more advanced devices and better drugs are in the pipeline.

[time-brightcove not-tgx=”true”]

But then there’s the flip-side. It’s why the pharmaceutical industry has invested so heavily in new therapies: Once a relatively rare condition, diabetes is now so common that drugstores dedicate entire aisles to it and television commercials for diabetic medications are common fare. In 1960, when the first concerted federal surveys were quantifying prevalence, two million Americans were living with a diabetes diagnosis. Today that number is 30 million; almost nine million more have diabetes but don’t yet know it. Each year, 1.4 million new cases are diagnosed and at ever younger ages.  

Diabetes puts all of these individuals at increased risk of heart disease, strokes, cancer, blindness, kidney failure, nerve damage, gangrene, and lower limb amputation. It increases cognitive impairment and dementia risk as patients age. Living with diabetes still comes with a decrease in life expectancy of six years.

For those with Type 1 diabetes, despite the remarkable new drugs and devices, blood sugar control is seemingly getting worse, on average, not better. As of 2018, fewer than one in five individuals diagnosed with Type 1 diabetes were achieving even the relatively generous blood-sugar goals set by the American Diabetes Association (ADA); this was a smaller proportion than a decade earlier.

Despite the remarkable advances in therapy, both Type 1 and Type 2 diabetes are still considered progressive chronic diseases, meaning the patient’s condition is expected inevitably to deteriorate as they live with the disease. The greatest challenge to better therapy, as one recent analysis suggested, is the hesitation of physicians to continue prescribing more or newer drugs and increasing dosages as the diseases progress.

All of this comes with a staggering financial burden. In November, the ADA estimated that the total annual cost of diabetes in the U.S. is over $400 billion; over $300 billion is direct medical costs. This was up $80 billion from 2017 when an editorial commenting on a similar accounting characterized these costs as the “elephant in the room” of the diabetes epidemic.Patients with diabetes are likely to spend over $12,000 a year just for medical care, almost three times that of healthy individuals of equivalent age. It does not help that the drugs themselves—whether insulin or Ozempic and its ilk —are expensive, costing many thousands of dollars a year. One in every four health care dollars spent in America goes to treating diabetic patients.

And the U.S. is by no means unique. The World Health Organization estimates that diabetes prevalence worldwide increased four-fold between 1980 and 2014, from 108 million to over 400 million, with the greatest rise coming, paradoxically, in the poorest countries. In 2016, Margaret Chan, then WHO director general, described the situation as a “slow-motion disaster” and predicted with near absolute certainty that these numbers would only get worse. They have.  

So how do we reconcile these conflicting realities: Unprecedented advances in medical therapies for an out-of-control disease epidemic in which which patients, at least in general, are doing poorly and can expect to do worse as time goes on? Confronted with such a dismal state of affairs shouldn’t we be asking how we got to this point? Were mistakes made in how we think about this disease? Were questionable assumptions treated as facts, and could those assumptions be wrong?

Asking the Right Questions

These are the kinds of questions you would hope health organizations worldwide would be asking, but surprisingly they have no mechanisms or protocols to do so. Diabetes associations like the ADA will regularly convene expert panels to address revisions in the latest standard of care guidelines to accommodate the latest research, but not whether the guiding principles underlying those guidelines should be rethought entirely. Independent investigators are not recruited to analyze and to provide an unbiased assessment of where progress might have gone off the rails. That job instead has been left to physicians in their clinics, those confronted with ever more diabetic patients and willing to take the risk of thinking independently, and to investigative journalists like myself, whose obligation when confronted with such conflicting realities is to ask just these kinds of questions.

Among the revolutions that changed medical practice over the past half century, one in particular is very relevant here. Beginning in the 1970s, health-care analysts began to confront quite how little physicians really knew about the risks and benefits of what they were doing for their patients. Not only had clinical trials demonstrated that some standard medical practices resulted in far more harm than good—the surgical procedure known as a radical mastectomy, most infamously, for breast cancer—but researchers were documenting wide variations in medical practices from physician to physician, hospital to hospital and state to state. This, in turn, resulted in a wide variation of benefits, harms and costs to the patients, depending on which physicians they might visit, and so which treatments they might get.

Read More: Should We End Obesity?

The revolution that followed became known as the Evidence-Based Medicine (EBM) movement, founded on the principle that medical interventions should be rigorously tested in clinical trials— double-blind, randomized, placebo-controlled—before they be used or prescribed. This would be necessary whenever physicians were faced with a choice between multiple options, and whenever the harms of an intervention might outweigh the benefits. David Sackett of McMaster University, a founder of the movement, would describe the EBM process as beginning with the fact that half of what aspiring doctors learn in medical school is “dead wrong,” and then trying to establish thoughtfully and critically which half that is. David Eddy of Duke University, another EBM pioneer, later described his motivation and that of his colleagues as the revelation that “medical decision making was not built on a bedrock of evidence or formal analysis, but was standing on Jell-O.”

It would be nice to think that this situation has been widely resolved by evidence-based guidelines, but that’s not the case. Journalists or physicians looking for the evidence base in decision making about diabetes therapies, will likely find themselves, as I did, with the same revelation. Clearly it, too, was standing on Jello-O in the 1970s, but the problem neither began nor ended there. A remarkable history emerges, with three clear observations.

First, we’ve been here before. We have had miracle drugs for diabetes. Most notably, the hormone insulin itself, when University of Toronto researchers led by Frederick Banting and Charles Best purified it and put it to use in 1922 treating patients with severe cases of diabetes. We then had better insulins, slower-acting and longer-lasting, and then, in the post-World War 2 years, drugs (oral hypoglycemic agents) that could lower blood sugar without having to be injected, as insulin did. We have had revolutionary advances in diabetes technology, beginning in the 1970s with devices that allowed patients to monitor their own blood sugar, and then insulin pumps that automated the process of insulin therapy. All contributed to easing the day-to-day burden of diabetes. None had any influence in controlling the epidemic, nor did they eradicate or meaningfully reduce the long-term complications of the disease. Put simply: diabetes plus drug therapy and devices, even the best drug therapy and devices, does not equate to health.

Secondly, diabetes researchers have not been averse to testing their fundamental assumptions. They‘ve done so in ever more ambitious clinical trials. But a disconcerting proportion of those trials failed to confirm the assumptions, despite the fact that it was these assumptions that constituted the rationale for therapeutic approaches. The $200 million Look AHEAD Trial, for example, tested a foundational belief in the field: that weight loss in those with Type 2 diabetes would lengthen lives. The trial was ended for “futility” in 2012. ”We have to have an adult conversation about this,” as David Nathan, a Harvard diabetes specialist, said to The New York Times. The 10,000-patient ACCORD trial had also been ended prematurely just four years earlier. “Halted After Deaths,” in the words of The New York Times headline. “Medical experts were stunned,” the 2008 article said. ACCORD was one of three trials testing the assumption that intensive blood sugar control by medications would reduce the macrovascular complications of Type 2 diabetes—particularly heart disease—and premature death. All three trials failed to confirm it.

Third, the remarkable aspect of all these trials is that they all assumed an approach to dietary therapy that itself had never been tested. This is the “standing on Jell-O” problem. For well over a century, diabetes textbooks and chapters in medical texts invariably included some variation on the statement that diet is the cornerstone of treatment. The most recent guidelines from the ADA refer to dieting as “medical nutrition therapy” (MNT) and say MNT is “integral” to therapy.

But what constitutes MNT—the dietary advice given—has been determined not by any meaningful research comparing different dietary approaches. Rather it has been assumed that individuals with diabetes should eat the same “healthful eating pattern” that health organizations recommend for all of us—“non-starchy vegetables, fruits, legumes, dairy, lean sources of protein… nuts, seeds, and whole grains”—albeit with the expectation, if weight control is necessary, that they should eat fewer calories.

Read More: Are Weight Loss Drugs From Compounding Pharmacies Safe?

Controlling the symptoms and complications of the disease is left to insulin and the pharmacopeia of drugs that work to maintain blood sugar levels near enough normal that the specter of diabetic complications may be reduced as well. Diabetes associations have assumed that this approach is easiest on the patients, allowing them to balance the burden of insulin injections or multi-drug therapy, against the joy of eating as their non-diabetic friends and family do. But this assumption has never been tested to see if it is true, nor whether a better approach exists that might truly minimize the disease burden of diabetes, extend lives and make the trade-off of restrictive eating vs. health worthwhile.

History of Diet and Diabetes

This is where understanding the history of the diet-diabetes relationship can be vitally important. What has been known for certain about diabetes since the 19th century is that it is characterized by the inability to safely metabolize the carbohydrates in our diet. This observation led to two divergent approaches/philosophies to dietary therapy. Beginning in 1797, when a British physician named John Rollo wrote about curing a diabetic patient using a diet of fatty (rancid) meat and green vegetables, through the early 1900s, diabetes therapy was based on the assumption that since individuals with diabetes could not safely metabolize the sugary and starchy foods in their diet, they should abstain from eating them. In this pre-insulin era, the only meaningful advice physicians could give their patients was dietary, variations on Rollo’s approach: sugars, grains, starches, even legumes were prohibited because they are carbohydrate-rich: meats, ideally as fatty as possible, butter and eggs, along with green leafy vegetables (boiled three times to remove the digestible carbohydrates) could be eaten to satiety.

Throughout Europe and America, this was known was “the animal diet,” endorsed by virtually every major diabetes specialist of the 19th Century. Physicians believed that the more calories their diabetic patients consumed, and ideally the more fat (because protein is composed of amino acids, some of which the liver converts to carbohydrates), the healthier they would be.  “Patients were always urged to take more fat,” is how this was described in 1930 by the Harvard physician Elliot Joslin, who was then, far and away, the most influential diabetes authority worldwide. “At one time my patients put fat in their soup, their coffee and matched their eggs with portions of fat of equal size. The carbohydrate was kept extraordinarily low….”

This thinking only changed in the years before World War I, when Joslin embraced and disseminated the idea promoted by a Harvard colleague, Frederick Allen, that diabetic patients, still without insulin, were best served if they were semi-starved—avoiding carbohydrates and fat. In short, patients suffering from a disease in which one characteristic symptom is ravenous hunger would be treated by making them go even hungrier than otherwise. The approach was unsurprisingly controversial. Joslin and others, though, came to believe they could keep their young Type 1 patients alive longer with Allen’s starvation therapy, even while the high fat, animal-based diet seemed more than adequate for their older Type 2 patients. Allen’s starvation therapy was in turn challenged between 1920 and 1923, when University of Michigan physicians Louis Newburgh and Robert Marsh reported in a series of articles that it was simply unnecessary, that even young patients with severe diabetes could thrive on the high-fat, carbohydrate-abstention approach if properly administered. By then, though, it was too late.

Insulin therapy had arrived in the winter of 1922. It launched what medical historians would call a “therapeutic revolution,” as close as medicine had ever come, and maybe ever has, to a miracle. Patients, often children, on the brink of death, horribly emaciated by the disease and the starvation therapy, would recover their health in weeks, if not days on insulin therapy. They were resurrected, to use the biblical terminology, which physicians of the era often did.

Diabetes specialists realized that insulin therapy was not a cure of the disease, but it allowed their patients to metabolize carbohydrates and held the promise of allowing them to eat whatever and however they wanted. “Were I a diabetic patient,” wrote Frederick Banting in 1930, by then a Nobel Laureate. “I would go to the doctor and tell him what I was going to eat and relieve myself of the worry by demanding of him a proper dose of insulin.”

That thinking, for better or worse, has governed diabetes therapy ever since.

While diabetes specialists still had no conception of the long-term complications of living with diabetes—the damage to large and small blood vessels that results in heart disease, strokes, kidney disease, neuropathy, amputations, blindness, dementia—they would advocate for ever more liberal carbohydrate diets and ever higher insulin doses to cover them. Patients would be taught to count the carbohydrate content of each meal, but only so they could properly dose their insulin. Diets would be prescribed, and still are, to allow for the drugs to be used freely, not to minimize their use. Patients, in turn, were allowed to eat anything, which physicians assumed they would do anyway.

Whether the patients lived longer, healthier lives because of it, would never be tested.  As diabetes specialists began to understand the burden of the disease they were treating, the wave of microvascular and macrovascular complications that set in after 10 or 20 years, they would rarely, if ever, ask the question, whether these complications were mitigated by their dietary approach or perhaps exacerbated by it. They would only test drug therapy.

In 1971, the American Diabetes Association institutionalized this philosophy with dietary guidelines that would commit the organization to this approach ever after: diabetic patients would be told to restrict dietary fat—by then thought to cause heart disease—rather than carbohydrates, the one macronutrient they could not metabolize safely without pharmaceutical help. “Medical Group, in a Major Change, Urges a Normal Carbohydrate Diet for Diabetics,” was the headline in The New York Times. By taking the ADA’s advice, diabetic patients would trade off blood sugar control for cholesterol, assuming this would prevent heart disease and lengthen their lives. While the guidelines explicitly acknowledged that the ADA authorities had no idea if this was the right thing to do, the advice would be given anyway.

Read More: Why You’re Not Losing Weight

By 1986, the ADA was recommending diabetic patients get “ideally up to 55-60% of total calories” from carbohydrates, while researchers led by the Stanford endocrinologist Gerald Reaven had established that such a diet was almost assuredly doing more harm than good. That same year, the NIH held a “consensus conference” on diet and exercise in Type 2 diabetes. The assembled authorities concluded that, at best, the nature of a healthy diet for diabetes remained unknown. The conference chairman, Robert Silverman of the NIH, summed the state of affairs up this way: “High protein levels can be bad for the kidneys. High fat is bad for your heart. Now Reaven is saying not to eat high carbohydrates. We have to eat something.” And then he added, “Sometimes we wish it would go away, because nobody knows how to deal with it.”

The modern era of the diabetes-diet relationship began 25 years ago, with the awareness that the nation was in the midst of an obesity epidemic. Physicians, confronted with ever more obese and diabetic patients and the apparent failure of conventional advice—eat less, exercise more—suggested instead the only obvious options, the approaches suggested by popular diet books. Many of these—Dr. Atkins’ Diet Revolution, Protein Power, Sugar Busters—were touting modern incarnations of Rollo’s animal diet.

The Diet Trials

The result was a series of small, independent clinical trials, comparing, for the first time, the conflicting dietary philosophies of a century before. Is it better for patients with Type 2 diabetes, specifically, to avoid dietary fat and, if they’re gaining weight, restrict total calories (both carbohydrates and fat), or will they do better by avoiding carbohydrate-rich foods alone and perhaps entirely? The earliest trials focused on treating obesity, but many of the participants also struggled with Type 2 diabetes. In 2003, physicians at the Philadelphia VA Medical Center published the results from the first of such trials in the New England Journal of Medicine: patients with both obesity and diabetes counseled to eat as much food as they desired but to avoid carbohydrates, became both leaner and healthier than patients counseled to eat the low-fat, carbohydrate-rich, calorie-restricted diet prescribed by both the American Heart Association and ADA. The numerous trials since then have concluded much the same.

Among the profound assumptions about Type 2 diabetes that these trials have now challenged is that it is, indeed, a progressive, degenerative disorder. This may only be true in the context of the carbohydrate-rich diets that the ADA has recommended. In 2019, researchers led by the late Sarah Hallberg of the University of Indiana, working with a healthcare start-up called Virta Health, reported that more than half of the participants in their clinical trial were able to reverse their type 2 diabetes by eating what amounts to a 21st century version of Rollo’s animal diet or the Newburgh and Marsh approach. They were able to discontinue their insulin therapy and all but the most benign of their diabetes medications (known as metformin) while achieving healthy blood sugar control. A third of these patients remained in remission, with no sign of their disease, for the five years, so far, that their progress has been tracked.

As for Type 1 diabetes, in 2018, a collaboration led by the Harvard endocrinologists Belinda Lennerz and David Ludwig reported on a survey of members of a Facebook Group called TypeOneGrit dedicated to using the dietary therapy promoted by Dr. Richard Bernstein in his book Dr. Bernstein’s Diabetes Solution. Bernstein’s approach requires patients to self-experiment until they find the diet that provides stable healthy levels of blood sugar with the smallest doses of insulin. Such a diet, invariably, is very low in carbohydrates with more fat than either the ADA or AHA would deem healthy. Both youth and adults in the Harvard survey maintained near-normal blood sugar with surprisingly few signs of the kind of complications—including very low blood sugar, known as hypoglycemia—that make the life of a patient with Type 1 diabetes so burdensome. The TypeOneGrit survey, Lennerz said, revealed “a finding that was thought to not exist. No one thought it possible that people with type one diabetes could have [blood sugar levels] in the healthy range.” This does not mean that such diets are benign. They may still have the potential to cause significant harm, as Lennerz and Ludwig and their colleagues made clear. That, again, has never been tested.

One consequence of the diabetes associations embracing and prescribing a dietary philosophy in 1971 that has only recently been tested is that we’re back to the kind of situation that led to the evidence based medicine movement to begin with: enormous variation in therapeutic options from physician to physician and clinic to clinic with potentially enormous variations in benefits, harms and costs.

Even the ADA advice itself varies from document to document and expert panel to expert panel. In 2019, for instance, the ADA published two consensus reports on lifestyle therapy for diabetes. The first was the association’s consensus report on the standard of care for patients with diabetes. The authors were physicians; their report repeated the conventional dietary wisdom about eating “vegetables, fruits, legumes, whole grains….” It emphasized “healthful eating patterns”, with “less focus on specific nutrients,” and singled out Mediterranean diets, Dietary Approaches to Stop Hypertension (known as the DASH diet) and plant-based diets as examples that could be offered to patients. This ADA report still argued for the benefits of low-fat and so carbohydrate-rich diets, while suggesting that the “challenges with long-term sustainability” of carbohydrate-restricted eating plans made them of limited use.

Three months later, the ADA released a five-year update on nutrition therapy. This was authored by a 14-member committee of physicians, dietitians and nutritionists. Among the conclusions was that the diets recommended as examples of healthful eating patterns in the lifestyle management report—low-fat diets, Mediterranean diets, plant-based diets and the DASH diet—were supported by surprisingly little evidence. In the few short-term clinical trials that had been done, the results had been inconsistent. As for carbohydrate-restricted high fat eating patterns, they were now “among the most studied eating patterns for Type 2 diabetes,” and the only diets for which the results had been consistent. “Reducing overall carbohydrate intake for individuals with diabetes,” this ADA report stated, “has demonstrated the most evidence for improving glycemia [high blood sugar] and may be applied in a variety of eating patterns that meet individual needs and preferences.”

Physician awarenessof the potential benefits of carbohydrate-restriction for Type 2 diabetes, meanwhile, still often comes from their patients, not their professional organizations. In the United Kingdom, for instance, David Unwin, a senior partner in a medium-sized practice began suggesting carbohydrate-restricted high fat diets to his patients in 2011, after seeing the results in one such patient who chose to do it on her own and lost 50 pounds. When results of her blood tests came back, says Unwin, they both realized that she was no longer suffering from diabetes. Both the weight loss and the reversal of diabetes were unique in Unwin’s experience. After reading up on the burgeoning literature on carbohydrate restriction, Unwin began counseling his diabetic patients to follow a very-low-carbohydrate, high-fat eating pattern. In 2017, the UK’s National Health Service awarded Unwin its “innovator of the year” award for applying a 200-year-old approach to diabetes therapy, as Unwin says, that “was routine until 1923.” Unwin has now published two papers documenting the experience in his medical practice. As of last year, 20 percent of the clinic’s diabetic patients—94 in total—had chosen to follow this restricted dietary approach and put their Type 2 diabetes into remission.

If the diabetes community is to solve the formidable problems confronting it, even as drug therapies get ever more sophisticated, it will have to accept that some of its fundamental preconceptions about diabetes and diet may indeed be wrong. As it does so, it will have to provide support for those living with diabetes who decide that what theyhave been doing is not working. Some patients, when confronted with the choice between following a restricted eating pattern that seemingly maximizes their health and wellbeing or eating whatever they want and treating the symptoms and complications with drug therapy, will prefer the former. For those who do, the informed guidance of their physicians and diabetes educators will be  invaluable.

When I interviewed individuals living with Type 1 diabetes, among the most poignant comments I heard was from a nutrition consultant diagnosed in 1977 when she was eight years old. She told me that she finally had faith she could manage her blood sugar and live with her disease when she met a physician who said to her “What can I do to help you?” That’s what changed her life, as much as any technology or medical intervention. In the context of the dietary therapies we’re discussing, that requires practitioners who are themselves open-minded and willing to spend the necessary time and effort to truly understand an approach to controlling diabetes that is, by definition, unconventional and, in Type 1 diabetes, still lacking clinical trials that test (or testify to) its safety and efficacy. Easy as it is for physicians to continue believing that what they should be doing is what they have been doing, they do not serve their patients best by doing so.

Adapted from Gary Taubes’ new book Rethinking Diabetes: What Science Reveals About Diet, Insulin and Successful Treatments

The Paradox of How We Treat Diabetes Even as new treatments become available for diabetes, the disease continues to spread. That’s because we’re not focused on root solutions.

Work Is the New Doctor’s Office

If you’re trying to improve your health, the first stop is likely to be your doctor’s office. But your own office may have nearly as much influence on well-being, according to a growing body of research that suggests your job can affect everything from mental health to risk of cardiovascular disease and how long you live.

“Health happens everywhere,” says Dr. Eduardo Sanchez, chief medical officer for prevention at the American Heart Association. Given that the average employed U.S. adult spends more of their waking hours working than doing just about anything else, that includes the workplace, he says.

[time-brightcove not-tgx=”true”]

Work-related stress is one culprit for health problems, since unmanaged stress can contribute to heart disease, insomnia, gastrointestinal issues, and other chronic conditions. Long hours on the job can also cut into time that would otherwise be spent sleeping, exercising, cooking, seeing loved ones, or doing other activities that can boost wellness. Such problems are most effectively fixed when employers change workplace conditions, rather than leaning on workplace wellness initiatives as a Band-Aid, says Laura Linnan, director of the University of North Carolina’s Collaborative for Research on Work and Health.

“We can provide all the coping strategies and stress-management programs possible,” she says. “But if we put employees back in an environment where the work pace is out of control, the staffing is wrong, there’s a toxic supervisor—no amount of stress management is going to save that.”

Here’s what the research says about how work affects health, and a few ways bosses and employees alike can make the workplace better for everyone.

Find control and meaning in work

Autonomy in the workplace is a powerful thing, Linnan says. Studies show that the level of control someone has over their work predicts how their job will affect their physical and mental health, sometimes more than workload alone. On the flip side, lacking autonomy is a known risk factor for burnout, a condition characterized by feeling exhausted by, disengaged from, and cynical about work.

Some workers will naturally have more say over their time and tasks than others, Linnan says. But even in a highly regimented setting, she says, bosses could ask, “What would make this job better for you?” and use that feedback to determine how shifts and breaks are scheduled, for example.

Studies also show that people who find their work meaningful may experience improved well-being, as long as they don’t work too much or become overly invested. So, if workplace culture allows, employees could consider proactively bringing ideas to their managers and asking for tasks that align with the work they’d like to be doing.

But, unfortunately, not all companies and managers are open to that kind of feedback. That, Linnan says, is where the “reawakening for unionization” in the U.S. comes in. “There are organizations that just haven’t moved the needle at all, and employees are not going to stand for it,” she says.

Acknowledge and reward good work

Fair pay is the most obvious and impactful form of workplace reward, and one with clear links to better health. But research suggests even verbal acknowledgement, such as bosses praising or thanking their direct reports for their work, can improve employee well-being.

In a recent study, men who felt they put forth a lot of effort on the job but were not adequately rewarded for it (as measured by whether they felt they were compensated fairly, had good promotion prospects, and got enough respect from peers and supervisors) had a 50% higher risk of heart disease than peers who felt fairly recognized. There was not as clear a link among women, but the study’s co-author noted in a statement that reducing stressors at work—including an imbalance between effort and reward—could have other health benefits for people of all genders, potentially including decreases in depression.

Create flexible work environments

Demanding workplaces can contribute to health problems. But some studies also show it’s not that difficult to make a meaningful shift. “You can change work, and actually in a relatively short time,” says Lisa Berkman, a social epidemiologist at the Harvard T.H. Chan School of Public Health.

For a paper published in 2023, Berkman and her colleagues studied two very different workplaces: an IT company and a long-term health care provider. In both, managers were trained on how to be more supportive of employee work-life balance, and supervisors and employees together looked for ways of streamlining work—such as by taking some meetings off the calendar, or minimizing time spent on administrative work. After these programs were put into place, workers saw measurable improvements in sleep quality, psychological wellness, and heart health, the researchers found.

Studies have also shown that four-day work weeks improve employees’ mental health, sleep, and physical activity levels, further underscoring the benefits of flexible working hours. True four-day work weeks may not be possible for every industry, but companies taking part in pilot programs have found workarounds, like assigning different departments within a company to work different days and letting employees take a couple of half days per week.

Foster social support in the workplace

Socializing at work may seem unimportant—or downright emotionally draining—but it can be surprisingly beneficial, experts say. Some research even suggests people who have strong social support at work have a reduced risk of premature death, in addition to better mental health and job satisfaction.

You don’t necessarily need to make close, personal friends at work. Even relatively small interactions, like chatting with coworkers after a meeting or checking in with them after a hard day, can go a long way, research suggests. It’s also up to managers to create environments in which employees feel free to build social connections, and to check in with their direct reports to see how they’re doing as whole people—not just workers.

That mentality, Linnan says, is key to workplace health more generally. She points to the National Institute for Occupational Safety and Health’s Total Worker Health Program as a good model. It seeks to improve all domains of employee health, from risk of on-the-job accidents and illnesses to psychological well-being—a marked contrast from classic workplace wellness initiatives, which tend to focus on narrow goals like boosting physical activity or encouraging smoking cessation. “Overall well-being is about mental, physical, spiritual, emotional, [and financial health],” Linnan says. “They all interrelate.”

Work Is the New Doctor’s Office A growing body research suggests your job has a strong impact on your health.

The Food Trends to Get Excited About in 2024, According to Experts

A longing for authenticity. An urge to protect the planet and embrace nature. An itch to spice things up. These are the modern sentiments shaping what will show up on our plates and in our glasses in 2024, according to experts who forecast food trends.

We asked nearly a dozen industry insiders—from chefs to a food futurologist—what to expect in the year ahead for food and drink. Here’s what they said.

[time-brightcove not-tgx=”true”]

An emphasis on global flavors

Even if you don’t venture farther than a nearby restaurant in 2024, exciting new flavors from around the world will be at the other end of your fork. One of the defining trends of the year is expected to be third-culture cuisine, or dishes from a chef’s diverse background. Think: wafu Italian restaurants, which bridge Japanese and Italian cultures, and Filipino-British bakeries. “It’s very much derived from social changes and globalization and the meaning of identity today,” says Claire Lancaster, head of food and drink at the trend-forecasting company WGSN. In the past, she notes, someone might have “slapped something random on a pizza” and called it fusion, but more care goes into it now. “This new generation of chefs is creating products that reflect their unique, multi-layered cultural identities.”

More Asian ingredients

Expect Asian flavors and ingredients to have a moment. Black sesame, ube, and milk tea will follow the path of matcha and become more prevalent, predicts Denise Purcell, vice president of resource development with the Specialty Food Association, a trade group that hosts the Fancy Food Show. “We’re seeing milk tea-filled donuts and ube hot chocolate,” she says. “I was just someplace where they had black sesame cookies.” The flavors are also popping up in salty snacks, like black milk tea popcorn, Purcell notes.

Andrea Xu, co-founder and CEO of Umamicart, an online grocer that specializes in Asian groceries, anticipates more people will embrace Asian fruits, such as rambutan, pink guava, longan, mangosteen, and various types of dragon fruit. “If you go for the golden variety, it will be much sweeter and softer,” Xu says of dragon fruit. “The white and purple varieties are a little tangier. They make for really good smoothies.”

In Denver, Ni and Anna Nguyen—the married chefs behind popular Vietnamese restaurant Sap Sua—are excited about the emergence of first-generation Asian chefs diversifying what dining looks like. “A lot of people are starting to recognize that there’s a difference between the cuisines,” Ni says. “What makes Filipino cuisine special, and what makes Vietnamese cuisine special? It’s not just lumped into one category.”

Steps toward sustainability

One of the undercurrents driving food and drink trends is our collective desire to take care of the planet. More companies will prioritize sustainability in the coming months in surprising ways. Expect, for instance, the rise of alternative chocolates. As Lancaster points out, the demand for cocoa has led to deforestation worldwide; plus, access to it is becoming more difficult and expensive. Alternative chocolate is “made without cocoa,” but it still tastes remarkably similar to your standard bar, she says. “There’s a group of innovators who are creating alternatives that have the same taste, smell, and melt of original chocolate.” One U.S.-based company, Voyage Foods, uses ingredients like grape seeds, sunflower protein flour, and sunflower lecithin to make their alternative chocolate. In the U.K., WNWN Food Labs replaces cocoa beans with ingredients like cereals and legumes.

Other companies are responding to water scarcity, extreme heat, and droughts by creating products that minimize their water footprint. For example: waterless plant milks come in powder form, so you can mix in water at home. “The industry is realizing that we’re paying to ship water—that’s 90% of the product,” Lancaster says. “It’s a huge CO2 emitter, and it adds to the cost of the product.” Other companies are utilizing drought-friendly crops like prickly pear cactus to make snacks like popcorn, trail mix, and candy

Meanwhile, as we learn more about the climate impact of marine ingredients, expect innovators to start showcasing lesser-known ones, Lancaster says. That includes urchins and fish roe—all of which “create a really lovely, savory, umami depth of flavor, and they’re bringing it to a wider range of dishes.”

Fun with fungi

Todd Anderson, a chef and founder of the Turnip Vegan Recipe Club, gets mushy when talking about mushrooms. In 2024, more of us will embrace fungi, he predicts—and mushrooms will shine as a meat replacement. Anderson recently made mushroom meatballs and roasted lion’s mane, a mushroom that grows on woody tree trunks. He also enjoys dishes like shiitake bacon, mushroom roast beef, and maple sausage made out of mushrooms. Many mushrooms are easy to grow at home, he says, even for people in urban environments—and he’s looking forward to seeing more people grow and experiment with them in 2024.

Want to learn more about how we eat and drink now? Get guidance from experts:

How to Be a Healthier Drinker

9 Food Trends to Ditch in 2024

How to Reduce Food Waste and Save Money

A celebration of vegetables

Matty Matheson, a chef and restaurateur who starred in FX’s restaurant dramedy The Bear, doesn’t consider himself a big trends guy—but he’s excited about veggies. We’re about to see a surge in “vegetable-forward restaurants,” he says. “I think people are now understanding how to cook vegetables in a way that’s more profound and more exhilarating for their customers and for themselves.” Take broccoli, for instance. You might see it grilled or pureed; a chef might stew its leaves with collard greens. Another increasingly popular technique: cooking Brussels sprouts’ “beautiful, very robust” leaves as though they were collard greens, which Matheson describes as especially flavorful. “Having more vegetables on the forefront is going to be a big thing.”

Dinner in a drink

Lauren Paylor O’Brien, a mixologist who’s the winner of Drink Masters season 1 on Netflix, likes to use food as inspiration for the drinks she creates. In 2024, she predicts we’ll see more culinary integration with booze. During a recent event, she paired a scoop of honey ice cream with three drops of olive oil and a fizzy whiskey cocktail. It’s a “sensory experience,” she says. “There’s the visual appearance of ice cream in a glass, the carbonation from the drink as you’re pouring it, the aromatics from both the ice cream and the canned cocktail, the additional flavor profile of adding olive oil, and then also the aromatics that you’re getting from the olive oil.”

Mixologists worldwide are embracing meal profiles for drink flavors, Lancaster notes. She points to Double Chicken Please, a New York City bar, where patrons can order cocktails like the Cold Pizza (Don Fulano Blanco, parmigiano Reggiano, burnt toast, tomato, basil, honey, and egg white) or Mango Sticky Rice (Bacardi Reserva Ocho, mango, sticky rice pu’er tea, wakame, cold brew, coconut). At the Savory Project bar in Hong Kong, patrons can sip on drinks that utilize ingredients like beef, charred corn husks, leeks, and shiitake mushrooms. “Really unexpected flavor profiles” are going to be big, Lancaster says.

More mindful drinking

For years, Derek Brown was best known in Washington, D.C., for owning high-profile bars. But the longtime bartender’s attitude about alcohol has shifted, and he’s now an advocate for non-alcoholic cocktails (he wrote the Mindful Mixology recipe book in 2022).

In 2024, Brown expects we’ll see the continued rise of mindful drinking, vs. an either/or approach. “We still see a lot of polarization in discussions about alcohol,” he says. “They tend to revert to: drink or don’t drink.” Instead, we’ll start to hear more about what he calls “substituters,” or people who switch between “non-alcoholic and alcoholic adult sophisticated-beverages based on the occasion.” That allows us to keep the best parts of drinking—being social and trying delicious drinks, Brown says—while leaving heavy consumption behind.

Another trend bubbling toward the surface is non-alcoholic wine, Brown predicts. Attention has largely centered on non-alcoholic beer until now, but companies like Leitz in Germany and Giesen in New Zealand are starting to offer dealcoholized wines. Many add teas and extracts to compensate for the body and flavor lost during the dealcoholization process—and Brown describes their taste as “amazing.”

Funky flavors, ingredients, and colors

During a conversation on a recent afternoon, Xu snacked on Lay’s “numb & spicy hot pot” flavored potato chips. “We’re starting to see people really going outside the typical snacks they’d been having,” she says. Enter: unique offerings like roasted cumin lamb skewer Lay’s, Sichuan Peppercorn Doritos, and Lay’s Stax potato chips flavored like jamon (Spanish ham).

On the higher brow end of things, chef Michele Mazza of Il Mulino New York is looking forward to cooking with unique pasta flavors, like squid ink pasta—which “has a very salty flavor with some hints of the ocean”—and truffle-infused pasta, which “gives off a more earthy taste.” We’ll also likely see wider use of whimsical pasta shapes, he believes, such as orecchiette, farfalle, fusilli, and Cavatappi.

Color-wise, blue will rule, predicts Morgaine Gaye, a food futurologist based in London. That’s a reflection of a broader trend: In 2024, we’ll continue to seek out nature—part of our ongoing quest to find solace in a divided, stressful world. Inspired by ocean and sky hues, more of our snacks and meals will incorporate blue: “We’ll see muffins, we’ll see cupcakes, we’ll see drinks” colored with butterfly pea protein—a powder made from the butterfly pea plant, a vine native to Thailand—or blue-green algae, Gaye says.

Gaye also foresees florals. Rose, lavender, and violet flavors will pop up in drinks, baked goods, ice cream, snacks, and more to delight us. In 2024, “we’re going to need comfort, kindness, and nature,” she says. “All of that stuff is key to mental well-being, as we try to hold ourselves, and hold one another, together.”

The Food Trends to Get Excited About in 2024, According to Experts From global flavors to alternative chocolates.