Mode
Text Size
Log in / Sign up

Missing Data in Diabetes Studies Could Skew Brain Health Results

Share
Missing Data in Diabetes Studies Could Skew Brain Health Results
Photo by Brett Jordan / Unsplash

You go to the doctor for a checkup. They ask you to complete a memory test. But you're tired, distracted, or just having a bad day. So you skip a few questions.

Now imagine that happens to hundreds of people in a research study. And the scientists simply ignore those missing answers.

That's exactly what's happening in many studies on type 2 diabetes and mild cognitive impairment (MCI). And it's a bigger problem than most people realize.

Why Missing Data Matters for Your Brain

Type 2 diabetes affects more than 500 million people worldwide. Many of them also develop mild cognitive impairment. That means small but noticeable problems with memory, focus, or decision-making.

Doctors want to understand how diabetes affects the brain over time. They run studies that track patients for months or years. But here's the problem.

People with memory issues are more likely to miss appointments. They forget to fill out forms. They skip test questions. This creates gaps in the data.

When researchers ignore those gaps, they may get the wrong answers.

The Old Way vs. What We Now Know

For years, scientists assumed that missing data was random. They thought a few skipped questions here and there wouldn't change the big picture.

But here's the twist. Missing data is rarely random. People who drop out of studies often have worse health. They may be sicker, more confused, or less able to follow instructions.

When researchers only look at the people who complete every test, they see a healthier group. This can make a treatment look better than it really is.

A New Review Reveals the Problem

Researchers from China and other countries recently reviewed 88 studies on diabetes and mild cognitive impairment. These studies were published between 2020 and 2025.

They wanted to know how often scientists reported missing data. And how they handled it.

The results were not encouraging.

Only 23 percent of studies even mentioned that data was missing. Among those, the average amount of missing data was about 9 percent. That means nearly 1 in 10 patient results were simply gone.

This doesn't mean the studies are worthless. But it does mean we should be careful about trusting their conclusions.

What Scientists Are Getting Wrong

When researchers did notice missing data, most of them used a simple fix. They just removed the incomplete cases and analyzed only the complete ones.

This method, called complete case analysis, was used in 93 percent of the studies.

Think of it like grading a test but only counting the students who showed up. The students who skipped class might have done worse. But you never find out.

Only one study out of 88 performed a sensitivity analysis. That's a fancy term for checking whether the missing data changed the results.

None of the studies explained why they thought the data was missing in the first place. Was it because patients got sicker? Because they moved away? Because they lost interest?

Without those answers, the results are less reliable.

If you have type 2 diabetes or care for someone who does, this matters. Studies shape what doctors recommend. They influence which treatments get approved.

When studies have missing data problems, the advice you receive may be based on incomplete information.

Does this mean you should ignore all diabetes and brain health research? No. But it does mean you should ask questions.

Ask your doctor: "How strong is the evidence for this treatment?" Ask: "Were the studies large and well-designed?"

The Limits of This Review

This review looked at observational studies. That means researchers watched what happened to patients over time. They didn't control treatments or assign people to groups.

Observational studies are useful. But they are not as strong as clinical trials where patients are randomly assigned.

Also, the review only covered studies from 2020 to 2025. Older studies may have different patterns.

What Happens Next

The researchers who conducted this review want to change how scientists handle missing data. They recommend that all studies report how much data is missing and why.

Better guidelines already exist. Groups like STROBE and Sterne and colleagues have published clear rules. The problem is that many researchers don't follow them.

In the future, journal editors may require better reporting. Funding agencies may demand it. And patients may start asking for it.

For now, the takeaway is simple. When you read about a new diabetes treatment that helps memory, ask one question first: Did the study account for the people who didn't finish?

Share