Mode
Text Size
Log in / Sign up

AI Rules Are Missing in Top Endocrinology Journals

Share
AI Rules Are Missing in Top Endocrinology Journals
Photo by Brett Jordan / Unsplash

Imagine submitting your life's work to a medical journal. You want your research published so patients can benefit. But what if the rules for using AI are unclear?

Many top endocrinology journals are letting researchers use AI tools without strict guidelines. This creates a risky gap in how we publish medical science today.

Endocrinology deals with hormones that control your body. Doctors study diabetes, thyroid issues, and more. These conditions affect millions of people worldwide.

Current treatments often leave patients frustrated. We need better ways to find cures and manage chronic diseases. Researchers rely on journals to share new findings.

But the way we write and publish is changing fast. AI tools can write text or create images in seconds. This speeds up work, but it also raises big questions.

The surprising shift

For years, scientists wrote every word themselves. They checked every number and every picture. This kept research honest and accurate.

But here is the twist. Most top journals now allow AI to help write papers. Some even let AI make images. Yet, very few require clear rules on how to use these tools.

What scientists didn't expect

Think of a research paper like a building. Every brick must be placed correctly. AI can lay bricks quickly, but it might not know the blueprint.

The study looked at the top 100 endocrinology journals. It found that 84% mention AI in their author rules. However, only 1% requires a specific reporting guide.

This means most journals are playing it safe. They say "you can use AI," but they don't say "how." This lack of detail could hurt the quality of published science.

AI tools act like a very fast assistant. They can draft sentences or generate graphs based on data you give them. It is like having a helper who never sleeps.

However, this helper might make mistakes. It could invent facts or copy old ideas without credit. If a journal does not check this work, bad science gets published.

Researchers checked the "Instructions for Authors" of 100 top journals. They looked at rules for writing, creating images, and claiming authorship.

The review ran from late 2024 to mid-2025. Two people checked the data to ensure it was correct. They found that 79% of journals asked authors to disclose AI use.

But only 64% allowed AI for writing text. Just 22% allowed it for generating content. And half let it create images. Very few endorsed global standards for AI reporting.

The most important result is about transparency. Seventy-nine percent of journals want you to tell them if you used AI. This is a good start.

But there is a problem. Only one journal out of 100 required a specific checklist for AI use. Most just say "be careful." This is not enough to ensure safety.

Another key finding is about authorship. No journal allowed AI to be listed as an author. This is smart. AI is a tool, not a person. It cannot take responsibility for the research.

But there's a catch

This doesn't mean this treatment is available yet.

The study shows that rules are inconsistent. Some journals are strict, while others are vague. This confusion makes it hard for researchers to know what is allowed.

It also means a paper accepted by one journal might be rejected by another. This slows down the sharing of important medical knowledge.

Medical experts agree that clear rules are needed. Without them, the integrity of research is at risk. Trust is the foundation of medicine. If people doubt the data, they cannot trust the treatments.

The study suggests that publishers must step up. They need to create explicit guidelines that everyone follows. This will help promote reliable and reproducible research.

If you are a patient, this news might feel distant. But it affects the drugs and cures you receive. Clear rules ensure that new treatments are tested properly before they reach your doctor.

If you are a caregiver or researcher, talk to your team about AI use. Ask if your institution has specific policies. Do not assume that because a journal allows AI, it is safe to use.

Always verify the source of any medical information. Rely on doctors who follow strict ethical guidelines.

This study looked only at the top 100 journals. There are many smaller journals that might have different rules. Also, the study was done in a short time frame.

The landscape changes fast. New tools appear every month. Policies might shift before the next review is done.

What happens next? Publishers must update their guidelines soon. They need to decide if AI can generate images or text. They should also adopt global standards for reporting AI use.

Research takes time. We cannot rush this process. Safety must come first. As AI grows in medicine, our rules must grow with it.

We need a future where technology helps without confusing the science. Clear paths forward will protect patients and keep trust high.

Share