A student at Tec de Monterrey’s School of Medicine shares with CONECTA how she, fellow students, and teachers have incorporated artificial intelligence (AI) as a training tool.
Lianny Hernández highlighted how these emerging technologies, ranging from virtual patients to personalized study systems, have become valuable tools in her professional training.
“I started using AI because it’s the most common (tool) of all and I started with ChatGPT because everyone was already using it,” she said.
“It helps us a lot as students and as clinicians (...) Everything is very intuitive, so you can quickly link applications and customize them with the information you need.”

SSimulate to learn: Using virtual patients
During one of her courses, the eighth-semester student had an unusual experience: interacting with a patient who did not exist.
This was an activity designed by Professor Nancy Segura, who “programmed” clinical cases in ChatGPT to simulate medical consultations.
Lianny said that in these exercises, the chatbot was modified to interact with her and her classmates as if it were a real patient.
“It would tell you its age, symptoms, history, everything. You had to ask it questions, propose studies, diagnose, and provide treatment,” recalls the Monterrey campus student.
At the end, the AI provided you with personalized feedback based on specific comments about the interaction that helped you detect gaps and areas of opportunity in your communication or diagnosis.
“The chatbot would tell you positive things like, ‘You explained the diagnosis to the patient very well. She didn’t have any outstanding questions, and you reassured her by explaining that her condition has a treatment or a cure.’
“But, it also told you what you needed to improve with comments like, ‘I feel the area you missed was explaining the treatment in more detail because the patient may have some unanswered questions or may be a little confused.’”
According to Hernández, that was especially useful because, in cases like hers where she is about to start contact with real patients, it allowed her to practice parts of a real patient consultation.

Linking platforms for more effective study
Another frequent use of AI in Lianny’s medical training is its integration with specialized platforms such as AMBOSS, a tool widely used among medical students as it is a specialized digital encyclopedia.
Lianny and her classmates discovered that they could link ChatGPT with AMBOSS to create customized clinical scenarios.
“You could tell it to give you a clinical case with certain issues and then solve it. That’s how you practice thinking like a doctor,” she said.
These simulations, said Hernández, have also allowed her to reinforce diagnostic skills and get used to the question format of specialized and standardized exams in countries such as the United States.
“You give it your study objectives, and it gives you exam-type questions. It forces you to reason, not just memorize. That’s what helped me a lot and helped me focus a lot on specialty exams.
“So, you can ask it to ask you questions in the style of a specialty exam in the U.S., and it creates exams in that style, which is something that helps us prepare and get used to (the format).”
Despite the benefits, the student points out that this integration and use requires constant critical judgment to avoid incorrect information or information that does not apply to the Mexican context.
“We have to confirm everything with other sources and, obviously, with what the professors teach us. Not everything that AI says applies to our context in Mexico, such as the type of drugs,” Lianny said.

A powerful ally with limitations and flaws
As students like Lianny embrace AI in their learning, concerns about its limitations have also emerged because the student says that while AI is extremely useful, it is not foolproof.
“ChatGPT can get things wrong. If you don’t know what is wrong and you study this way, you can repeat serious mistakes (...) Many times it could recommend medications that aren’t even available in hospitals here or that are very inaccessible,” she explains.
“I think that artificial intelligence today has both advantages and disadvantages, so it seems to me that it should be used more as an educational tool for those of us who are doctors in training and for doctors who want to inform themselves.”
This being so, the student highlighted how, in addition to being a tool for practice or study, AI can be an important ally in academia as well as in research areas if used as an “assistant” to gather information.
AI tools such as Consensus, Anara, Future Housem Open Evidence, and AMBOSS are used in this area by Lianny and other students such as Cristina Jiménez, a seventh semester biosciences student.
However, Hernández stresses that first and foremost AI cannot replace clinical judgment or human sensitivity.
“At the end of the day, AI can’t touch, can’t hear, and can’t see how a patient reacts. That remains an essential part of the diagnosis.”

Reflecting on AI, patients, and ethical conduct
For students further along in their studies, as is the case with Lianny, one of the emerging challenges to address is the impact that AI has had on the doctor-patient relationship.
According to what the student has learned from mentors and professors in the field, physicians today face new challenges such as patients “coming in with a diagnosis” thanks to ChatGPT.
“Some patients say ‘I already looked on the Internet. I have this. Prescribe me such and such.’ And, if doctors tell them no, they don’t believe them. They trust the bot more than the professional who is looking after them.
“That’s why you have to understand that ChatGPT can’t ask for an X-ray, can’t send someone for tests, and can’t identify signs that a doctor can detect in seconds by looking at the person,” she said.
These types of situations, however, have led both Lianny and her classmates to reflect on this type of behavior with a certain degree of understanding and empathy because this phenomenon can come from a place of desperation.
“If getting a medical appointment can take weeks, with the lack of quick access, people can turn to Dr. Google, Dr. ChatGPT, or Dr. TikTok.
“But then, we have to explain why it’s important to come in for a consultation, why a chatbot isn’t enough. It’s also important to be prepared for those challenges in our career.”

Gynecology and the new digital dilemmas
Lianny said that she currently has special interest in the areas of gynecology and oncology as areas of expertise, recognizing that the use of AI in these specialties may bring new challenges.
“Many women may stop going to the doctor because they think that what AI tells them is good enough. But there are check-ups, such as the pap smear, that cannot be done with a chatbot (...)
“Even if the symptoms aren’t alarming, bad advice can have serious consequences: if a woman has a lump and it doesn’t hurt, the AI might tell her it’s not serious. But, that could be early-stage cancer,” she said.
The central concern students like Lianny have is that AI can provide a false sense of security, plus the algorithms of these tools are often trained on data from other countries.
“What applies to a white patient in the United States doesn’t always apply to a Mexican, indigenous, or migrant woman,” she said.
Because of this, Hernández considers it essential to continue promoting medical contact, periodic check-ups, and education with a focus on gender, context, and needs that only a human can provide.

Other uses and opportunities for innovation
With the start of her clinical internship, Lianny recognizes that she will face new challenges, such as long hours, rotations through different specialties, and a constant academic load, so she believes that AI could help her even more in the future.
“I’m going to be in the hospital six days a week. I’ll still have to hand in assignments, study, and interact with patients,” she said, explaining that, in that context, AI can serve as an organizational tool.
However, even with a myriad of uses, Lianny stresses that she cannot use AI for her academic deliverables because her papers go through plagiarism and artificial intelligence reviewers.
“Everything we hand in must be completely our own (...) The AI can guide you, help you not to get lost, or give you ideas. However, you have to build the real knowledge yourself,” she said.
Finally, Lianny mentioned that students at the Tec participate in challenges that ask them to design innovative solutions to social problems, one instance being Tec Week, saying that AI could be part of these future projects.
“We can use AI to learn better, to practice, to organize ourselves. Later on, when we have clinical experience, we may be able to design something that works without putting anyone at risk,” she concludes.
YOU MAY ALSO BE INTERESTED IN: