GMCH separate wing for transgenders
File image

Imagine a nurse on duty in a general ward at a rural hospital in the early hours of the morning, watching over 30 patients single-handedly, most of them from remote corners of Assam.

A patient’s condition suddenly deteriorates. There is no alarm. By the time anyone notices, the window for intervention has already closed. This is not a problem of human error. It is a systemic failure.

The introduction of an artificial intelligence-based, contactless patient monitoring system at Guwahati Medical College and Hospital (GMCH) is meant to change this reality. It is designed to be always on, always watching, especially in crowded wards where human attention is stretched thin.

But without proper safeguards, the same system could create a new kind of failure—one that is harder to spot and easier to mistake for reliable performance. At GMCH, a flawed AI alert is not a technical glitch. It is a missed diagnosis in a hospital where time is everything.

The risks are immediate and specific. An untested system can trigger false alarms, pulling already overstretched staff in the wrong direction. It can miss critical warnings entirely, leaving behind a false sense of security.

Continuous monitoring produces vast amounts of sensitive patient data, and without clear legal safeguards, that data can be shared or misused without patients ever knowing.

Procurement without transparency can tie a public hospital to a single vendor for years, quietly draining resources that were never meant for that purpose. These are not hypothetical concerns. They are predictable outcomes when technology moves faster than the institutions meant to govern it.

The scale of GMCH makes all of this more urgent. With about one million outpatient visits and over 100,000 inpatient admissions every year, the hospital operates under constant pressure. Most patients travel long distances with few other options. The demand for smarter systems is real.

AI-assisted monitoring could genuinely help ensure that critical cases are caught in time and that no patient slips through unnoticed in a crowded ward. The question is no longer whether such systems are needed. It is whether they can be trusted.

Other countries have already been through this. The United Kingdom did what India is now attempting, rapidly pushing AI into public healthcare. It spent over £143 million doing so, but the results only came after strict oversight was built into every step.

One independent evaluation of the NHS AI Lab found that a Phase 4 AI Award project saved around £44 million and helped treat 150,000 patients—not because the technology was exceptional, but because it was tested rigorously before anyone depended on it. Singapore drew the same conclusion and made it policy.

The Ministry of Health, together with the Health Sciences Authority, developed the Artificial Intelligence in Healthcare Guidelines, a framework requiring accountability, transparency, and safety standards before any AI tool enters a clinical setting.

In 2025, the UK and Singapore went further and created a joint corridor to speed up access to safe health technologies.

The lesson is plain: AI in healthcare does not fail because it moves too slowly. It fails when it moves faster than governance.

India faces tighter constraints than either the UK or Singapore. Public health expenditure remains around 2% of GDP, leaving almost no room for expensive technological missteps.

A failed system here is not just a financial loss; it is a missed chance to improve patient care for people who have very few other options. Deploying AI without proper validation is not innovation. It is a transfer of risk from the system to the patient.

India is not starting from scratch. The Strategy for Artificial Intelligence in Healthcare for India (SAHI), released by the Ministry of Health and Family Welfare, lays out a national framework for governing how AI is deployed in clinical settings.

Alongside it, the Benchmarking Open Data Platform for Health AI (BODH) provides a system to test and validate AI tools against real-world data before they are used at scale. The problem is not the absence of frameworks.

It is the absence of compulsion. Unless hospitals like GMCH are required to use these safeguards, they remain optional, and risk remains inevitable.

The same principle applies to data. Patient monitoring creates a continuous digital trail of deeply sensitive information. Without strict compliance with the Digital Personal Data Protection Act, and clear rules on storage, access, and third-party use set from the start, that data becomes vulnerable.

Trust in public healthcare cannot survive ambiguity over who owns patient data and how it is used.

How these systems are introduced matters just as much as whether they are introduced.

Deploying AI in district hospitals with unreliable power or poor connectivity will not close the healthcare gap; it will widen it. What functions well in a large urban hospital can fail entirely in a rural setting.

A gradual rollout, tested across different real-world conditions, is not timidity. It is the only responsible approach.

Assam’s decision to bring AI into public healthcare shows a genuine willingness to innovate under difficult circumstances—rising disease burdens, staff shortages, and infrastructure under strain.

These are exactly the conditions where well-governed AI can make a real difference. But its impact will depend far less on how advanced the technology is and far more on how well it is managed.

The nurse in that ward at three in the morning does not need a sophisticated system in theory. She needs one that works in practice—one that sends timely alerts, protects her patients’ data, and remains reliable regardless of who supplies it.

If this system fails, it will not fail in code. It will fail in that ward, at that hour, when no alarm sounds, and a life slips through unnoticed.

Views expressed are that of the author and do not reflect EastMojo’s stance on this or any other issue. The authors are trainees at Pahle India Foundation. Dr Urvashi Prasad (Senior Fellow) contributed to the article.

Also Read: Assam’s tea industry at a crossroads: Why wages deserve centre stage

Independent Journalism Needs You
Khushbu Bura Gohain
Khushbu Bura Gohain Reporter, EastMojo

You just read a story that took days to report. Help us keep our reporters on the ground in the Northeast.

For Rs 83/month - less than a cup of coffee
Ad-free reading, support and keep important stories alive
Become a Member
OR

Support once (any amount)

(incl. 18% GST)
or
UPI QR Code
Scan to pay via UPI

Leave a comment

Leave a comment