The summer issue is here! Join us for our launch party on June 19 at Green Apple Books.

The race is on for the killer health app at UCSF

SF Public Press
 — Apr 5 2011 - 11:44am

In the future, you might not need to go to a doctor for follow up visits even if you suffer from a chronic disease. You could connect devices like blood pressure or glucose meters to your phone or enter in data from them, as well as tell your device how you are feeling. The phone (or the application, to be exact) will tell you if you need to adjust your habits, diet or medication – or if you should visit your doctor.

Mobile technology is expected to give patients better access to care at lower costs while empowering them to take care of their health. Pilot projects are ongoing in areas that include remote patient monitoring, but mHealth – mobile technology in healthcare – is still in its infancy. Applications exist that give healthcare professionals access to patient data and advanced medical research but current applications for patients are still simple compared to their full potential.

At the University of California, San Francisco, Jeff Jorgenson and his mHealth development team are building next-generation patient apps. Those apps will be highly personal, getting smarter as they learn more about the user. The phone might, for example, urge you to walk by the bar you are approaching because it knows you are stressed out – from the mood information you have inputted – and having a few drinks might make you relapse into smoking.

Aiming for behavior-changing applications

Applications already exist for exercising, quitting smoking and dieting, but UCSF experts think they can do much better.

“The App Store apps today are nowhere near what they are going to be,” said Larry Suarez, mobile application architect at UCSF. He came to the university late last summer after working 10 years in the medical industry. “Now the apps are the same for everyone. We really want the phone to service patients, to change according to the user and modify users' behavior.”

Jeff Jorgenson came to UCSF from Apple to serve as the assistant director of telemedicine in January 2010. By late spring, he started to hear concerns about UCSF falling behind in mobile health. However, when his team started investigating the issue, they found not everyone was as advanced as they claimed.

“The medical world moves a lot slower than consumer electronics,” Jorgenson explained. The team visited industry conferences and what they saw did not impress them. “The commercial vendors were looking at it from commercial and technological standpoints. They haven't really worked out how the physicians fit in and what they want to do with mHealth.”

Jorgenson said his vision is to put the doctor in the phone and give the phone to the patient. “That's the big market,” he said.

UCSF creating an open-source app engine

As an organization, UCSF is in a good position to create future apps. It has the medical knowledge, physicians who are interested in mHealth, the technical support from information services and patients who would use the apps. But it is not easy to coordinate all this, especially when nobody has the money to make it happen. Until now, funding for mHealth projects has come from the information services unit budget and various grants.

The problem with grants is that applicants need to show what they are going to do, and it takes money to create a demo. This is where wStack, an app engine created by Suarez, helps. It provides one engine to create demos and many kinds of applications. It also works in smartphones like iPhones and Android devices as well as in simpler phones.

Suarez says the wStack will be open-sourced, probably within a month. It means that anyone can develop the code and use it to create new applications. “We try to be as open as possible,” he said.

wStack collects information from sources like electronic health records, phone sensors that can, for example, tell if users are standing or lying down, medical device sensors, user location according to satellite-positioned navigation systems and information from the user including his beliefs and desires. By combining this information, the wStack can run programs that give relevant information to the user in a form that is most effective.

Basically, wStack performs artificial intelligence, drawing conclusions about what is wrong with the patient and what he or she should do. This would free up physicians' time to treat patients who most urgently need care. But it also raises the question: can a machine be relied on to make correct diagnoses?

“We believe so,” Jorgenson said. “There are published medical protocols that can give you the criteria. We have to obviously work together with the physicians.”

Selling the mHealth idea

Before mobile health can really experience a breakthrough, the industry needs to convince insurance companies and other stakeholders that apps really can be efficient in keeping people healthier and saving money. “If you can't understand its effectiveness, it's really hard to justify its existence,” Suarez said.

It's not easy to sell the idea of investing in mHealth to care providers either, as Jorgenson has noticed at UCSF. “Wellness doesn't pay – you only get paid when people come in,” he said. “Finding the right monitoring models has been tough." He plans to create a collection of UCSF branded apps that enable people to take care of themselves.

Branding is probably going to be vital to mHealth success. When it comes to your health, whom do you trust? A race is under way: who will be the one to create a set of intelligent, trustworthy and efficient mobile health applications first?

“What is mHealth? Whoever gets there first and defines the whole thing is going to drive that definition,” Jorgenson said. “I don't think anyone has done that yet.”


Lifebot is impressive but it's not what this article was talking about.   The next gen mHealth won't be  reliant on video or human intervention.  The wellness and behavior change aspects will be incorporated and the mobile device will offer contextual guidance based on what it learns from the environment and interactions with the device's human partner.

 This integration has been occuring here for some years and is patented. We are using military telemedicine combined with consumer telemedicine, 911/999 calls and medical record management with common locations. See: