For my first long tweet I am so excited to recognize Varun Shenoy from Cupertino HS -- one of the smart, smart High School students at #AMIA2017 evaluated recognition of 7-segment numerical display -- how cool is that??? @nlm_news congratulates you! Welcome to @AMIAinformatics!— Patti Brennan (@NLMdirector) November 8, 2017
About a year ago, I began the development of Biosnap, an app that can automatically read data from seven segment-based medical monitors with an image and a bit of magic that is computer vision and machine learning. My mentor advised me to submit a research paper for this project to the American Medical Informatics Association Annual Symposium. After a lot of late night programming sessions, I finally finished the paper and the application. The day of submission was the most hectic, and I remember it clearly. My mentor and I kept sending updated drafts of the paper to each other and we were able to submit it online within the nick of time.
Anyway, I spent the next few weeks creating a poster for my research. I planned to compete at the Synopsys Science Fair, the local science competition, in the computer science category. Overall, I didn’t expect to win anything mainly because I did not think of my project as a technical feat. And I was correct. Although I was upset after the results of the science fair, I continued to work on other projects and hone my data science and machine learning skills.
A few months later, I received an email.
I was in. I was accepted to a professional conference for research I mostly did in my own house. Not only that, I was accepted for a presentation in a session. Generally, top papers get accepted for presentations and the rest are placed in poster sessions. I was honestly blown away.
Last week, I attended a single day of the conference (the day that I was scheduled to present) and it was a blast. We took a red-eye flight to Washington DC and got about four hours of sleep before my presentation. My dad and I went to the session room early to make sure the cords and slideshow worked with the projector. We grabbed some breakfast (a single New York cheesecake donut from Krispy Kreme) and headed back to the session. Ten minutes prior to the session, there was no one apart from a few individuals who reached early and were browsing their laptops and phones. When the session began, there were upwards of 70 individuals in the room. There weren’t enough chairs, so many had to stand up in the back. Everyone seemed either very young (graduate school) or middle aged (professional practitioners).
I was the second presenter. My presentation was squeezed between a PhD from Columbia University and a Postdoc from UIUC in S101: Oral Presentations - mHealth Applications to Support Data Capture and Consumer Engagement. Although I was nervous at first, I quickly overcame my restlessness and gave my talk. I was asked many questions from “do you have any quantitative results that compare using the app to manually typing in the information” and “wouldn’t a statistical overlay atop of individual digits provide a simpler approach than your outlined random forest”. It went very well and many in the audience nodded as I answered questions, providing me with immense confidence.
#S101 #AMIA2017 Investigator Varun Shenoy trained a mobile app to do OCR on off-the-shelf health devices with 7-segment LED displays (like your old calculator or those inexpensive BP cuffs). BTW, he’s a high school junior! pic.twitter.com/dryUr9HnSU— Bimal Desai, MD, MBI (@origamidoc) November 8, 2017
Afterwards, my dad and I ate some bibimbap for lunch and took a stroll to the White House. It was a fantastic experience and I’m looking forward to many more research symposiums in the future.