Research Assistant for AI Assisted Microscopy [Closed]
- Post by: Naamii
- November 2, 2020
- Comments off
Please prepare a detailed CV and cover letter as a single pdf and upload it. [closed].
15 November, 2020
We are looking for a Research Assistant to work in a collaborative project with Kathmandu Institute of Applied Sciences (KIAS) in building AI assisted smartphone microscopes. The successful candidate will explore machine learning algorithms for small object detection in microscopic images. The project will start with testing and comparing existing state-of-the-art object detection algorithms for identifying cysts and oocysts of parasites such as Giardia and Cryptosporidium in conventional bright field microscope and in smartphone microscope. After finding out the efficacy, limitations and challenges of existing models and exploring how to aid humans in improving their efficiency and accuracy in detecting the parasites, we will further explore novel algorithms to find better methods. This may also include exploring uncertainty estimation in order to make AI assisted parasite detection pipeline more robust.
The smartphone microscope has been built at KIAS1, and we are looking to embed automated parasite detection into this microscope. Current version of smartphone microscope system relies on manual identification and counting of cysts, which is tedious when testing large number of samples and takes some time to train a new user. The proposed AI assisted automated cyst detection system will allow us to reliably test large number of samples in short period time without the need of experienced user.
The project is expected to yield high quality research publications with both application novelty (new smartphone AI assisted microscopes) and methodological novelty (small object detection in noisy low-cost microscopes).
1 Shrestha, R., Duwal, R., Wagle, S., Pokhrel, S., Giri, B., & Neupane, B. B. (2020). A smartphone microscopic method for simultaneous detection of (oo) cysts of Cryptosporidium and Giardia. PLOS Neglected Tropical Diseases, 14(9), e0008560.
- Eager to learn, hardworking attitude, curious mind and sincere.
- Fluency in Python and Python environment for scientific computing and machine learning particularly numpy and scikit-learn.
- Fluent in one or more of the popular deep learning frameworks such as Pytorch or Tensorflow.
- Experience in training convolutional neural networks with reasonable understanding of basics such as why CNNs are common rather than fully connected networks, regularizations, overfitting and bias vs variance tradeoff.
- Good performance in relevant courses such as linear algebra, computer vision, machine learning, image processing and statistics.
- Good proficiency in communicating methods and results of experiments.
- Experience in state-of-the-art object detectors such as region proposal networks, single stage vs two-stage networks.
- Knowledge of the importance of regularizations and be well aware of key challenges such as generalization and explainability.
- Good skills in scientific writing in English and in visualizing experimental results with graphs and figures.
- Experience in git version control
- Implement existing SOTA object detectors for in house microscopic images dataset.
- Identify and document key challenges in using existing models for low-cost smartphone based microscopic images
- Draft research paper under the guidance of the supervisors on the performance of deep learning object detectors and its role in improving efficacy of smartphone-based microscopes
- Explore novel and better methods for small object detection that could be implemented in smartphone microscopes.
- Communicate your research results to the larger communities through publications in international conferences and journals
Minimum Required Qualifications
Bachelor in Engineering (Currently final year students about to complete with exceptional experience may be eligible)
Employment Duration and Salary
- 6 months
- Full-time position with NRs 30,000 per month.
Dr. Bishesh Khanal (Email: [email protected])
Dr. Basant Giri (Email: [email protected])