AMIS

Microscope Smart Sensors

The AMIS project aims to develop a next-generation adaptive microscope control system with automated image analysis tools for detection and tracking of living cells in high temporal-spatial resolution. Such high-throughput screening systems will replace the currently prevalent practices of visual inspection and manual analysis by delivering results with unprecedented efficiency, completeness and consistency, and is thus ready for thorough statistical evaluation of large-scale datasets.

Our research focuses on the following issues:

  • Development of effective, robust algorithms of cell detection and tracking for fluorescent microscopes
  • Exploiting the parallelism of hardware computing in image preprocessing for high throughput applications
  • Investigation of effective data reduction and management strategies to cope with the immense volume of image data
  • Exploring the optimized constellations and integration strategies of heterogeneous microprocessors in embedded vision systems with high-throughput and high-content analysis requirements
  • Integration and interfacing with the image acquisition and control logic units on a next-generation microscope platform with self-adaptation capabilities
  • Streamlining the methodology of hardware/software co-design for embedded computer vision systems

A powerful, versatile, yet efficient image analysis system is decisive to automating the high-throughput imaging experiments of living cells. The following processing pipeline is proposed to intelligently handle the immense volume of data with on-the-fly extraction of relevant information for cell property analysis and adaptive microscope control to retain the optimal experimental conditions.

To meet the stringent real-time requirement for adaptive microscope control, the cell detection and tracking algorithms are implemented in a heterogeneous multiprocessor embedded system with FPGAs and DSPs and integrated into a next-generation microscope platform.

Videos

Tracking three distinct cells for 12h
Tracking all visible cells simultaneously for 12h

People

  • Yang Chen, M.Sc.
  • Dipl.-Ing. Matthias Geisbauer
  • Thorsten Röder, M.Sc.
  • Dr. Martin Wojtczyk

Partners

Acknowledgement

The project on which this report is based was sponsored by funds from the Federal Ministry of Economics and Technology according to reference number 16IN0676. Sponsored by BMWi on the basis of an enactment with the German Parliament.

Publications

[1] Matthias Geisbauer, Thorsten Röder, Yang Chen, Alois Knoll, and Rainer Uhl. Adaptive platform for fluorescence microscopy-based high content screening. In Proceedings of SPIE on Medical Imaging, San Diego, CA, 2010. [ DOI | .bib | .pdf ]
[2] Thorsten Röder, Matthias Geisbauer, Yang Chen, Alois Knoll, and Rainer Uhl. A system architecture for online data interpretation and reduction in fluorescence microscopy. In IS&T/SPIE Electronic Imaging, San Jose, CA, 2010. [ DOI | .bib | .pdf ]
[3] Giorgio Panin, Claus Lenz, Suraj Nair, Erwin Roth, Martin Wojtczyk, Thomas Friedlhuber, and Alois Knoll. A unifying software architecture for model-based visual tracking. In IS&T/SPIE 20th Annual Symposium of Electronic Imaging, San Jose, CA, 2008. [ .bib | .pdf ]
[4] Martin Wojtczyk, Kushal Abhyankar, Suraj Nair, and Alois Knoll. A computer vision based approach for cell tracking to increase throughput in visual drug discovery. In Natural Product Discovery and Production II, Whistler B.C., Canada, 2008. [ .bib | .pdf ]