Google recently released Google Mobile Vision API libraries for Android and iOS that can perform real-time offline face detection, barcode reading and OCR. So I decided to try the face tracking feature while creating something for the Halloween. The face tracking can extract multiple faces from images or directly from a camera stream. Additionally, the API can find landmarks such as eyes, mouth, etc. and even tell if an eye is closed or if the person is smiling. However, I just went with the face tracking. The app locates your face using the front camera and then rotates a scary looking eye to look straight at you. To seem more authentic, the eye looks randomly around when idle. This app could be used for making Halloween decorations look alive by placing the device behind a mask or inside a pumpkin.
Watching eye app on Google Play store
Fall 2017
I built and programmed a machine vision setup for automatic foam control as a part of my master's thesis. A standard webcam was used to monitor two lab scale fermentors and the camera feed was analyzed in my open source fermentation control software, Biorec. An anti-foaming agent was added automatically when persistent foam was detected. The foam detection steps were:
- Calculate positions, rotations, distances and angles of two fermentors from their red reflective tags (POSIT algorithm)
- Crop reactor area (quadrilateral transformation)
- Various preprocessing steps (brightness and contrast correction, and euclidean color filtering)
- Segmentation using Otsu's method
- Foam object recognition with fuzzy logic
- Foam height could be calculated in ± 1 cm accuracy (sufficient for automatically feeding anti-foaming agent)
Spring 2015
I designed and built a prototype of a pipette light guide that reduces pipetting errors by lighting the target well. The light is moved manually with back and forward buttons or with a wireless/wired foot pedal. The light movement has two modes: row by row and column by column. This was built from a PIC microcontroller and an aluminum din rail case.
Fall 2015
I designed and built a home automation system using Raspberry Pi B+, webcam , Tellstick Duo USB RF tranceiver and multiple RF power outlets. The system controls home electronics and it can be remotely monitored and accessed through an Android app that I wrote. The network connection uses a secure SSH protocol.
Spring 2015
This was made to demonstrate augmented reality through color tracking and simple shape recognition. The program was written in VB.NET using Aforge.NET framework.
Fall 2014
I built some camera systems in 2014 & 2015 using RPi's as embedded systems and four of these are currently deployed. They feature:
- Motion detection
- Video recording and storage for up to a month of videos
- Optional motion email alert with a picture
- IR lights and night vision
- Password protected web-stream
- Wifi
- HD 1080x720 resolution
- Control software for windows that uses secure SSH to communicate with the camera
Fall 2014