Universal Seven-Segment Reader:
Enhancing Industrial Automation through Computer Vision and IoT
We are a maker club from Naresuan University who are passionate about innovation and collaboration between academia and industry through groundbreaking research and cutting edge technology. Our team includes:
(from left to right)
Titipan Phetsrikran: OpenCV Developer
Panusorn Banlue: UI/UX & Web Developer
Wansuree Massagram: Maker Club Founder
Thanathorn Phoka: Tech Lead
As one of the finalists for NECTEC’s Maker Startup 2018 “Smart Factory” contest, we have developed IoT solutions to be deployed at Wuttisak Aesthetic Care CO., LTD., — helping them automate their data collection using the image processing technique.
We believe computer vision and IoT together can improve quality and efficiency of industrial automation.
During the normal QC process at Wuttisak Aesthetic Care, the workers on the factory floor have to log parameters displayed on a mixing machine
The mixing machine shows the current status on a panel of several seven-segment displays, which is a form of an electronic display device for decimal numbers. A worker reads the displays and writes the values onto a process tracking paper — only one record for each mixing batch. These records are tracked and maintained manually (still on paper). The logs are later verified by the R&D team.
“Automatically collect data with the help of computer vision and NETPIE IoT”
Computer vision unit setup with NETPIE IoT
We are revolutionizing the company’s QC process using our computer vision IoT camera with onboard image processing to automatically read these measurements from seven-segment displays and organize the data using the NETPIE IoT platform.
Our computer vision unit is comprised of a Raspberry Pi with an LCD touch screen, a webcam, and a tripod. One of these units cost slightly more than 5,000 baht. With a wire or wireless connection, this unit can be placed in front of a target to collect the readouts from up to 7 seven-segment display panels.
But why do we have to go through the trouble of using computer vision? Why don’t we just plug in some sort of cables (RS232, RS485, MODBUS, CANBUS, ETHERCAT, Ethernet, etc.) to read the data through serial connection?
Our answer to that question is we aim to be unobtrusive and non-disruptive to the company’s normal workflow. Additionally, the setup for our computer vision unit is flexible and easily scalable. We can deploy the computer vision unit anywhere as needed.
- Flexibility and scalability
- Preserving the conventional method
- Offering unobtrusive and non-disruptive solutions with existing hardware
Our computer vision IoT solution involves several aspects ranging from pattern recognition, hardware optimization, to database organization. Here is our IoT network architecture:
The IoT Network Architecture for the Computer Vision Unit on NETPIE
NETPIE supports and connects all components within the system. It helps facilitate device management, handle hardware/software communication protocols, collect/analyze data, enhance data flow and functionality of smart applications.
For our purpose, NETPIE is used to transfer the data between the computer vision unit to an HTML webpage. The computer vision unit on the Raspberry Pi can be controlled via this webpage. The data from Pi is published to any subscriber. We also save our data locally on a mongoDB database.
Let’s talk about what we have to read from the mixer machine with our computer vision unit.
Parameters to be recorded from the mixing machine
Here we have the readouts of the batch temperature (1), the current for homo/paddle/scraper (2-4), and the RPM for homo/paddle/scraper (5-7).
The picture above shows the computer vision unit installed in front of the mixing machine. When the worker wants to record the measurements from seven-segment displays, he can turn on the Raspberry Pi and run the shell script by simply double click on “open.sh” as shown below.
Running the computer vision program with a shell script
A window showing the status of the system will appear. The color bar indicates whether the system is ready to take the measurement (red = still calibrating, yellow = ready, green = processing). Once the ON button is pressed, the data is published to NETPIE. As you can see below, all seven parameters of interest (batch temperature, current for homo/paddle/scraper, and RPM for homo/paddle/scraper) are shown in this window.
Status window on Raspberry Pi
So that is what happens on the factory floor.
Other workers, R&D scientists, or managers can remotely log into a web page (on their PCs or mobile devices) to access the same information at the same time. The measurement data are published through NETPIE. We send the numbers as well as images in base64.
Here is what our HTML page looks like:
As you can see from the above picture, an operator can fill in the information about the mixing batch (batch/ product code/ lot no.) and record the data.
When the work is done for the day, the worker on the factory floor can turn off the computer vision unit by running the shutdown shell script. Just double click “shutdown.sh” and go home. 🙂
Turning off the unit with a shell script
If the factory would like to see the previously recorded data, a worker can query data from the local database we set up. He or she can save the record into a .csv file for later inspection and analysis. Below is an example of a tester batch.
For more information, you can visit our other blog posts explaining in detail each technique:
- All About Image illustrates how the seven-segment recognition algorithm works
- And the best camera goes to… walks through the steps of how to pick out the most suitable camera for the job
- Are we playing chess? describes the chessboard calibration with OpenCV
- Letters to the Skynet explains how the data are organized and transferred through NETPIE to a database
Since the deployment at Wuttisak Aesthetic Care on February 11, 2019, we have been monitoring our system closely. The only times our system fails to detect the correct numbers are when a worker adjusting the knobs in front of the mixing machine panels (quickly changing the readouts as well as obscuring the vision). This indicates how robust our system is. As for NETPIE, we have adjusted our sampling rate such that the data uplink would not cause any trouble to the company connections.
We have a cute story the Wuttisak Aesthetic Care manager shared with us. At the beginning of the deployment, the manager would be the only one handling the Raspberry Pi and the server all by herself. But our unit had gathered popularity from the factory floor workers, they asked her to teach them how to operate our system — how to turn it on and off, what it does, how it works, etc. We were extremely happy to hear how interested these workers were in our project.
We also have collected feedback and suggestions from the users as well as NECTEC committee on how to improve our system. The following section shows our plan and action for the next phase of this project.
Going Above and Beyond:
Understanding the capability of the technology and the ability of our team, we wanted to push our solutions for Wuttisak Aesthetic Care to be more than automatizing and digitizing of data. We have divided the “possible improvements” to our system into four aspects: preventive maintenance, error correction, data analysis, and hardware scalability.
1) Preventive Maintenance
One natural addition to our setup is a warning system. Each parameter has its own normal range of operation as stated in the table.
|Label||SI unit||Abnormal range||Action required||
|Main tank temperature||
|Homo current meter||
> 5.0 A
|Paddle current meter||
> 1.0 A
|Scraper current meter||
> 1.6 A
> 3512 RPM
> 97 RPM
> 50 RPM
The financial burden of failures in mixture batches or motors could set the company back from 10,000 baht up to 10,000,000 baht. Hence, preventive maintenance and an early warning system are imperative to the company operation.
Our current system already has the ability to flag the abnormality of the reading in the graphs as well as the recorded data. An example of a warning from ammeters (simulated) is shown below in the yellow pop-up window.
A flag for the warning system
We have also added the Line notification feature to alert workers and managers in case of irregularity and emergency. A notification is always sent when a new batch starts. If any parameter is out of range, a notification is also sent out as shown below.
A Line notification for preventive maintenance
2) Error Correction
We have found two major causes of data error in our system. The first happens when our algorithm could not detect any number on the panel. The second problem is when our system could not establish a connection to NETPIE.
The simplest way to overcome this type of errors could be data interpolation which is a popular technique of dealing with missing data. Interpolation replaces missing values using the last valid value before the missing value and the first valid value after the missing value. This additional feature should be straightforward to include to our data management.
3) Data Analysis
With more and more data aggregated for the factory, the R&D department could possibly start seeing trends, discovering useful information, informing conclusions, supporting decision-making for the company, and ultimately achieving effective operation.
We are currently exploring the possibility of aiding Wuttisak Aesthetic Care in this aspect.
4) Hardware Scalability
In our current system, we could potentially deploy the computer vision unit with a Raspberry Pi and a webcam anywhere as needed. A bigger factory might need several more sets of them. A smaller one might not need as much capacity.
Another idea we would like to address regarding the camera is, instead of using a small webcam, we could pick a reasonable-priced IP camera with good resolution, as seen in the example picture below. The IP camera receives control data and sends image data via the Internet. The pictures could then be processed on a much more capable computer. However, by moving to that direction, we would be moving away from edge computing using a Raspberry Pi. This is one of many tradeoffs that we will have to consider when finding IoT solutions.
Example picture from an IP camera (HIK Vision) captured with OpenCV
We have designed and implemented an IoT solution that can universally read seven-segment display panels. Our work combines computer vision technique and IoT to address the following industrial problems:
- data collection
- legacy data collection technique is slow/expensive/limited
- manually/visually/discretely aggregate data (for R&D and QC)
- data access
- manually transcribe data into electronic format
- slow/arduous retrieving of records
- capital cost
With our solution to automatically collect data with the help of computer vision and NETPIE IoT, we have demonstrated that we can
- expand scope (frequency) of data collection
- remove human error source
- reduce human workload (long-term financial/emotional costs)
Our system offers these competitive advantages:
- flexibility and scalability
- preserving the conventional method
- offering unobtrusive and non-disruptive solutions with existing hardware
and could potentially
- provide initial steps to real-time process control
- support future integration with the ERP system
Since the day we, Naresuan University Maker Club, decided to get involved with NECTEC’s Maker Startup 2018 “Smart Factory” Contest, each and every one of us has learned tremendously from this experience. We have worked tirelessly — fulfilling our academic obligations during the days and hacking away on this project during the nights.
As students and faculty members of a higher-education institution, we rarely have a chance to get out of our ivory tower and get into the real world. We have learned valuable lessons on grits, compassion, and empathy through our attempts to help out others.
We are grateful for the opportunity to foster IoT deployment with Wuttisak Aesthetic Care CO., LTD., and increase collaboration among other like-minded makers through the NETPIE platform.
We would like to thank everyone who has helped make this project possible. The Department of Computer Science and IT, Asst. Prof. Sanya Khruahong, Dr. Noah Hafner, and all maker club members — there is no word to express our gratitude.