Plagiarism is checked by the leading plagiarism checker
Volume 4 Issue 2
March-April 2026
| Author(s) | L. Bala Gangamma, T. Kaveri, V. Ahalya, H. Mounika, S. Mounika |
|---|---|
| Country | India |
| Abstract | Industrial machines are important in the contemporary factory setup, and as machinery is overutilized, it may end up overheating and mechanical exhaustion, thus raising the chances of machinery breakages and accidents. The traditional methods of monitoring, e.g. the manual inspection, which is done periodically or the use of just one specific type of sensor, are usually slow, inefficient, and can be easily distorted. In an attempt to overcome this shortcoming, this project brings to use an intelligent real-time surveillance system which incorporates the use of thermal and visual imaging. The thermal camera follows the changes in temperature in order to identify some abnormal heat distribution whereas the visual camera records surface and structural patterns related to wear and fatigue. Combining data of the two cameras using image fusion techniques, they will have better diagnostics output which is more informative and more precise. The analysis is done with deep learning models such as MobileNetV3 and ResNet-50 which can be computed with high precision and have lightweight computation. It is a fast, cost-effective, convenient solution, which means that the suggested system is appropriate when it should be implemented in factories in real-time, when it is used in the outside, and when the environment can be quite dangerous. It will be able to detect problems at an initial state and prevent equipment failures, reduce downtime, and increase the safety of the workplace. |
| Keywords | MobileNetV3, ResNet-50, Heat risk detection. |
| Discipline | Engineering |
| Published In | Volume 4, Issue 2, March-April 2026 |
| Published On | 2026-04-04 |

E-ISSN 2584-0487All research papers published on this website are licensed under Creative Commons Attribution-ShareAlike 4.0 International License, and all rights belong to their respective authors/researchers.