Reading and understanding house numbers for delivery robots using the ”SVHN Dataset”

Date

2024-06-05

Advisors

Journal Title

Journal ISSN

Volume Title

Publisher

IEEE

Department

Type

Conference paper

ISSN

2641-0184

item.page.extent-format

Citation

Pradhan O, Tang G, Makris C, Gudipati R. (2024) Reading and understanding house numbers for delivery robots using the “SVHN Dataset”. In: 2024 IEEE International Conference on Industrial Technology (ICIT), 25-27 March 2024, Bristol, UK

Abstract

Detecting street house numbers in complex environments is a challenging robotics and computer vision task that could be valuable in enhancing the accuracy of delivery robots' localisation. The development of this technology also has positive implications for address parsing and postal services. This project focuses on building a robust and efficient system that deals with the complexities associated with detecting house numbers in street scenes. The models in this system are trained on Stanford University's SVHN (Street View House Numbers) dataset. By fine-tuning the YOLO's (You Only Look Once) nano model results with an effective detection range from 1.02 meters to 4.5. The optimum allowance for angle of tilt was ±15°. The inference resolution was obtained to be 2160 * 1620 with inference delay of 35 milliseconds.

Description

item.page.description-software

item.page.type-software-language

item.page.identifier-giturl

Keywords

Artificial Intelligence, Character Recognition, Computer Vision, Object Detection, YOLO, SVHN

Rights

Attribution-NonCommercial 4.0 International

item.page.relationships

item.page.relationships

item.page.relation-supplements