A Smart Menu Using Video Processing for Restaurants

Authors

  • Baswaraj Baswaraj Department of Electronics and Communication Engineering, East West Institute of Technology, Bangalore, India
  • Bhaskar Tigadi Department of Electronics and Communication Engineering, East West Institute of Technology, Bangalore, India
  • Jayanth M S Department of Electronics and Communication Engineering, East West Institute of Technology, Bangalore, India
  • Kallesh B S Department of Electronics and Communication Engineering, East West Institute of Technology, Bangalore, India
  • Bhagya Bhagya Department of Electronics and Communication Engineering, East West Institute of Technology, Bangalore, India
  • Anand M Department of Electronics and Communication Engineering, East West Institute of Technology, Bangalore, India

Keywords:

Touchscreen, Cost-effective, Video Processing

Abstract

Restaurants worldwide are trying different techniques to gain customers. It could be with creative advertising or by providing high-tech interactions with the customers. Several restaurants have adopted table that include touchscreen (table - sized) with which one can place order and use the same as table. By keeping this in mind a tabletop based eco- friendly touch sensing system is proposed in this project. This screen is simply a paper placed on the glass table. The touch sensing based advanced menu ordering is the method by which anyone will select any items by their choice which are in menu. Here video processing is performed using MATLAB and Simulink to get the finger blob, help of camera placed under the table. The list of selected items and total amount is spoken to the user for their confirmation, after confirmation the same order will be sent to chef. With the proposed technique, we propose an e - wasteless, cost -  effective eco - friendly touch sensing system.

Downloads

Download data is not yet available.

References

H. Elfekey, H. A. Bastawrous, and S. Okamoto, “A touch sensing

technique using the human body extremely low frequency fields,”

Sensors, vol. 16, no. 12, 2016.

K. Tachi, S. Okamoto, Y. Akiyama, and Y. Yamada, “Hum

touch: Finger gesture recognition on hydrogel-painted paper using

hum-driven signals,” in Proceedings of IEEE Global Conference

on Consumer Electronic, 2019, pp. 157–159.

H. Elfekey, S. Okamoto, and Y. Yamada, “Localization of touch

on granite based on ac hum noise,” in Proceedings of IEEE

Global Conference on Consumer Electronic, 2017, pp. 163–164.

H. Elfekey and S. Okamoto, “Turning surfaces into touch panels:

A Granite-touch pad, ser. Lecture Notes in Electrical Engineering.

Springer, 2016, vol. 432, pp. 301–304.

J. A. Paradiso, K. Hsiao, J. Strickon, J. Lifton, and A. Adler,

‘‘Sensor systems for interactive surfaces,’’ IBM Syst. J., vol. 39,

nos. 3–4, pp. 892–914, Jul. 2000.

G. Laput and C. Harrison, ‘‘Surfacesight: A new spin on touch,

user, and object sensing for IoT experiences,’’ in Proc. CHI Conf.

Hum. Factors Comput. Syst., New York, NY, USA, 2019, pp. 1–

Microvision. (2020). Product Brief: Short-Throw Interactive

Display Module & Starter Kit. Accessed: Jun. 2021. Available:

http://www.microvision.com/wpcontent/uploads/2020/01/DB0140010_ MV-2407sti421_Product_Brief.pdf.

H. Kubo, S. Jayasuriya, T. Iwaguchi, T. Funatomi, Y.

Mukaigawa, and S. G. Narasimhan, ‘‘Programmable non-epipolar

indirect light transport: Capture and analysis,’’ IEEE Trans. Vis.

Comput. Graphics, vol. 27, no. 4, pp. 2421–2436, Apr. 2021.

D. Victor. (2017). MicrovisionHandTrack: A Library for

Prototyping Real-Time Hand Tracking Interfaces Using

Convolutional Neural Networks. Accessed: Apr. 2021. [Online].

Available: https://github.com/ victordibia/handtrack.js/

Downloads

Published

2022-06-13

How to Cite

[1]
B. Baswaraj, B. . Tigadi, J. M S, K. B S, B. Bhagya, and A. M, “A Smart Menu Using Video Processing for Restaurants”, pices, pp. 32-35, Jun. 2022.

Issue

Section

Articles