Click to open the HelpDesk interface
AECE - Front page banner

Menu:


FACTS & FIGURES

JCR Impact Factor: 1.221
JCR 5-Year IF: 0.961
SCOPUS CiteScore: 2.5
Issues per year: 4
Current issue: Aug 2021
Next issue: Nov 2021
Avg review time: 88 days


PUBLISHER

Stefan cel Mare
University of Suceava
Faculty of Electrical Engineering and
Computer Science
13, Universitatii Street
Suceava - 720229
ROMANIA

Print ISSN: 1582-7445
Online ISSN: 1844-7600
WorldCat: 643243560
doi: 10.4316/AECE


TRAFFIC STATS

1,748,458 unique visits
579,693 downloads
Since November 1, 2009



Robots online now
Googlebot
SemanticScholar


SJR SCImago RANK

SCImago Journal & Country Rank




TEXT LINKS

Anycast DNS Hosting
MOST RECENT ISSUES

 Volume 21 (2021)
 
     »   Issue 3 / 2021
 
     »   Issue 2 / 2021
 
     »   Issue 1 / 2021
 
 
 Volume 20 (2020)
 
     »   Issue 4 / 2020
 
     »   Issue 3 / 2020
 
     »   Issue 2 / 2020
 
     »   Issue 1 / 2020
 
 
 Volume 19 (2019)
 
     »   Issue 4 / 2019
 
     »   Issue 3 / 2019
 
     »   Issue 2 / 2019
 
     »   Issue 1 / 2019
 
 
 Volume 18 (2018)
 
     »   Issue 4 / 2018
 
     »   Issue 3 / 2018
 
     »   Issue 2 / 2018
 
     »   Issue 1 / 2018
 
 
 Volume 17 (2017)
 
     »   Issue 4 / 2017
 
     »   Issue 3 / 2017
 
     »   Issue 2 / 2017
 
     »   Issue 1 / 2017
 
 
  View all issues  








LATEST NEWS

2021-Jun-30
Clarivate Analytics published the InCites Journal Citations Report for 2020. The InCites JCR Impact Factor of Advances in Electrical and Computer Engineering is 1.221 (1.053 without Journal self-cites), and the InCites JCR 5-Year Impact Factor is 0.961.

2021-Jun-06
SCOPUS published the CiteScore for 2020, computed by using an improved methodology, counting the citations received in 2017-2020 and dividing the sum by the number of papers published in the same time frame. The CiteScore of Advances in Electrical and Computer Engineering in 2020 is 2.5, better than all our previous results.

2021-Apr-15
Release of the v3 version of AECE Journal website. We moved to a new server and implemented the latest cryptographic protocols to assure better compatibility with the most recent browsers. Our website accepts now only TLS 1.2 and TLS 1.3 secure connections.

2020-Jun-29
Clarivate Analytics published the InCites Journal Citations Report for 2019. The InCites JCR Impact Factor of Advances in Electrical and Computer Engineering is 1.102 (1.023 without Journal self-cites), and the InCites JCR 5-Year Impact Factor is 0.734.

2020-Jun-11
Starting on the 15th of June 2020 we wiil introduce a new policy for reviewers. Reviewers who provide timely and substantial comments will receive a discount voucher entitling them to an APC reduction. Vouchers (worth of 25 EUR or 50 EUR, depending on the review quality) will be assigned to reviewers after the final decision of the reviewed paper is given. Vouchers issued to specific individuals are not transferable.

Read More »


    
 

  3/2013 - 12

3D Hand Gesture Recognition using the Hough Transform

OPRISESCU, S. See more information about OPRISESCU, S. on SCOPUS See more information about OPRISESCU, S. on IEEExplore See more information about OPRISESCU, S. on Web of Science, BARTH, E. See more information about BARTH, E. on SCOPUS See more information about BARTH, E. on SCOPUS See more information about BARTH, E. on Web of Science
 
View the paper record and citations in View the paper record and citations in Google Scholar
Click to see author's profile in See more information about the author on SCOPUS SCOPUS, See more information about the author on IEEE Xplore IEEE Xplore, See more information about the author on Web of Science Web of Science

Download PDF pdficon (715 KB) | Citation | Downloads: 632 | Views: 3,167

Author keywords
image processing, computer vision, gesture recognition, Kinect camera, Hough transform

References keywords
gesture(11), recognition(10)
Blue keywords are present in both the references section and the paper title.

About this article
Date of Publication: 2013-08-31
Volume 13, Issue 3, Year 2013, On page(s): 71 - 76
ISSN: 1582-7445, e-ISSN: 1844-7600
Digital Object Identifier: 10.4316/AECE.2013.03012
Web of Science Accession Number: 000326321600012
SCOPUS ID: 84884965434

Abstract
Quick view
Full text preview
This paper presents an automatic 3D dynamic hand gesture recognition algorithm relying on both intensity and depth information provided by a Kinect camera. Gesture classification consists of a decision tree constructed on six parameters delivered by the Hough transform of projected 3D points. The Hough transform is originally applied, for the first time, on the projected gesture trajectories to obtain a reliable decision. The experimental data obtained from 300 video sequences with different subjects validate the proposed recognition method.


References | Cited By  «-- Click to see who has cited this paper

[1] A. Kolb, E. Barth, R. Koch, R. Larsen, "Time-of-Flight Cameras in Computer Graphics," In Computer Graphics Forum, 29(1), pp 141-159, 2010
[CrossRef] [Web of Science Times Cited 233] [SCOPUS Times Cited 190]


[2] X. Liu and K. Fujimura, "Hand gesture recognition using depth data," Proc. of the 6th IEEE international conf. on Automatic face and gesture recognition (FGR' 04), Washington, DC, USA, 529-534, 2004.

[3] S. Mitra and T. Acharya, "Gesture Recognition: A Survey," IEEE Trans. on Syst., man, and cybernetics, Part C: Applications and Reviews, pp. 311-324, vol. 37, no. 3, may 2007
[CrossRef] [Web of Science Times Cited 975] [SCOPUS Times Cited 1373]


[4] M. B. Holte, T. B. Moeslund, and P. Fihl, "View-invariant gesture recognition using 3D optical flow and harmonic motion context," Comput. Vis. Image Underst. 114, 12, pp. 1353-1361, 2010.
[CrossRef] [Web of Science Times Cited 46] [SCOPUS Times Cited 61]


[5] P. Doliotis, A. Stefan, C. McMurrough, D. Eckhard, and V. Athitsos, "Comparing gesture recognition accuracy using color and depth information," in Proceedings of PETRA, pp. 20:1-20:7, 2011.
[CrossRef] [SCOPUS Times Cited 66]


[6] C. Keskin, A. T. Cemgil, and L. Akarun, "DTW Based Clustering to Improve Hand Gesture Recognition," in Proceedings of HBU'11, pp. 72-81, Amsterdam, 2011.
[CrossRef] [SCOPUS Times Cited 21]


[7] L. Gallo, A.P. Placitelli, and M. Ciampi, "Controller-free exploration of medical image data: experiencing the Kinect," Proc. of. 24th IEEE CMBS'11, Piscataway, NJ, USA, 2011.
[CrossRef] [Web of Science Times Cited 112] [SCOPUS Times Cited 172]


[8] S. Soutschek, J. Penne and J. Hornegger, 3D gesture-based scene navigation in medical imaging applications using time-of-flight cameras, IEEE Conf. on Computer Vision & Pattern Recogn., Workshop on ToF-Camera based Computer Vision (2008).
[CrossRef] [SCOPUS Times Cited 63]


[9] P. Yanik et al., "Use of Kinect Depth Data and Growing Neural Gas for Gesture Based Robot Control," in Proc. of PervaSense, pp. 283-290, 2012.
[CrossRef] [SCOPUS Times Cited 19]


[10] Prodan, R.-C., Pentiuc, S.-G., Vatavu, R.-D., "An Efficient Solution for Hand Gesture Recognition from Video Sequence," Advances in Electrical and Computer Engineering, vol. 12, no. 3, pp. 85-88, 2012,
[CrossRef] [Full Text] [Web of Science Times Cited 2] [SCOPUS Times Cited 4]


[11] K. Lai, J. Konrad, and P. Ishwar, "A gesture-driven computer interface using Kinect camera," in Proc. Southwest Symposium on Image Analysis and Interpretation, Apr. 2012.

[12] Q. Munib, M. Habeeb, B. Takruri and H. A. Al-Malik, "American sign language (ASL) recognition based on Hough transform and neural networks," Expert Systems with Applications, vol. 32, 1, pp. 24-37, 2007.
[CrossRef] [Web of Science Times Cited 52] [SCOPUS Times Cited 82]


[13] O. Altun, S. Albayrak, "Turkish fingerspelling recognition system using Generalized Hough Transform, interest regions, and local descriptors," Patt. Rec. Letters, vol. 32, 13, pp. 1626-1632, 2011.
[CrossRef] [Web of Science Times Cited 10] [SCOPUS Times Cited 12]


[14] N.N. Bhat, "Real time robust hand gesture recognition and visual servoing," India Conference (INDICON), Annual IEEE, pp.1153-1157, 7-9 December 2012
[CrossRef] [SCOPUS Times Cited 1]


References Weight

Web of Science® Citations for all references: 1,430 TCR
SCOPUS® Citations for all references: 2,064 TCR

Web of Science® Average Citations per reference: 102 ACR
SCOPUS® Average Citations per reference: 147 ACR

TCR = Total Citations for References / ACR = Average Citations per Reference

We introduced in 2010 - for the first time in scientific publishing, the term "References Weight", as a quantitative indication of the quality ... Read more

Citations for references updated on 2021-10-13 16:48 in 82 seconds.




Note1: Web of Science® is a registered trademark of Clarivate Analytics.
Note2: SCOPUS® is a registered trademark of Elsevier B.V.
Disclaimer: All queries to the respective databases were made by using the DOI record of every reference (where available). Due to technical problems beyond our control, the information is not always accurate. Please use the CrossRef link to visit the respective publisher site.

Copyright ©2001-2021
Faculty of Electrical Engineering and Computer Science
Stefan cel Mare University of Suceava, Romania


All rights reserved: Advances in Electrical and Computer Engineering is a registered trademark of the Stefan cel Mare University of Suceava. No part of this publication may be reproduced, stored in a retrieval system, photocopied, recorded or archived, without the written permission from the Editor. When authors submit their papers for publication, they agree that the copyright for their article be transferred to the Faculty of Electrical Engineering and Computer Science, Stefan cel Mare University of Suceava, Romania, if and only if the articles are accepted for publication. The copyright covers the exclusive rights to reproduce and distribute the article, including reprints and translations.

Permission for other use: The copyright owner's consent does not extend to copying for general distribution, for promotion, for creating new works, or for resale. Specific written permission must be obtained from the Editor for such copying. Direct linking to files hosted on this website is strictly prohibited.

Disclaimer: Whilst every effort is made by the publishers and editorial board to see that no inaccurate or misleading data, opinions or statements appear in this journal, they wish to make it clear that all information and opinions formulated in the articles, as well as linguistic accuracy, are the sole responsibility of the author.




Website loading speed and performance optimization powered by: