Click to open the HelpDesk interface
AECE - Front page banner

Menu:


FACTS & FIGURES

JCR Impact Factor: 1.102
JCR 5-Year IF: 0.734
SCOPUS CiteScore: 2.5
Issues per year: 4
Current issue: May 2021
Next issue: Aug 2021
Avg review time: 73 days


PUBLISHER

Stefan cel Mare
University of Suceava
Faculty of Electrical Engineering and
Computer Science
13, Universitatii Street
Suceava - 720229
ROMANIA

Print ISSN: 1582-7445
Online ISSN: 1844-7600
WorldCat: 643243560
doi: 10.4316/AECE


TRAFFIC STATS

1,660,111 unique visits
534,269 downloads
Since November 1, 2009



Robots online now
Applebot
Sogou
SemanticScholar
PetalBot
bingbot


SJR SCImago RANK

SCImago Journal & Country Rank




TEXT LINKS

Anycast DNS Hosting
MOST RECENT ISSUES

 Volume 21 (2021)
 
     »   Issue 2 / 2021
 
     »   Issue 1 / 2021
 
 
 Volume 20 (2020)
 
     »   Issue 4 / 2020
 
     »   Issue 3 / 2020
 
     »   Issue 2 / 2020
 
     »   Issue 1 / 2020
 
 
 Volume 19 (2019)
 
     »   Issue 4 / 2019
 
     »   Issue 3 / 2019
 
     »   Issue 2 / 2019
 
     »   Issue 1 / 2019
 
 
 Volume 18 (2018)
 
     »   Issue 4 / 2018
 
     »   Issue 3 / 2018
 
     »   Issue 2 / 2018
 
     »   Issue 1 / 2018
 
 
 Volume 17 (2017)
 
     »   Issue 4 / 2017
 
     »   Issue 3 / 2017
 
     »   Issue 2 / 2017
 
     »   Issue 1 / 2017
 
 
  View all issues  








LATEST NEWS

2021-Jun-06
SCOPUS published the CiteScore for 2020, computed by using an improved methodology, counting the citations received in 2017-2020 and dividing the sum by the number of papers published in the same time frame. The CiteScore of Advances in Electrical and Computer Engineering in 2020 is 2.5, better than all our previous results.

2021-Apr-15
Release of the v3 version of AECE Journal website. We moved to a new server and implemented the latest cryptographic protocols to assure better compatibility with the most recent browsers. Our website accepts now only TLS 1.2 and TLS 1.3 secure connections.

2020-Jun-29
Clarivate Analytics published the InCites Journal Citations Report for 2019. The InCites JCR Impact Factor of Advances in Electrical and Computer Engineering is 1.102 (1.023 without Journal self-cites), and the InCites JCR 5-Year Impact Factor is 0.734.

2020-Jun-11
Starting on the 15th of June 2020 we wiil introduce a new policy for reviewers. Reviewers who provide timely and substantial comments will receive a discount voucher entitling them to an APC reduction. Vouchers (worth of 25 EUR or 50 EUR, depending on the review quality) will be assigned to reviewers after the final decision of the reviewed paper is given. Vouchers issued to specific individuals are not transferable.

2019-Dec-16
Starting on the 15th of December 2019 all paper authors are required to enter their SCOPUS IDs. You may use the free SCOPUS ID lookup form to find yours in case you don't remember it.

Read More »


    
 

  3/2020 - 10

Edge-preserving Filtering and Fuzzy Image Enhancement in Depth Images Captured by Realsense Cameras in Robotic Applications

TADIC, V. See more information about TADIC, V. on SCOPUS See more information about TADIC, V. on IEEExplore See more information about TADIC, V. on Web of Science, ODRY, A. See more information about  ODRY,  A. on SCOPUS See more information about  ODRY,  A. on SCOPUS See more information about ODRY, A. on Web of Science, BURKUS, E. See more information about  BURKUS, E. on SCOPUS See more information about  BURKUS, E. on SCOPUS See more information about BURKUS, E. on Web of Science, KECSKES, I. See more information about  KECSKES, I. on SCOPUS See more information about  KECSKES, I. on SCOPUS See more information about KECSKES, I. on Web of Science, KIRALY, Z. See more information about  KIRALY, Z. on SCOPUS See more information about  KIRALY, Z. on SCOPUS See more information about KIRALY, Z. on Web of Science, ODRY, P. See more information about ODRY, P. on SCOPUS See more information about ODRY, P. on SCOPUS See more information about ODRY, P. on Web of Science
 
Click to see author's profile in See more information about the author on SCOPUS SCOPUS, See more information about the author on IEEE Xplore IEEE Xplore, See more information about the author on Web of Science Web of Science

Download PDF pdficon (1,857 KB) | Citation | Downloads: 258 | Views: 541

Author keywords
filtering algorithms, fuzzy logic, image enhancement, robots, stereo vision

References keywords
intel(21), depth(19), image(14), realsense(13), robot(11), odry(10), technologies(9), group(9), vision(8), processing(8)
Blue keywords are present in both the references section and the paper title.

About this article
Date of Publication: 2020-08-31
Volume 20, Issue 3, Year 2020, On page(s): 83 - 92
ISSN: 1582-7445, e-ISSN: 1844-7600
Digital Object Identifier: 10.4316/AECE.2020.03010
Web of Science Accession Number: 000564453800010
SCOPUS ID: 85090336328

Abstract
Quick view
Full text preview
This paper presents both the use of depth cameras in robotic applications and effects of post-processing on the captured depth images. The performance of depth cameras and post-processing image enhancement are evaluated with the aim to improve the depth-based object detection. First, the edge-preserving exponential moving average (EMA) filter and the fuzzy contrast enhancement procedures are briefly introduced. Then, the use of depth cameras with post-processing methods is shown in the example of painting robots. The use of the stereo depth camera is essential in robotic applications, since it constitutes the initial steps in a series of robotic operations, where the goal is to both detect and extract obstacles on walls that are not intended to be painted.


References | Cited By  «-- Click to see who has cited this paper

[1] M. Carfagni, R. Furferi, L. Governi, C. Santarelli, M. Servi, "Metrological and Critical Characterization of the Intel D415 Stereo Depth Camera", Sensors, 2019, 19, 489;
[CrossRef] [Web of Science Times Cited 24] [SCOPUS Times Cited 27]


[2] J. Hu, Y. Niu, Z. Wang, "Obstacle Avoidance Methods for Rotor UAVs Using RealSense", 2017, Chinese Automation Congress (CAC),
[CrossRef] [SCOPUS Times Cited 7]


[3] S. Giancola, M. Valenti, R. Sala, "A Survey on 3D Cameras: Metrological Comparison of Time-of-Flight, Structured-Light and Active Stereoscopy Technologies", SpringerBriefs in Computer Science, Springer, ISSN 2191-5768, 2018,
[CrossRef] [Web of Science Times Cited 26] [SCOPUS Times Cited 12]


[4] L. Keselman, J. I. Woodfill, A. Grunnet-Jepsen, A. Bhowmik, "Intel RealSense Stereoscopic Depth Cameras", 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops,
[CrossRef] [Web of Science Times Cited 29] [SCOPUS Times Cited 75]


[5] R. L. Lagendijk, R. E.H. Franich, E. A. Hendriks, "Stereoscopic Image Processing", The work was supported in part by the European Union under the RACE-II project DISTIMA and the ACTS project PANORAMA,
[CrossRef]


[6] F. L. Siena, B. Byrom, P. Watts, P. Breedon, "Utilising the Intel RealSense Camera for Measuring Health Outcomes in Clinical Research", Journal of Medical Systems (2018) 42: 53,
[CrossRef] [Web of Science Times Cited 17] [SCOPUS Times Cited 29]


[7] "Intel RealSense D400 Series Product Family Datasheet", New Technologies Group, Intel Corporation, 2019, Document Number: 337029-005.

[8] A. Grunnet-Jepsen, D. Tong, "Depth Post-Processing for Intel RealSense D400 Depth Cameras", New Technologies Group, Intel Corporation, 2018, Rev 1.0.2, Article ID 000028866

[9] "Evaluating Intel's RealSense SDK 2.0 for 3D Computer Vision Using the RealSense D415/D435 Depth Cameras", 2018, Berkeley Design Technology, Inc.

[10] "Intel RealSense Camera Depth Testing Methodology", New Technologies Group, Intel Corporation, 2018, Revision 1.0.

[11] A. Grunnet-Jepsen, J. N. Sweetser, J. Woodfill, "Best-Known-Methods for Tuning Intel RealSense D400 Depth Cameras for Best Performance", New Technologies Group, Intel Corporation, Rev 1.9, Article ID 000027833

[12] E. S. L. Gastal, M. M. Oliveira, "Domain Transform for Edge-Aware Image and Video Processing", ACM Transactions on Graphics, Volume 30 (2011), Number 4, Proceedings of SIGGRAPH 2011,
[CrossRef] [SCOPUS Times Cited 83]


[13] A. Grunnet-Jepsen, P. Winer, A. Takagi, J. Sweetser, K. Zhao, T. Khuong, D. Nie, J. Woodfill, "Using the Intel RealSenseTM Depth cameras D4xx in Multi-Camera Configurations", New Technologies Group, Intel Corporation, Rev 1.1, Article ID 000028140

[14] "Intel RealSense Depth Module D400 Series Custom Calibration", New Technologies Group, Intel Corporation, 2019, Revision 1.5.0, Article ID 000026725

[15] A. Grunnet-Jepsen, J. N. Sweetser, "Intel RealSens Depth Cameras for Mobile Phones", New Technologies Group, Intel Corporation, 2019, Article ID 000026983

[16] P. Krejov, A. Grunnet-Jepsen, "Intel RealSense Depth Camera over Ethernet", New Technologies Group, Intel Corporation, 2019

[17] J. Cunha, E. Pedrosa, C. Cruz, A. J. R. Neves, N. Lau, "Using a Depth Camera for Indoor Robot Localization and Navigation", Conference: RGB-D Advanced Reasoning with Depth Cameras Workshop, Robotics Science and Systems Conference (RSS), At LA, USA, 2011

[18] H. J. Hemmat, E. Bondarev, P. H. N. de With, "Real-time planar segmentation of depth images: from 3D edges to segmented planes", Journal of Electronic Imaging 24(5): 051008, 2015,
[CrossRef] [SCOPUS Times Cited 5]


[19] F. Flacco, T. Kroger, A. De Luca, O. Khatib, "A depth space approach to human-robot collision avoidance" 2012, IEEE International Conference on Robotics and Automation,
[CrossRef] [SCOPUS Times Cited 242]


[20] A. Saxena, S. H. Chung, A. Y. Ng, "3-D Depth Reconstruction from a Single Still Image", International Journal of Computer Vision, 2008, Volume 76, Issue 1, pp 53-69,
[CrossRef] [Web of Science Times Cited 287] [SCOPUS Times Cited 416]


[21] V. Sterzentsenko, A. Karakottas, A. Papachristou, N. Zioulis, A. Doumanoglou, D. Zarpalas, P. Daras, "A low-cost, flexible and portable volumetric capturing system", 14th International Conference on Signal-Image Technology & Internet-Based Systems, 2018,
[CrossRef] [Web of Science Times Cited 10] [SCOPUS Times Cited 12]


[22] N. Carey, R. Nagpal, J. Werfel, "Fast, accurate, small-scale 3D scene capture using a low-cost depth sensor", 2017 IEEE Winter Conference on Applications of Computer Vision (WACV),
[CrossRef] [Web of Science Times Cited 6] [SCOPUS Times Cited 5]


[23] C. Garnica, F. Boochs, M. Twardochlib, "A New Approach To Edge-Preserving Smoothing for Edge Extraction and Image Segmentation", International Archives of Photogrammetry and Remote Sensing. Vol. XXXIII, Part B3. Amsterdam 2000.

[24] S. Reich, F. Worgotter, B. Dellen, "A Real-Time Edge-Preserving Denoising Filter'' In Proceedings of the 13th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, 2018), Vol. 4: VISAPP, pages 85-94,
[CrossRef] [SCOPUS Times Cited 4]


[25] R. Abiko, M. Ikehara, "Fast Edge Preserving 2D Smoothing Filter Using Indicator Function", ICASSP 2019, 978-1-5386-4658-8/18, IEEE,
[CrossRef] [SCOPUS Times Cited 1]


[26] J. Choi, H. Park, D. Seo, "Pansharpening Using Guided Filtering to Improve the Spatial Clarity of VHR Satellite Imagery", Remote Sensing, 2019, 11, 633;
[CrossRef] [Web of Science Times Cited 11] [SCOPUS Times Cited 11]


[27] N. Iqbal, S. Ali, I. Khan, B. M. Lee, "Adaptive Edge Preserving Weighted Mean Filter for Removing Random-Valued Impulse Noise", Symmetry 2019, 11, 395;
[CrossRef] [Web of Science Times Cited 5] [SCOPUS Times Cited 7]


[28] F. Zhu, Z. Liang, X. Jia, L. Zhang, Y. Yu, "A Benchmark for Edge-Preserving Image Smoothing",  IEEE Transactions on Image Processing 28(7): 3556-3570, 2019,
[CrossRef] [Web of Science Times Cited 10] [SCOPUS Times Cited 13]


[29] J. Sandeep, K. Samrudh, "Image contrast enhancement using fuzzy logic", arXiv: 1809.04529, 2018.

[30] D. Van De Ville, M. Nachtegael, D. Van der Weken, E. E. Kerre, W. Philips, I. Lemahieu, "Noise reduction by fuzzy image filtering", IEEE Transactions on Fuzzy Systems, 11(4), 429-436.
[CrossRef] [Web of Science Times Cited 146] [SCOPUS Times Cited 218]


[31] H. D. Cheng, H. Xu, "A novel fuzzy logic approach to contrast enhancement", Pattern Recognition Volume 33, Issue 5, May 2000, Pages 809-819,
[CrossRef] [Web of Science Times Cited 104] [SCOPUS Times Cited 147]


[32] A. S. Parihar, O. P. Verma, C. Khanna, "Fuzzy-Contextual Contrast Enhancement", IEEE Transactions on Image Processing, Volume: 26 , Issue: 4 , April 2017,
[CrossRef] [Web of Science Times Cited 44] [SCOPUS Times Cited 57]


[33] V. Tadic, M. Popovic, P. Odry, "Fuzzified Gabor filter for license plate detection", Engineering Applications of Artificial Intelligence, 2016, 48, 40-58,
[CrossRef] [Web of Science Times Cited 27] [SCOPUS Times Cited 32]


[34] A. A. M. Salih, K. Hasikin, N. A. M. Isa, "Adaptive Fuzzy Exposure Local Contrast Enhancement", IEEE Access, Volume: 6, 58794 - 58806,
[CrossRef] [Web of Science Times Cited 6] [SCOPUS Times Cited 8]


[35] A. S. Parihar, "Fuzzy adaptive gamma correction for contrast enhancement", 2017 International Conference on Intelligent Sustainable Systems (ICISS), IEEE 2018,
[CrossRef] [SCOPUS Times Cited 7]


[36] V. Tadic, A. Odry, I. Kecskes, E. Burkus, Z. Kiraly, P. Odry, "Application of Intel RealSense Cameras for Depth Image Generation in Robotics", WSEAS Transactions on Computers, E-ISSN: 2224-2872, Volume 18, 2019

[37] R. C. Gonzales, R. E. Woods, S. L. Eddins, "Digital Image Processing Using MATLAB", pp. 150-156 and 486-534, 2nd Edition, Gatesmark, LLC, USA, 2009, ISBN-13: 978-0982085400

[38] V. Tadic, Z. Kiraly, P. Odry, Z. Trpovski, T. Loncar-Turukalo, "Comparison of Gabor Filter Bank and Fuzzified Gabor Filter for License Plate Detection", Acta Polytechnica Hungarica, 17(1): 61-81, 2020,
[CrossRef] [SCOPUS Times Cited 4]


[39] NIST/SEMATECH e-Handbook of Statistical Methods, http://www.itl.nist.gov/div898/handbook/, 2012.

[40] J. S. Hunter, "The Exponentially Weighted Moving Average", Journal of Quality Technology, 18:4, 203-210, 1986,
[CrossRef]


[41] T. Chaira, A. K. Ray, "Fuzzy Image Processing and Applications with MATLAB", pp. 47-49, CRC Press, Taylor & Francis Group, LLC, USA, 2009.

[42] P. M. Khandekar, S. S. Chiddarwar, A. Jha, "Programming of an Industrial Robot Using Demonstrations and Soft Computing Techniques", Journal of Scientific & Industrial Research, pp. 156-163, Vol. 77, 2018

[43] D. Ristic-Durrant, S. M. Grigorescu, A. Gräser, Ž. cojbašic, V. Nikolic, "Robust Stereo-Vision Based 3D Object Reconstruction for the Assistive Robot FRIEND", Advances in Electrical and Computer Engineering, Volume 11, Number 4, 2011,
[CrossRef] [Full Text] [Web of Science Times Cited 6] [SCOPUS Times Cited 9]


[44] X. Ning, G. Tian, Y. Wang, "Top-Down Approach to the Automatic Extraction of Individual Trees from Scanned Scene Point Cloud Data", Advances in Electrical and Computer Engineering, Volume 19, Number 3, 2019,
[CrossRef] [Full Text] [Web of Science Times Cited 3] [SCOPUS Times Cited 3]


[45] E. Asadi, B. Li, I. M. Chen, "Pictobot: A Cooperative Painting Robot for Interior Finishing of Industrial Developments with High Walls", IEEE Robotics & Automation Magazine, 2018,
[CrossRef] [Web of Science Times Cited 13] [SCOPUS Times Cited 21]


[46] I. M. Chen, E. Asadi, J. Nie, R. J. Yan, W. C. Law, E. Kayacan, S. H. Yeo, K. H. Low, G. Seet, R. Tiong, "Innovations in Infrastructure Service Robots", CISM International Centre for Mechanical Sciences 2016,
[CrossRef] [Web of Science Times Cited 5] [SCOPUS Times Cited 3]


[47] L. Somlyai, Z. Vamossy, "SLAM algorithm for mobile robot localization with RGB-D camera", Fluids, Heat and Mass Transfer, Mechanical and Civil Engineering, WSEAS, ISBN: 978-1-61804-358-0

[48] G. Kertesz, S. Szenasi, Z. Vamossy, "Multi-Directional Image Projections with Fixed Resolution for Object Matching", Acta Polytechnica Hungarica, Vol. 15, No. 2, 2018,
[CrossRef] [Web of Science Times Cited 8] [SCOPUS Times Cited 12]


[49] T. Haidegger, G. S. Virk, C. Herman, R. Bostelman, P. Galambos, Gy. Gyorok, I. J. Rudas, "Industrial and Medical Cyber-Physical Systems: Tackling User Requirements and Challenges in Robotics", Recent Advances in Intelligent Engineering, vol 14. Springer, Cham, 2020,
[CrossRef]


[50] A. R. Varkonyi-Koczy, A. Rovid, "Soft Computing Based Point Correspondence Matching for Automatic 3D Reconstruction", Acta Polytechnica Hungarica, Vol. 2, No. 1, 2005.

[51] E. Burkus, P. Odry, "Autonomous Hexapod Walker Robot "Szabad(ka)"", Acta Polytechnica Hungarica, Vol. 5, No. 1, 2008.

[52] R. Szabo, A. Gontean, "Robotic Arm Control Algorithm Based on Stereo Vision Using RoboRealm Vision", Advances in Electrical and Computer Engineering, Volume 15, Number 2, 2015,
[CrossRef] [SCOPUS Times Cited 19]


[53] I. Kecskes, E. Burkus, F. Bazso, P. Odry, "Model validation of a hexapod walker robot", Robotica 35, no. 2, 2017, pp. 419-462,
[CrossRef] [Web of Science Times Cited 13] [SCOPUS Times Cited 16]


[54] A. Koubaa, "Robot Operating System (ROS)", Springer International Publishing Switzerland 2016,
[CrossRef] [Web of Science Times Cited 4]


[55] J. Kramer, M. Scheutz, "Development environments for au-tonomous mobile robots: A survey", Autonomous Robots, vol. 22, no. 2, pp. 101-132, 2007.

[56] G. Bradski, A. Kaehler, "Learning OpenCV", pp. 115-124, O'Reilly Media, Inc., 1005, USA, ISBN: 978-0-596-51613-0, 2008

[57] A. Martinez, E. Fernandez, "Learning ROS for Robotics Programming", pp. 63-102, Published by Packt Publishing Ltd., UK., 2013, ISBN 978-1-78216-144-8

[58] J. Kerr, K. Nickels, "Robot operating systems: Bridging the gap between human and robot", Proceedings of the 44th Southeastern Symposium on System Theory (SSST), IBSN 978-1-4577-1493-1/12, 2012

[59] V. Tadic, E. Burkus, A. Odry, I. Kecskes, Z. Kiraly, P. Odry, "Effects of the Post-processing on Depth Value Accuracy of the Images Captured by RealSense Cameras", Contemporary Engineering Sciences, Vol. 13, 2020, no. 1, 149 - 156, HIKARI Ltd,
[CrossRef]


[60] M. Quigley, B. Gerkey, W. D. Smart, "Programming Robots with ROS", pp. 391-394, O'Reilly Media, Inc., 2015, ISBN: 978-1-4493-2389-9

[61] V. Tadic, A. Odry, A. Toth, Z. Vizvari, P. Odry, "Fuzzified Circular Gabor Filter for Circular and Near-Circular Object Detection", IEEE Access,
[CrossRef] [Web of Science Times Cited 4] [SCOPUS Times Cited 3]




References Weight

Web of Science® Citations for all references: 808 TCR
SCOPUS® Citations for all references: 1,508 TCR

Web of Science® Average Citations per reference: 13 ACR
SCOPUS® Average Citations per reference: 24 ACR

TCR = Total Citations for References / ACR = Average Citations per Reference

We introduced in 2010 - for the first time in scientific publishing, the term "References Weight", as a quantitative indication of the quality ... Read more

Citations for references updated on 2021-06-19 14:58 in 225 seconds.




Note1: Web of Science® is a registered trademark of Clarivate Analytics.
Note2: SCOPUS® is a registered trademark of Elsevier B.V.
Disclaimer: All queries to the respective databases were made by using the DOI record of every reference (where available). Due to technical problems beyond our control, the information is not always accurate. Please use the CrossRef link to visit the respective publisher site.

Copyright ©2001-2021
Faculty of Electrical Engineering and Computer Science
Stefan cel Mare University of Suceava, Romania


All rights reserved: Advances in Electrical and Computer Engineering is a registered trademark of the Stefan cel Mare University of Suceava. No part of this publication may be reproduced, stored in a retrieval system, photocopied, recorded or archived, without the written permission from the Editor. When authors submit their papers for publication, they agree that the copyright for their article be transferred to the Faculty of Electrical Engineering and Computer Science, Stefan cel Mare University of Suceava, Romania, if and only if the articles are accepted for publication. The copyright covers the exclusive rights to reproduce and distribute the article, including reprints and translations.

Permission for other use: The copyright owner's consent does not extend to copying for general distribution, for promotion, for creating new works, or for resale. Specific written permission must be obtained from the Editor for such copying. Direct linking to files hosted on this website is strictly prohibited.

Disclaimer: Whilst every effort is made by the publishers and editorial board to see that no inaccurate or misleading data, opinions or statements appear in this journal, they wish to make it clear that all information and opinions formulated in the articles, as well as linguistic accuracy, are the sole responsibility of the author.




Website loading speed and performance optimization powered by: