Click to open the HelpDesk interface
AECE - Front page banner

Menu:


FACTS & FIGURES

JCR Impact Factor: 0.700
JCR 5-Year IF: 0.700
SCOPUS CiteScore: 1.8
Issues per year: 4
Current issue: Aug 2024
Next issue: Nov 2024
Avg review time: 54 days
Avg accept to publ: 60 days
APC: 300 EUR


PUBLISHER

Stefan cel Mare
University of Suceava
Faculty of Electrical Engineering and
Computer Science
13, Universitatii Street
Suceava - 720229
ROMANIA

Print ISSN: 1582-7445
Online ISSN: 1844-7600
WorldCat: 643243560
doi: 10.4316/AECE


TRAFFIC STATS

2,826,371 unique visits
1,119,525 downloads
Since November 1, 2009



Robots online now
Googlebot
Amazonbot
bingbot
MJ12bot


SCOPUS CiteScore

SCOPUS CiteScore


SJR SCImago RANK

SCImago Journal & Country Rank




TEXT LINKS

Anycast DNS Hosting
MOST RECENT ISSUES

 Volume 24 (2024)
 
     »   Issue 3 / 2024
 
     »   Issue 2 / 2024
 
     »   Issue 1 / 2024
 
 
 Volume 23 (2023)
 
     »   Issue 4 / 2023
 
     »   Issue 3 / 2023
 
     »   Issue 2 / 2023
 
     »   Issue 1 / 2023
 
 
 Volume 22 (2022)
 
     »   Issue 4 / 2022
 
     »   Issue 3 / 2022
 
     »   Issue 2 / 2022
 
     »   Issue 1 / 2022
 
 
 Volume 21 (2021)
 
     »   Issue 4 / 2021
 
     »   Issue 3 / 2021
 
     »   Issue 2 / 2021
 
     »   Issue 1 / 2021
 
 
  View all issues  


FEATURED ARTICLE

Application of the Voltage Control Technique and MPPT of Stand-alone PV System with Storage, HIVZIEFENDIC, J., VUIC, L., LALE, S., SARIC, M.
Issue 1/2022

AbstractPlus






LATEST NEWS

2024-Jun-20
Clarivate Analytics published the InCites Journal Citations Report for 2023. The InCites JCR Impact Factor of Advances in Electrical and Computer Engineering is 0.700 (0.700 without Journal self-cites), and the InCites JCR 5-Year Impact Factor is 0.600.

2023-Jun-28
Clarivate Analytics published the InCites Journal Citations Report for 2022. The InCites JCR Impact Factor of Advances in Electrical and Computer Engineering is 0.800 (0.700 without Journal self-cites), and the InCites JCR 5-Year Impact Factor is 1.000.

2023-Jun-05
SCOPUS published the CiteScore for 2022, computed by using an improved methodology, counting the citations received in 2019-2022 and dividing the sum by the number of papers published in the same time frame. The CiteScore of Advances in Electrical and Computer Engineering for 2022 is 2.0. For "General Computer Science" we rank #134/233 and for "Electrical and Electronic Engineering" we rank #478/738.

2022-Jun-28
Clarivate Analytics published the InCites Journal Citations Report for 2021. The InCites JCR Impact Factor of Advances in Electrical and Computer Engineering is 0.825 (0.722 without Journal self-cites), and the InCites JCR 5-Year Impact Factor is 0.752.

2022-Jun-16
SCOPUS published the CiteScore for 2021, computed by using an improved methodology, counting the citations received in 2018-2021 and dividing the sum by the number of papers published in the same time frame. The CiteScore of Advances in Electrical and Computer Engineering for 2021 is 2.5, the same as for 2020 but better than all our previous results.

Read More »


    
 

  2/2008 - 12

 HIGHLY CITED PAPER 

Training Neural Networks Using Input Data Characteristics

CERNAZANU, C. See more information about CERNAZANU, C. on SCOPUS See more information about CERNAZANU, C. on IEEExplore See more information about CERNAZANU, C. on Web of Science
 
View the paper record and citations in View the paper record and citations in Google Scholar
Click to see author's profile in See more information about the author on SCOPUS SCOPUS, See more information about the author on IEEE Xplore IEEE Xplore, See more information about the author on Web of Science Web of Science

Download PDF pdficon (710 KB) | Citation | Downloads: 1,287 | Views: 5,012

Author keywords
neural networks, data mining, correlation-based feature subset selection method, data features extraction, training algorithm

References keywords
neural(8), networks(7), data(7), selection(6), learning(6), mining(5), machine(5), ijcnn(4), feature(4)
Blue keywords are present in both the references section and the paper title.

About this article
Date of Publication: 2008-06-02
Volume 8, Issue 2, Year 2008, On page(s): 65 - 70
ISSN: 1582-7445, e-ISSN: 1844-7600
Digital Object Identifier: 10.4316/AECE.2008.02012
Web of Science Accession Number: 000264815000012
SCOPUS ID: 77955635511

Abstract
Quick view
Full text preview
Feature selection is often an essential data processing step prior to applying a learning algorithm. The aim of this paper consists in trying to discover whether removal of irrelevant and redundant information improves the performance of neural network training results. The present study will describe a new method of training the neural networks, namely, training neural networks using input data features. For selecting the features, we used a filtering technique (borrowed from data mining) which consists in selecting the best features from a training set. The technique is made up of two components: a feature evaluation technique and a search algorithm for selecting the best features. When applied as a data preprocessing step for one common neural network training algorithms, the best data results obtained from this network are favorably comparable to a classical neural network training algorithms. Nevertheless, the first one requires less computation.


References | Cited By  «-- Click to see who has cited this paper

[1] Negnevitsky, M., "Artificial Intelligence: A Guide to Intelligent Systems", (2nd Edition), Addison Wesley, England,2005.

[2] Luger G., "Artificial Intelligence :Structures and Strategies for Complex Problem Solving", (Fifth Edition) Addison Wesley, 2005.

[3] Stergiou, C., Siganos, D., "Neural networks", [Online] Available: Temporary on-line reference link removed - see the PDF document, 1996

[4] Babii, S., Cretu, V., Petriu, E. M., "Performance Evaluation of Two Distributed BackPropagation Implementations", Neural Networks 2007, IJCNN 2007, pp. 1578-1583

[5] Zhongwen, L., Hongzhi, L., Xincai, W., "Artificial neural network computation on graphic process unit", Neural Networks, 2005, IJCNN 2005, pp. 622-626.

[6] Siddique, M. N. H., Tokhi, M.O., "Training neural networks: backpropagation vs. genetic algorithms", Neural Networks, 2001, IJCNN, 2001, pp. 2673-2678

[7] Nguyen, D., Widrow, B., "Improving the learning speed of 2-layer neural networks by choosing initial values of adaptive weights", Neural Networks 1990, IJCNN, 1990, pp. 21-26, Volume. 3

[8] Gorea, D., "Dynamically Integrating Knowledge in Applications. An Online Scoring Engine Architecture", Advances in Electrical and Computer Engineering, Suceava, Romania, Volume 8, 2008, pp.44-49
[CrossRef] [Full Text] [Web of Science Times Cited 2] [SCOPUS Times Cited 4]


[9] Langley, P., "Selection of relevant features in machine learning", Proceedings of the AAAI Fall Symposium on Relevance, AAAI Press, 1994

[10] Jain, A., Zongker, D., "Feature selection: evaluation, application and small sample performance, Pattern Analysis and Machine Learning Intelligence, IEEE Transactions on, Volume 19, 1997, pp. 153-158

[11] Pudil, P., Novovicova, J., Kittler, J., "Floating search methods in feature selection, Pattern Recognition Letters, Volume 15, November 1994, pp. 1119-1125.
[CrossRef] [Web of Science Times Cited 2166] [SCOPUS Times Cited 2566]


[12] Kim, Y., Street, W.N., Menczer, F. Roussell, G.J., "Feature selection in data mining", J. Wang Editor, Data Mining: Opportunities and Challenges, Idea Group Publishing, 2003, pages 80-105.

[13] Gigli, G., Bosse, I., Lampropoulos, G.A., "A optimized architecture for classification combining data fusion and data mining", Information Fusion, Volume 8, 2007, pp. 366-378
[CrossRef] [Web of Science Times Cited 8] [SCOPUS Times Cited 13]


[14] Hall, M. "Correlation-based Feature Selection for Machine Learning", Ph. D. diss. Hamilton, NZ: Waikato Uiversity, Department of Computer Science, 1998

[15] Boyan, J., Moore, A., "Learning evaluation functions to improve optimization by local search", Journal of Machine Learning Research, Volume 1, pp. 77-112, 2000

[16] Weka3, "Data mining Software in Java", The University of Waikato, [Online] Available: Temporary on-line reference link removed - see the PDF document, 2008

[17] Witten, I. H., Frank, E., "Data mining: Practical Machine Tools and Techniques", (Second Edition), Morgan Kaufmann, 2005.

[18] NIST Handprinted Forms and Characters Database, [Online] Available: Temporary on-line reference link removed - see the PDF document

[19] http://weka.sourceforge.net/wiki/index.php, Performing attribute selection, 2008

[20] Image Segmentation Data, Vision Group, University of Massachusetts, November, 1990.

References Weight

Web of Science® Citations for all references: 2,176 TCR
SCOPUS® Citations for all references: 2,583 TCR

Web of Science® Average Citations per reference: 109 ACR
SCOPUS® Average Citations per reference: 129 ACR

TCR = Total Citations for References / ACR = Average Citations per Reference

We introduced in 2010 - for the first time in scientific publishing, the term "References Weight", as a quantitative indication of the quality ... Read more

Citations for references updated on 2024-10-13 08:49 in 23 seconds.




Note1: Web of Science® is a registered trademark of Clarivate Analytics.
Note2: SCOPUS® is a registered trademark of Elsevier B.V.
Disclaimer: All queries to the respective databases were made by using the DOI record of every reference (where available). Due to technical problems beyond our control, the information is not always accurate. Please use the CrossRef link to visit the respective publisher site.

Copyright ©2001-2024
Faculty of Electrical Engineering and Computer Science
Stefan cel Mare University of Suceava, Romania


All rights reserved: Advances in Electrical and Computer Engineering is a registered trademark of the Stefan cel Mare University of Suceava. No part of this publication may be reproduced, stored in a retrieval system, photocopied, recorded or archived, without the written permission from the Editor. When authors submit their papers for publication, they agree that the copyright for their article be transferred to the Faculty of Electrical Engineering and Computer Science, Stefan cel Mare University of Suceava, Romania, if and only if the articles are accepted for publication. The copyright covers the exclusive rights to reproduce and distribute the article, including reprints and translations.

Permission for other use: The copyright owner's consent does not extend to copying for general distribution, for promotion, for creating new works, or for resale. Specific written permission must be obtained from the Editor for such copying. Direct linking to files hosted on this website is strictly prohibited.

Disclaimer: Whilst every effort is made by the publishers and editorial board to see that no inaccurate or misleading data, opinions or statements appear in this journal, they wish to make it clear that all information and opinions formulated in the articles, as well as linguistic accuracy, are the sole responsibility of the author.




Website loading speed and performance optimization powered by: 


DNS Made Easy