删除或更新信息,请邮件至freekaoyan#163.com(#换成@)

基于双错测度的极限学习机选择性集成方法

本站小编 Free考研考试/2022-01-03

夏平凡,
倪志伟,
朱旭辉,,
倪丽萍
1.合肥工业大学管理学院 合肥 230009
2.过程优化与智能决策教育部重点实验室 合肥 230009
基金项目:国家自然科学基金(91546108, 71521001),安徽省自然科学基金 (1908085QG298, 1908085MG232),过程优化与智能决策教育部重点实验室开放课题,中央高校基本科研业务费专项资金(JZ2019HGTA0053, JZ2019HGBZ0128)

详细信息
作者简介:夏平凡:女,1994年生,博士生,研究方向为机器学习、人工智能和集成学习等
倪志伟:男,1963年生,教授,博士生导师,研究方向为人工智能、机器学习和云计算
朱旭辉:男,1991年生,讲师,硕士生导师,研究方向为进化计算和机器学习
倪丽萍:女,1981年生,副教授,硕士生导师,研究方向为分形数据挖掘、人工智能和机器学习
通讯作者:朱旭辉 zhuxuhui@hfut.edu.cn
中图分类号:TP391

计量

文章访问数:941
HTML全文浏览量:313
PDF下载量:37
被引次数:0
出版历程

收稿日期:2019-08-12
修回日期:2020-06-21
网络出版日期:2020-07-17
刊出日期:2020-11-16

Selective Ensemble Method of Extreme Learning Machine Based on Double-fault Measure

Pingfan XIA,
Zhiwei NI,
Xuhui ZHU,,
Liping NI
1. School of Management, Hefei University of Technology, Hefei 230009, China
2. Key Laboratory of Process Optimization and Intelligent Decision-making, Ministry of Education, Hefei 230009, China
Funds:The National Natural Science Foundation of China (91546108, 71521001), The Anhui Provincial Natural Science Foundation (1908085QG298, 1908085MG232), The Open Research Fund Program of Key Laboratory of Process Optimization and Intelligent Decision-making, Ministry of Education, The Fundamental Research Funds for the Central Universities (JZ2019HGTA0053, JZ2019HGBZ0128)


摘要
摘要:极限学习机(ELM)具有学习速度快、易实现和泛化能力强等优点,但单个ELM的分类性能不稳定。集成学习可以有效地提高单个ELM的分类性能,但随着数据规模和基ELM数目的增加,计算复杂度会大幅度增加,消耗大量的计算资源。针对上述问题,该文提出一种基于双错测度的极限学习机选择性集成方法(DFSEE),同时从理论和实验的角度进行了详细分析。首先,运用bootstrap 方法重复抽取训练集,获得多个训练子集,在ELM上进行独立训练,得到多个具有较大差异性的基ELM,构成基ELM池;其次,计算出每个基ELM的双错测度,将基ELM按照双错测度的大小进行升序排序;最后,采用多数投票算法,根据顺序将基ELM逐个累加集成,直至集成精度最优,即获得基ELM最优子集成,并分析了其理论基础。在10个UCI数据集上的实验结果表明,较其他方法使用了更小规模的基ELM,获得了更高的集成精度,同时表明了其有效性和显著性。
关键词:选择性集成/
双错测度/
极限学习机
Abstract:Extreme Learning Machine (ELM) has unique advantages such as fast learning speed, simplicity of implementation, and excellent generalization performance. However, the performance of a single ELM is unstable in classification. Ensemble learning can effectively improve the classification ability of single ELMs, but it may incur the rapid increase in memory space and computational overheads as the increase of the data size and the number of ELMs. To address this issue, a Selective Ensemble approach of ELM based on Double-Fault measure (DFSEE) is proposed, and it is evaluated by theoretical and experimental analysis simultaneously. Firstly, multiple training subsets extracted from a training dataset are obtained employing the bootstrap sampling method, and an initial pool of base ELMs is constructed by independently training multiple ELMs on different training subsets; Secondly, the ELMs in pool are sorted in ascending order according to their double-fault measures of those ELMs. Finally, it starts with one ELM and grows the ensemble by adding new base ELMs according to the order, the final ensemble of ELMs can be achieved with the best classification ability, and the theoretical basis of DFSEE is analyzed. Experimental results on 10 benchmark classification tasks show that DFSEE can achieve better results with less number of ELMs by comparing with other approaches, and its validity and significance.
Key words:Selective ensemble/
Double-fault measure/
Extreme Learning Machine (ELM)



PDF全文下载地址:

https://jeit.ac.cn/article/exportPdf?id=92f96ada-d6a5-4aab-9993-36c2f0259f20
相关话题/计算 过程 优化 智能 数据