Journal of shanghai Jiaotong University (Science) ›› 2015, Vol. 20 ›› Issue (2): 143-148.doi: 10.1007/s12204-015-1602-2
Previous Articles Next Articles
SU Bai-hua1 (苏柏桦), WANG Ying-lin2* (王英林)
Online:
2015-04-30
Published:
2015-04-02
Contact:
WANG Ying-lin (王英林)
E-mail:yinglin.wang@outlook.com
CLC Number:
SU Bai-hua1 (苏柏桦), WANG Ying-lin2* (王英林). Genetic Algorithm Based Feature Selection and Parameter Optimization for Support Vector Regression Applied to Semantic Textual Similarity[J]. Journal of shanghai Jiaotong University (Science), 2015, 20(2): 143-148.
[1] | ˇSari′c F, Glavaˇs G, Karan M, et al. Takelab:Systems for measuring semantic text similarity[C]//Proceedings of the First Joint Conference on Lexical and Computational Semantics. Montreal, Canada:Association for Computational Linguistics, 2012: 441-448. |
[2] | B¨ar D, Biemann C, Gurevych I, et al. Ukp: Computing semantic textual similarity by combining multiple content similarity measures [C]//Proceedings of the First Joint Conference on Lexical and Computational Semantics. Montreal, Canada: Association for Computational Linguistics, 2012: 435-440. |
[3] | Fr¨ohlich H, Chapelle O, Sch¨olkopf B. Feature selection for support vector machines by means of genetic algorithm [C]//Proceedings of 15th IEEE International Conference on Tools with Artificial Intelligence.Washington, DC, USA: IEEE, 2003: 142-148. |
[4] | Huang C L, Wang C J. A GA-based feature selection and parameters optimization for support vector machines [J]. Expert Systems with Applications, 2006,31(2): 231-240. |
[5] | John G H, Kohavi R, Pfleger K. Irrelevant features and the subset selection problem [C]//Proceedings on Machine Learning’94. [s. l.]: Morgan Kauffmann Publishers,1994: 121-129. |
[6] | Kohavi R, John G H. Wrappers for feature subset selection [J]. Artificial Intelligence, 1997, 97(1): 273-324. |
[7] | Witten I H, Frank E. Data mining: Practical machine learning tools and techniques [M]. Burlington,Massachusetts, USA: Morgan Kaufmann Publishers,2005. |
[8] | Vapnik V. The nature of statistical learning theory[M]. Berlin, Germany: Springer-Verlag, 2000. |
[9] | Drucker H, Burges C J C, Kaufman L, et al. Support vector regression machines [J]. Advances in Neural Information Processing Systems, 1997, 9: 155-161. |
[10] | Sch¨olkopf B, Smola A J. Learning with kernels:Support vector machines, regularization, optimization,and beyond [M]. Cambridge MA, USA: MIT press,2002. |
[11] | Smola A J, Sch¨olkopf B. A tutorial on support vector regression [J]. Statistics and Computing, 2004,14(3): 199-222. |
[12] | Vapnik V, Golowich S E, Smola A. Support vector method for function approximation, regression estimation,and signal processing [J]. Advances in Neural Information Processing Systems, 1997, 9: 281-287. |
[13] | Bennett K P, Mangasarian O L. Robust linear programming discrimination of two linearly inseparable sets [J]. Optimization Methods and Software, 1992,1(1): 23-34. |
[14] | Cortes C, Vapnik V. Support-vector networks [J].Machine Learning, 1995, 20(3): 273-297. |
[15] | Salzberg S L. On comparing classifiers: Pitfalls to avoid and a recommended approach [J]. Data Mining and Knowledge Discovery, 1997, 1(3): 317-328. |
[16] | Hsu C W, Chang C C, Lin C J. A practical guide to support vector classification [R]. Taipei, China: Department of Computer Science, National Taiwan University,2003. |
[17] | Chang C C, Lin C J. LIBSVM: A library for support vector machines [J]. ACM Transactions on Intelligent Systems and Technology, 2011, 2(3): 27. |
[1] | LI Fen, SUN Ling, WANG Yawei, QU Aifang, MEI Nian, ZHAO Jinbin. Short-Term Interval Forecasting of Photovoltaic Power Based on CEEMDAN-GSA-LSTM and SVR [J]. Journal of Shanghai Jiao Tong University, 2024, 58(6): 806-818. |
[2] | ZHANG Bo(张博),LI Keqing (李克庆),HU Yafei(胡亚飞),JI Kun(吉坤),HAN Bin*(韩斌). Prediction of Backfill Strength Based on Support Vector Regression Improved by Grey Wolf Optimization [J]. J Shanghai Jiaotong Univ Sci, 2023, 28(5): 686-694. |
[3] | LIU Yixun, LIU Zhihao, GAO Qinhe, HUANG Tong, MA Dong. Vertical Force Estimation of Heavy-Loaded Radial Tire Based on Circumferential Strain Analysis [J]. Journal of Shanghai Jiao Tong University, 2023, 57(10): 1273-1281. |
[4] | ZHU Chenghao, WANG Han, SUN Guoqi, WEI Xiaobin, WANG Fuwen, CAI Xu. An Identification Method for DC-Link Capacitor Capacitance of Grid Connected Inverter [J]. Journal of Shanghai Jiao Tong University, 2022, 56(6): 693-700. |
[5] | XU Hongdong, GAO Haibo, XU Xiaobin, LIN Zhiguo, SHENG Chenxing. State of Health Estimation of Lithium-ion Battery Using a CS-SVR Model Based on Evidence Reasoning Rule [J]. Journal of Shanghai Jiao Tong University, 2022, 56(4): 413-421. |
[6] | GUO Weicheng,LI Beizhi,YANG Jianguo,ZHOU Qinzhi. Monitoring of Grinding Signals and Development of Wheel Wear Prediction Model [J]. Journal of Shanghai Jiaotong University, 2019, 53(12): 1475-1481. |
[7] | SHEN Jian,HU Jie,MA Jin,QI Jin,ZHU Guoniu,PENG Yinghong. Case Acquisition in Biological Domain Based on Text Mining [J]. Journal of Shanghai Jiaotong University, 2018, 52(8): 954-960. |
[8] | FANG Yan (方艳). Feature Selection, Deep Neural Network and Trend Prediction [J]. Journal of Shanghai Jiao Tong University (Science), 2018, 23(2): 297-307. |
[9] | SONG Jie (宋杰), XIAO Liang* (肖亮), LIAN Zhichao (练智超). Robust Segmentation, Shape Fitting and Morphology Computation of High-Throughput Cell Nuclei [J]. Journal of shanghai Jiaotong University (Science), 2017, 22(2): 180-187. |
[10] | HE Tao,HU Jie,XIA Peng,GU Chaochen. Feature Selection of Emg Signal Based on ReliefF Algorithm and Genetic Algorithm [J]. Journal of Shanghai Jiaotong University, 2016, 50(02): 204-208. |
[11] | WANG Fei (王飞), LI Cai-hong* (李彩虹), WANG Jing-shan (王景山),XU Jiao (徐娇), LI Lian (李廉). A Two-Stage Feature Selection Method for Text Categorization by Using Category Correlation Degree and Latent Semantic Indexing [J]. Journal of shanghai Jiaotong University (Science), 2015, 20(1): 44-50. |
Viewed | ||||||||||||||||||||||||||||||||||||||||||||||||||
Full text 225
|
|
|||||||||||||||||||||||||||||||||||||||||||||||||
Abstract 619
|
|
|||||||||||||||||||||||||||||||||||||||||||||||||