J Shanghai Jiaotong Univ Sci ›› 2021, Vol. 26 ›› Issue (4): 503-510.doi: 10.1007/s12204-020-2239-3
ZHAN Zhu (占竹), ZHANG Wenjun (张文俊), CHEN Xia (陈霞), WANG Jun *(汪军)
出版日期:
2021-08-28
发布日期:
2021-06-06
通讯作者:
WANG Jun *(汪军)
E-mail:junwang@dhu.edu.cn
ZHAN Zhu (占竹), ZHANG Wenjun (张文俊), CHEN Xia (陈霞), WANG Jun *(汪军)
Online:
2021-08-28
Published:
2021-06-06
Contact:
WANG Jun *(汪军)
E-mail:junwang@dhu.edu.cn
摘要: As an important indicator for the appearance and intrinsic quality of textiles, fabric flatness is the immediate cause affecting the aesthetic appearance and performance of textiles. In this paper, the objective evaluation system of fabric flatness based on 3D scanner and convolutional neural network (CNN) is constructed by using the height data of AATCC flatness template. The 3D scanner is responsible for the collection of the height value data of the sample. The effect of different sub-sample cutting sizes, cutting offsets, and network model depths on the objective evaluation coincidence rate of multiple flatness level was studied. The experimental results show that the coincidence rate of the system reaches 98.9% when the collected sample data are cut into subsamples of 20 pixel×20 pixel with 12 pixel cutting offsets and the 11-layer network model is selected. Finally, this scheme is used to evaluate the flatness of four real fabrics with different colors and textures. The result shows
that all of the samples can achieve a higher coincidence rate, which further verifies the adaptability and stability of the objective evaluation system constructed in this paper for fabric flatness evaluation.
中图分类号:
ZHAN Zhu (占竹), ZHANG Wenjun (张文俊), CHEN Xia (陈霞), WANG Jun (汪军) . Objective Evaluation of Fabric Flatness Grade Based on Convolutional Neural Network[J]. J Shanghai Jiaotong Univ Sci, 2021, 26(4): 503-510.
ZHAN Zhu (占竹), ZHANG Wenjun (张文俊), CHEN Xia (陈霞), WANG Jun (汪军) . Objective Evaluation of Fabric Flatness Grade Based on Convolutional Neural Network[J]. J Shanghai Jiaotong Univ Sci, 2021, 26(4): 503-510.
[1] | WANG R W, WU X Y, WANG S Y, et al. Automaticidentification of ramie and cotton fibers using characteristicsin longitudinal view, Part I: Locating captureof fiber images [J]. Textile Research Journal, 2009,79(14): 1251-1259. |
[2] | STIVANELLO M E, VARGAS S, ROLOFF M L, et al. Automatic detection and classification of defects inknitted fabrics [J]. IEEE Latin America Transactions,2016, 14(7): 3065-3073. |
[3] | LI Y, ZHANG C. Automated vision system for fabric defect inspection using Gabor filters and PCNN [J].Springer Plus, 2016, 5(1): 765. |
[4] | YANGW, LI D, ZHU L, et al. A new approach for imageprocessing in foreign fiber detection [J]. Computersand Electronics in Agriculture, 2009, 68(1): 68-77. |
[5] | SU Z, TIAN G Y, GAO C. A machine vision system for on-line removal of contaminants in wool [J]. Mechatronics,2006, 16(5): 243-247. |
[6] | CHEN X, HUANG X B. Evaluating fabric pilling with light-projected image analysis [J]. Textile Research Journal, 2004, 74(11): 977-981. |
[7] | KIM S C, KANG T J. Image analysis of standard pilling photographs using wavelet reconstruction [J].Textile Research Journal, 2005, 75(12): 801-811. |
[8] | PARK C K, KANG T J. Objective rating of seam pucker using neural networks [J]. Textile Research Journal, 1997, 67(7): 494-502. |
[9] | PARK C K, KANG T J. Objective evaluation of seam pucker using artificial intelligence, Part I: Geometric modeling of seam pucker [J]. Textile Research Journal,1999, 69(10): 735-742. |
[10] | LIU C. Investigation on the novel measurement for fabric wrinkle simulating actual wear [J]. The Journal of The Textile Institute, 2017, 108(2): 279-286. |
[11] | ZHANG N, PAN R R, GAO W D. Automatic seampuckering evaluation using image processing [J]. Journal of Textile Research, 2017, 38(4): 145-150 (in Chinese). |
[12] | SU J, XU B. Fabric wrinkle evaluation using laser triangulation and neural network classifier [J]. Optical Engineering, 1999, 38(10): 1688-1693. |
[13] | BARI A S M H, GAVRILOVA M L. Artificial neural network based gait recognition using Kinect sensor [J].IEEE Access, 2019, 7: 162708-162722. |
[14] | GAO G, W¨UTHRICH M V. Convolutional neural network classification of telematics car driving data [J].Risks, 2019, 7(1): 6. |
[15] | SERGEEV A, DEL BALSO M. Horovod: Fast and easy distributed deep learning in TensorFlow [DB/OL]. (2018-02-15) [2019-11-25].https://arxiv.org/abs/1802.05799. |
[16] | ABRIL H C, MILL′AN M S, VALENCIA E. Influence of the wrinkle perception with distance in the objective evaluation of fabric smoothness [J]. Journal of Optics A: Pure and Applied Optics, 2008, 10(10): 104030. |
[1] | ZHAO Yong (赵勇), MENG Yang (孟杨), YU Pengyao (于鹏垚), WANG Tianlin (王天霖), SU Shaojua. Prediction of Fluid Force Exerted on Bluff Body by Neural Network Method[J]. Journal of Shanghai Jiao Tong University (Science), 2020, 25(2): 186-192. |
[2] | SHANG Zhiwu (尚志武), ZHOU Xiangping (周湘平), LI Cheng (李成), ZHOU Xinyu (周昕宇). Design of Micropipette System with High Precision for Small Enzyme Immunoassay Analyzer[J]. Journal of Shanghai Jiao Tong University (Science), 2019, 24(5): 605-615. |
[3] | FU Ling (傅玲), MA Jingchen (马璟琛), CHEN Yizhi (琛奕志), LARSSON Rasmus, ZHAO Jun *(赵俊. Automatic Detection of Lung Nodules Using 3D Deep Convolutional Neural Networks[J]. Journal of Shanghai Jiao Tong University (Science), 2019, 24(4): 517-523. |
[4] | CHEN Yimin (陈一民), LU Rongrong (陆蓉蓉), ZOU Yibo (邹一波), ZHANG Yanhui (张燕辉). Branch-Activated Multi-Domain Convolutional Neural Network for Visual Tracking[J]. sa, 2018, 23(3): 360-. |
[5] | FENG Shao-kong1,2* (冯少孔), HUANG Tao3 (黄 涛), LI Hong-jie4 (李宏阶). Automatic Identification of Cracks from Borehole Image Under Complicated Geological Conditions[J]. 上海交通大学学报(英文版), 2013, 18(6): 699-705. |
[6] | GAO Shuang-sheng1 (高双胜), CHI Da-zhao2* (迟大钊), GANG Tie2 (刚铁). Ultrasonic Nondestructive Testing and Evaluating System for the Brazing Quality of a Guide Ring[J]. 上海交通大学学报(英文版), 2012, 17(5): 527-530. |
阅读次数 | ||||||||||||||||||||||||||||||||||||||||||||||||||
全文 80
|
|
|||||||||||||||||||||||||||||||||||||||||||||||||
摘要 460
|
|
|||||||||||||||||||||||||||||||||||||||||||||||||