基于生成对抗网络的宫颈细胞图像数据增强
DOI:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

TP391.41

基金项目:


Image Data Augmentation of Cervical Cells Based on Generative Adversarial Networks
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    为了在数据集过小时更好的训练卷积神经网络,本文提出一种方法通过训练GAN(生成对抗网络)生成新的样本进行图像数据增强。扩充后的数据集应用于训练图像分类模型,得到了不错的效果。针对Herlev宫颈细胞数据集的二分类问题,本文首先使用原始训练集训练GAN,生成了大量高质量的高分辨率细胞图像,将每类训练集扩充到24 000例。然后使用扩充后的训练集进行分类网络训练,在Resnet迁移学习的验证集准确率高达97%,高于仿射变换扩充的数据集的训练结果93%,可见本文方法可以有效地实现图像的数据增强。本文方法也可用于其他领域的图像数据增强。

    Abstract:

    In order to solve the problem of too small dataset when training convolutional neural networks, this paper uses a small dataset training GAN(Generative Adversarial Networks) to generate new samples for data augmentation. The expanded data set is applied to the training image classification model and has a good effect. For the dichotomy problem of Herlev datasets, the original training set was used to train Generative Adversarial Networks, generating a large number of high-quality high-resolution cell images, and expanding each training set up to 24,000 cases. Then the expanded training set is used for classification network training. The accuracy of the validation set in Resnet migration learning is 97% which is higher than the accuracy of dataset of affine transformation expansion, which is 93%, showing that our approach is very effective for image data augmentation. This method can also be used for image data augmentation in other fields.

    参考文献
    相似文献
    引证文献
引用本文

林志鹏,曾立波,吴琼水. 基于生成对抗网络的宫颈细胞图像数据增强[J]. 科学技术与工程, 2020, 20(28): 11672-11677.
LIN Zhi-peng, ZENG Li-bo, WU Qiong-shui. Image Data Augmentation of Cervical Cells Based on Generative Adversarial Networks[J]. Science Technology and Engineering,2020,20(28):11672-11677.

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2019-12-09
  • 最后修改日期:2020-07-01
  • 录用日期:2020-04-13
  • 在线发布日期: 2020-11-03
  • 出版日期:
×
律回春渐,新元肇启|《科学技术与工程》编辑部恭祝新岁!
亟待确认版面费归属稿件,敬请作者关注