In order to solve the problem that the corresponding weight cannot be updated due to the value of 0 in the negative region of ReLU function, a new activation function S-ReLU is proposed. It has soft saturation in the negative region and increases the attention of negative sample data. By giving a smaller derivative to the output value in the negative region, it promotes the back propagation of negative input value, the robustness of the model is improved. Through the comparative experiment of using LeNet-5 model on MNIST and CIFAR-10 datasets with other common activation functions, the image classification effect based on S-ReLU activation function is explored. The experimental results show that for MNIST and CIFAR-10 datasets, using S-ReLU function improves the classification accuracy of the model compared with other activation functions.