Abstract:The existing Aspect-based sentiment analysis models based on BERT only use the output of the last hidden layer of BERT and ignore the semantic knowledge in the intermediate layers, which have the problem of insufficient information utilization, this paper proposes an Aspect-level sentiment analysis model that incorporates the intermediate hidden layer of BERT. Firstly, the comments and aspect information are spliced into sentence pairs and input to BERT model, and the connection between comments and aspect information is established through the self-attention mechanism of BERT. Secondly, the Gated Convolutional network is constructed to extract features from the word vector matrices of the outputs of all hidden layers of BERT, and the extracted features are added to the max pooling layer and spliced into feature sequences. In addition, the feature sequences are fused using BiGRU network to encode the information of different hidden layers of BERT. Finally, the attention mechanism is introduced to assign weights to features based on their relevance to aspect information. The experimental results on the public SemEval2014 Task4 review datasets show that the proposed model outperforms the comparative models such as BERT, CapsBERT, BERT-PT, and BERT-LSTM in terms of accuracy and F1 value. It has a good effect on sentiment classification.