%0 Journal Article %T Chroma Intra Prediction with Lightweight Attention-Based Neural Networks %A Chengyi Zou %A Shuai Wan %A Tiannan Ji %A Marc Gorriz Blanch %A Marta Mrak %A Luis Herranz %J IEEE Transactions on Circuits and Systems for Video Technology %D 2023 %V 34 %N 1 %F Chengyi Zou2023 %O MACO; LAMP %O exported from refbase (http://158.109.8.37/show.php?record=3875), last updated on Tue, 06 Feb 2024 15:17:39 +0100 %X Neural networks can be successfully used for cross-component prediction in video coding. In particular, attention-based architectures are suitable for chroma intra prediction using luma information because of their capability to model relations between difierent channels. However, the complexity of such methods is still very high and should be further reduced, especially for decoding. In this paper, a cost-effective attention-based neural network is designed for chroma intra prediction. Moreover, with the goal of further improving coding performance, a novel approach is introduced to utilize more boundary information effectively. In addition to improving prediction, a simplification methodology is also proposed to reduce inference complexity by simplifying convolutions. The proposed schemes are integrated into H.266/Versatile Video Coding (VVC) pipeline, and only one additional binary block-level syntax flag is introduced to indicate whether a given block makes use of the proposed method. Experimental results demonstrate that the proposed scheme achieves up to −0.46%/−2.29%/−2.17% BD-rate reduction on Y/Cb/Cr components, respectively, compared with H.266/VVC anchor. Reductions in the encoding and decoding complexity of up to 22% and 61%, respectively, are achieved by the proposed scheme with respect to the previous attention-based chroma intra prediction method while maintaining coding performance. %U https://ieeexplore.ieee.org/document/10144394 %U http://dx.doi.org/10.1109/TCSVT.2023.3282980 %P 549-560