Quantum NLP is still at its infancy. There has been some early research done by Bob Coecke and Cambridge Quantum
arxiv.org/abs/1608.01406
arxiv.org/abs/2005.04147
arxiv.org/abs/2102.12846
arxiv.org/abs/2110.04236
Most of their effort has been focused on the DiscoCat model for grammar composition.
This is however, not how traditional NLP is done, and it's not clear if it can scale well and compete with some of the best available NLP models now, like BERT and GPT3.
The NLP models used now are transformers, using a technique called attention.
This article towardsdatascience.com/toward-a-quantum-transformer-a51566ed42c2 pointed out the need for a quantum transformer. This seems a natural extension to the classical transformers but there hasn't been too much research done in this field.
In short, I think it's important to understand how NLP is done at the moment, the Coursera NLP specialization offers an excellent intro to the field www.coursera.org/specializations/natural-language-processing. The attention mechanism is only described in course4 so it will take a while to get there.
With that and the papers above, one should be in a very good place for some QNLP research.