Meta embedding [3] is the process of creating an embedding (representation) from one or more existing embeddings. The term is specifically used in connection to [word embeddings](https://en.wikipedia.org/wiki/Word_embedding) used in the natural language processing community for representing the meanings of individual words.



References

edit

1. James O'Neill and Danushka Bollegala: *Semi-Supervised Multi-Task Word Embeddings* arXiv, 2018. arXiv(https://arxiv.org/abs/1809.05886)

2. Cong Bao and Danushka Bollegala: *Learning Word Meta-Embeddings by Autoencoding* Proc. of the 27th International Conference on Computational Linguistics (COLING), pp. 1650-1661, 2018. PDF(http://danushka.net/papers/aeme.pdf) CODE(https://github.com/LivNLP/AEME)

3. Danushka Bollegala, Kohei Hayashi and Ken-ichi Kawarabayashi: Think Globally, Embed Locally -- Locally Linear Meta-Embedding of Words, Proc. of International Joint Conference in Artificial Intelligence and European Conference on Artificial Intelligence (IJCAI-ECAI), pp. 3970-3976, 2018. PDF(https://www.ijcai.org/proceedings/2018/0552.pdf) CODE(https://github.com/LivNLP/LLE-MetaEmbed) meta-embeddings(https://www.dropbox.com/s/xykbg65l3ir3rcj/glove%2Bsg%2BME.zip?dl=0)

4. Joshua Coates and Danushka Bollegala: *Frustratingly Easy Meta-Embedding -- Computing Meta-Embeddings by Averaging Source Word Embeddings* Proc. of the 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL-HLT), pp. 194-198, 2018. PDF(http://aclweb.org/anthology/N18-2031)