Draft article not currently submitted for review.
This is a draft Articles for creation (AfC) submission. It is not currently pending review. While there are no deadlines, abandoned drafts may be deleted after six months. To edit the draft click on the "Edit" tab at the top of the window. To be accepted, a draft should:
It is strongly discouraged to write about yourself, your business or employer. If you do so, you must declare it. Where to get help
How to improve a draft
You can also browse Wikipedia:Featured articles and Wikipedia:Good articles to find examples of Wikipedia's best writing on topics similar to your proposed article. Improving your odds of a speedy review To improve your odds of a faster review, tag your draft with relevant WikiProject tags using the button below. This will let reviewers know a new draft has been submitted in their area of interest. For instance, if you wrote about a female astronomer, you would want to add the Biography, Astronomy, and Women scientists tags. Editor resources
Last edited by UtherSRG (talk | contribs) 30 days ago. (Update) |
Meta embedding [3] is the process of creating an embedding (representation) from one or more existing embeddings. The term is specifically used in connection to [word embeddings](https://en.wikipedia.org/wiki/Word_embedding) used in the natural language processing community for representing the meanings of individual words.
References
edit1. James O'Neill and Danushka Bollegala: *Semi-Supervised Multi-Task Word Embeddings* arXiv, 2018. arXiv(https://arxiv.org/abs/1809.05886)
2. Cong Bao and Danushka Bollegala: *Learning Word Meta-Embeddings by Autoencoding* Proc. of the 27th International Conference on Computational Linguistics (COLING), pp. 1650-1661, 2018. PDF(http://danushka.net/papers/aeme.pdf) CODE(https://github.com/LivNLP/AEME)
3. Danushka Bollegala, Kohei Hayashi and Ken-ichi Kawarabayashi: Think Globally, Embed Locally -- Locally Linear Meta-Embedding of Words, Proc. of International Joint Conference in Artificial Intelligence and European Conference on Artificial Intelligence (IJCAI-ECAI), pp. 3970-3976, 2018. PDF(https://www.ijcai.org/proceedings/2018/0552.pdf) CODE(https://github.com/LivNLP/LLE-MetaEmbed) meta-embeddings(https://www.dropbox.com/s/xykbg65l3ir3rcj/glove%2Bsg%2BME.zip?dl=0)
4. Joshua Coates and Danushka Bollegala: *Frustratingly Easy Meta-Embedding -- Computing Meta-Embeddings by Averaging Source Word Embeddings* Proc. of the 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL-HLT), pp. 194-198, 2018. PDF(http://aclweb.org/anthology/N18-2031)