Word Embeddings in #nlp and its Applications
▻https://hackernoon.com/word-embeddings-in-nlp-and-its-applications-fab15eaf7430?source=rss----3
Word embeddings are basically a form of word representation that bridges the human understanding of language to that of a machine. Word embeddings are distributed representations of text in an n-dimensional space. These are essential for solving most NLP problems.Domain adaptation is a technique that allows Machine learning and Transfer Learning models to map niche datasets that are all written in the same language but are still linguistically different. For example, legal documents, customer survey responses, and news articles are all unique datasets that need to be analyzed differently. One of the tasks of the common spam filtering problem involves adopting a model from one user (the source distribution) to a new one who receives significantly different emails (the target (...)
#artificial-intelligence #machine-learning #word-embeddings-nlp #word-embeddings