Neural embeddings are one of the most popular techniques used in Natural Language Processing to represent word similarities. There are many variations and implementations of this concept - starting from skip-gram, through GloVe or Word2Vec. But embeddings are a much more powerful concept that can be utilized in many different areas, not only NLP. Possible applications of embeddings in recommendation engines, item similarity detection, and market basket analysis through frequent items search will be presented during this speech.