Dense vectors that capture semantic meaning and relationships. Words with similar meanings have similar vector representations, allowing machines to perform "math" on language.
Real embeddings (like Word2Vec or BERT) have 300 to 1024 dimensions. This 3D view is a mathematically reduced projection (like PCA) for human visualization. Real conceptual clusters exist amidst thousands of other seemingly random words.