Embeddings are ubiquitous in machine learning, appearing in recommender systems, NLP, and many other applications. Indeed, in the context of TensorFlow, it’s natural to view tensors (or slices of tensors) as points in space, so almost any TensorFlow system will naturally give rise to various embeddings.
To learn more about embeddings and how to train them, see the Vector Representations of Words tutorial. If you are interested in embeddings of images, check out this article for interesting visualizations of MNIST images. On the other hand, if you are interested in word embeddings, this article gives a good introduction.
TensorBoard has a built-in visualizer, called the Embedding Projector, for interactive visualization and analysis of high-dimensional data like embeddings. It is meant to be useful for developers and researchers alike. It reads from the checkpoint files where you save your tensorflow variables. Although it’s most useful for embeddings, it will load any 2D tensor, potentially including your training weights.
Source: TensorBoard: Embedding Visualization
There’s a projector as well, which you can use seperately from tensorflow here
You can use this to see what your AI is thinking…
Robin Edgar
Organisational Structures | Technology and Science | Military, IT and Lifestyle consultancy | Social, Broadcast & Cross Media | Flying aircraft