Aim 3.3 — Audio & Text tracking, Plotly & Colab integrations

Audio & Text tracking

Hey community, Aim 3.3 is now available! 😊

We are on a mission to democratize AI dev tools. Thanks to the awesome Aim community for helping test early versions and thanks for their contributions.

  • Audio & Text tracking
  • Scatter plot explorer
  • Colab integration
  • Plotly integration
  • Images visualization on run details page

Aim 3.3 is full of highly requested features. Lots of items scratched from the public Aim roadmap.

New Demo: play.aimstack.io:10004

Demo Code: github.com/osoblanco/FastSpeech2

Special thanks to Erik Arakelyan (osoblanco), Krystian Pavzkowski (krstp), Flaviu Vadan, SSamDav and AminSlk for their contributions and feedback.

Audio tracking and exploration

Now you can track audio files with Aim during your speech-to-text or other audio-involving experiments.

It will allow you to track generated audio with context, query them, observe the evolution for audio-related experiments (e.g. speech-to-text). Both input, output and ground truth.

Just like tracking the metrics, we have enabled a simple API to track the audio.

from aim import Audio

for step in range(1000):
    my_run.track(
        Audio(arr), # Pass audio file or numpy array
        name='outputs', # The name of distributions
        step=step,   # Step index (optional)
        epoch=0,     # Epoch (optional)
        context={    # Context (optional)
            'subset': 'train',
        },
    )

Here is how it looks like on the UI.

Aim: Audio tracking and exploration

Text tracking and visualization

Use this to compare text inputs and outputs during training

Similarly, you can also track text with Aim during your NLP experiments.

Here is how the code looks like:

from aim import Text

for step in range(1000):
    my_run.track(
        Text(string), # Pass a string you want to track
        name='outputs', # The name of distributions
        step=step,   # Step index (optional)
        epoch=0,     # Epoch (optional)
        context={    # Context (optional)
            'subset': 'train',
        },
    )

This is the end result on the UI.

Aim: Text tracking and visualization
training info tracked as a text

Colab integration

After we have integrated Aim to Jupyter, there were many requests to enable Aim on Colab too. Now it has arrived! 😊

With fully embedded Aim UI, now you can track and follow your experiments live without leaving your colab environment!

Here is an example colab to get started with.

Note: Please make sure to run all the cells to be able to use the UI as well

So this is how it looks on your browser:

Aim: Colab integration

Plotly integration

Now you can track your custom plotly charts and visualize them on Aim with full native plotly interactive capabilities baked in.

This is a great way to also track all your relevant plotly visualizations per step and have them rendered, navigated in Aim along with everything else already in there.

from aim import Figure

for step in range(1000):
    my_run.track(
        Figure(fig_obj), # Pass any plotly figure
        name='plotly_bars', # The name of distributions
        step=step,   # Step index (optional)
        epoch=0,     # Epoch (optional)
        context={    # Context (optional)
            'subset': 'train',
        },
    )

The end-result on the Aim Web UI.

Aim: Plotly integration

Images visualization on run details page

As we had launched the images tracking and visualization in 3.1, we haven’t enabled the images on the single run page. Besides the explorer, now you can observe and search through the images on the single run page as well.

Here is how it looks on the UI

Aim: Images visualization on run details page

Learn More

Aim is on a mission to democratize AI dev tools.

We have been incredibly lucky to get help and contributions from the amazing Aim community. It’s humbling and inspiring.

Try out Aim, join the Aim community, share your feedback, open issues for new features and bugs.

And don’t forget to leave Aim a star on GitHub for support 🙌.

Leave a reply

Your email address will not be published. Required fields are marked *