Skip to main content
Ctrl+K
encodermap 3.0.1+11.g6469d37.dirty documentation - Home encodermap 3.0.1+11.g6469d37.dirty documentation - Home
  • Getting started
  • Notebook Gallery
  • User Guide
  • API
  • Contributing
  • GitHub
  • Getting started
  • Notebook Gallery
  • User Guide
  • API
  • Contributing
  • GitHub

Section Navigation

Starter Notebooks

  • Getting started: Basic Cube
  • Advanced Usage: Asp 7
  • Your Data

MD Notebooks

  • Working with trajectory ensembles

Customization Notebooks

  • Customize EncoderMap: Logging Custom Scalars
  • Customize EncoderMap: Custom loss functions
  • Logging Custom Images
  • Learning Rate Schedulers

Publication Notebooks

  • Publication Figure 1
  • Publication Figure 2
  • Publication Figure 3
  • Publication Figure 5

Static Code Examples

  • Static Code Examples
  • Notebook Gallery

Notebook Gallery#

Here, you can find static renders of EncoderMap’s example notebooks. You can run them interactively on Google Colab or MyBinder. You can also run them on your local machine by cloning EncoderMap’s repository.

Starter Notebooks#

The starter notebooks help you with your first steps with EncoderMap.

Basic Cube Example

Get started with EncoderMap

A cube with colored vertices. Getting started: Basic Cube
Advanced Asp7

Advanced EncoderMap usage with MD data

the Mybinder logo Advanced Usage: Asp 7
Your Data

Upload your own data and use this notebook

the Mybinder logo Your Data

Notebooks intermediate#

The advanced notebooks introduce more advanced techniques and explore more novel features of EncoderMap.

Training diverse topologies

Training diverse topologies

Img description. Intermediate EncoderMap: Different Topologies

Notebooks MD#

The MD notebooks contain more detailed descriptions of how EncoderMap deals with MD data. It helps you in saving and loading large MD datasets and using them to train EncoderMap. It also helps you in understanding the terms feature space and collective variable.

Trajectory Ensembles

Trajectory Ensembles

Img description. Working with trajectory ensembles

Notebooks Customization#

These notebooks help you in customizing EncoderMap. These tools can assist you in understanding how EncoderMap trains on your data. Furthermore, you will learn how to implement new cost functions and vary the training rate of the Neural Network.

Custom Scalars

Monitor in TensorBoard

Img description. Customize EncoderMap: Logging Custom Scalars
Custom Loss

Add new loss functions

Img description. Customize EncoderMap: Custom loss functions
Custom Images

Write images to TensorBoard

Img description. Logging Custom Images
Learning Rate Scheduler

Adjust the learning rate

Img description. Learning Rate Schedulers

Notebooks Publication#

These notebooks contain the analysis code of an upcoming publication “”EncoderMap III: A dimensionality reduction package for feature exploration in molecular simulations” featuring the new version of the EncoderMap package. Trained network weights are available upon reasonable request by raising an issue on GitHub: AG-Peter/encodermap#issues, or by contacting the authors of the publication.

Figure1

Notebook to create figure1 of the publication

Img description. Publication Figure 1
Figure2

Notebook to create figure2 of the publication

Img description. Publication Figure 2
Figure3

Notebook to create figure3 of the publication

Img description. Publication Figure 3
Figure5

Notebook to create figure5 of the publication

Img description. Publication Figure 5

Starter Notebooks

  • Getting started: Basic Cube
    • Import Libraries
    • Load Data
    • Select Parameters
    • Get more info about parameters
    • Perform Dimensionality Reduction
    • Generate High-Dimensional Data
    • Conclusion
  • Advanced Usage: Asp 7
    • Primer
      • Imports and load data
      • Periodic variables
      • Parameter selection
    • Visualize Learning with TensorBoard
      • Running tensorboard on Google colab
      • Running tensorboard locally
      • Save and Load
    • Generate Molecular Conformations
    • Conclusion
  • Your Data
    • Load Libraries
    • Load Your Data
    • Set Parameters
    • Run the Dimensionality Reduction
    • Plot the Results

MD Notebooks

  • Working with trajectory ensembles
    • For Google colab only:
      • Primer
    • Collective Variables
    • Example CVs
    • Sharing MD data
      • Classes for working with MD data
    • The new SingleTraj class
    • The TrajEnsemble class contains multiple SingleTrajs
      • Loading CVs
    • From numpy
    • Slicing with CVs.
    • Loading from files
    • Loading with EncoderMap’s featurzier
    • Wrtiting custom features No 1
    • Writing custom features No 2
      • Saving trajectory and CVs into one file
      • Saving a complete trajectory Ensemble
      • Conclusion

Customization Notebooks

  • Customize EncoderMap: Logging Custom Scalars
    • For Google colab only:
      • Import libraries
      • Adding custom scalars to TensorBoard by subclassing EncoderMapBaseMetric
      • Use the y_true and y_pred parameters in the update() function
      • Conclusion
    • Getting input data
    • Setting parameters
    • Subclassing the SequentialModel
    • Changing what happens in a training step
    • Running EncoderMap with the new model
  • Customize EncoderMap: Custom loss functions
    • Import Libraries
    • What are loss functions
    • Cost functions
      • Cost functions in TensorFlow
      • Cost functions in EncoderMap
    • Custom cost functions
      • Triplet cost (contrastive learning)
      • Adding a unit circle cost
      • Train
      • Output
    • Loading logs into jupyter notebook
    • Conclusion
    • References
  • Logging Custom Images
    • For Google colab only:
      • Logging via a custom function
    • Provide this function to EncoderMap
    • Train EncoderMap with our new function.
    • Output
      • Writing custom callbacks
    • Polar coordinates
    • Subclassing EncoderMapBaseCallback
    • Adding the callback to EncoderMap
    • Output
      • Conclusion
  • Learning Rate Schedulers
    • For Google colab only:
      • Import Libraries
      • Why learning rate schedulers? A linear regression example
      • Log the current learning rate to Tensorboard
    • Running tensorboard on Google colab
    • Running tensorboard locally
    • Sublcassing EncoderMap’s EncoderMapBaseCallback
      • Write a learning rate scheduler
      • Conclusion

Publication Notebooks

  • Publication Figure 1
    • Imports
    • Trained network weights
      • Load MD data from KonDATA
      • Add collective variables to MD data by featurization
      • Create a TensorFlow dataset
      • Plot images during training
      • Train / load trained network
      • Look at the images that have been saved during training
      • Final image
  • Publication Figure 2
    • Imports
    • Trained network weights
      • Load MD data from KonDATA
      • Featurize
      • Create EncoderMap
      • Train / Load trained network
      • Images
  • Publication Figure 3
    • Imports
    • Trained network weights
      • Get data
    • “Classical” methods
      • MDS
      • PCA
      • Network methods
      • Final image
  • Publication Figure 5
    • Imports
    • Trained network weights
      • Load MD Data from KonDATA
      • Add custom topologies
      • Load CVs
      • Train / Load trained network
      • Project all data
      • Images of lowd procetions
      • Cluster and paths

Static Code Examples

  • Static Code Examples
    • Cube Example
    • Cube Distance Analysis
    • Trp Cage
    • Dihedral to Cartesian diUbi

previous

Review the README

next

Getting started: Basic Cube

On this page
  • Starter Notebooks
  • Notebooks intermediate
  • Notebooks MD
  • Notebooks Customization
  • Notebooks Publication

This Page

  • Show Source

© Copyright 2024, Kevin Sawade, Tobias Lemke, University of Konstanz.

Created using Sphinx 8.1.3.

Last updated on 2025-06-22T22:54:03.

Docs for EncoderMap 3.0.1+11.g6469d37.dirty