We have hosted the application bootstrap your own latent byol in order to run this application in our online workstations with Wine or directly.
Quick description about bootstrap your own latent byol:
Practical implementation of an astoundingly simple method for self-supervised learning that achieves a new state-of-the-art (surpassing SimCLR) without contrastive learning and having to designate negative pairs. This repository offers a module that one can easily wrap any image-based neural network (residual network, discriminator, policy network) to immediately start benefitting from unlabelled image data. There is now new evidence that batch normalization is key to making this technique work well. A new paper has successfully replaced batch norm with group norm + weight standardization, refuting that batch statistics are needed for BYOL to work. Simply plugin your neural network, specifying (1) the image dimensions as well as (2) the name (or index) of the hidden layer, whose output is used as the latent representation used for self-supervised training.Features:
- Practical implementation of an astoundingly simple method
- Group norm + weight standardization
- Simply plugin your neural network
- BYOL does not even need the target encoder to be an exponential moving average of the online encoder
- Fetch the embeddings or the projections
- Without contrastive learning
Programming Language: Python.
Categories:
©2024. Winfy. All Rights Reserved.
By OD Group OU – Registry code: 1609791 -VAT number: EE102345621.