VectorInstitute/atomgen
Library for handling atomistic graph datasets focusing on transformer-based implementations, with utilities for training various models, experimenting with different pre-training tasks, and a suite of pre-trained models with huggingface integrations
This is a library for researchers and scientists working with atomistic graph datasets, especially those using transformer-based models. It simplifies the process of collecting, standardizing, and using diverse datasets like S2EF, Molecule3D, and PDB. The library provides tools for training models on these datasets and includes pre-trained models for tasks like predicting energies and forces, which ultimately helps accelerate materials discovery and drug design.
Use this if you are a machine learning researcher or computational chemist developing or experimenting with transformer models for atomistic structures and their properties.
Not ideal if you are an end-user needing a ready-to-use application for materials simulation without custom model development.
Stars
8
Forks
1
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 12, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/VectorInstitute/atomgen"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
rxn4chemistry/rxn-onmt-models
Training of OpenNMT-based RXN models
CTCycle/ADSMOD-Adsorption-Modeling
Streamline adsorption modeling by automatically fitting theoretical adsorption models to...
sanjaradylov/smiles-gpt
Generative Pre-Training from Molecules
lamalab-org/MatText
Text-based modeling of materials.
mikemayuare/apetokenizer
Tokenizer for chemnical SMILES and SELFIES for use in transformers models.