Instructions to use physical-intelligence/fast with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use physical-intelligence/fast with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("physical-intelligence/fast", dtype="auto") - Notebooks
- Google Colab
- Kaggle
| { | |
| "action_dim": null, | |
| "auto_map": { | |
| "AutoProcessor": "processing_action_tokenizer.UniversalActionProcessor" | |
| }, | |
| "min_token": -354, | |
| "processor_class": "UniversalActionProcessor", | |
| "scale": 10, | |
| "time_horizon": null, | |
| "vocab_size": 2048 | |
| } | |