⚙️ Prerequisites
- If you haven’t already, install the nexa-SDK by following the installation guide.
- MLX models only work on MacOS. It’s recommended to run on a device with at least 16GB RAM.
- Below are the MLX-compatible model types you can experiment with right away.
LLM - Language Models
📝 Language models in MLX format. Try out this quick example: Try it out:bash
⌨️ Once model loads, type or paste multi-line text directly into the CLI to chat with the model.
LMM - Multimodal Models
🖼️ Language models that also accept vision and/or audio inputs. LMM in MLX formats. Try out this quick example:bash
⌨️ Drag photos or audio clips directly into the CLI — you can even drop multiple images at once!
Supported Model List
We curated a list of top, high quality models in MLX format.Many MLX models in the Hugging Face
mlx-community have quality issues and may not run locally. We recommend using models from our collection for best results.- Create an account at sdk.nexa.ai
- Generate a token: Go to Deployment → Create Token
- Activate your SDK: Run the following command on the terminal to set your license:
bash
🙋 Request New Models
Missing a model? Vote for it on the Nexa Wishlist — we build the most-voted models fast! You can also submit an issue on the nexa-sdk GitHub or request in our Discord/Slack community.Was this page helpful?