What’s New in MAX 24.4? MAX on macOS, Fast Local Llama3, Native Quantization and GGUF Support

MAX
AI
LLM
Release
Exploring the new features in MAX 24.4 including macOS support and Llama3 - originally published on Modular’s blog.
Author

Ehsan M. Kermani

Published

June 25, 2024

I wrote a blog post on the Modular blog announcing the new features in MAX 24.4.

This release brings exciting capabilities including native macOS support, fast local Llama3 inference, native quantization, and GGUF format support.

Key topics covered:

Read the full article: What’s New in MAX 24.4?