AI Roundup: OpenAI Buys Astral, Qwen 397B on MacBook, Cursor Gets Smarter
Big moves in the AI world this week: OpenAI acquires the Python tooling powerhouse, someone gets a 397B model running on a laptop, and Cursor drops Composer 2 powered by Kimi.
Tob
Backend Developer
The AI space keeps moving fast. This week brought some genuinely interesting developments that matter if you write code.
TL;DR: OpenAI just bought Astral (the company behind uv, ruff, and ty). A researcher got Qwen3.5-397B running on a MacBook using Apple's Flash memory technique. Cursor launched Composer 2 powered by Kimi K2.5.
OpenAI Acquires Astral
This is the biggest story of the week if you work with Python. Astral, the company behind three increasingly essential open source tools, is now joining OpenAI.
The tools in question are:
- uv - A blazing fast Python package installer and resolver
- ruff - An extremely fast linter written in Rust
- ty - The new type checker everyone's been watching
This acquisition raises questions. Will these tools stay open source? Will they become closed? The Python community has built real dependencies on these tools. The uncertainty is uncomfortable.
On the positive side, OpenAI has the resources to push these tools forward faster. The maintainers (Charlie Marsh and others) will now have serious backing. The tradeoff is obvious: independence versus resources.
What matters most is what happens to the open source licenses. Watch this space.
Running Qwen 397B on a MacBook
Here's something wild. Dan Woods figured out how to run Qwen3.5-397B on a 48GB MacBook Pro M3 Max at 5.5+ tokens per second.
The trick? Using Apple's "LLM in a Flash" research from 2023. Qwen3.5-397B is a Mixture-of-Experts (MoE) model. Only about 10 experts activate for each token, meaning you don't need all 397B parameters in RAM at once.
He quantized the expert weights to 2-bit, keeping only 5.5GB resident in memory while streaming the rest from SSD. The technique involves reading data in larger, contiguous chunks optimized for flash memory characteristics.
The result isn't production-ready quality, but it shows where local AI is heading. Your laptop running 400B parameter models might not be far off.
Cursor Composer 2 and Kimi
Cursor dropped Composer 2 this week, and it turns out it's powered by Kimi K2.5 from Moonshot AI. The integration goes through Fireworks AI's hosted RL and inference platform.
This is interesting for a few reasons. Kimi K2.5 is an open model (from the Chinese AI scene), and it's now driving one of the most popular AI coding tools in the West. The open model ecosystem continues to surprise.
Composer 2 brings frontier-level coding performance at $0.50/M input tokens and $2.50/M output tokens (standard), or $1.50/M and $7.50/M for the fast option.
The bigger picture: AI coding tools are consolidating around different model backends. Cursor went with Kimi. Others use Claude or GPT. The competition is healthy.
Sources: Simon Willison, Hacker News, Cursor Blog, Kimi.ai Twitter, GitHub
Related Blog
AI Roundup: Cursor's Kimi-Powered Composer, OpenAI's Astral Acquisition, and Running 397B on a Laptop
AI Engineering · 4 min read
AI Roundup: Snowflake Sandbox Break, Python 3.15 JIT Beats Expectations, and Cursor's Plugin Power Play
AI Engineering · 4 min read
AI This Week: A Victorian Chatbot, a PyPI Supply Chain Attack, and Text Rendering Magic
AI Engineering · 5 min read