I just got an M1 Max. I searched for "battery life AI" in this and the LocalLLM subreddits and didn't find any relevant information.
When I run local models (Llama 70B 4-bit) on my M1 Max,the battery life plunges instantly. One of the reasons for getting this and potentially going to the M4/5 Max is to be able to run local AI. I understand that loads all the GPUs for large models. But I am seeing 5% drops for any sustained sessions of even a few minutes of VS Code with local models.
So I wanted to ask here:
- Is this typical of the M1 Max, or do I have a bad battery (battery health is 90%)?
- If I get an M4/5 Max with 128GB to run larger models locally, will it be the same 5% drop for 10 minutes of work, or are they more efficient because of the neural accelerators?
- Is the intent to just run those models on a MBP when plugged in and bypass the battery altogether?
- Am I hurting the long-term health of the battery with these local AI runs?
Thank you for any help you can provide.
u/Allenite — 9 days ago