I found it far too expensive for Anthropic. Entire context of every conversation is sent each time you type anything. Switched to a local model running from Ollama. Not quite as smart as opus, but good enough for my needs.
My spend was similar. I only spent one day with it. Pretty neat but a security can of worms and I already do similar automations with Apple Shortcuts and APIs and the Apple Foundation models on-device. It's free and does a great job.
If I had the time and desire, I'd write up a little Swift app and an OpenClaw plugin to just use Apple Foundation models locally, by default. AFM still sucks for code, so proxy those requests out to Claude or Codex or whatever.
If I had the time and desire, I'd write up a little Swift app and an OpenClaw plugin to just use Apple Foundation models locally, by default. AFM still sucks for code, so proxy those requests out to Claude or Codex or whatever.