## Auto-summary 2026-03-22 10:00 KST
- What happened: Investigated OpenClaw model/auth status after repeated `FailoverError: OAuth token refresh failed for openai-codex`, confirming that conversations are currently falling back to Gemini (`google-gemini-cli`) models due to Codex auth issues.
- What happened: Ran `openclaw models list`, `openclaw plugins list`, inspected `/home/lagoon3/.openclaw/openclaw.json`, and tailed `journalctl` logs around the failures to understand provider configuration and recent errors (including rate-limit FailoverErrors).
- Decisions / stable facts: Confirmed that the correct provider identifier for Codex auth is `openai-codex` (from both `models list` prefixes and `auth.profiles["openai-codex:default"].provider` in `openclaw.json`).
- Decisions / stable facts: Established that there is an OAuth profile configured for `openai-codex` in `openclaw.json`, even though there is no separate `openai-codex` plugin in the extensions list; Codex appears as a core/bundled provider.
- Next actions / blockers: To restore Codex as the primary model, run `openclaw models auth login --provider openai-codex` in an interactive terminal, open the printed auth URL, complete OAuth, and, if needed, paste back the `code` value when prompted.
- Next actions / blockers: If `openclaw models auth login --provider openai-codex` ever returns `Unknown provider`, investigate plugin/version mismatch or updated auth flow (e.g. check `openclaw` release notes or docs for Codex auth changes) before proceeding.

## Auto-summary 2026-03-22 13:00 KST
- What happened: No new main-session work since the 10:00 summary beyond continued inspection of the same Codex OAuth/FailoverError logs and confirming the already-known provider/auth configuration.
- Decisions / stable facts: No new decisions or durable facts added in this window; `openai-codex` remains the correct provider id and OAuth remains the required path to restore Codex.
- Next actions / blockers: Still pending: run `openclaw models auth login --provider openai-codex` when convenient to re-establish Codex OAuth; otherwise continue operating on the Gemini fallback until re-auth is completed.
- Links/IDs: Config and model references unchanged — `~/.openclaw/openclaw.json` auth profile `openai-codex:default`, models `openai-codex/gpt-5.2`, `openai-codex/gpt-5.1`, and fallbacks as previously noted.

## Auto-summary 2026-03-22 16:00 KST
- What happened: Main session activity after 13:00 was minimal; assistant retried several Gemini-backed tool calls (`sessions_history`, `exec`, `read`, etc.), many of which failed with `Cloud Code Assist API error (429): No capacity available for model gemini-3-pro-preview/flash`, indicating transient capacity limits on the fallback provider.
- Decisions / stable facts: No new configuration or workflow decisions were made in this window; it remains confirmed that `openai-codex` is the correct provider id and that the system is currently operating via Gemini fallback while Codex OAuth is broken.
- Next actions / blockers: Codex OAuth still needs to be re-established via `openclaw models auth login --provider openai-codex`; until then, occasional Gemini capacity errors (429) may block tool-heavy tasks and may require manual retries later when capacity is available.
- Links/IDs: No new identifiers created; relevant references remain `~/.openclaw/openclaw.json` (auth profile `openai-codex:default`) and the Codex models `openai-codex/gpt-5.2` / `openai-codex/gpt-5.1` already noted in earlier summaries.

## Auto-summary 2026-03-22 20:00 KST
- What happened: Between 16:00 and 20:00, main-session activity stayed focused on the same Codex OAuth issue; assistant and user continued to inspect `journalctl` logs and confirm that recent `FailoverError` entries (including rate-limit and token-refresh failures) are still occurring for `openai-codex` while Gemini handles responses.
- Decisions / stable facts: No new configuration changes or workflow decisions were made; it remains confirmed that `openai-codex` is the correct provider id, Codex is still unauthenticated, and the system is relying on Gemini fallbacks (which sometimes hit 429 capacity errors).
- Next actions / blockers: Primary next step is still to run `openclaw models auth login --provider openai-codex` in an interactive shell to restore Codex OAuth; until that is done, expect occasional Gemini capacity/rate-limit errors and the need to retry affected operations later.
- Links/IDs: References unchanged — `~/.openclaw/openclaw.json` auth profile `openai-codex:default`, models `openai-codex/gpt-5.2` / `openai-codex/gpt-5.1`, and recent `journalctl` FailoverError entries for `openai-codex` and Gemini rate limits.