Skip to content

fix: abort cleanly in batch mode (--exit) when Ollama is unreachable#4977

Open
Frank-Schruefer wants to merge 1 commit intoAider-AI:mainfrom
Frank-Schruefer:fix/ollama-connection-error-batch-mode
Open

fix: abort cleanly in batch mode (--exit) when Ollama is unreachable#4977
Frank-Schruefer wants to merge 1 commit intoAider-AI:mainfrom
Frank-Schruefer:fix/ollama-connection-error-batch-mode

Conversation

@Frank-Schruefer
Copy link
Copy Markdown

Problem

When Ollama is unreachable (e.g. connection refused), aider treats the error as a non-fatal warning and continues with "sane defaults". In batch mode (--exit --message), this causes aider to open its interactive TUI anyway, dumping ANSI escape sequences to stdout. If stdout is redirected to a file, the file is filled with TUI garbage instead of the expected output.

Root cause

ModelInfoManager.get_model_info catches all exceptions from litellm.get_model_info and silently swallows them (printing the message but continuing). The Ollama connection error is raised by litellm as a plain Exception with a message starting with "OllamaError:".

Fix

  • models.py ModelInfoManager.get_model_info: re-raise exceptions whose message starts with "OllamaError:" instead of swallowing them
  • models.py Model.get_model_info: catch the re-raised error, store it on self.ollama_error, and return {} — this preserves the existing interactive-mode behaviour (sane defaults, warning shown)
  • main.py: after model setup, if main_model.ollama_error is set and --exit is active, emit a clear error message and return 1

Behaviour

Mode Before After
Interactive Warning + sane defaults + TUI opens Unchanged
Batch (--exit) TUI garbage written to stdout Clear error message, exit code 1

When Ollama returns a connection error during model info lookup, aider
previously treated it as a non-fatal warning and continued with "sane
defaults", opening the interactive TUI. In batch mode (--exit), this
causes the TUI output to be written to the redirected stdout instead of
aborting cleanly.

- ModelInfoManager.get_model_info: re-raise exceptions whose message
  starts with "OllamaError:" instead of silently swallowing them
- Model.get_model_info: catch OllamaError, store on self.ollama_error,
  return {} to allow interactive mode to continue with sane defaults
- main.py: if ollama_error is set and --exit is active, emit a clear
  error message and return 1

Interactive mode is unaffected: the warning and sane-defaults behaviour
remain unchanged.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@Frank-Schruefer Frank-Schruefer force-pushed the fix/ollama-connection-error-batch-mode branch from 1ae6ba3 to 84df006 Compare March 31, 2026 14:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant