I had a problem with SSH today that I don’t think I’ve experienced before.

Received disconnect from <ip address> port 22:2: Too many authentication failures

I am using keys created in AWS via Ansible, with the private identity key saved to my local machine. I’ve been tweaking some settings to play with things and was repeatedly destroying instances, security groups, and AWS-created keys.

I run Ubuntu 22.04 with ssh-agent, if you want a couple of clues to the solution.

So here is the situation:

  • Ubuntu 22.04
  • Repeated creation/destruction of AWS instances/keys
  • Use of Ansible to alter local ssh configuration files and add keys
  • Lots of setup tweaking

Debugging this was a little difficult!

I was also debugging some other issues and made many changes along the way. From my point of view, SSH was working, then randomly stopped working. Then it would work again. Intermittent bugs are the worst when you don’t know how to recreate the fault condition!

I did a search on the error and found this post on StackExchange, stating that having too many keys will cause this error. All of them will be offered to the server, but once the SSH server has seen the max number of keys, it’ll reject further connection attempts.

I only had two keys, as I frequently put infrastructure keys in separate directories for separate projects, so this wasn’t exactly the problem.

However, that post also mentions ssh-agent. Many years ago, I added AddKeysToAgent yes in my ~/.ssh/config file. This was automatically adding my ssh keys to ssh-agent, so in my repeated creation/destroy cycle I had piled up a number of keys that I wasn’t even using!

SSH was cycling through these keys and eventually erroring because the correct key wasn’t being found quickly enough.

On discovering this problem, I ran these commands:

ssh-add -l

This lists the keys in ssh-agent and showed me that I had too many.

ssh-add -D

This deleted all the keys in ssh-agent (they will repopulate) and instantly fixed my SSH problem.

For a permanent solution, you can do this in your SSH config:

Host ubuntu
   Hostname example.com
   User someuser
   IdentityFile ~/path/to/private/key
   IdentitiesOnly yes

Funnily, I could have unknowingly avoided this problem in my Ansible playbook. All of my other entries in my SSH config file look similar to the above entry.

However, I was using Ansible to add and remove the AWS instance entry in my ~/.ssh/config file and I had forgotten to add the IdentitiesOnly yes option to my playbook.