Sometimes the LLM of your choice is down and you need to switch to an alternative. This is where a fallback LLM comes.

With Langchain this is as simple as doing

only_35 = prompt | openai_35
fallback_4 = prompt | openai_35.with_fallbacks([openai_4])

https://python.langchain.com/v0.1/docs/guides/productionization/fallbacks/