This model’s maximum context length is 4096 tokens. However, your messages resulted in 5660 tokens. Please reduce the length of the messages.