The bane of my existence:
InvalidRequestError: This model’s maximum context length is 16385 tokens. However, your messages resulted in 28737 tokens. Please reduce the length of the messages.
Somebody please give me GPT4-32k access.
Updates: See part 2.
Back to top