The bane of my existence:
InvalidRequestError: This model’s maximum context length is 32768 tokens. However, you requested 33013 tokens (31413 in the messages, 1600 in the completion). Please reduce the length of the messages or completion.
Somebody please give me GPT5-64k access.
See part 1.
Back to top