context length pain pt2

The bane of my existence:

InvalidRequestError: This model’s maximum context length is 32768 tokens. However, you requested 33013 tokens (31413 in the messages, 1600 in the completion). Please reduce the length of the messages or completion.

Somebody please give me GPT5-64k access.

See part 1.

context length pain

The bane of my existence:

InvalidRequestError: This model’s maximum context length is 16385 tokens. However, your messages resulted in 28737 tokens. Please reduce the length of the messages.

Somebody please give me GPT4-32k access.

Updates:

See part 2.