context length pain

ai
Author

Cody

Published

August 13, 2023

The bane of my existence:

InvalidRequestError: This model’s maximum context length is 16385 tokens. However, your messages resulted in 28737 tokens. Please reduce the length of the messages.

Somebody please give me GPT4-32k access.

Updates: See part 2.

Back to top