
error: This model's maximum context length





You are welcome. Also I'd like to add that many users don't quite understand how the OpenAI GPT models' restriction on the maximum number of tokens works. The number of tokens is calculated as the total sum of the length of the generated text you specify and the actual length of your assignment, including the original text.
Forum Timezone: Europe/Amsterdam
Most Users Ever Online: 541
Currently Online:
15 Guest(s)
Currently Browsing this Page:
1 Guest(s)
Top Posters:
ninja321: 86
s.baryshev.aoasp: 68
Freedom: 61
harboot: 61
Pandermos: 54
MediFormatica: 49
Member Stats:
Guest Posters: 337
Members: 2980
Moderators: 0
Admins: 1
Forum Stats:
Groups: 1
Forums: 5
Topics: 1697
Posts: 8678
Newest Members:
raffaellabonaschi, matogon, nduwawep, burakaltanalisan, info.magicbytes, ioannis.mavroudisAdministrators: CyberSEO: 4090