error: This model's maximum context length
(1 votes) You are welcome. Also I'd like to add that many users don't quite understand how the OpenAI GPT models' restriction on the maximum number of tokens works. The number of tokens is calculated as the total sum of the length of the generated text you specify and the actual length of your assignment, including the original text.
Forum Timezone: Europe/Amsterdam
Most Users Ever Online: 541
Currently Online:
7 Guest(s)
Currently Browsing this Page:
1 Guest(s)
Top Posters:
ninja321: 87
harboot: 75
s.baryshev.aoasp: 68
Freedom: 61
Pandermos: 54
tormodg: 51
Member Stats:
Guest Posters: 337
Members: 3125
Moderators: 0
Admins: 1
Forum Stats:
Groups: 1
Forums: 5
Topics: 1729
Posts: 8911
Newest Members:
vargamartin56, delvinongyj, anyday555, jetlifeguy1, seoarshinov, pashabaruna1Administrators: CyberSEO: 4194

Log In
Home
Offline

All RSS