Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How do you decide whether the article too long to fit in ChatGPT's context window?


ChatGPT's context window is 8192 tokens. A token is about 3-4 characters. OpenAI has an open source tokenizer you can download, too, if you want the exact number of tokens a body of text is.


Good question. I tested it manually with a few articles I could find. If you find a web page too large for ChatGPT then let me know, I can split it into multiple batches and ask ChatGPT to summarize once I'm done.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: