RE: LeoThread 2024-12-11 08:21

You are viewing a single comment's thread:

Summary Stats from December 10, 2024

Yesterday, the Youtube Summarizer processed a total of 4.1 million input tokens, producing approximately 831,000 output tokens that was posted to chain. That's equivalent to about 2300 book pages, and is just shy of our all time high daily record (860k)!

Keep up the excellent work and summarizing #summarystats



0
0
0.000
16 comments
avatar

What is this all about my friend?

0
0
0.000
avatar

I have to improve, this fall must have been my poor performance.

Great work my friend!

When you refer to tokens, do you mean bot comments?

0
0
0.000
avatar

Haha december 10 was our second best day yet! No worries, we contribute when we can, everyone has other stuff to do as well :)


Tokens can be thought of as pieces of words. Here are some helpful rules of thumb for understanding tokens in terms of lengths:

  • 1 token ~= 4 characters in English
  • 1 token ~= ¾ words
  • 100 tokens ~= 75 words

Or

  • 1-2 sentence ~= 30 tokens
  • 1 paragraph ~= 100 tokens
  • 1,500 words ~= 2048 tokens

Source: help.openai.com

0
0
0.000