RE: LeoThread 2024-10-18 17:29

You are viewing a single comment's thread:

Price per token for output on LLMs has dropped by 100x over the last year.

This is how fast things are moving. It is also why I stated the LLM game is a race to the bottom.



0
0
0.000
14 comments
avatar

interesting.

can you elaborate on this?

0
0
0.000
avatar

Yeah. Tokens are a unit of measure of data. With something like Groq, it tells us how many tokens were output (the amount of data in the output window.

Lets say this (609 token) output cost 1 cent of inference compute. A year ago, it would have cost $1.

0
0
0.000
avatar

makes sense. Wild.

0
0
0.000
avatar

That is why anyone who isnt focused upon AI is going to miss out. It is moving so rapidly and costs are dropping faster than panties on prom night.

0
0
0.000
avatar

Nice that Musk is so open about this, very interesting to know!

0
0
0.000
avatar

Groq is not Musks. That is Grok. Two separate chatbots.

Groq.com

0
0
0.000