Inference Compute: More Important Than Neural Networks
There is a lot of focus on companies that are achieving GPUs like Meta, Xai, and Tesla. These are obviously required for the training of neural networks.
In this video I discuss how, over the next few years, there will be something that outpaces it. We are going to see the need for inference explode. This is something that Tesla is working upon. Musk understands how there is a major need for this and how Tesla has access to millions of computers.
Here is the article mentioned in the video:\
▶️ 3Speak
0
0
0.000
That article doesn't tackle the huge bandwidth problem here. Will Tesla vehicles all have SIM cards and be uploading data while driving around? Will they instead upload data at the owner's residence using their data plans? Until I see an actual plan for the bandwidth issue, I can't help but think this idea only exists to boost the stock price and nothing more.
All Tesla's come with SIM cards so the question is connectivity. Some opt not to connect hence go through WiFi. If they connect, it is $9.99 per month which gives the ability to get over the air updates and infotainment package while moving.
As for the inference, it is dependent upon the hardware if would imagine. each generation of hardware can handle more. The size of hardware 5 will likely allow it to be moving, even on FSD, and handle inference. I am not sure of the potential of it on the earlier versions. It might have to be done when the vehicle is not being utilized.
I totally understand that the hardware might be sufficient for inference... but the way that I understand it is that the compute power is potentially restricted by the amount of data that can be uploaded and downloaded to those computers. If data can't be transferred quickly then the compute power of each device becomes far less important (as I understand it).
You can imagine that Amazon, xAI, would have incredible data connections - whereas Tesla might be relying on SIM cards and suboptimal WiFi connections.
It would be cool if Tesla can achieve AI inference, but I wonder how far away are they from reaching it? Inference would make their FSD so much better, and it can help AI learn faster. The company that is able to crack it will certainly be the leader at that point.
The system design is actually easier than the buildout. They need a lot more computers in the field before they can think about that.
I would look to 2027-2029 on that one.
Am just waiting for Tesla to succed in this I heard about this same project sometime ago. But it will be huge to see both of them working hand in hand
It often takes a while to build stuff out, especially when it comes to moonshot projects.
Yahhh takes a lot of time to achieve something great