OpenAI just announced their latest o3 model that’s wicked smart. Some say it has achieved AGI.
One novel thing about their o1 and o3 models is that, result quality scales with inference resources. Before these models, it was assumed that the hard and resource intensive work was in the training, which is a one-time thing. Inference is supposed to be cheap, easy and fixed cost. Not anymore. Now the marginal cost of running the models has turned into a variable people can tune.
Inference may not be the commodity people assumed it was. Very bullish for Nvidia.
The way people develop software will dramatically change. Into what nobody knows yet. Many on Twitter predict doom for SWE. I am not so sure. But future CS grads need to know more than just software. I suspect getting a dual degree in business will be beneficial. If you compete on coding alone humans have already lost to AI.
Progress of AI shows no sign of slowing down. Earlier worries about hitting a scaling wall proves extremely premature. Again very bullish hardware providers like Nvidia.
I think o3’s inference is significantly different from prior generations’ inference. Haven’t read up the details yet. Just a hunch. It’s sorta like part of the training is now happening at the inference stage. So like training you now need a lot of compute and memory at inference time. Even though you are not modifying the weights, these resource hungry workloads can’t be handled by today’s specialized inference ASICs.
Again just a guess. Haven’t seen any good write up on o3 inference yet.
Future is hazy. Pretty sure SWE’s will still be around. But their skill set needs to be broadened. Can’t be mere code monkeys. Need to get closer to the decision-making core of the business.
Dec 26, 2024
A junk stock according to some bloggers leave hot stocks, TSLA, META and NVDA in the dust. I don’t understand why some bloggers can’t understand a simple ontology-based AI stock and even think is a junk stock that would demise.
In addition to Cosmos, Nvidia debuted its Isaac GROOT Blueprint for training humanoid robots. The software, which connects to Apple’s Vision Pro headset, allows a developer to perform and record specific movements they want to teach a robot. Isaac GROOT Blueprint then takes those movements and synthesizes them, providing the robot with an enormous set of movements based on the developer’s original motions.