AI's Investment Implications

I have no clue on that either.

We are just discussing whether PLTR will be “king of AI”. I say no. My definition of being “king of AI” is not short term stock performance. Pretty sure the people who commented on your post didn’t think being a king means stock outperformance in 12 months. You can certainly laugh at things nobody said and pat yourself on the back. It’s healthy to feel good about onself.

It’s like discussing whether Google will be king of smartphone if its stock outperform Apple in 12 months. Meaningless.

1 Like

So…

should delete their posts or remove their likes.

Again so sure of what people are thinking. They may have understood the context intuitively and didn’t mechanically interpret words.


@manch,
Learn how @marcus335 makes comments. He directly commented on the views of the youtuber. He didn’t try to interpret what “king of AI” means. You presume what “king of AI” meant and make comments. Not wise. Remember…

Intelligence is knowing what to do (comment), wisdom is knowing what not to do (comment)

Don’t always interpret words literally.

:rofl:

Sure. AI king.

By that logic, why dont call PLTR king of grocery if their stock price rises more than WMT?

1 Like

.

This would be out of context. Youtuber is opining PLTR is the best AI stock to buy in 2024 and will appreciate higher than NVDA in 2024.

That’s my point @hanera! PLTR and NVDA are just as different as PLTR and WMT.

Wishing you the best in your trades.

.

I don’t trade them. Buy n hold, hopefully forever.

I read both bullish and bearish thesis.

Many people eg. YTFinance consider both PLTR and NVDA as AI stocks… you can verify this by searching social media.

Ofc, both @manch and you reserve the right not to consider PLTR as an AI stock.

“AI stock”. Funny name. What are included in this AI stock universe and why?

Is Snowflake an AI stock? Cloudflare? AMD? Intel? Microsoft?

Most tech companies have something to do with AI. If you ask Larry Ellison he will probably say Oracle is an AI company. Before that it was a cloud company. And before that an Internet company. So on and so forth.

If you think question “What are AI stocks?” is simple, then this following question should have simple answers:

When I shop on Amazon it shows what products I may be interested. Is that AI? Facebook regularly show ads according to users interests. Is that AI?

Are Amazon and Facebook AI stocks?

I am not a stock analyst, don’t follow all stocks and not an academic.

I know this dog is an obedient dog and is a Labrador. Doesn’t mean I know all dogs and know how to classify them. Most importantly, why do I need to care about other dogs?

If you claim your dog is the king of dogs, I think it is a fair question to ask how many breeds of dogs you know, and how good your knowledge of dogs is.

Or even a more basic question. Can you tell whether some animal is a dog or not?

.

I didn’t. The YTFinance guy did. Go watch the YouTube.

.

Shame on Google.

1 Like

Lol, that’s not even close to what the demo led people to believe. They were hoping no one would pay attention and look behind the curtain. I guess they’ve learned from politicians.

Right. The guy showed Google is lying… by referring to Google’s own doc?

Cloudflare CEO Matthew Prince:

We also announced Workers AI to put powerful AI inference within milliseconds of every Internet user.

We believe inference is the biggest opportunity in AI and inference tasks will largely be run on end devices and connectivity clouds like Cloudflare.

Right now, there are members of the Cloudflare team traveling the world with suitcases full of GPUs, installing them throughout our network. We have inference-optimized GPUs running in 75 cities worldwide as of the end of October, we are well on our way of hitting a goal of 100 by the end of 2023.

By the end of 2024, we expect to have inference-optimized GPUs running in nearly every location where Cloudflare operates worldwide, making us easily the most widely distributed cloud-AI inference platform.

Is Cloudflare an AI company?

I have been following along this Berkeley Machine Learning class. It’s very cool in that they put up nearly all the lecture videos online and posted all the homework problems, including coding exercises where you can download their partially written code and fill in the missing pieces, doing exactly the same thing a Berkeley student is asked to do.

Lecture 26 linked below is the Berkeley professor philosophizing on the current state of AI as of late November 2023. It clears up, for me at least, a lot of the hypes surrounding LLM’s and where the future of AI may lead.

He thinks language alone is not enough. We need vision, from which we deduce a “world model” of how things work in the physical world. Anyway, I am butchering this. Watch the 1 hr 15 min video if you are interested.

1 Like

AMD Is Competing With Nvidia for Inference Workloads

ARK Invest_Headshot_Jozef Soja_Edit 544x44

By Jozef Soja
Research Associate

Last week, AMD officially launched its MI300X accelerator which, according to the company, outperforms Nvidia’s H100 by a factor of 1.4x in inferencing Llama 2 70B. While that trails NVIDIA’s next generation H200’s 1.9x improvement over the H100, AMD is likely to price the MI300X significantly lower than both the H100 and H200. If AMD were to price the MI300X at $18-$19,000 and if H100s and H200s were to sell in the $25-$40,000 range, then not only would the MI300X offer better performance per dollar of capex than both the H100 and H200, as shown below, but it also could be accretive to AMD’s margins.

AMD also touted the performance improvement in its open-source software ecosystem with support from Meta, OpenAI, and others. While AMD’s ROCm software is gaining traction, Nvidia’s software moat remains formidable. Highlighted during Nvidia’s most recent earnings call, the TensorRT-LLM SDK (Software Development Kit) promises to boost LLM inferencing performance 2x on Nvidia hardware. Software can improve performance per dollar of capex on AI inferencing hardware significantly, as shown on the right side of the chart above. In our view, software will remain an important battleground for GPU providers as the applications built around LLMs drive demand for efficient compute.

1 Like