It’s great to see the “cutting edge”, but as for the AI bubble and data center growth, we’re now at the point where models are reaching saturation. You soon will not need a powerful frontier model to do the task for you, all you’ll need is a small distilled quantized model. The majority of AI usage in the near future will involve smaller efficient models provided context to do very specific redundant tasks like digitizing text, doing price comparisons, or harvesting data.
Intel slapping the side of the fab. You can do so much AI with this baby… Wait what do you mean we are to late?? Damn. We should have just kept going with the +++++++ chip plan.
It’s great to see the “cutting edge”, but as for the AI bubble and data center growth, we’re now at the point where models are reaching saturation. You soon will not need a powerful frontier model to do the task for you, all you’ll need is a small distilled quantized model. The majority of AI usage in the near future will involve smaller efficient models provided context to do very specific redundant tasks like digitizing text, doing price comparisons, or harvesting data.
Intel slapping the side of the fab. You can do so much AI with this baby… Wait what do you mean we are to late?? Damn. We should have just kept going with the +++++++ chip plan.