Trendaavat aiheet
#
Bonk Eco continues to show strength amid $USELESS rally
#
Pump.fun to raise $1B token sale, traders speculating on airdrop
#
Boop.Fun leading the way with a new launchpad on Solana.

Chamath Palihapitiya
God is in the details.
Energy is the big bottle neck for AI.
In both physical and software AI, it’s not just that the ingredients need to change but the entire recipe does as well:
1) we need infinite and marginally costless energy. This will mean an ensemble of energy sources working together that can produce energy TODAY (ie nuclear not really an option before 2032, building a Nat gas or coal plant with a multi year backlog for parts also isn’t a near term option until 2030+) which means we will need Solar + Storage because it can go online 12-17 months from being greenlit. No way around it.
2) but for energy storage to scale economically in light of Foreign Entity of Concern (FEOC)/Prohibited Foreign Entity (PFE) you will need to find domestic LFP CAM providers for the ESS supply chain. There are very few.
3) you will need to step down the overall power footprint of the data centers which means HVAC needs to be rethought - an entirely new kind of heat pump must be invented. This new device while having a superior profile will also need to eliminate the forever chemicals which are now outlawed and must be sunset.
4) the chips themselves need to be re architected for performant, power efficient inference. Memory design, c2c, cabling and all manner of other design decisions that work well for training won’t likely scale for inference if inference is 100x+ bigger than training.
4) in physical AI, after storage (see above) abundant REs are essential for any form of motion/actuation. But getting REs out of the ground, into an oxide then into an alloy that can be made into Permanent Magnets….are a huge exercise in energy.
And the list goes on and on…
.
.
.
My point is that if you are focused on AI, you should start paying attention to energy as it will be the gatekeeper of progress/change over the next few years in AI.

Rihard Jarc29.7.2025
An interesting comment from a Former $META employee. ENERGY is the biggest bottleneck right now.
Even if $META wants to spend $100-$150B on CapEx for AI infrastructure, they can't. It is not just $NVDA.
Transformers, power equipment, cooling equipment, and the availability of power it is all limited right now. Schneider Electric is completely booked until 2030.
Even if you have the money, you can't spend it.

563,21K
Energy is the big bottle neck for AI.
In both physical and software AI, it’s not just that the ingredients need to change but the entire recipe does as well:
1) we need infinite and marginally costless energy. This will mean an ensemble of energy sources working together that can produce energy TODAY (ie nuclear not really an option before 2032, building a Nat gas or coal plant with a multi year backlog for parts also isn’t a near term option until 2030+) which means we will need Solar + Storage because it can go online 12-17 months from being greenlit. No way around it.
2) but for energy storage to scale economically in light of Foreign Entity of Concern (FEOC)/Prohibited Foreign Entity (PFE) you will need to find a domestic LFP CAM providers for the ESS supply chain. There are very few.
3) you will need to step down the overall power footprint of the data centers which means HVAC needs to be rethought - an entirely new kind of heat pump must be invented. This new device while having a superior profile will also need to eliminate the forever chemicals which are now outlawed and must be sunset.
4) the chips themselves need to be re architected for performant, power efficient inference. Memory design, c2c, cabling and all manner of other design decisions that work well for training won’t likely scale for inference if inference is 100x+ bigger than training.
4) in physical AI, after storage (see above) abundant REs are essential for any form of motion/actuation. But getting REs out of the ground, into an oxide then into an alloy that can be made into Permanent Magnets….are a huge exercise in energy.
And the list goes on and on…
.
.
.
My point is that if you are focused on AI, you should start focusing attention on energy as it will be the gatekeeper of progress/change over the next few years in the AI market.

Rihard Jarc29.7.2025
An interesting comment from a Former $META employee. ENERGY is the biggest bottleneck right now.
Even if $META wants to spend $100-$150B on CapEx for AI infrastructure, they can't. It is not just $NVDA.
Transformers, power equipment, cooling equipment, and the availability of power it is all limited right now. Schneider Electric is completely booked until 2030.
Even if you have the money, you can't spend it.

37,9K
I’ve had every car imaginable:
Range Rover
Mercedes
Bentley
Toyota
Ferrari
Volkswagen
Honda
And my Model Y is the absolute best, no close competitor.

Elon Musk29.7.2025
Model Y rated highest in safety
344,33K
Tesla is a physical AI company + software AI company.
It is the only company, today, that is tying both branches of AI together at scale. It will not be the only one in the end, but it is the only one now.
See below:
“Samsung Electronics will manufacture artificial-intelligence chips for Tesla in Texas under a $16.5 billion multiyear deal, a major win for its U.S. foundry business that sent shares of South Korea’s largest company sharply higher.
Tesla Chief Executive Elon Musk confirmed the deal on X, saying Samsung’s new facilities in Texas will be dedicated to making the U.S. electric-vehicle company’s next-generation AI6 chip. “The strategic importance of this is hard to overstate,” he wrote.”
478K
13 years ago, around when this video was taken, I wrote an Op-Ed in Bloomberg advocating every person in the world putting 1% of their then net worth into Bitcoin.
The price of Bitcoin then was $80.

Archiving Saylor25.7.2025
Chamath on #Bitcoin in 2013 👀
894,76K
Johtavat
Rankkaus
Suosikit
Ketjussa trendaava
Trendaa X:ssä
Viimeisimmät suosituimmat rahoitukset
Merkittävin