Trendaavat aiheet
#
Bonk Eco continues to show strength amid $USELESS rally
#
Pump.fun to raise $1B token sale, traders speculating on airdrop
#
Boop.Fun leading the way with a new launchpad on Solana.
Decentralized AI movement is growing!

4.8. klo 06.46
I just launched 🤖 - a map of active Decentralized AI projects. I'd like to see this ecosystem grow 10X!
Privacy by itself doesn't drive adoption. We need to match and surpass centralized Big AI's UX.
Data source connectors and memory are the moats. AI Apps switching costs will skyrocket over the next 2 years.
Why? Better data sources → increased usage → richer history → more valuable memory that's completely lost when switching providers → higher moats.
That's why we must act fast.
Today at the DeAI Summit at @UCBerkeley I pitched the "Bartender Stack" to solve this, while keeping the same UX for users and developers:
Inference abstraction: Developers should simply call a generic inference function, and the user's device decides which model to route it to, whether using remote TEEs or local inference. Currently, developers must bring their own model for local inference or provide their own privacy-preserving TEEs. which is crazy—which is why they default to centralized inference APIs. What I'm proposing is actually what we already do with general computation: you just ask the kernel to do the math, completely abstracting processor, memory, etc. This enables developers to focus on what makes their AI app distinctive: interface and prompt+context engineering.
Cross-app memory: Destroy the "memory moat" by automatically sharing context across apps, achieving the same UX of BigAI. The same local router that abstracts inference "extends" the context, New startups building local AI apps benefit from all your memory from day one instead of being penalized.
Data wallets: Granular, local control of tools and data sharing across apps. Configure once, not per app. The memory is owned by the users and shared with AI apps, not owned by the AI apps.
Stateless TEEs: Prompts and contexts are generated locally and shared at inference time. TEEs store no data, solving TEEs' UX keys issue and trust assumptions.
Distributed reputation for Agents: Agent networks need entry points to access permissionless reputation for deciding which agents to trust. Without this, people will always default to Big AIs or famous brands. For agents, trust is the moat.
I want AIs to be like bartenders! Local. Replaceable. Forgetful.
They listen when I need them, vanish when I leave, and never keep a record of my order history. Too professional to remember what I said after 🍹my second drink!
Full deck and architecture design here:
cc @BerkeleyRDI, 🦊@MetaMask

511
Johtavat
Rankkaus
Suosikit