Trending topics
#
Bonk Eco continues to show strength amid $USELESS rally
#
Pump.fun to raise $1B token sale, traders speculating on airdrop
#
Boop.Fun leading the way with a new launchpad on Solana.

Gami
Specialised generalist @gnars_dao
any engineer friends looked into zero by layer zero labs? nice to see a level headed take like this one. keen to hear your thoughts too

Campbell | existential penguin arc 🐧15 hours ago
From the looks of it, ZERO is a mess.
The GTM yesterday was a masterclass, but didn't have much substance.
I spent the morning going through the whitepapers & docs with @alranpe.
Let's break down what's actually here and what's missing:
What they do have is two really good quality engineering papers (one for execution, one for storage).
But they don't go into detail on the serious challenges they will face building ZERO.
Rn our team has more questions than answers.
Question #1: CONSENSUS
What's the consensus protocol? They mention "nano validators" but 1 GB/s DA throughput per nano validator implies serious bandwidth.
They don't specify hardware requirements. Without consensus + hardware specs, you can't answer: How many validators? How decentralized? How fast? How safe?
Question #2: CROSS-ZONE ATOMICITY
This is the problem that kills every sharding system.
DeFi composability breaks across zones. Liquidation requiring a swap in another zone? You're dead. No cross-zone atomic transactions = no composable DeFi.
Question #3: GOVERNANCE & STATE SHARDING
Their governance-as-upgrade model sounds nice vs L2 multisigs.
But what stops governance capture? If an attacker accumulates delegated stake, they modify any zone's STF. That's arguably worse than a multisig.
Question #4: ZK ECONOMICS
Jolt Pro needs 64× RTX 5090 GPUs. That's extreme centralization at the block producer level.
They also don't disclose latency from execution to proof. You can't assess actual finality time, evaluate the infrastructure cost or judge decentralization feasibility.
Question #5: DATA AVAILABILITY
The "no slashing" argument doesn't cover DA. A single BP can withhold data and nothing happens.
What incentivizes validators to store and serve SVID chunks? They claim it's the right direction, but without a specific SVID proposal or the actual DA architecture, it's impossible to evaluate if they have anything novel.
Summing up: 2 million TPS is a great claim. Rn we don't see how they'll get there without major tradeoffs.
1
Top
Ranking
Favorites


