Gradient launches distributed reinforcement learning platform 'Echo-2'

Bloomingbit Newsroom

Summary

  • Gradient said it has launched the distributed reinforcement learning framework Echo-2, aiming to cut inference-centric training costs for large language models to one-tenth of current levels.
  • Echo-2 can run post-training for 30B-class models at about one-tenth the cost of commercial clouds, expanding large-model training to developers and startups through cost democratization.
  • Echo-2 leverages worldwide idle GPUs and the Lattica P2P protocol to reduce training costs from $4,490 to $425, and said it has validated performance in domains involving financial accountability.

Forecast Trend Report by Period

Loading IndicatorLoading Indicator
Photo=Gradient
Photo=Gradient

AI infrastructure company Gradient said on the 12th (local time) that it has launched Echo-2, a distributed reinforcement learning (RL) framework. Echo-2 aims to cut inference-centric training costs for large language models to one-tenth of current levels.

Gradient assessed that the AI industry has reached the limits of the data-scaling–driven “scaling laws.” It said that simply throwing more text and GPUs at training is yielding diminishing gains in model intelligence, while “inference scaling”—where models validate their own logic and arrive at answers through trial and error—is emerging as a key competitive edge.

Echo-2 is a distributed RL infrastructure designed for this transition, enabling post-training for 30B (30 billion-parameter) models at roughly one-tenth the cost of conventional commercial cloud environments. Beyond simple cost cutting, this represents “cost democratization,” extending large-model training from a big-tech preserve to developers and startups.

A training session that cost about $4,490 on conventional cloud setups fell to about $425 in the Echo-2 environment, while training time was reduced to about 9.5 hours. The key was parallelizing the sampling process—which accounts for roughly 80% of RL compute—across idle GPUs worldwide.

Technically, the system adopts an asynchronous RL architecture based on “Bounded Staleness,” separating learners from actors and tightly controlling lag between model versions. This design is intended to maintain training stability even in distributed environments.

It also applies Lattica, a P2P protocol that can distribute large model weights of 60GB or more across hundreds of nodes within minutes. Lattica uses a “decentralized weight propagation” structure that reduces reliance on central servers and helps minimize bottlenecks in large-scale distributed training.

According to research released by Gradient, training the Qwen3-8B model in a distributed RTX 5090 GPU environment delivered stable results without performance degradation while costing 36% less than a centralized A100 data-center setup.

Real-world use cases are also expanding. Echo-2 has completed performance validation in areas involving financial accountability, including high-difficulty reasoning at mathematics Olympiad level, smart-contract security audits, and autonomous on-chain agents.

A Gradient official said, “We need to move away from a model of renting AI via APIs and shift to a structure in which companies directly own and evolve model weights,” adding, “Echo-2 will become the foundation that enables anyone to build inference infrastructure that can operate at internet scale.”

Bloomingbit Newsroom

Bloomingbit Newsroom

news@bloomingbit.ioFor news reports, news@bloomingbit.io
hot_people_entry_banner in news detail bottom articleshot_people_entry_banner in news detail mobile bottom articles
What did you think of the article you just read?




PiCK News

White House says Iran clashes likely to end in the short term…oil tankers resume transit through the Strait of Hormuz

1 hours ago
White House says Iran clashes likely to end in the short term…oil tankers resume transit through the Strait of Hormuz

Israel Says It Has Launched Large-Scale Airstrikes on Iran…With About 2,000 Projectiles Fired From Lebanon in Parallel

1 hours ago
Israel Says It Has Launched Large-Scale Airstrikes on Iran…With About 2,000 Projectiles Fired From Lebanon in Parallel

Bernstein: “Rising long-term holdings ease Bitcoin selling pressure…bolstering market stability”

2 hours ago
Bernstein: “Rising long-term holdings ease Bitcoin selling pressure…bolstering market stability”

Willy Woo: “Bitcoin’s rebound could be a bull trap… we’re not at the bottom yet”

3 hours ago
Willy Woo: “Bitcoin’s rebound could be a bull trap… we’re not at the bottom yet”

U.S.-China summit postponed…fresh clash feared over Iran solution and Section 301 trade law

3 hours ago
U.S.-China summit postponed…fresh clash feared over Iran solution and Section 301 trade law

Trending News