Attention Residuals: How Kimi Is Rethinking Transformer Depth

Every transformer you've ever used stacks layers with a dead-simple formula: take the input, add the layer's output, move on. x + layer(x). Fixed weight of 1. No questions asked. The Kimi team at M...

By · · 1 min read
Attention Residuals: How Kimi Is Rethinking Transformer Depth

Source: DEV Community

Every transformer you've ever used stacks layers with a dead-simple formula: take the input, add the layer's output, move on. x + layer(x). Fixed weight of 1. No questions asked. The Kimi team at Moonshot AI just published a paper that asks: what if that's been wrong the whole time? The Problem Nobody Talks About Standard residual connections accumulate layer outputs with equal weight. Layer 1 contributes the same as layer 47. The hidden state grows without bound as you stack more layers, and each individual layer's contribution gets diluted into the noise. This is called PreNorm dilution — and it gets worse the deeper your model goes. At 100+ layers, the early layers are essentially screaming into a hurricane. Their signal is there, mathematically, but it's buried under the sum of everything that came after. For most of transformer history, we've papered over this with normalization tricks. RMSNorm, LayerNorm, various pre-norm and post-norm arrangements. They help. They don't solve th

Related Posts

Trending on ShareHub

  1. Understanding Modern JavaScript Frameworks in 2026
    by Alex Chen · Feb 12, 2026 · 0 likes
  2. The System Design Primer
    by Sarah Kim · Feb 12, 2026 · 0 likes
  3. Just shipped my first open-source project!
    by Alex Chen · Feb 12, 2026 · 0 likes
  4. OpenAI Blog
    by Sarah Kim · Feb 12, 2026 · 0 likes
  5. Building Accessible Web Applications: A Practical Guide
    by Alex Chen · Feb 12, 2026 · 0 likes
  6. Rapper Lil Poppa dead at 25, days after releasing new music
    Rapper Lil Poppa dead at 25, days after releasing new music
    by Anonymous User · Feb 19, 2026 · 0 likes
  7. write-for-us
    by Volt Raven · Mar 7, 2026 · 0 likes
  8. Before the Coffee Gets Cold: Heartfelt Story of Time Travel and Second Chances
    Before the Coffee Gets Cold: Heartfelt Story of Time Travel and Second Chances
    by Anonymous User · Feb 12, 2026 · 0 likes
    #coffee gets cold #the #time travel
  9. Best DoorDash Promo Code Reddit Finds for Top Discounts
    Best DoorDash Promo Code Reddit Finds for Top Discounts
    by Anonymous User · Feb 12, 2026 · 0 likes
    #doordash #promo #reddit
  10. Premium SEO Services That Boost Rankings & Revenue | VirtualSEO.Expert
    by Anonymous User · Feb 12, 2026 · 0 likes
  11. NBC under fire for commentary about Team USA women's hockey team
    NBC under fire for commentary about Team USA women's hockey team
    by Anonymous User · Feb 18, 2026 · 0 likes
  12. Where to Watch The Nanny: Streaming and Online Viewing Options
    Where to Watch The Nanny: Streaming and Online Viewing Options
    by Anonymous User · Feb 12, 2026 · 0 likes
    #streaming #the nanny #where
  13. How Much Is Kindle Unlimited? Subscription Cost and Plan Details
    How Much Is Kindle Unlimited? Subscription Cost and Plan Details
    by Anonymous User · Feb 12, 2026 · 0 likes
    #kindle unlimited #subscription #unlimited
  14. Russian skater facing backlash for comment about Amber Glenn
    Russian skater facing backlash for comment about Amber Glenn
    by Anonymous User · Feb 18, 2026 · 0 likes
  15. Google News
    Google News
    by Anonymous User · Feb 18, 2026 · 0 likes

Latest on ShareHub

Browse Topics

#ai (4466)#news (2280)#webdev (2175)#programming (1469)#opensource (1135)#security (1110)#productivity (1061)#prediction markets (973)#business (949)#javascript (909)

Around the Network