<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Nested-Learning on Todd W. Bucy — Research Blog</title><link>https://toddwbucy.github.io/toddwbucy_research_blog/tags/nested-learning/</link><description>Recent content in Nested-Learning on Todd W. Bucy — Research Blog</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Thu, 29 Jan 2026 09:00:00 -0600</lastBuildDate><atom:link href="https://toddwbucy.github.io/toddwbucy_research_blog/tags/nested-learning/index.xml" rel="self" type="application/rss+xml"/><item><title>Google Didn't Go Silent. You Stopped Looking.</title><link>https://toddwbucy.github.io/toddwbucy_research_blog/blog/2026/01/google-didnt-go-silent-you-stopped-looking/</link><pubDate>Thu, 29 Jan 2026 09:00:00 -0600</pubDate><guid>https://toddwbucy.github.io/toddwbucy_research_blog/blog/2026/01/google-didnt-go-silent-you-stopped-looking/</guid><description>&lt;p&gt;&lt;img src="https://toddwbucy.github.io/toddwbucy_research_blog/images/blog/google-didnt-go-silent/image1.jpg" alt="" /&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;A response to Delanoe Pirard&amp;rsquo;s &amp;ldquo;Transformers Are Dead&amp;rdquo; and &amp;ldquo;A 170M Model Beats GPT-4&amp;rdquo;&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;In December 2025, Delanoe Pirard published two articles that crystallized a question the ML community had been asking for a year: If Titans is so good, why has Google gone silent?&lt;/p&gt;
&lt;p&gt;The answer is this: Titans is one component of a seven-paper research program, not a standalone architecture. Google didn&amp;rsquo;t go silent. The community stopped looking for the rest of the papers.&lt;/p&gt;</description></item></channel></rss>