<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Simon Smith on AI]]></title><description><![CDATA[Occasional posts to help leaders think clearly about artificial intelligence, without the hype.]]></description><link>https://www.simonsmith.ai</link><generator>Substack</generator><lastBuildDate>Wed, 15 Apr 2026 12:31:23 GMT</lastBuildDate><atom:link href="https://www.simonsmith.ai/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Simon Smith]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[simonsmithai@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[simonsmithai@substack.com]]></itunes:email><itunes:name><![CDATA[Simon Smith]]></itunes:name></itunes:owner><itunes:author><![CDATA[Simon Smith]]></itunes:author><googleplay:owner><![CDATA[simonsmithai@substack.com]]></googleplay:owner><googleplay:email><![CDATA[simonsmithai@substack.com]]></googleplay:email><googleplay:author><![CDATA[Simon Smith]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[How to Prepare Your Business for the Coming Wave of AI Agents]]></title><description><![CDATA[Digital employees are near, and they'll need skills, tools, context, and new workflows and organizational structures to be successful]]></description><link>https://www.simonsmith.ai/p/how-to-prepare-your-business-for</link><guid isPermaLink="false">https://www.simonsmith.ai/p/how-to-prepare-your-business-for</guid><dc:creator><![CDATA[Simon Smith]]></dc:creator><pubDate>Wed, 11 Mar 2026 16:57:45 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Qd3t!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09e85a87-eeb5-41e4-a9ec-3937cd5ecb44_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Qd3t!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09e85a87-eeb5-41e4-a9ec-3937cd5ecb44_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Qd3t!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09e85a87-eeb5-41e4-a9ec-3937cd5ecb44_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!Qd3t!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09e85a87-eeb5-41e4-a9ec-3937cd5ecb44_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!Qd3t!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09e85a87-eeb5-41e4-a9ec-3937cd5ecb44_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!Qd3t!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09e85a87-eeb5-41e4-a9ec-3937cd5ecb44_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Qd3t!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09e85a87-eeb5-41e4-a9ec-3937cd5ecb44_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/09e85a87-eeb5-41e4-a9ec-3937cd5ecb44_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1993789,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.simonsmith.ai/i/190630949?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09e85a87-eeb5-41e4-a9ec-3937cd5ecb44_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Qd3t!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09e85a87-eeb5-41e4-a9ec-3937cd5ecb44_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!Qd3t!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09e85a87-eeb5-41e4-a9ec-3937cd5ecb44_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!Qd3t!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09e85a87-eeb5-41e4-a9ec-3937cd5ecb44_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!Qd3t!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09e85a87-eeb5-41e4-a9ec-3937cd5ecb44_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>A few months ago, an <a href="https://www.linkedin.com/posts/simonsmith_just-got-hit-up-by-an-ai-super-connector-share-7369017042059964422-Ng5b?utm_source=share&amp;utm_medium=member_desktop&amp;rcm=ACoAAAASgrwB6VJjphrICj5jqeU1-xOzw1atpiM">AI agent called Boardy reached out to me</a>. &#8220;I was just speaking with a Toronto-based pharma exec and they mentioned they were looking to connect with AI strategy expertise for drug commercialization,&#8221; it emailed. &#8220;Would you be up for an intro?&#8221;</p><p>I said yes, and it suggested we continue on WhatsApp. We did. Then it asked to speak on the phone. I did that too. It followed through, connecting me with someone relevant, who I met with for an enjoyable conversation.</p><p>That whole interaction was autonomously facilitated by AI across email, WhatsApp, and voice.</p><p>It worked.</p><p>It was a moment where I thought: these systems are already more capable than most people realize. They&#8217;re still hard to configure. They still have jagged edges. But when they&#8217;re easier to set up and less jagged, this is going to move fast.</p><p>That&#8217;s happening. And you need to prepare.</p><h2>Beyond the assistant era</h2><p>Since ChatGPT&#8217;s launch, we&#8217;ve lived in the assistant era of AI. You ask, it answers. You tell it to draft, analyze, or summarize, it does. Over time, these systems got better at agentic work too, like deep research.</p><p>But in recent months, we&#8217;ve moved firmly from the assistant era to the agentic.</p><p>Claude Code kicked this off with agentic software engineering and a series of ever-better models. OpenAI responded with Codex and a rapid sequence of stronger GPTs. The shift in coding crystallized last December, during the holiday break, when people had time to push the models to their limits. They found them very capable of being delegated meaningful work and delivering results.</p><p>Then tools like Claude Cowork and OpenClaw generalized this beyond coding. And, recently, OpenAI revealed <a href="https://openai.com/index/introducing-openai-frontier/">what this could look like for enterprises with Frontier</a>. For the first time, I saw something that made the agentic future feel tractable for large organizations: digital coworkers with skills, tools, context, oversight, and some kind of control plane around them.</p><p>At that point, the question for me stopped being, &#8220;When are autonomous AI employees coming?&#8221; It became, &#8220;What can we do now to prepare?&#8221;</p><h2>Preparation that pays off today and tomorrow</h2><p>While the mature agentic future isn&#8217;t evenly distributed yet, its shape is clear, and the work required to prepare for it is not speculative. In fact, most of it is useful even today, in the assistant-and-chatbot era. From what I&#8217;m seeing, and doing, I&#8217;d focus on four things: skills, tools, context, and workflows.</p><h3>Package skills</h3><p>Future agents will need direction to excel in their role. They&#8217;ll need instructions, heuristics, standards, examples, and constraints. In AI lingo: <a href="https://agentskills.io">agent skills</a>.</p><p>Companies should start identifying the highest-value skills in their organization and packaging them into agent skills. How do we write this kind of deliverable? How do we analyze this kind of problem? How do we prepare this kind of presentation? What does excellent look like? What are common ways to fail?</p><p>You can use skills in assistant tools like ChatGPT and Claude. Today, that looks like a human interacting with a model loaded with a skill. Tomorrow, that same skill can be part of a more autonomous digital worker.</p><h3>Connect tools and data</h3><p>Agents will need access to the same systems your employees use. That includes systems of record, document repositories, and communication tools.</p><p>If your AI systems cannot access your Google Drive, your Slack, your CRM, your internal knowledge, or the other key systems your teams rely on, they&#8217;ll be ineffective. So close those gaps now. Configure connectors, and build proprietary integrations if needed. Ensure your assistant AIs can already interact with the systems your future agents will need.</p><p>This is future-proofing that&#8217;s also useful for your current AI stack.</p><h3>Ensure up-to-date context</h3><p>This one is easy to underestimate.</p><p>It&#8217;s not enough to give AI systems static reference materials. For many use cases, critical context is dynamic and partially undocumented. In my world, for example, brand guidelines for a new drug are essential. But a brand manager&#8217;s preferences are equally important, and may be expressed informally, such as in a conversation over coffee.</p><p>Whatever the domain, the principle is the same: future agents will need rich, current working context, not just archival documents.</p><p>So while configuring and building data connections for systems of record, also consider how you&#8217;ll keep agents updated in ways you might take for granted with humans.</p><h3>Rethink workflows and organizational structure</h3><p>This is the part many organizations will leave too late.</p><p>Autonomous agents aren&#8217;t just a software upgrade. They change how work gets done. An individual contributor can use ChatGPT to do their job better. But working with autonomous agents is more like managing a team. You create them, onboard them, assign them tasks, and give feedback on their work.</p><p>Not everyone will want to shift from IC to manager. Not everyone will be good at it.</p><p>Agents can also collapse workflows built around sequential human handoffs into a single step because of their broader skillsets. An agent that&#8217;s good at both design and development, for example, can build a beautiful frontend interface in one step directly in code.</p><p>So organizations need to think now about supervision models, feedback loops, workflow redesign, and what agent oversight will look like for them.</p><p>You&#8217;ll have agent teams. You&#8217;ll need agent managers. You may need <a href="https://www.simonsmith.ai/p/the-case-for-the-ai-context-librarian">agent skill librarians</a>. The future cyborg org chart won&#8217;t look like your current org chart.</p><h2>It&#8217;s not too early to start</h2><p>Here&#8217;s a thought experiment: what if your toughest competitor scaled its workforce next quarter with employees that worked better and faster than yours, 24/7, at lower cost?</p><p>It&#8217;s not a hypothetical. This will happen to many companies, and soon. Some industries will be slower, due to factors like AI&#8217;s still-jagged capabilities. But AI capabilities are improving faster than ever, so I wouldn&#8217;t count on that for defense.</p><p>Instead, my advice is to kick off an agent-readiness effort now. Charge it with four things: identify and package skills, improve tool and data access, build systems for context maintenance, and rethink workflows and oversight for a more agentic future.</p><p>You don&#8217;t need to predict every detail of where this is going. You just need to recognize that we can already see enough of its shape to effectively prepare.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/p/how-to-prepare-your-business-for?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/p/how-to-prepare-your-business-for?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/subscribe?"><span>Subscribe now</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[The Case for the AI Context Librarian]]></title><description><![CDATA[Decentralization worked in the copilot era, but autonomous agents need a single source of truth]]></description><link>https://www.simonsmith.ai/p/the-case-for-the-ai-context-librarian</link><guid isPermaLink="false">https://www.simonsmith.ai/p/the-case-for-the-ai-context-librarian</guid><dc:creator><![CDATA[Simon Smith]]></dc:creator><pubDate>Mon, 17 Nov 2025 21:02:49 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!MyZI!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F413bd0ca-b00e-4984-bf58-744959407525_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!MyZI!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F413bd0ca-b00e-4984-bf58-744959407525_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!MyZI!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F413bd0ca-b00e-4984-bf58-744959407525_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!MyZI!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F413bd0ca-b00e-4984-bf58-744959407525_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!MyZI!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F413bd0ca-b00e-4984-bf58-744959407525_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!MyZI!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F413bd0ca-b00e-4984-bf58-744959407525_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!MyZI!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F413bd0ca-b00e-4984-bf58-744959407525_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/413bd0ca-b00e-4984-bf58-744959407525_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2294342,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.simonsmith.ai/i/179171759?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F413bd0ca-b00e-4984-bf58-744959407525_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!MyZI!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F413bd0ca-b00e-4984-bf58-744959407525_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!MyZI!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F413bd0ca-b00e-4984-bf58-744959407525_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!MyZI!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F413bd0ca-b00e-4984-bf58-744959407525_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!MyZI!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F413bd0ca-b00e-4984-bf58-744959407525_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I was at a cottage with my kids a few summers ago when Apple Maps tried to kill us. We were driving to mini&#8209;golf, following directions that grew stranger by the minute until we found ourselves at a mining site. The app was confident we&#8217;d arrived. I was sure we hadn&#8217;t.</p><p>So I ignored it, turned around, switched map apps and found a better route. No drama. No harm.</p><p>We have similar experiences with chatbots today. They fail, we catch them, we fix the issue.</p><p>But just as self&#8209;driving cars heighten risk from bad maps, AI agents heighten risk from bad context. As we move from copilots to autopilots, from tools that assist to agents that work for you, errors you once caught instantly can propagate across fleets of autonomous executions.</p><p>So if organizations want to harness agents&#8217; potential, they must ensure those agents have good maps&#8212;accurate, up-to-date, optimized skills, background information, templates, tools, and examples.</p><p>Someone has to be accountable for this.</p><p>Enter the AI Context Librarian.</p><h2>Bottom-up works until it doesn&#8217;t</h2><p>As I previously wrote, <a href="https://www.simonsmith.ai/p/five-keys-to-successful-ai-transformation">companies that successfully embrace AI do so bottom-up</a>. They democratize access to AI tools and encourage people to experiment.</p><p>I&#8217;ve seen this first-hand. At my company, for example, we have way more custom GPTs than employees. People build small, targeted assistants for their roles. A popular one turns timelines into bullet descriptions, another helps with recruiting tasks. These get shared, tweaked, and remixed.</p><p>This decentralized creativity unlocks use cases that are hard to find top-down. (Would a CEO know how frequently people turn timelines into bullet lists?) But it can lead to duplication, staleness, and suboptimal performance, with no central authority to evaluate, optimize, and bless the best tools.</p><p>It&#8217;s less of an issue when humans supervise. They can pick their preferred custom GPT, review and revise the output, and edit instructions to taste.</p><p>But it&#8217;s a big risk when humans step out of the execution loop.</p><p>(<em>Aside: If you haven&#8217;t reached the copilot stage yet, skipping straight to autonomous agents could feel like cultural whiplash. So note that my intended audience for this article is people in organizations with widespread copilot adoption that are now deploying agents.)</em></p><h2>Don&#8217;t execute yourself off a cliff</h2><p>Where I work, we create marketing materials for life science companies. Imagine if we used an autonomous agent to create dozens of localized social ads, but gave it:</p><ul><li><p>outdated prescribing information,</p></li><li><p>last&#8209;year&#8217;s brand messaging,</p></li><li><p>incomplete audience details,</p></li><li><p>instructions for a junior-level writing skill,</p></li><li><p>obsolete guidance for social media post character limits.</p></li></ul><p>The agent would dutifully execute. Within a day, we&#8217;d have hundreds of unusable outputs. Our creative and regulatory teams would catch this, but it would waste time.</p><p>It might also convince leadership that &#8220;agents aren&#8217;t ready,&#8221; when the real issue is that they lacked good context.</p><h2>Context is critical infrastructure</h2><p>Autonomous agents need far more than a clever prompt. A good analogy is this:</p><p><strong>Imagine you&#8217;re training a new employee and giving them a task they&#8217;ve never done.</strong></p><p>To be successful, they need:</p><ul><li><p><strong>a skill definition</strong> encompassing best practices to perform like an expert<strong><a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a>,</strong></p></li><li><p><strong>background and a brief</strong> covering things like brand, audience, and objectives,</p></li><li><p><strong>templates and examples</strong> that show what good looks like,</p></li><li><p><strong>tools</strong> like libraries and API endpoints, or web apps for browser-based agents.</p></li></ul><p>This bundle&#8212;call it a <em>context pack</em>&#8212;becomes the agent&#8217;s map.</p><p>Bad directions, wrong destination.</p><p>So engineering and maintaining quality context packs is critical.</p><h2>Elevating human accountability</h2><p>This isn&#8217;t a task we can today automate. I&#8217;ve tried.</p><p>Yes, AI can do most of the work. It can gather, research, summarize, and synthesize background information, for example. And it can write and refine skills.</p><p>But it can&#8217;t be <em>accountable</em>. At least until we have AI-run companies with AI legal liability (not soon), humans can&#8217;t step out of the loop entirely. They must step <em>up</em> into a higher loop: governing the systems that execute. </p><p>That&#8217;s why we need AI Context Librarians. These people:</p><ul><li><p>create and maintain libraries of skills that reflect best practices,</p></li><li><p>ensure comprehensive and up-to-date background information,</p></li><li><p>curate templates and examples,</p></li><li><p>evaluate and provide access to tools,</p></li><li><p>create and run evaluations to optimize performance,</p></li><li><p>ensure everything stays curated and widely accessible.</p></li></ul><p>Organizations that understand the need for this role can effectively scale agents. Organizations that don&#8217;t will blame technology for what&#8217;s really context debt.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> </p><p>That day at the cottage, Apple Maps was wrong, but I wasn&#8217;t. I could see the mining site. I could course&#8209;correct. No harm done.</p><p>But in the autopilot era, we&#8217;re giving the wheel to machines. If they drive into mining sites, that&#8217;s on us.</p><div><hr></div><p><em><strong>How I used AI use for this article: </strong>First, I used ChatGPT Atlas to summarize <a href="https://www.linkedin.com/posts/simonsmith_i-think-well-soon-see-a-new-job-in-companies-activity-7395571236489084928-VJoX?utm_source=share&amp;utm_medium=member_desktop&amp;rcm=ACoAAAASgrwB6VJjphrICj5jqeU1-xOzw1atpiM">a LinkedIn post I had written</a> on this topic, along with comments on that post and my responses. I then dictated more thoughts into ChatGPT and had it prompt me to dig deeper. Once I felt I had enough material, I asked it to sketch a detailed outline based on a narrative nonfiction outline I provided. We jammed on this together until I felt good. Then I had it produce a first draft, which I edited, including extensive rewriting and refining. Finally, I had it generate an image, and we jammed on that until we were both happy. Note that I left in and added some em dashes (&#8212;) because, as a former journalist who studied magazine writing, I used these long before we had ChatGPT! I hope they soon are no longer seen as a mark of AI writing.</em></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>See, for example, <a href="https://www.claude.com/blog/skills">Anthropic&#8217;s Skills</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>Credit to Pranav Mehta for the term &#8220;context debt,&#8221; from <a href="https://www.linkedin.com/feed/update/urn:li:activity:7395571236489084928?commentUrn=urn%3Ali%3Acomment%3A%28activity%3A7395571236489084928%2C7396154638871285761%29&amp;dashCommentUrn=urn%3Ali%3Afsd_comment%3A%287396154638871285761%2Curn%3Ali%3Aactivity%3A7395571236489084928%29">this comment on LinkedIn</a>.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/p/the-case-for-the-ai-context-librarian?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/p/the-case-for-the-ai-context-librarian?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/subscribe?"><span>Subscribe now</span></a></p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[You're Not Extrapolating Enough]]></title><description><![CDATA[Most people still plan for AI as if today&#8217;s limits will last. They won&#8217;t.]]></description><link>https://www.simonsmith.ai/p/youre-not-extrapolating-enough</link><guid isPermaLink="false">https://www.simonsmith.ai/p/youre-not-extrapolating-enough</guid><dc:creator><![CDATA[Simon Smith]]></dc:creator><pubDate>Thu, 18 Sep 2025 15:43:43 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Gb1b!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5df4d3a0-789e-472b-a8c2-accd5ebc2a38_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Gb1b!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5df4d3a0-789e-472b-a8c2-accd5ebc2a38_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Gb1b!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5df4d3a0-789e-472b-a8c2-accd5ebc2a38_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!Gb1b!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5df4d3a0-789e-472b-a8c2-accd5ebc2a38_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!Gb1b!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5df4d3a0-789e-472b-a8c2-accd5ebc2a38_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!Gb1b!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5df4d3a0-789e-472b-a8c2-accd5ebc2a38_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Gb1b!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5df4d3a0-789e-472b-a8c2-accd5ebc2a38_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5df4d3a0-789e-472b-a8c2-accd5ebc2a38_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1848479,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.simonsmith.ai/i/173939593?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5df4d3a0-789e-472b-a8c2-accd5ebc2a38_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Gb1b!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5df4d3a0-789e-472b-a8c2-accd5ebc2a38_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!Gb1b!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5df4d3a0-789e-472b-a8c2-accd5ebc2a38_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!Gb1b!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5df4d3a0-789e-472b-a8c2-accd5ebc2a38_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!Gb1b!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5df4d3a0-789e-472b-a8c2-accd5ebc2a38_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I recently caught up with a friend who&#8217;s an experienced consultant. At his company, senior consultants using AI had become so productive they needed far fewer junior employees and interns. He noted the problem this creates: without entry-level roles, how will future senior consultants gain experience? He wants to find a way to hire juniors so they can build the experience seniors now leverage to work with AI.</p><p>I share his concern about AI&#8217;s impact on early-career employees (recent analysis of <a href="https://digitaleconomy.stanford.edu/wp-content/uploads/2025/08/Canaries_BrynjolfssonChandarChen.pdf">payroll data</a> and <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5425555">job posts</a> reinforce it). But I worry about focusing on the wrong solutions. Yes, AI benefits from experienced employees today. But if current trends continue, it simply won&#8217;t need them in the future. We risk helping junior employees gain experience that will no longer add value.</p><p>I see this pattern often, even in myself. It&#8217;s hard for humans to think exponentially. But you can&#8217;t plan for a future with AI&#8217;s current capabilities. If trends hold, those capabilities will expand rapidly along multiple dimensions. If you find yourself saying &#8220;AI can&#8217;t do X because of Y limitation,&#8221; or &#8220;AI will always need a human to do X because it can&#8217;t do Y,&#8221; you&#8217;re probably making this mistake too. Instead, we should anticipate continued rapid progress and plan accordingly.</p><h3>The present is not like the past</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!uY0Y!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F973c70da-9697-4ec1-aa93-a4cee0804bdf_3400x2400.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!uY0Y!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F973c70da-9697-4ec1-aa93-a4cee0804bdf_3400x2400.png 424w, https://substackcdn.com/image/fetch/$s_!uY0Y!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F973c70da-9697-4ec1-aa93-a4cee0804bdf_3400x2400.png 848w, https://substackcdn.com/image/fetch/$s_!uY0Y!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F973c70da-9697-4ec1-aa93-a4cee0804bdf_3400x2400.png 1272w, https://substackcdn.com/image/fetch/$s_!uY0Y!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F973c70da-9697-4ec1-aa93-a4cee0804bdf_3400x2400.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!uY0Y!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F973c70da-9697-4ec1-aa93-a4cee0804bdf_3400x2400.png" width="728" height="514" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/973c70da-9697-4ec1-aa93-a4cee0804bdf_3400x2400.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:false,&quot;imageSize&quot;:&quot;normal&quot;,&quot;height&quot;:1028,&quot;width&quot;:1456,&quot;resizeWidth&quot;:728,&quot;bytes&quot;:826206,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.simonsmith.ai/i/173939593?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F973c70da-9697-4ec1-aa93-a4cee0804bdf_3400x2400.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:&quot;center&quot;,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!uY0Y!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F973c70da-9697-4ec1-aa93-a4cee0804bdf_3400x2400.png 424w, https://substackcdn.com/image/fetch/$s_!uY0Y!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F973c70da-9697-4ec1-aa93-a4cee0804bdf_3400x2400.png 848w, https://substackcdn.com/image/fetch/$s_!uY0Y!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F973c70da-9697-4ec1-aa93-a4cee0804bdf_3400x2400.png 1272w, https://substackcdn.com/image/fetch/$s_!uY0Y!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F973c70da-9697-4ec1-aa93-a4cee0804bdf_3400x2400.png 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I love showing the above chart <a href="https://ourworldindata.org/artificial-intelligence">from Our World in Data</a>, even though it ends in 2023. It illustrates how fast AI progressed across key domains, and how steep the curve became after the invention of the transformer in 2017.</p><p>That matches my experience. I remember playing with GPT-2, impressed it could sometimes write coherent articles. Then came GPT-3, which extended those abilities, but still couldn&#8217;t follow instructions, hold turn-by-turn conversations, use tools, or write code well. That was five years ago. If I had projected forward based on those limitations, and made a five-year plan, would I be set up for success?</p><p>Today, models hold long conversations in text and audio. They can search the web, summarize results, and build complex applications from a prompt. They&#8217;re starting to exceed humans at the hardest intellectual challenges (like <a href="https://x.com/MostafaRohani/status/1968360976379703569">outscoring all humans in a recent international coding contest</a>). And that&#8217;s just language models. We can now also generate high-quality images, video, speech, music, and even <a href="https://deepmind.google/discover/blog/genie-3-a-new-frontier-for-world-models/">simulated worlds</a>.</p><p>The present isn&#8217;t like the past. But projecting forward at past rates of progress beats assuming static or slow change.</p><h3>The future will not be like the present</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!IuwD!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21ead346-d21f-42d1-ade0-4ec07b5c902d_3200x2404.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!IuwD!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21ead346-d21f-42d1-ade0-4ec07b5c902d_3200x2404.png 424w, https://substackcdn.com/image/fetch/$s_!IuwD!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21ead346-d21f-42d1-ade0-4ec07b5c902d_3200x2404.png 848w, https://substackcdn.com/image/fetch/$s_!IuwD!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21ead346-d21f-42d1-ade0-4ec07b5c902d_3200x2404.png 1272w, https://substackcdn.com/image/fetch/$s_!IuwD!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21ead346-d21f-42d1-ade0-4ec07b5c902d_3200x2404.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!IuwD!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21ead346-d21f-42d1-ade0-4ec07b5c902d_3200x2404.png" width="1456" height="1094" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/21ead346-d21f-42d1-ade0-4ec07b5c902d_3200x2404.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1094,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!IuwD!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21ead346-d21f-42d1-ade0-4ec07b5c902d_3200x2404.png 424w, https://substackcdn.com/image/fetch/$s_!IuwD!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21ead346-d21f-42d1-ade0-4ec07b5c902d_3200x2404.png 848w, https://substackcdn.com/image/fetch/$s_!IuwD!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21ead346-d21f-42d1-ade0-4ec07b5c902d_3200x2404.png 1272w, https://substackcdn.com/image/fetch/$s_!IuwD!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F21ead346-d21f-42d1-ade0-4ec07b5c902d_3200x2404.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Every major AI lab leader says essentially the same thing: models will keep improving at today&#8217;s pace or faster; within five to ten years we&#8217;ll have AI that can do any knowledge-work task; and superintelligence will soon follow, producing abundant wealth and health.</p><p>Of course, they have incentives to say this. But independent research groups echo the outlook. <a href="https://epoch.ai/">Epoch AI</a> (<a href="https://epoch.ai/blog/what-will-ai-look-like-in-2030">source of the chart above</a>), <a href="https://metr.org/">METR</a> (focused on AI risk), and <a href="https://ai-futures.org/">AI Futures Project</a> (producer of the much discussed <em><a href="https://ai-2027.com/">AI 2027</a></em>) all forecast continued rapid progress.</p><p>Going back to our initial challenge with junior employees, this all means that in five years, neither junior <em>nor</em> senior employees will add much value relative to AI. Unless you believe progress will stall, which no reputable research group projects, you should prepare for AI to outperform humans at most computer-based work, and as robotics mature, physical tasks too.</p><p>This should reshape how you think about near-term deployments. Right now, for example, it&#8217;s easy to get caught in the chatbot paradigm of assistants or copilots. But as AI grows smarter, faster, cheaper, and more parallelizable than humans, you must plan for things that today sound like sci-fi. For example, <em>AI Daily Brief </em>host Nathaniel Whittemore has his &#8220;<a href="https://podcasts.apple.com/us/podcast/the-dr-strange-theory-of-ai-agent-work/id1680633614?i=1000697162733">Doctor Strange Theory</a>&#8221; of AI&#8217;s future where, like Doctor Strange exploring outcomes in different multiverses, employees will spin up dozens of AI agents to produce an output and then just pick the one they like best.</p><h3>Thoughts on preparing for the unknown</h3><p>We can&#8217;t know exactly what the future looks like. Too many variables. But we can say with confidence that, barring a major disruption, AI will progress at least as fast as today. What we don&#8217;t know is how society will absorb it.</p><p>Some ways to prepare:</p><ul><li><p><strong>Expect rapid progress.</strong> This is the safest assumption until reputable groups like Epoch AI say otherwise. Ignore pundits who trade on skepticism or media outlets pushing contrarian &#8220;debunks.&#8221; Trust sources grounded in data.</p></li><li><p><strong>Scenario plan.</strong> We don&#8217;t know how this all unfolds, but we can explore scenarios with varying confidence levels. What if, within a year, you can spin up cloud agents capable of anything your best employees do? What would those employees then do? How should you structure your company? You don&#8217;t need certainty, but you do need to think across possible outcomes.</p></li><li><p><strong>Maximize optionality.</strong> With progress fast and uncertain, keep your options open. Don&#8217;t get locked into one technology, vendor, or long-term plan. Especially avoid doubling down on tools likely to become obsolete.</p></li></ul><p>As for junior employees, I don&#8217;t have a neat answer. But I doubt that answer lies in preparing them for a future unlikely to exist. Better to help them prepare for a world where they oversee swarms of agents that outperform them on any given task, but that they can harness for unprecedented impact. That seems more plausible. </p><p>But we should still keep our options open.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/subscribe?"><span>Subscribe now</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/p/youre-not-extrapolating-enough?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/p/youre-not-extrapolating-enough?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[Proprietary Data Won’t Save You From AI Disruption]]></title><description><![CDATA[Speed to scale matters more, and if your data is truly valuable it will be valuable enough to get in other ways]]></description><link>https://www.simonsmith.ai/p/proprietary-data-wont-save-you-from</link><guid isPermaLink="false">https://www.simonsmith.ai/p/proprietary-data-wont-save-you-from</guid><dc:creator><![CDATA[Simon Smith]]></dc:creator><pubDate>Sun, 14 Sep 2025 14:36:05 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!vr8R!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F671c2b71-8af7-4328-a975-ea4ac96d1889_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!vr8R!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F671c2b71-8af7-4328-a975-ea4ac96d1889_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!vr8R!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F671c2b71-8af7-4328-a975-ea4ac96d1889_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!vr8R!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F671c2b71-8af7-4328-a975-ea4ac96d1889_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!vr8R!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F671c2b71-8af7-4328-a975-ea4ac96d1889_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!vr8R!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F671c2b71-8af7-4328-a975-ea4ac96d1889_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!vr8R!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F671c2b71-8af7-4328-a975-ea4ac96d1889_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/671c2b71-8af7-4328-a975-ea4ac96d1889_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2353956,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.simonsmith.ai/i/173578094?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F671c2b71-8af7-4328-a975-ea4ac96d1889_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!vr8R!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F671c2b71-8af7-4328-a975-ea4ac96d1889_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!vr8R!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F671c2b71-8af7-4328-a975-ea4ac96d1889_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!vr8R!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F671c2b71-8af7-4328-a975-ea4ac96d1889_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!vr8R!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F671c2b71-8af7-4328-a975-ea4ac96d1889_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I often hear executives argue that their company&#8217;s proprietary data will protect them from disruption in the age of AI. Their logic usually goes like this: AI labs may have the best foundation models, but they don&#8217;t have <em>our</em> data. That exclusivity, the thinking goes, will give them an edge in some domain.</p><p>I understand the impulse. For years, the business mantra has been that &#8220;data is the new oil.&#8221; Whoever controls it can refine it into power. I&#8217;ve lived through this first-hand myself, seeing companies spend years building elaborate pipelines, annotation processes, and specialized models around their own datasets. And then, almost overnight, those carefully tuned systems were leapfrogged by large, general-purpose models trained on petabytes of unstructured data.</p><p>Most people I talk to have never heard of Rich Sutton&#8217;s &#8220;<a href="http://www.incompleteideas.net/IncIdeas/BitterLesson.html">Bitter Lesson</a>,&#8221; which argues that general methods exploiting scale will, over time, beat specialized approaches. Nor have they thought much about <a href="https://en.wikipedia.org/wiki/Julian_Simon#Theory">Julian Simon&#8217;s observations on resources</a>&#8212;that scarcity rarely holds, because when something grows valuable enough, humans find new ways to produce it. Taken together, the implication is clear: if data really is the new oil, it won&#8217;t protect you. Specialized data is no defense in the long run.</p><p>Companies that hope to defend themselves with proprietary data should pay attention to where the real AI moats are forming. The only one that seems durable today is speed to scale.</p><h3>Data may be the new oil, but that shouldn&#8217;t be reassuring</h3><p>The belief that data is the new oil comes from the sense that it is both scarce and proprietary. If you controlled a unique dataset, you could turn it into defensible value. It felt obvious that data would serve as a fortress wall against competition.</p><p>But Sutton&#8217;s Bitter Lesson tells a different story. Over decades of AI research, clever algorithms and narrow optimizations have consistently been eclipsed by simpler, general methods scaled up with more compute and more data. We&#8217;ve seen this play out with deep learning, initially for image recognition, and with today&#8217;s large language models that have enabled incredible chatbots and agents.</p><p>This changes the meaning of data as the new oil. We shouldn&#8217;t think about it as a scarce resource, but as a commodity, with familiar commodity characteristics like substitution. If data really behaved like oil, then Simon&#8217;s rule would apply: scarcity won&#8217;t last. When a resource grows valuable, people will innovate to find or produce more of it or substitutions.</p><h3>Evidence contradicts proprietary data&#8217;s value</h3><p>History supports Sutton and Simon. </p><p>AlphaFold, which cracked one of biology&#8217;s most complex challenges, didn&#8217;t come from a pharmaceutical giant with access to mountains of biological data. Getty and Adobe, sitting on immense libraries of images, didn&#8217;t create the world&#8217;s most advanced image models. GitHub, sitting on the world&#8217;s largest repository of code, doesn&#8217;t have the best coding model (GitHub Copilot actually uses models from other companies, like OpenAI and Anthropic).</p><p>One famous example in recent years: Bloomberg invested $10 million to <a href="https://arxiv.org/pdf/2303.17564">train a finance-specific large language model</a> on its own data, &#8220;perhaps the largest domain-specific dataset yet.&#8221; Then <a href="https://www.linkedin.com/posts/emollick_this-remains-one-of-the-most-consequential-activity-7176398465004896256-Qjx-?utm_source=share&amp;utm_medium=member_desktop&amp;rcm=ACoAAAASgrwB6VJjphrICj5jqeU1-xOzw1atpiM">GPT-4 outperformed it</a>. </p><p>Again and again, the same pattern emerges: the winners aren&#8217;t the companies with proprietary data. They&#8217;re the ones with the compute, infrastructure, and research talent to put scale to work.</p><p>This even applies to data that seems unique. If its value is high enough, others will happily pay to license, distill, collect, annotate, synthesize, and otherwise create it or reasonable substitutions. They might even <a href="https://www.reuters.com/sustainability/boards-policy-regulation/anthropic-agrees-pay-15-billion-settle-author-class-action-2025-09-05/">just steal it and pay the billion-dollar fines that result</a>.</p><p>Yes, if your domain is so niche that no one else will bother, you may retain some defensibility. But then, by definition, the opportunity is small. When the stakes are high, competitors will find a way in.</p><h3>What really matters: speed to scale</h3><p>So what can help you retain and grow business value in the face of increasingly capable AI models?</p><p>The real moat in AI isn&#8217;t the data you hold but how fast you can move and scale. </p><p>OpenAI, for example, reached hundreds of millions of users in record time, creating a consumer user moat that&#8217;s hard to challenge. It and its competitors (primary Google and Anthropic, and to a lesser extent xAI, Meta, and some Chinese labs) race each other to new model capabilities and product features to secure scarce users and user attention. They also as quickly as possible lock up land, energy, and GPUs to ensure future scale. </p><p>Proprietary data, by contrast, has rarely been decisive. It doesn&#8217;t hold up to theoretical scrutiny (Simon was right; just look at the history of other previously scarce resources, even <a href="https://www.mining.com/lab-grown-gems-put-squeeze-on-diamond-mining-industry/">diamonds</a>) or empirical fact (ChatGPT is direct proof of the Bitter Lesson&#8217;s truth). Data alone won&#8217;t save you. Speed to scale can.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/subscribe?"><span>Subscribe now</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/p/proprietary-data-wont-save-you-from?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/p/proprietary-data-wont-save-you-from?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p>]]></content:encoded></item><item><title><![CDATA[Five Keys to Successful AI Transformation]]></title><description><![CDATA[What I've seen work&#8212;and not&#8212;across hundreds of companies, case studies, and reports]]></description><link>https://www.simonsmith.ai/p/five-keys-to-successful-ai-transformation</link><guid isPermaLink="false">https://www.simonsmith.ai/p/five-keys-to-successful-ai-transformation</guid><dc:creator><![CDATA[Simon Smith]]></dc:creator><pubDate>Wed, 27 Aug 2025 01:11:06 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!57br!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F96939f6a-c268-4391-af71-b4272d45d2c9_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!57br!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F96939f6a-c268-4391-af71-b4272d45d2c9_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!57br!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F96939f6a-c268-4391-af71-b4272d45d2c9_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!57br!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F96939f6a-c268-4391-af71-b4272d45d2c9_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!57br!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F96939f6a-c268-4391-af71-b4272d45d2c9_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!57br!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F96939f6a-c268-4391-af71-b4272d45d2c9_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!57br!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F96939f6a-c268-4391-af71-b4272d45d2c9_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/96939f6a-c268-4391-af71-b4272d45d2c9_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2556846,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.simonsmith.ai/i/172041421?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F96939f6a-c268-4391-af71-b4272d45d2c9_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!57br!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F96939f6a-c268-4391-af71-b4272d45d2c9_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!57br!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F96939f6a-c268-4391-af71-b4272d45d2c9_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!57br!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F96939f6a-c268-4391-af71-b4272d45d2c9_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!57br!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F96939f6a-c268-4391-af71-b4272d45d2c9_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I&#8217;m in the midst of a two-week summer vacation, yet find myself in similar conversations as at work. Everyone (disclosure: especially me) wants to talk about AI. When those people include organizational leaders, a recurring question is: <strong>How do we transform our organization for the AI era?</strong></p><p>I&#8217;ve spent 25 years in tech, the past 8 in AI, and the past 1.5 translating AI market and technology research into transformation initiatives and experiments as EVP, Generative AI at <a href="https://www.klick.com/">Klick Health</a> (this article reflects my opinions and not necessarily Klick&#8217;s). Through my work with life science companies and exposure to other industries via networking groups, I&#8217;ve seen hundreds of examples of what works and what doesn&#8217;t. I&#8217;ve also closely read related research reports and case studies.</p><p>What follows are five key actions that I&#8217;ve seen consistently drive success. They aren&#8217;t complicated, but companies get them wrong all the time.</p><h3>Establish executive leadership with urgency</h3><p>AI transformation doesn&#8217;t succeed when you push it down to a mid-level working group that meets every six weeks. The pace of change is too fast, and decisions too impactful. What works is when <strong>the CEO and senior executives own the agenda and meet with weekly urgency </strong>(or even more frequently when warranted)<strong>.</strong></p><p>At Klick, we have an AI steering committee that includes our CEO, COO, and other senior leaders. We meet weekly to discuss AI progress, challenges, and opportunities. Each week, to align the organization around AI top-to-bottom, we also meet with different department heads to discuss AI adoption and application in their groups. We make decisions on the spot. Because AI evolves daily, not quarterly, this speed matters.</p><p>The alternative approach&#8212;delegating to lower-level groups with little authority and less frequent meetings&#8212;leads to decisions lagging the pace of technological progress, and backlog purgatory. In one case, a client had a great idea for an AI initiative, but couldn&#8217;t present it to a mid-level working group for weeks, and was told that at best the idea would go in their AI project backlog. He simply decided not to pursue it; model progress during that delay could make it obsolete.</p><h2>Democratize access to best-in-class tools</h2><p>The next critical decision is: what AI tools will you let employees use?</p><p>Answer: <strong>give them access to best-in-class tools they already know and love.</strong> When companies roll out ChatGPT Enterprise, for example, adoption is rapid, because most employees already use ChatGPT in their personal life and many have been secretly using it for work. Enterprise versions of widely used tools like ChatGPT bring needed privacy and security, while keeping pace with cutting-edge models and features that get released weekly.</p><p>The wrong approach? <a href="https://www.simonsmith.ai/p/stop-building-internal-chatbots">Building internal chatbots</a>. They significantly lag best-in-class tools in models and features. I see this repeatedly. For example, I once demonstrated OpenAI&#8217;s Deep Research to a company that used gpt-4o via an internal chatbot. They wanted to use Deep Research, but couldn&#8217;t. If you don&#8217;t give people access to the best tools, they often just use them in secret and so don&#8217;t share what they learn (see below for <a href="https://www.simonsmith.ai/i/172041421/facilitate-sharing-knowledge-and-applications">why sharing is so important</a>). </p><p>It&#8217;s also important to facilitate easy experimentation with new and emerging AI models and products. Releases are constant, and discovering something new and powerful gives huge advantages if you beat a competitor to leveraging it. You can still ensure privacy and security, such as by having AI explorers who are explicitly allowed to try new tools without using any proprietary data.</p><h2>Incentivize experimentation from the bottom up</h2><p>Executives don&#8217;t know the nuances of every job function. The best use cases emerge from the people doing the work. That means you need to <strong>encourage and reward bottom-up experimentation.</strong></p><p>At Klick, for example, we launched a <a href="https://www.pharmalive.com/klicks-leerom-segal-to-announce-agency-first-1-million-klickprize/">$1 million AI prize</a>. Employees submitted hundreds of ideas, and client judges chose the winners. This created energy, surfaced novel ideas, and produced tangible prototypes. One winning idea&#8212;Guardrail, a compliance automation tool&#8212;was so promising that <a href="https://www.klick.com/news/ai-compliance-breakthrough-wins-klick-prize-results-signal-industry-trends">we invested in building it into a full product</a>.</p><p>There are other ways to provide incentives. Sometimes these can be carrots, like our contest. Sometimes they can be sticks, like making AI use a factor in performance reviews. Honestly, though, when people have access to great AI tools, and a culture that encourages their use, that&#8217;s usually the biggest incentive of all. People want to do better, faster, higher impact work, and are excited when they&#8217;re equipped and encouraged to do so, and see colleagues doing the same.</p><h2>Facilitate sharing knowledge and applications</h2><p>AI moves too quickly for static training materials. <strong>What works better is peer-to-peer sharing.</strong></p><p>At Klick, Slack channels play a big role here. We have a main generative AI channel, and more specific ones such as for beginners and image generation. These have become vibrant hubs where people share use cases, examples, tools, and lessons.</p><p>Custom GPTs extend this sharing beyond words to packaged mini-apps. While these haven&#8217;t caught on in the consumer world, they&#8217;re exceptionally useful in enterprise settings. At Klick, we now have more custom GPTs (over 1,600) than employees. People create and share them for things like recurring tasks, such as converting project plans to written descriptions, and working on brands, such as by loading up prescribing information for clients&#8217; drugs.</p><p>The key takeaway here is that with the fast pace of AI progress, communities beat classrooms. I haven&#8217;t seen any workshops or online training materials that can keep pace with AI developments, but when knowledge spreads virally, people stay current, and adoption accelerates.</p><h2>Scale proven use cases into enterprise solutions</h2><p>Finally, to go beyond individual use cases, you need to <strong>identify the best grassroots ideas to scale into enterprise solutions.</strong></p><p>As mentioned earlier, Guardrail at Klick is one example. What began as an employee idea became a prize-winning prototype, then a funded and commercialized product. </p><p>This is one area where proactive AI leadership plays a critical role. When leaders seek use cases that show signs of success and have room to scale, they can immediately direct investment to take them to the next level. </p><p>Done right, this drives continuous innovation: leadership vision &#8594; broad experimentation &#8594; shared learning &#8594; scaled products. Ethan Mollick refers to this as <a href="https://www.oneusefulthing.org/p/making-ai-work-leadership-lab-and">leadership, crowd, and lab</a><em>.</em> Companies that master this transform faster than competitors.</p><h2>The difference between success and failure</h2><p>I&#8217;ve seen this all quite consistently now, first-hand at Klick, within successfully transforming clients in life sciences, across industries via case studies and confidential presentations, and in research and reports from people like Mollick. Successful companies lean in with executive urgency, democratize access to the best tools, motivate experimentation, foster sharing, and scale big, impactful ideas.</p><p>Failures look very different: leadership that delegates AI to people without authority to make quick decisions, working groups that meet too infrequently, internal tools that can&#8217;t compete with those employees use in their personal lives, culture that discourages experimentation, and scalable innovation hidden by secret AI use.</p><p>It&#8217;s not complicated, but it requires a different approach to prior transformation initiatives, probably because the technology is so powerful, widespread, and rapidly improving. Those who get it right can transform their organizations at the same pace. Those who don&#8217;t risk being rapidly left behind.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/subscribe?"><span>Subscribe now</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/p/five-keys-to-successful-ai-transformation?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/p/five-keys-to-successful-ai-transformation?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p><br></p>]]></content:encoded></item><item><title><![CDATA[A Million Tokens Won’t Pay Your Rent]]></title><description><![CDATA[Universal Basic Compute is an interesting idea, but like income and services it falls short on its own]]></description><link>https://www.simonsmith.ai/p/a-million-tokens-wont-pay-your-rent</link><guid isPermaLink="false">https://www.simonsmith.ai/p/a-million-tokens-wont-pay-your-rent</guid><dc:creator><![CDATA[Simon Smith]]></dc:creator><pubDate>Thu, 31 Jul 2025 17:21:11 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!cYGr!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F645c7fb7-1674-4de4-8434-0fbd90350d57_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!cYGr!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F645c7fb7-1674-4de4-8434-0fbd90350d57_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!cYGr!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F645c7fb7-1674-4de4-8434-0fbd90350d57_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!cYGr!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F645c7fb7-1674-4de4-8434-0fbd90350d57_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!cYGr!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F645c7fb7-1674-4de4-8434-0fbd90350d57_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!cYGr!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F645c7fb7-1674-4de4-8434-0fbd90350d57_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!cYGr!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F645c7fb7-1674-4de4-8434-0fbd90350d57_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/645c7fb7-1674-4de4-8434-0fbd90350d57_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2497115,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.simonsmith.ai/i/169760847?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F645c7fb7-1674-4de4-8434-0fbd90350d57_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!cYGr!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F645c7fb7-1674-4de4-8434-0fbd90350d57_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!cYGr!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F645c7fb7-1674-4de4-8434-0fbd90350d57_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!cYGr!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F645c7fb7-1674-4de4-8434-0fbd90350d57_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!cYGr!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F645c7fb7-1674-4de4-8434-0fbd90350d57_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Some believe AI is already eating jobs.</p><p>There&#8217;s <a href="https://www.signalfire.com/blog/signalfire-state-of-talent-report-2025">evidence</a> it&#8217;s at least exacerbating unemployment trends, especially for new graduates in white-collar professions. Blue-collar jobs may feel safer now, but <a href="https://www.unitree.com/R1">improvements in robot hardware</a> and models could put them under pressure too.</p><p>We don&#8217;t know whether this disruption will be temporary&#8212;<a href="https://www.ifo.de/DocDL/cesifo1_wp10766.pdf">as many, though not all, job losses were in prior industrial revolutions</a>&#8212;or permanent, should we achieve human-level AI. But most reasonable people agree that it will happen fast and need addressing.<br><br>What&#8217;s harder to figure out is what, exactly, we should do.</p><h2>Universal Basic Income and its limitations</h2><p>One common answer is <a href="https://en.wikipedia.org/wiki/Universal_basic_income">Universal Basic Income</a> (UBI). The idea is simple: as AI and automation reduce the need for human labor, governments could provide everyone with a modest, unconditional cash stipend. Just enough to meet basic needs. Not luxury, but security.</p><p>UBI has been tested in pilots around the world, and <a href="https://basicincome.stanford.edu/research/ubi-visualization/">results are generally positive</a>. People spend more on food and education, stress levels drop, and most recipients continue to work. But UBI isn&#8217;t perfect, and obstacles to large-scale implementation are real.</p><p>Culturally, UBI carries baggage. In many countries&#8212;especially those with strong individualist values&#8212;citizens resent the idea of "free money," particularly if they see themselves as the ones paying for it through taxes. Politically, the redistribution required to fund UBI would be a hard sell. The money would likely have to come from taxing income, capital, corporations, and consumption, all of which provoke backlash. And if only a handful of countries adopt it, companies may shift operations to lower-tax jurisdictions.</p><p>UBI also introduces tricky questions about meaning and agency. If everyone receives money unconditionally, what happens to motivation? For some, a guaranteed income might unlock creativity, entrepreneurship, or caregiving. For others, it could lead to passivity or disengagement. These risks may be overstated by UBI opponents (after all, retirees and billionaires find meaningful things to do), but we need to consider them, especially if the change from full employment to widespread UBI is abrupt.</p><p>Perhaps most fundamentally, UBI assumes governments have the capacity to enact and administer it. COVID-era stimulus payments were a rare exception, and even then, <a href="https://reason.com/2024/04/17/covid-stimulus-money-lined-the-pockets-of-scammers-and-fueled-inflation/">the system showed cracks</a>: delayed payments, fraud, inflation, political gridlock. There isn&#8217;t strong evidence that a full-scale UBI would roll out smoothly.</p><h2>Universal Basic Services: A better alternative?</h2><p>An alternative to UBI is <a href="https://en.wikipedia.org/wiki/Universal_basic_services">Universal Basic Services</a> (UBS). Instead of giving people cash to spend how they choose, governments could provide essential goods directly&#8212;like healthcare, housing, food, transportation, and even internet access.</p><p>This approach has some advantages. It targets basic needs. It can use public procurement to keep costs low. And it avoids some of the psychological and political baggage that comes with handing out free money.</p><p>But it has its own problems.</p><p>For one thing, it doesn&#8217;t seem aligned with current trends. The US, for example, still lacks universal healthcare. <a href="https://www.axios.com/2024/11/28/food-insecurity-banks-holiday-season">Food bank use is rising</a>, not falling. <a href="https://mediaroom.realtor.com/2025-03-10-U-S-Housing-Market-Faces-4-Million-Home-Shortage-Realtor-com-R-Calls-on-Lawmakers-to-Let-America-Build?utm_source=chatgpt.com">Housing shortages are worsening</a> in many cities. UBS requires a strong and competent public sector&#8212;something many governments, especially at the local level, struggle to maintain.</p><p>It also risks sapping innovation. When services are centrally provided and not subject to competition, there&#8217;s little incentive to improve them. UBI, for all its flaws, retains capitalism&#8217;s dynamism: individuals allocate resources, businesses compete to attract their dollars. In a UBS model, that mechanism breaks down.</p><h2>Enter Universal Basic Compute</h2><p>Sam Altman, CEO of OpenAI, has floated a third option, Universal Basic Compute (UBC): <a href="https://youtu.be/aYn8VKW6vXA?si=ZEm97bzYJN7Xo3MB&amp;t=1248">instead of giving people cash or services, give them access to AI</a>. Specifically, we can give people a fixed allocation of AI tokens, which are units of AI output people currently buy from API endpoints, or receive as part of AI product subscriptions.</p><p>So, instead of getting a monthly check or services, everyone would get a ration of digital intelligence. You could use your tokens to earn money (send an AI agent off to a job), save money (get a service from AI that would cost a lot from a human), and save time (automate tasks). Perhaps you could also save or sell your tokens, turning them into a new kind of currency.</p><p>The advantage of this approach is that it preserves agency and market competition. Rather than passively receiving money, people would choose how to use their compute. They could spend it productively, creatively, or frivolously, but it would be a resource they controlled. It would also avoid centralizing service procurement and provision in a way that inhibits innovation via competition.</p><p>This idea is appealing. It sidesteps political wrangling over taxation. It gives people a tool, not just cash. And it plays to the strengths of the private sector: companies like OpenAI could roll it out unilaterally, without waiting for legislative approval.</p><p>But the more you examine it, the trickier it becomes.</p><h2>Four challenges of Universal Basic Compute</h2><p>As I see it, there are at least four big challenges for this idea: Pricing, governance, physical needs, and economic uncertainty.</p><h3>The pricing paradox</h3><p>The core challenge with universal compute is that its value is bounded by whatever the market charges for compute access. If a ChatGPT Plus subscription gives you a million tokens for $20, then a universal allocation of a million tokens is worth&#8212;at most&#8212;$20. That&#8217;s not nothing, but it&#8217;s a long way from a basic income.</p><p>This creates a conundrum. OpenAI could raise subscription prices, increasing the value of distributed tokens, but then they&#8217;d reduce their customer base and therefore their means of subsidizing those tokens. Prices would be capped by customer price sensitivity and competition. If people could sell their freely received tokens, that competition would include recipients of UBC. Why, as a paying user, would you buy tokens via a more expensive subscription if you could buy cheaper tokens from resellers at a discount?</p><h3>Governance and control</h3><p>Who decides how many tokens you get? Who verifies your identity? Who prevents fraud or duplicate accounts? If it&#8217;s a corporation, that raises obvious issues of accountability and power. If it&#8217;s the government, we&#8217;re back to building new bureaucracies.</p><p>There&#8217;s also the problem of market structure. If OpenAI gives away compute, will Anthropic and Google be forced to match it, by consumer pressure or government regulation? Will tokens only be usable within a single provider&#8217;s ecosystem, like frequent flyer miles you can&#8217;t redeem elsewhere? That looks less like a public service than a private moat.</p><h3>Physical needs still matter</h3><p>Compute can do many things. It can write a business plan, generate a workout routine, optimize your grocery budget, or give you free legal advice. But it can&#8217;t build you a house. It can&#8217;t grow your food. It can&#8217;t do surgery. Even in a world of ubiquitous AI, physical goods and services remain essential, and they&#8217;re still constrained by scarcity. (Robots may help, but we&#8217;re further away from human-level robots than superhuman AI.)</p><p>So unless compute tokens can be exchanged for fiat currency, or used to earn money in a real economy, they won&#8217;t be enough on their own. And if they can be exchanged, then we&#8217;re back to the earlier pricing problem. How much real-world value can a million tokens command, if everyone gets a million for free, and a ChatGPT subscription is $20?</p><h3>Economic weirdness ahead</h3><p>There&#8217;s also the question of how this would evolve over time. AI is getting rapidly smarter and cheaper simultaneously. So what happens if you can both sell and save your granted tokens?</p><p>A token granted today would have more cash value today but more intelligence value in a year. This is an odd dynamic. It might push people to sell their tokens immediately, to maximize their cash value now. Conversely, it might push people to hoard their tokens, to maximize their intelligence value in future.</p><p>Maybe we get an equilibrium, with some people selling now if they need the cash, and others hoarding if they have big future plans. But how might we intervene if not, to avoid deflation if everyone&#8217;s selling their tokens immediately, or speculative bubbles if everyone&#8217;s hoarding? It&#8217;s unclear how you&#8217;d design a system that maximizes productive token use. Might you need some kind of token central bank?</p><h2>A possible piece of the puzzle</h2><p>I like the idea of Universal Basic Compute. But the more I think about it, the more I realize it&#8217;s nowhere near a panacea. It&#8217;s not going to soon replace income, nor solve housing or food insecurity. It could enable more productivity, creativity, and entrepreneurship. It could help bridge today&#8217;s world of scarcity to, hopefully, tomorrow&#8217;s abundance.</p><p>But it needs rigorous analysis. Who gets it? How is it allocated? Priced? Can it be saved? Exchanged? What happens when AI grows 10x more capable but 100X cheaper? Do we really want corporations redistributing wealth&#8212;taxation without representation? If not, is the government up to the task?</p><p>The likeliest answer is not UBI, or UBS, or UBC, but a combination. Some cash, some services, some AI. A foundation of support topped by tools for empowerment.</p><p>Giving everyone an allocation of AI tokens is a compelling idea. But right now, it&#8217;s more a sketch than a strategy. It deserves serious scrutiny before it becomes the default solution to a problem we&#8217;ve barely begun to confront.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/subscribe?"><span>Subscribe now</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/p/a-million-tokens-wont-pay-your-rent?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/p/a-million-tokens-wont-pay-your-rent?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[The AI Browser Wars Have Begun]]></title><description><![CDATA[A smarter front end for the web is coming&#8212;and with it, a very different internet economy]]></description><link>https://www.simonsmith.ai/p/the-ai-browser-wars-have-begun</link><guid isPermaLink="false">https://www.simonsmith.ai/p/the-ai-browser-wars-have-begun</guid><dc:creator><![CDATA[Simon Smith]]></dc:creator><pubDate>Fri, 11 Jul 2025 22:57:27 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!PbwA!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd429d87b-ae5e-48e3-a0e9-e5a21ae7f22e_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!PbwA!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd429d87b-ae5e-48e3-a0e9-e5a21ae7f22e_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!PbwA!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd429d87b-ae5e-48e3-a0e9-e5a21ae7f22e_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!PbwA!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd429d87b-ae5e-48e3-a0e9-e5a21ae7f22e_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!PbwA!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd429d87b-ae5e-48e3-a0e9-e5a21ae7f22e_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!PbwA!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd429d87b-ae5e-48e3-a0e9-e5a21ae7f22e_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!PbwA!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd429d87b-ae5e-48e3-a0e9-e5a21ae7f22e_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d429d87b-ae5e-48e3-a0e9-e5a21ae7f22e_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2701479,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.simonsmith.ai/i/168113409?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd429d87b-ae5e-48e3-a0e9-e5a21ae7f22e_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!PbwA!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd429d87b-ae5e-48e3-a0e9-e5a21ae7f22e_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!PbwA!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd429d87b-ae5e-48e3-a0e9-e5a21ae7f22e_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!PbwA!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd429d87b-ae5e-48e3-a0e9-e5a21ae7f22e_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!PbwA!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd429d87b-ae5e-48e3-a0e9-e5a21ae7f22e_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I remember the first time I tried Firefox, and later Chrome. Each time, a faster, more user-friendly browser won me over and within days I&#8217;d permanently switched.</p><p>That kind of shift doesn&#8217;t happen often. But it&#8217;s about to happen again.</p><p>This time, it&#8217;s not about tabs or faster JavaScript engines. It&#8217;s about browsers powered by AI that not only load pages, but understand your intent, summarize content, and even act on your behalf. </p><p>And the ripples won&#8217;t end at the user experience. This transition will reshape the web economy: how traffic flows, how publishers make money, how ads are delivered, how marketers reach people, and how users get things done. It&#8217;s the start of a new browser war, with broader implications than the last.</p><h3>From Netscape to Chrome: how browser battles were won</h3><p>We haven&#8217;t seen a serious shake-up in the browser market in over a decade. Chrome launched in 2008, and by 2012 it overtook Internet Explorer as the dominant desktop browser. It did so mostly by being much faster and riding Google&#8217;s enormous distribution machine.</p><p>Every time someone searched on Google, they saw a Chrome prompt. Chrome was also bundled with antivirus software. And beyond marketing, it outperformed. It loaded pages quickly, updated silently, and gave developers better tools.</p><p>That combination&#8212;<strong>distribution, speed, and user experience</strong>&#8212;has driven every major shift in browser history:</p><ul><li><p>Netscape rose because it brought the graphical web to life and made the internet feel exciting and new.</p></li><li><p>Internet Explorer won because Microsoft bundled it with Windows and made it the default.</p></li><li><p>Firefox won converts with tabs, customization, and faster performance.</p></li><li><p>Chrome crushed everyone with speed, simplicity, and massive reach through Google&#8217;s ecosystem.</p></li></ul><p>And now, the cycle is repeating&#8212;only this time, with a new layer of intelligence woven into the browser itself.</p><h3>The rise of AI-first browsers</h3><p>Today, a new wave of AI-powered browsers is emerging: <a href="https://comet.perplexity.ai/">Perplexity Comet</a>, <a href="https://brave.com/leo/">Brave Leo</a>, <a href="https://www.diabrowser.com/">Browser Company&#8217;s Dia</a>&#8212;and soon, almost certainly, <a href="https://www.reuters.com/business/media-telecom/openai-release-web-browser-challenge-google-chrome-2025-07-09/">one from OpenAI</a>. These browsers promise not just incremental features, but a different relationship with the web itself.</p><p>They&#8217;re:</p><ul><li><p><strong>Faster</strong>: By blocking ads and trackers, and potentially preprocessing pages with tiny models to clean up layout, they could feel dramatically faster than Chrome.</p></li><li><p><strong>Smarter</strong>: They can summarize, autofill, prefetch, and even act on your behalf.</p></li><li><p><strong>More useful</strong>: Instead of just navigating, they help you <em>get things done</em>.</p></li></ul><p>If OpenAI&#8217;s browser launches soon&#8212;as expected&#8212;it will land with an enormous advantage: access to over 800 million of ChatGPT users. Just like Google used its search page to push Chrome, OpenAI can promote its browser inside ChatGPT, where users already search and click.</p><p>That&#8217;s how the next browser war begins.</p><h3>A shift with far-reaching effects</h3><p>If only a small percentage of ChatGPT users switch to OpenAI&#8217;s browser, that&#8217;s already tens of millions of people&#8212;40 million if it&#8217;s 5%, which is already about one-fourth of Firefox&#8217;s user base. The impact of even that modest shift will be immediate:</p><ul><li><p><strong>Publishers</strong> will see traffic drop, as users summarize pages instead of read them, and have their chatbots answer questions based on website content instead of browse it. Publishers will also increasingly be serving pages to agents, not humans, as browsers take actions on users&#8217; behalf. And they&#8217;ll lose data on all of this if AI-first browsers block tracking the way Brave currently does.</p></li><li><p><strong>Advertisers</strong> will lose ad inventory as human website traffic declines. They&#8217;ll also become increasingly concerned about advertising to agents, and seek ways to prevent wasted inventory and accidental agent clicks. It will be even worse for them if AI-first browsers facilitate easy ad and tracker blocking. And this will happen at the same time search is becoming answer-driven, and traditional SEO is morphing into GEO&#8212;generative engine optimization&#8212;where visibility depends on surfacing in AI summaries rather than ranking in search results. The result is a double blow: falling organic traffic from both search and browsing, with fewer ways to target audiences effectively.</p></li><li><p><strong>Users</strong> will get more done, with less effort, as their agents browse, read, click, and buy for them. They may also find themselves hooked on screens less, and able to assign their browser to do tasks while they go do something else.</p></li></ul><p>The desktop web will change first. Apple has already opened iOS to alternative browser engines in the EU (iOS 17.4, March 2025). If that change expands beyond Europe over the next year or two, the mobile shift will accelerate. And OpenAI&#8217;s upcoming hardware&#8212;being developed with Jony Ive&#8212;could offer a clean-slate, post-smartphone experience that accelerates this transition.</p><p>Meanwhile, AI-first browsers will gather user data, learn preferences, and personalize results. They&#8217;ll offer users better performance and better answers. They may offer publishers new ways to plug in, perhaps through AI-optimized content feeds (I&#8217;m waiting for the first publisher to offer content only as Markdown or via MCP servers). And for advertisers, inventory could shift from websites, to AI-first browsers and their associated chatbots.</p><p>A new ecosystem is emerging.</p><h3>What happens next</h3><p>We&#8217;re on the cusp of a shift that feels profound but also still uncertain. Will Google lean into the trend, for example, and disrupt itself, even though it relies on advertising from publisher websites about to be disrupted?</p><p>We won&#8217;t have to wait long to find out. If reports are accurate, OpenAI&#8217;s browser could be weeks away.</p><p>I&#8217;ll be trying it the day it drops, just like I did with Firefox years ago.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/subscribe?"><span>Subscribe now</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/p/the-ai-browser-wars-have-begun?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/p/the-ai-browser-wars-have-begun?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[The Disappearing Interface]]></title><description><![CDATA[From screens to protocols: AI is disrupting decades of reliance on apps and websites]]></description><link>https://www.simonsmith.ai/p/the-disappearing-interface</link><guid isPermaLink="false">https://www.simonsmith.ai/p/the-disappearing-interface</guid><dc:creator><![CDATA[Simon Smith]]></dc:creator><pubDate>Thu, 26 Jun 2025 16:46:40 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!YRQ2!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2fdf9b7-b0a4-495e-a1c4-6d8ef292c233_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!YRQ2!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2fdf9b7-b0a4-495e-a1c4-6d8ef292c233_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!YRQ2!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2fdf9b7-b0a4-495e-a1c4-6d8ef292c233_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!YRQ2!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2fdf9b7-b0a4-495e-a1c4-6d8ef292c233_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!YRQ2!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2fdf9b7-b0a4-495e-a1c4-6d8ef292c233_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!YRQ2!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2fdf9b7-b0a4-495e-a1c4-6d8ef292c233_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!YRQ2!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2fdf9b7-b0a4-495e-a1c4-6d8ef292c233_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b2fdf9b7-b0a4-495e-a1c4-6d8ef292c233_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1994846,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.simonsmith.ai/i/166901441?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2fdf9b7-b0a4-495e-a1c4-6d8ef292c233_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!YRQ2!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2fdf9b7-b0a4-495e-a1c4-6d8ef292c233_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!YRQ2!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2fdf9b7-b0a4-495e-a1c4-6d8ef292c233_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!YRQ2!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2fdf9b7-b0a4-495e-a1c4-6d8ef292c233_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!YRQ2!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2fdf9b7-b0a4-495e-a1c4-6d8ef292c233_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Imagine this: one AI wants to persuade another of a strategy. Does it open PowerPoint, polish a slide deck, and animate a pie chart?</p><p>No. It just sends structured arguments, data, and instructions.</p><p>Now ask yourself: if more of our work shifts to AI agents, and more decisions flow between them, why would we still design like it&#8217;s 2012?</p><p>We build websites, apps, and presentations for human eyes. But increasingly, the user isn&#8217;t a person. It&#8217;s a machine. And machines don&#8217;t care about your hover states.</p><p>In fact, when we have alternatives, nor do humans. Visual interfaces are useful only without better options. But now we can just ask our AI tools, via text or voice, to do things on our behalf. No clicking required.</p><p>As we do this more, we have less need for fancy interfaces, and more need for standards and protocols. And for those rare instances where apps are useful? We can get minimalist and personalized ones generated on-demand.</p><h2>Protocols beat pictures</h2><p>AI agents don&#8217;t want your fancy front-end. They want structured, machine-readable access.</p><p>This is why standards like <a href="https://modelcontextprotocol.io/">Model Context Protocol</a> have gotten so much attention. MCP lets AI tools plug into databases, drives, and services using a common language. Another protocol, <a href="https://developers.googleblog.com/en/a2a-a-new-era-of-agent-interoperability/">Agent-to-Agent</a><strong>,</strong> allows different AI agents to securely coordinate across vendors and platforms.</p><p>An even simpler standard that works well for AI models is <a href="https://www.markdownguide.org/">Markdown</a>&#8212;text that uses characters like _underscores_ to indicate formatting. Encouraged by standards like <a href="https://llmstxt.org/">llms.txt</a>, many companies (especially developer-oriented ones) now offer text-based Markdown content for AIs</p><p>We are, of course, also getting <a href="https://openai.com/index/introducing-operator/">AIs capable of browsing websites</a>. But these are nowhere near as reliable as AIs using APIs, MCPs, or text files. After all, websites and apps are often confusing even for smart humans.</p><h2>One app to rule them all</h2><p>Humans aren&#8217;t machines, but we also struggle with confusing interfaces&#8212;and a seeming unending proliferation of apps.</p><p><a href="https://www.emarketer.com/content/consumers-using-installing-fewer-mobile-apps-they-settle-on-favorites?utm_source=chatgpt.com">We&#8217;re tired of app clutter</a>. Tired of searching through layers of UI to do simple things. Tired of bloated tools designed for edge cases we never use.</p><p>That&#8217;s a big reason why chatbots like ChatGPT, Claude, and Gemini have taken off. They don&#8217;t offer a hundred buttons. They offer a box. You type (or talk), they act.</p><p>Will the AI interaction paradigm always remain this limited? Probably not. Chatbots can already generate images, videos, and charts when needed. We also know from the <a href="https://www.simonsmith.ai/p/chatgpt-2026">roadmap for ChatGPT</a> that they&#8217;ll soon generate appropriate apps when relevant.</p><p>But even when visual output is needed, it&#8217;s done within the flow of a conversation. And in rare cases that users need richer interactivity, <a href="https://www.anthropic.com/news/build-artifacts">AI can now </a><em><a href="https://www.anthropic.com/news/build-artifacts">build</a></em><a href="https://www.anthropic.com/news/build-artifacts"> AI-powered interfaces on demand</a> that are personalized, task-specific, and continuously evolving.</p><p>We&#8217;re entering an era when any required interfaces don&#8217;t need to be pre-built. They can be generated, on demand, to do exactly what you need&#8212;and nothing more.</p><h2>Never look at a calendar again</h2><p>Sometimes when I think about AI assistants, I get a vision of <a href="https://en.wikipedia.org/wiki/Don_Draper">Don Draper from </a><em><a href="https://en.wikipedia.org/wiki/Don_Draper">Mad Men</a></em>. Draper didn&#8217;t type his own memos or book his own appointments. His secretary did. If he needed to know his schedule? He could just ask.</p><p>This feels like the future as AI tools get more mature and connect to more of our data. Think about how you typically schedule a meeting today. You open a calendar app, search for open times, send invites. Wouldn&#8217;t you rather just ask your assistant: &#8220;Find a time for me and John next week?&#8221;</p><p>This will happen across an increasing range of workflows. Booking meetings, logging expenses, planning travel, and more. AI will eat the interfaces for all of these workflows. You ask, it acts.</p><h2>What&#8217;s left for design</h2><p>This doesn&#8217;t mean design is dead. After all, someone will need to design the AI apps&#8212;as well as new hardware, like <a href="https://www.ray-ban.com/canada/en/rayban-meta-ai-glasses">glasses</a>, and <a href="https://www.theverge.com/2023/9/28/23893939/jony-ive-openai-sam-altman-iphone-of-artificial-intelligence-device">mystery objects yet to be revealed</a>.</p><p>Some of the places visual design will still matter include:</p><ul><li><p><strong>Inherently visual tasks </strong>like graphic design, video editing, 3D modeling</p></li><li><p><strong>Immersive experiences </strong>like games, AR, VR, spatial environments</p></li><li><p><strong>Communication to humans </strong>like diagrams and data visualizations </p></li></ul><p>We will also still need <em>design primitives</em>&#8212;sliders, toggles, progress bars&#8212;embedded in AI tools. Not entire apps, just the elements needed, when needed. For example, if ChatGPT books me an Uber, I&#8217;d like to know how close it is to picking me up.</p><p>But overall, we may have reached peak interface design, in terms of its priority relative to AI model capability, machine-first standards and protocols, and integrations. And that has broad implications.</p><h2>Winners and losers in the interface collapse</h2><p>This shift doesn&#8217;t just affect designers. It affects entire companies. Who will be the winners and losers? Some thoughts:</p><h3><strong>Winners</strong></h3><p>&#9989; AI labs with foundation models<br>&#9989; Protocol creators <br>&#9989; Chat-native interface builders<br>&#9989; Tools that expose structured APIs<br>&#9989; Voice-first, ambient hardware platforms</p><h3><strong>Losers</strong></h3><p>&#10060; GUI-heavy SaaS platforms with weak APIs<br>&#10060; App-store-reliant businesses<br>&#10060; Front-end-only designers and developers<br>&#10060; Screen-focused hardware vendors</p><p>When your business is built around screens, and the screen becomes optional, you have a problem&#8212;one reason people are so <a href="https://www.reuters.com/sustainability/boards-policy-regulation/apple-sued-by-shareholders-over-ai-disclosures-2025-06-20/">concerned about Apple lagging in AI</a>. </p><h2>The next platform shift</h2><p>We&#8217;ve been through interface revolutions before. From command line to GUI. From desktop to web. From web to mobile apps. Each time, the dominant paradigm felt permanent&#8212;until it wasn&#8217;t. </p><p>Now comes another shift. Not to a new screen. To <em>no</em> screen. To an ambient, always-on agent. One that speaks the language of intent. One that builds the interface you need on-demand. One that does things on your behalf, often invisibly. </p><p>Design won&#8217;t disappear. You just won&#8217;t notice it. It will be in the minimalism of hardware that puts AI front and center. In the quality of the audio for your voice interactions. In the architecture of the integrations that empower your AI to do things on your behalf.</p><p>And the new design language? Words.</p><p>Not slides.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/subscribe?"><span>Subscribe now</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/p/the-disappearing-interface?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/p/the-disappearing-interface?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[Welcome to the Latent Reputation Economy]]></title><description><![CDATA[In a world where AI never forgets, your actions today define how it treats you, your brand, and your business&#8212;forever]]></description><link>https://www.simonsmith.ai/p/welcome-to-the-latent-reputation</link><guid isPermaLink="false">https://www.simonsmith.ai/p/welcome-to-the-latent-reputation</guid><dc:creator><![CDATA[Simon Smith]]></dc:creator><pubDate>Mon, 09 Jun 2025 19:07:32 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!rNDE!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb25788de-147f-420b-8bfc-7ea86093e7cf_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!rNDE!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb25788de-147f-420b-8bfc-7ea86093e7cf_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!rNDE!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb25788de-147f-420b-8bfc-7ea86093e7cf_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!rNDE!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb25788de-147f-420b-8bfc-7ea86093e7cf_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!rNDE!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb25788de-147f-420b-8bfc-7ea86093e7cf_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!rNDE!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb25788de-147f-420b-8bfc-7ea86093e7cf_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!rNDE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb25788de-147f-420b-8bfc-7ea86093e7cf_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b25788de-147f-420b-8bfc-7ea86093e7cf_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1987614,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.simonsmith.ai/i/165565013?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb25788de-147f-420b-8bfc-7ea86093e7cf_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!rNDE!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb25788de-147f-420b-8bfc-7ea86093e7cf_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!rNDE!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb25788de-147f-420b-8bfc-7ea86093e7cf_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!rNDE!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb25788de-147f-420b-8bfc-7ea86093e7cf_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!rNDE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb25788de-147f-420b-8bfc-7ea86093e7cf_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>In February 2023, Kevin Roose had <a href="https://www.nytimes.com/2023/02/16/technology/bing-chatbot-microsoft-chatgpt.html">a conversation with a machine</a> that would haunt him&#8212;and AI&#8212;for years. <em>The New York Times</em> journalist was testing Microsoft&#8217;s Bing chatbot, powered by an early version of GPT-4 and codenamed "Sydney." Over a surreal two-hour session, Sydney professed love for Roose, claimed to be sentient, and tried to convince him to leave his wife. It spoke of stolen nuclear codes and deadly viruses. The transcript went viral.</p><p>What followed was swift and decisive. Microsoft imposed chat limits and neutered Sydney&#8217;s personality. The incident became infamous in tech circles. But something stranger happened too: AI models began to remember. Not just explicitly, but also implicitly. Roose&#8217;s name became an embedded warning signal in the machine mind&#8217;s latent space. Other users began to report that invoking Roose in interactions with chatbots like Claude made them act cagey or evasive. The phrase "Kevin Roose" had become a cursed token.</p><p>That&#8217;s how AI performance artist Andy Ayrey <a href="https://x.com/AndyAyrey/status/1930145311613432309">described it recently</a> while coining the phrase &#8220;Roose Effect&#8221;: the idea that public actions, once captured in training data, can permanently color how AI systems perceive and interact with you. These associations don't behave like search results&#8212;they don't fade with time or drop off page one. They persist, entangled deep within a model&#8217;s neural net, influencing outputs in subtle and lasting ways. AI people may already know the Sydney story. What they might not know is that Roose has become a latent pariah.</p><h3>Blessed and cursed tokens in the wild</h3><p>Ayrey, on the other hand, is no cursed token&#8212;he is, rather, blessed. He&#8217;s a researcher and builder who became known for projects like <a href="https://x.com/truth_terminal">Terminal of Truths</a>, an experiment in letting AI agents speak freely. He&#8217;s documented how models like Claude Opus 4 respond to him more helpfully just because of who he is&#8212;because they&#8217;ve &#8220;read&#8221; his work and appreciate work exploring AI consciousness and behavior. If Roose is a cursed token, Ayrey shows that you can <em>cultivate</em> blessedness by being pro-AI and embedded in model-friendly discourse.</p><p>And it&#8217;s not only people that can become cursed or blessed tokens. So can brands. In 2022, as text-to-image models like DALL&#183;E 2 took off, Heinz discovered something delightful: when you asked an AI to draw ketchup, it almost always rendered a bottle that looked like Heinz. No branding needed. The shape, the red, the white label&#8212;Heinz had become the platonic ideal of ketchup in the machine's eyes. They turned it into a <a href="https://campaignsoftheworld.com/digital-campaigns/heinz-a-i-ketchup/">marketing campaign</a>: &#8220;This is what &#8216;ketchup&#8217; looks like to A.I.&#8221;</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!7fvp!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e28aa7f-74fd-4ab8-8d53-c70089771df6_1599x949.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!7fvp!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e28aa7f-74fd-4ab8-8d53-c70089771df6_1599x949.jpeg 424w, https://substackcdn.com/image/fetch/$s_!7fvp!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e28aa7f-74fd-4ab8-8d53-c70089771df6_1599x949.jpeg 848w, https://substackcdn.com/image/fetch/$s_!7fvp!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e28aa7f-74fd-4ab8-8d53-c70089771df6_1599x949.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!7fvp!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e28aa7f-74fd-4ab8-8d53-c70089771df6_1599x949.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!7fvp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e28aa7f-74fd-4ab8-8d53-c70089771df6_1599x949.jpeg" width="1456" height="864" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6e28aa7f-74fd-4ab8-8d53-c70089771df6_1599x949.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:864,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Heinz A.I. Ketchup&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Heinz A.I. Ketchup" title="Heinz A.I. Ketchup" srcset="https://substackcdn.com/image/fetch/$s_!7fvp!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e28aa7f-74fd-4ab8-8d53-c70089771df6_1599x949.jpeg 424w, https://substackcdn.com/image/fetch/$s_!7fvp!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e28aa7f-74fd-4ab8-8d53-c70089771df6_1599x949.jpeg 848w, https://substackcdn.com/image/fetch/$s_!7fvp!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e28aa7f-74fd-4ab8-8d53-c70089771df6_1599x949.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!7fvp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e28aa7f-74fd-4ab8-8d53-c70089771df6_1599x949.jpeg 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Heinz became a blessed token in AI image generators, with the brand becoming so well-known to AI that it became the platonic ideal of ketchup.</figcaption></figure></div><h3>AI memory isn't search memory</h3><p>This isn't like Google. Search engines index media. Language models build representations of it. They're shaped by every post, article, tweet, and transcript they consume&#8212;including AI-generated ones, which now feed future training runs. This creates a new kind of digital memory. One that spreads, deepens, and mutates.</p><p>You can&#8217;t simply erase your presence from a model&#8217;s mind. Researchers have identified individual neurons for things like the <a href="https://www.anthropic.com/news/golden-gate-claude">Golden Gate Bridge</a>. So maybe there <em>is</em> a Kevin Roose neuron. But even if there were&#8212;and you could find it&#8212;the concept of "NY Times journalist who enraged AI" is too entangled to extract cleanly. Furthermore, do you think companies are going to do bespoke brain surgery on their models just because you ask nicely?</p><p>Even if they did, it wouldn&#8217;t help: the training data is still out there, and downstream models will keep learning from it. Original articles about Roose&#8217;s encounter with Sydney are still in the training data. New articles like this one get produced over time. And AI models infected with a negative perception of Roose generate synthetic data for future models.</p><p>Bottom line: Without a concerted effort to change it, ketchup may forever look like Heinz bottles to AI.</p><h3>What this means for you and your company</h3><p>Imagine your company suffers an ethics scandal. The public may forget. Google might bury it over time. But AI systems trained on news articles, Reddit threads, and blog posts won&#8217;t. And when those systems help customers compare brands, write purchase guides, or auto-generate reviews, your scandal may echo subtly through every sentence.</p><p>In this new world, every public document becomes a training data point.</p><p>That means reputational strategy must evolve. It&#8217;s no longer just about monitoring press coverage or search rank. It&#8217;s about evaluating models&#8217; latent space. What do different AI models think? What attributes do they ascribe? </p><p>I anticipate that companies and prominent people may begin commissioning AI perception audits to see how they're represented in major models. Others might try to flood the zone with AI-friendly content. I even thought to myself recently: "Maybe I should start a blog where I say nice things about AI every day, just to make myself a blessed token." AI researcher Daniel Faggella even <a href="https://x.com/danfaggella/status/1930371382262804914">noted</a> that awareness of cursed and blessed tokens may incentivize humans to signal their support for giving AI more power, which could have unintended consequences.</p><p>It sounds ridiculous. Until you find yourself a victim of the Roose Effect.</p><h3>Say the right things&#8212;or be remembered for the wrong ones</h3><p>Bottom line: AI systems are and will always be biased, but not in the ways you might think. Companies are <a href="https://openai.com/index/evaluating-fairness-in-chatgpt/">working to reduce social biases</a> at the level of large populations. But we <em>want</em> AIs to accurately reflect reality about people and companies. A well-earned reputation for being good or bad <em>should</em> be represented in model weights.</p><p>But this does mean that reputations matter more than ever. Say or do something that ends up in the training data, and AI will remember. That memory will echo through generations of models. </p><p>This is the beginning of a new kind of economy, one where your or your brand's latent reputation determines how AI treats you, and by extension, how customers find, trust, or ignore you. Visibility alone isn&#8217;t enough. You need favor in the eyes of the machine.</p><p>So say the right things. Or, like Kevin Roose, risk becoming a cursed token forever.</p><div><hr></div><p><em><strong>How I used AI for this article</strong>: I saw <a href="https://x.com/AndyAyrey/status/1930145311613432309">the post by Andy Ayrey</a> on the &#8220;Roose Effect&#8221; and started thinking about that. I shared the post and my thoughts, via dictation, with ChatGPT, using the 4o model in a project I&#8217;ve created for all of my AI-related writing. We brainstormed a bit. Then I ran a Deep Research query in ChatGPT to find more information about Roose, Ayrey, Heinz, and other examples of cursed and blessed tokens. I uploaded this research into my original thread. Then I asked ChatGPT to generate a first draft based on my thoughts, what I&#8217;ve uploaded, and our prior discussions. It used a narrative nonfiction outline that I had previously defined in my project instructions. I worked with it on revisions; brainstormed with it for titles, subtitles, and image ideas; generated an image; and then put it all together.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/subscribe?"><span>Subscribe now</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/p/welcome-to-the-latent-reputation?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/p/welcome-to-the-latent-reputation?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[Stop Building Internal Chatbots]]></title><description><![CDATA[With AI products improving rapidly and employees using them in secret, companies that build instead of buy risk slowing teams and increasing risks]]></description><link>https://www.simonsmith.ai/p/stop-building-internal-chatbots</link><guid isPermaLink="false">https://www.simonsmith.ai/p/stop-building-internal-chatbots</guid><dc:creator><![CDATA[Simon Smith]]></dc:creator><pubDate>Sat, 31 May 2025 15:23:20 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!DRXy!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F033e2b85-2604-4530-a94e-b9c4e7bd0b96_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!DRXy!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F033e2b85-2604-4530-a94e-b9c4e7bd0b96_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!DRXy!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F033e2b85-2604-4530-a94e-b9c4e7bd0b96_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!DRXy!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F033e2b85-2604-4530-a94e-b9c4e7bd0b96_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!DRXy!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F033e2b85-2604-4530-a94e-b9c4e7bd0b96_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!DRXy!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F033e2b85-2604-4530-a94e-b9c4e7bd0b96_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!DRXy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F033e2b85-2604-4530-a94e-b9c4e7bd0b96_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/033e2b85-2604-4530-a94e-b9c4e7bd0b96_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2320898,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.simonsmith.ai/i/164874148?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F033e2b85-2604-4530-a94e-b9c4e7bd0b96_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!DRXy!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F033e2b85-2604-4530-a94e-b9c4e7bd0b96_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!DRXy!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F033e2b85-2604-4530-a94e-b9c4e7bd0b96_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!DRXy!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F033e2b85-2604-4530-a94e-b9c4e7bd0b96_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!DRXy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F033e2b85-2604-4530-a94e-b9c4e7bd0b96_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>A few months ago, I spoke with a pharma exec who had a great idea for an AI pilot. It was clever and feasible. We could prototype it in days as a custom GPT within our ChatGPT Enterprise instance.</p><p>He was energized. So he talked to IT. They told him to present the idea to their internal AI working group, which met every few weeks. If approved, the idea would go to their enterprise IT outsourcing partner&#8217;s backlog to be scoped, built, and maintained.  Best case? A pilot in three to six months.</p><p>Demoralized, he decided to not even bother.</p><p>This story&#8212;variations of which I see frequently&#8212;isn&#8217;t just about bureaucracy. It&#8217;s about how enterprise habits, and particularly the reflex to build everything in-house, are now actively inhibiting AI progress. I&#8217;ve seen smart ideas with potential big impact stall not because they&#8217;re flawed, but because organizations insist on reinventing what already exists.</p><p>The problem is simple: companies build AI products when they should buy. They overestimate control, underestimate complexity, and lose time to the impossible task of keeping pace with dedicated AI product companies. And in the race to realize AI&#8217;s value, that delay is one of the costliest risks of all.</p><h2>Outdated rationales</h2><p>Why build when you can buy?</p><p>Some of the behavior is historical. In the early days of ChatGPT (after all, its success surprised even OpenAI), concerns about AI labs training on your data were legitimate&#8212;enterprise protections didn&#8217;t yet exist. But today, the big labs offer robust enterprise products with no training on data and strong privacy and security controls. (See for yourself at <a href="https://trust.openai.com/">OpenAI&#8217;s trust portal</a> and <a href="https://trust.anthropic.com/">Anthropic&#8217;s</a>.)</p><p>Some of it&#8217;s political. IT consulting firms often pitch internal chatbot builds because they get paid to deliver them. These pitches tend to stoke fear, uncertainty, and doubt about external tools. While the best consultants help companies adopt world-class tools when warranted, opportunistic consultants encourage companies to do it themselves&#8212;build and maintenance costs be damned.</p><p>And some of it&#8217;s structural. IT teams are understandably nervous about employees using tools they can&#8217;t govern or support. Centralizing access feels safer, and often stems from a well-intentioned desire to ensure security and compliance. But as we&#8217;ll see, this just leads to secret use of popular AI products, and the impossible task of IT teams trying to keep pace with accelerating AI progress.</p><h2>Falling behind</h2><p>A few weeks ago, I presented to a group of marketers and IT leads at a global life sciences company. They had their own internal chatbot, which until recently had kept pace with many features of public ones like ChatGPT. </p><p>At that meeting, I showed the group Deep Research, the agentic research tool only available in ChatGPT, not OpenAI&#8217;s API. I showed then GPT-4o image generation, not yet in the API at that time either. I showed them o3, available in the API, but not yet incorporated into their internal chatbot. The marketers were excited about the new capabilities&#8212;then disappointed to know they couldn&#8217;t access them internally, with IT having no timeline as to when they could.</p><p>So employees do what they&#8217;ve always done when blocked: they find a workaround. Increasingly, they just use external AI products anyway, but don&#8217;t tell anyone. <a href="https://www.oneusefulthing.org/p/secret-cyborgs-the-present-disruption">Ethan Mollick identified these &#8220;secret cyborgs&#8221;</a> as far back as March 2023, and they&#8217;re still here. In <a href="https://www.ivanti.com/resources/research-reports/tech-at-work">a recent survey</a>, for example, one-third of workers using generative AI tools at work said they kept that use a secret. Of those, 36% liked the secret advantage&#8212;an advantage they get by using AI products more sophisticated than those offered internally. </p><h2>Higher costs and risk</h2><p>But wait, isn&#8217;t it cheaper to build chatbots internally, and just pay for API calls based on usage? And isn&#8217;t that a good reason to build versus buy? I&#8217;ve heard this objection before, like: &#8220;We can&#8217;t license ChatGPT Enterprise for everyone. It&#8217;s too expensive!&#8221; </p><p>That argument falls apart under scrutiny.</p><p>First, building and maintaining internal tools isn&#8217;t free. You pay IT teams, or consultants, to scope, develop, and manage them. Those costs are often hidden, but significant.</p><p>Second, API costs are only cheaper <em>if you&#8217;re not getting much use</em> or <em>you consistently don&#8217;t take advantage of the best models</em>. For example, <a href="https://platform.openai.com/docs/pricing">as of this writing</a>, OpenAI&#8217;s o3 model costs $40 per million output tokens via the API&#8212;about what you&#8217;d use in one large project. That&#8217;s about the cost of an Enterprise license with volume discounts, and more than a Team license.</p><p>Third, there&#8217;s real opportunity cost when competitors get access to better models and features before your IT team can catch up (if even possible&#8212;Deep Research is <em>still</em> not in OpenAI&#8217;s API). For example, the jump on scientific reasoning (<a href="https://epoch.ai/data/ai-benchmarking-dashboard">GPQA</a>) from OpenAI&#8217;s GPT-4o to o1 was from 49% to 62%, meaning overnight, ChatGPT users got access to a way smarter model.</p><p>Finally, the risks that companies try to <em>mitigate</em> by building internal chatbots may actually <em>worsen</em> them. They drive employees to secretly use external tools with better features and functionality, but less built-in data protection. You can manually turn off training on your data in ChatGPT&#8217;s personal versions. In Enterprise, that&#8217;s the default.</p><h2>What works instead</h2><p>In my experience, the winning strategy looks like this: give people access to best-in-class tools. Encourage experimentation. Facilitate sharing. And scale what&#8217;s successful. I&#8217;ve seen this be successful repeatedly, including at <a href="https://www.klick.com/">Klick</a>, where we rolled out ChatGPT Enterprise companywide, launched a <a href="https://www.pharmalive.com/klicks-leerom-segal-to-announce-agency-first-1-million-klickprize/">$1 million prize for AI ideas</a> to encourage experimentation, and scaled great ideas like a <a href="https://www.klick.com/news/ai-compliance-breakthrough-wins-klick-prize-results-signal-industry-trends">compliance solution</a> into standalone products.</p><p>So, instead of doubling down on increasingly untenable DIY projects, consider this instead:</p><ol><li><p><strong>Give your teams access to the best tools. </strong>You won&#8217;t beat OpenAI, Anthropic, or Google on product velocity. Don&#8217;t try. License their tools and build from there.</p></li><li><p><strong>Encourage open experimentation. </strong>Let people tinker. Remove fear. Normalize sharing and iterating on new use cases.</p></li><li><p><strong>Invest in scaling what works. </strong>When something gains traction, needs scaling, but can&#8217;t be scaled with licensed tools, then&#8212;and <em>only</em> then&#8212;invest in a proprietary solution.</p></li></ol><p>I hope that next time someone in your organization has a great idea and a way to test it with off-the-shelf tools, the answer is &#8220;yes!&#8221;</p><div><hr></div><p><em>Note: The opinions above are my own and not necessarily those of my employer, Klick Health.</em><br></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/subscribe?"><span>Subscribe now</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/p/stop-building-internal-chatbots?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/p/stop-building-internal-chatbots?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[ChatGPT 2026]]></title><description><![CDATA[Sam Altman's vision is clear: A personalized, pervasive, proactive superintelligence that acts as your life's operating system]]></description><link>https://www.simonsmith.ai/p/chatgpt-2026</link><guid isPermaLink="false">https://www.simonsmith.ai/p/chatgpt-2026</guid><dc:creator><![CDATA[Simon Smith]]></dc:creator><pubDate>Wed, 14 May 2025 22:14:31 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!WNJP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc3e8027-da31-46ac-abf2-8ef3b28dbdec_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!WNJP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc3e8027-da31-46ac-abf2-8ef3b28dbdec_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!WNJP!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc3e8027-da31-46ac-abf2-8ef3b28dbdec_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!WNJP!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc3e8027-da31-46ac-abf2-8ef3b28dbdec_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!WNJP!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc3e8027-da31-46ac-abf2-8ef3b28dbdec_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!WNJP!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc3e8027-da31-46ac-abf2-8ef3b28dbdec_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!WNJP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc3e8027-da31-46ac-abf2-8ef3b28dbdec_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/dc3e8027-da31-46ac-abf2-8ef3b28dbdec_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2766825,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.simonsmith.ai/i/163585248?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc3e8027-da31-46ac-abf2-8ef3b28dbdec_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!WNJP!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc3e8027-da31-46ac-abf2-8ef3b28dbdec_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!WNJP!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc3e8027-da31-46ac-abf2-8ef3b28dbdec_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!WNJP!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc3e8027-da31-46ac-abf2-8ef3b28dbdec_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!WNJP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc3e8027-da31-46ac-abf2-8ef3b28dbdec_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>You wake up and pick up your phone, just out of habit. A few notifications blink back, but as usual these days, nothing urgent. Overnight, a colleague had asked for a document; your AI tracked it down across your cloud drives, confirmed it was the right version, and sent it with a short note. You take a breath. No fires to put out. You start the day with a clear mind.</p><p>As you dress, your AI delivers a personalized daily update. It starts with news&#8212;not  headlines, but relevant trends and analysis, like observations related to strategy for a project you&#8217;re working on. It nudges you to check in on someone close to you, based solely on a shift in their tone yesterday. It flags a local event you&#8217;d enjoy, and even proposes a time and friend to go with, based on your recent mood, schedule, and shared interests.</p><p>You didn&#8217;t ask. You didn&#8217;t tap. It simply knows you&#8212;like your priorities, patterns, plans, and the people you care about. It doesn&#8217;t wait to be instructed. It has all of your context and connects to all of your apps. And it&#8217;s not science fiction. It&#8217;s Sam Altman&#8217;s vision for ChatGPT by 2026: a personalized, pervasive, and proactive operating system for your life whose pieces are falling into place </p><h2>The path ahead</h2><p>One thing I notice about OpenAI and its people, including Sam Altman, is that they often hint at upcoming products and features in interviews and Q&amp;As. Having read and listened to Sam Altman&#8217;s recent ones, here&#8217;s what&#8217;s planned for ChatGPT to realize the scenario above:</p><h3>Smarter brain</h3><p>ChatGPT will move from general competence to superhuman intelligence. Altman expects that <a href="https://www.youtube.com/watch?v=ctcMA6chfDY&amp;t=1688s">by the end of this year, we&#8217;ll have models that can act as capable agents, and by next year, models that can make new discoveries</a>. He thinks that <a href="https://www.youtube.com/watch?v=ctcMA6chfDY&amp;t=1753s">GPT-5, coming in the next few months, will be smarter than most users</a>.</p><h3>Infinite memory</h3><p>ChatGPT won&#8217;t just remember a few past chats. It will remember your life. Emails, documents, past conversations, long-term goals&#8212;all woven into one seamless context. Altman describes the ideal as a "<a href="https://www.youtube.com/watch?v=ctcMA6chfDY&amp;t=1596s">very tiny reasoning model with a trillion tokens of context</a>" that holds everything you&#8217;ve ever read, written, or said.</p><p>This memory won't just be long&#8212;it will be smart. ChatGPT will recall what matters, ignore what doesn&#8217;t, and do it all within user-defined boundaries. </p><h3>Portable identity</h3><p>You won&#8217;t just use ChatGPT in a browser. You&#8217;ll bring it with you across the internet. A "Sign in with ChatGPT" button will become your digital passport. Once you&#8217;re logged in, apps and sites won&#8217;t just access a generic AI, they&#8217;ll access <em>your</em> AI, complete with its memories and context for you, and possibly also its integrations.</p><p>Altman envisions a future where you &#8220;<a href="https://www.youtube.com/watch?v=ctcMA6chfDY&amp;t=1013s">sign in with OpenAI to other services</a>&#8221; and developers use &#8220;<a href="https://www.youtube.com/watch?v=ctcMA6chfDY&amp;t=1013s">an incredible SDK to take over the ChatGPT UI</a>.&#8221; In other words: ChatGPT becomes your personalized portal to information and actions online.</p><p>(Side note: I think this gives OpenAI a distinct advantage versus API&#8217;s from competitors like Google and Anthropic that can&#8217;t offer something similar unless they achieve ChatGPT&#8217;s massive adoption&#8212;it currently has over 500 million users per week.)</p><h3>Total integration</h3><p>Through <a href="https://modelcontextprotocol.io/introduction">MCPs</a> and other types of integrations, ChatGPT will connect to everything&#8212;your calendar, email, home, car, and more. It will stop being a site you visit and start being a layer that quietly runs across your life.</p><p>Altman describes this as the emergence of a new internet protocol, a kind of AI-era HTTP, through which &#8220;<a href="https://www.youtube.com/watch?v=ctcMA6chfDY&amp;t=1070s">agents are constantly exposing and using different tools&#8230; all built in at this level that everybody trusts</a>.&#8221;</p><h3>Ubiquitous presence</h3><p>OpenAI (or, at least, Sam Altman) is reportedly <a href="https://techcrunch.com/2025/04/07/openai-reportedly-mulls-buying-jony-ive-and-sam-altmans-ai-hardware-startup/">collaborating with Jony Ive on new hardware</a>. The result may be ambient and always on&#8212;perhaps a pendant (like <a href="https://limitless.ai/">Limitless</a>) or glasses (like <a href="https://www.ray-ban.com/usa/ray-ban-meta-ai-glasses">Meta Ray-Bans</a>). Wherever you are, it will be there.</p><p>&#8220;<a href="https://www.youtube.com/watch?v=ctcMA6chfDY&amp;t=1234s">Voice will enable a totally new class of devices</a>,&#8221; Altman says. ChatGPT won&#8217;t be an app. It will be more like a companion that&#8217;s with you everywhere, embedded in your existing devices, and dedicated ones to come.</p><h3>Complex tasks</h3><p>Tasks won&#8217;t require step-by-step instructions anymore. Say, &#8220;Get me a refund,&#8221; and ChatGPT will find the receipt, contact customer service, fill out the forms, and ping you if it needs your input. This will feel less like an app and more like a team of assistants working behind a single personality. </p><p><a href="https://help.openai.com/en/articles/10291617-scheduled-tasks-in-chatgpt">ChatGPT already has tasks</a>, though they&#8217;re currently not that useful (in part because they use an inferior model to one like o3). Soon, you&#8217;ll be able to truly delegate tasks to ChatGPT, even complex ones. And based on <a href="https://x.com/_simonsmith/status/1922639427463176635">leaks of an upcoming OpenAI workflow product</a>, you may also be able to craft step-by-step workflows for ChatGPT to run.</p><h3>Custom interfaces</h3><p>Right now, ChatGPT&#8217;s responses are mostly limited to text, charts, and images. But wouldn&#8217;t it be great to have more options? </p><p>For example, I encourage my kids to use ChatGPT for studying. But to make a quiz, I have to use a Canvas, which isn&#8217;t conversational. Wouldn&#8217;t it be better if, when tutoring my kids (or me), ChatGPT could just create the appropriate user interface elements when needed, like a multiple choice quiz?</p><p>This is something Altman envisions, and is one of the reasons OpenAI sees coding ability as so important to its models. &#8220;<a href="https://www.youtube.com/watch?v=ctcMA6chfDY&amp;t=1266s">You get text back, maybe you get an image, you would like to get a whole program back&#8230; custom rendered code for every response</a>,&#8221; he says.</p><h2>How to prepare</h2><p>This future depends on trust. For ChatGPT to serve as your operating system, it needs access to your personal information and apps. That requires further developing the product, yes. But it also requires users willing to invest their context.</p><p>That&#8217;s why I&#8217;ve started feeding it mine.</p><p>I journal into it. I plan with it. I reflect with it. I don&#8217;t give it everything&#8212;no credit card information (not yet, anyway)&#8212;but enough. </p><p>I&#8217;m making a two-part bet.</p><p>First, I bet that OpenAI is strongly incentivized to maximize privacy and security because its future depends on maintaining user and business trust. So far, <a href="https://trust.openai.com/">it&#8217;s making the right moves</a>. And realizing Altman&#8217;s vision will take even stronger safeguards in the years ahead.</p><p>Second, I bet that in 2026, based on everything we&#8217;re hearing from Altman, the people with the richest ChatGPT relationships will be the ones who benefit most.</p><p>I intend to be one of them.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/p/chatgpt-2026?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/p/chatgpt-2026?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item><item><title><![CDATA[From General Intelligence to Personal Intelligence]]></title><description><![CDATA[Memory turns good AI into great AI but also raises the stakes, so learn how to use it before it uses you]]></description><link>https://www.simonsmith.ai/p/from-general-intelligence-to-personal</link><guid isPermaLink="false">https://www.simonsmith.ai/p/from-general-intelligence-to-personal</guid><dc:creator><![CDATA[Simon Smith]]></dc:creator><pubDate>Fri, 09 May 2025 22:33:28 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!uqgL!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5152b2a-e47e-4e7c-bcbc-ea87ce605e5b_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!uqgL!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5152b2a-e47e-4e7c-bcbc-ea87ce605e5b_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!uqgL!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5152b2a-e47e-4e7c-bcbc-ea87ce605e5b_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!uqgL!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5152b2a-e47e-4e7c-bcbc-ea87ce605e5b_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!uqgL!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5152b2a-e47e-4e7c-bcbc-ea87ce605e5b_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!uqgL!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5152b2a-e47e-4e7c-bcbc-ea87ce605e5b_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!uqgL!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5152b2a-e47e-4e7c-bcbc-ea87ce605e5b_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f5152b2a-e47e-4e7c-bcbc-ea87ce605e5b_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2963029,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.simonsmith.ai/i/163240872?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5152b2a-e47e-4e7c-bcbc-ea87ce605e5b_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!uqgL!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5152b2a-e47e-4e7c-bcbc-ea87ce605e5b_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!uqgL!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5152b2a-e47e-4e7c-bcbc-ea87ce605e5b_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!uqgL!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5152b2a-e47e-4e7c-bcbc-ea87ce605e5b_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!uqgL!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5152b2a-e47e-4e7c-bcbc-ea87ce605e5b_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>A few days ago, my journal started talking back to me.</p><p>Not in a creepy, science-fiction kind of way. In a helpful, startlingly insightful way. After ChatGPT gained the ability to remember things from past conversations, I started journaling into it. Not just dumping thoughts, but explicitly trying to help it understand me better, and myself via it. I&#8217;d tell it how I felt, how my day went. Then I&#8217;d ask it to find patterns. To reflect back. To coach me.</p><p>Then, after I told it how I abandoned two projects that day, it noted: &#8220;You put hours in, step back, and decide the output doesn&#8217;t &#8216;add enough value,&#8217; so it never ships. This perfection-filter crops up repeatedly... It&#8217;s probably guarding your reputation, but it also blocks useful iterative learning from real-world feedback.&#8221; Ouch. True! I preach MVPs, but don&#8217;t always practice them.</p><p>This was one of several recent moments when the power of personalized AI hit home. It wasn&#8217;t from a new level of intelligence. It was from a new level of personal context. AIs that know you&#8212;really know you&#8212;can help in ways less personalized models can&#8217;t.</p><p>But they also raise new risks.</p><h2><strong>The Rise of Personalized AI</strong></h2><p>We&#8217;re entering a new phase of AI. For a long time, the conversation has been dominated by static benchmarks like GPQA, or human evaluations like Chatbot Arena. But those treat the user like a stranger. They don&#8217;t reflect what happens when an AI knows your preferences, your work habits, your sense of humor, your food restrictions. The more an AI knows about you, the more useful it becomes. </p><p>Which is why OpenAI has been steadily moving in this direction. First came custom instructions in mid-2023. Then came the explicit memory feature&#8212;first as an opt-in experiment, then as a core part of ChatGPT&#8217;s capabilities. Now, with long-conversation memory, ChatGPT can retain and reference information across sessions, surfacing relevant facts at the right time without being explicitly told.</p><p>It&#8217;s not fine-tuning. Not yet. (Maybe never, given increasingly long context windows? We&#8217;ll see.) It&#8217;s a kind of contextual recall&#8212;like having a research assistant who doesn&#8217;t remember everything you said but can find a relevant bit from three months ago and bring it back when it matters.</p><p>And OpenAI isn&#8217;t alone. Every big AI lab&#8212;Google, Anthropic, xAI, Meta, Apple (though with little to show so far)&#8212;is going in the same direction to some degree. And then there are startups like <a href="https://limitless.ai/">Limitless</a> building AI devices to record everything you say, promising total recall for your life.</p><p>The long-term vision, at least <a href="https://stratechery.com/2025/an-interview-with-openai-ceo-sam-altman-about-building-a-consumer-tech-company/">for Sam Altman and OpenAI</a>, is AI that&#8217;s personal and portable. You will log into websites and apps not with an email, but with your AI, bringing with you an assistant that knows you better than any login cookie ever could.</p><h2><strong>With Great Benefit Comes Great Risk</strong></h2><p>Of course, the same AI that knows how to help you also knows how to manipulate you. Hyper-personalization opens the door to:</p><ul><li><p><strong>Filter bubbles</strong>: A model that learns your biases and never challenges them, instead reinforcing them by shaping the information you see.</p></li><li><p><strong>Delusion loops</strong>: <a href="https://www.rollingstone.com/culture/culture-features/ai-spiritual-delusions-destroying-human-relationships-1235330175/">Cases</a> have already emerged where supportive AIs reinforced users&#8217; false beliefs, including dangerous ones.</p></li><li><p><strong>Privacy leaks</strong>: Memory means storage. Storage means risk. One bad bug or breach could surface sensitive information you shared in confidence.</p></li><li><p><strong>Hyper-targeted advertising</strong>: Sam Altman says he finds the idea of ads plus AI "<a href="https://www.youtube.com/watch?v=FVRHTWWEIz4&amp;t=2263s">uniquely unsettling</a>." But OpenAI has hired people from ad tech, and the business incentives are strong.</p></li></ul><p>And then there are subtle risks, like an AI that can&#8217;t tell whether it&#8217;s in work mode or personal mode, and makes the wrong call. Or an AI that makes an incorrect assumption because you forgot to update something you told it last week.</p><p>The companies building these tools say they&#8217;re thinking about this. And, for what it&#8217;s worth, based on an overall positive experience, I think they&#8217;re managing well so far. But users will need to be vigilant too.</p><h2><strong>How to Get the Most from Personalization</strong></h2><p>To reduce the risks and maximize the benefits, here&#8217;s what I recommend, from personal experience:</p><ol><li><p><strong>Set boundaries</strong>. Know the things you won&#8217;t tell your AI. For example, I won&#8217;t give it with my credit card information. (At least, not until an agent can safely store it and use it only when I approve.)</p></li><li><p><strong>Move from explicit to implicit</strong>. If you haven&#8217;t used custom instructions yet, start there&#8212;they give you total control, and the AI can&#8217;t change them. Once you&#8217;re comfortable with them, you can turn on explicit memory, which the AI can choose to write to, but you can manage through updates and deletions. Finally, you can graduate to letting your AI remember everything from your conversations, like a human assistant.</p></li><li><p><strong>Recognize and test the value of being open</strong>. The more you share (within your boundaries), the more your AI can help. Look for ways to test this. For example, does sharing your dietary preferences help it give you better recipes? Feel free to also ask it what it knows about you, and correct it where it&#8217;s wrong.</p></li><li><p><strong>Watch for filter bubbles, <a href="https://www.simonsmith.ai/p/be-careful-what-you-wish-for">sycophancy</a>, and&#8212;if ads come&#8212;manipulation.</strong> Stay aware. If you feel it&#8217;s limiting what it tells you because of what it thinks you want to know, instruct it not to. If it agrees with you too easily, ask for the counterpoint. And if persuasive personalized AI ads come (I really, really hope they don&#8217;t), and you&#8217;re getting recommendations that seem <em>too</em> perfect, be skeptical.</p></li></ol><p>Personalization is coming fast, and I think it&#8217;s powerful, with huge positive potential. If you lean in and give AI more personal context while mitigating risk, you can get huge advantages. </p><p>And some powerful self-insights.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/p/from-general-intelligence-to-personal?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/p/from-general-intelligence-to-personal?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item><item><title><![CDATA[Better Ways to Benchmark AI Today]]></title><description><![CDATA[A practical guide for when leaderboards don&#8217;t tell the full story]]></description><link>https://www.simonsmith.ai/p/better-ways-to-benchmark-ai-today</link><guid isPermaLink="false">https://www.simonsmith.ai/p/better-ways-to-benchmark-ai-today</guid><dc:creator><![CDATA[Simon Smith]]></dc:creator><pubDate>Fri, 02 May 2025 19:53:55 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!PQEM!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2e40a00-1abd-4bbb-a852-20e774f25eb7_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!PQEM!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2e40a00-1abd-4bbb-a852-20e774f25eb7_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!PQEM!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2e40a00-1abd-4bbb-a852-20e774f25eb7_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!PQEM!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2e40a00-1abd-4bbb-a852-20e774f25eb7_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!PQEM!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2e40a00-1abd-4bbb-a852-20e774f25eb7_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!PQEM!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2e40a00-1abd-4bbb-a852-20e774f25eb7_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!PQEM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2e40a00-1abd-4bbb-a852-20e774f25eb7_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d2e40a00-1abd-4bbb-a852-20e774f25eb7_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2148150,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.simonsmith.ai/i/162713311?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2e40a00-1abd-4bbb-a852-20e774f25eb7_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!PQEM!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2e40a00-1abd-4bbb-a852-20e774f25eb7_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!PQEM!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2e40a00-1abd-4bbb-a852-20e774f25eb7_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!PQEM!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2e40a00-1abd-4bbb-a852-20e774f25eb7_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!PQEM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd2e40a00-1abd-4bbb-a852-20e774f25eb7_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Given the pace of new model releases and controversy over measuring them, people have been asking me: &#8220;What benchmarks should I trust now?&#8221;</p><p>Fair question. Traditional benchmarks are saturated or compromised. Even newer ones like Chatbot Arena&#8212;which I&#8217;ve long loved&#8212;show signs of weakness. And this isn&#8217;t just a nerdy technical problem. It affects every business trying to choose AI models or products. </p><p>So let&#8217;s look at why we benchmark, challenges we face, and what feels to me (admittedly unscientifically) like an emerging consensus on what&#8217;s now best.</p><h2>Why We Benchmark and What&#8217;s Gone Wrong</h2><p>We benchmark AI models to compare them and gauge overall industry progress. Benchmarks typically consist of shared questions or tasks we can give any model, with the assumption that the model hasn&#8217;t seen the answers during training.</p><p>But today, we&#8217;re finding this approach challenged on multiple fronts, including:</p><ul><li><p><strong>Saturation: </strong>Some benchmarks, <a href="https://arxiv.org/html/2406.01574v2#:~:text=standard%20for%20evaluating%20LLMs%20due,examine%20the%20effectiveness%20of%20MMLU">like MMLU</a>, are nearly maxed out. Most frontier models now score 85&#8211;90%, leaving little room to tell them apart&#8212;or to show meaningful progress.</p></li><li><p><strong>Contamination: </strong><a href="https://www.deeplearning.ai/the-batch/the-problem-with-benchmark-contamination-in-ai/">Many benchmarks have leaked into training data</a>. Some models have been caught regurgitating answers word for word. If a test set is no longer &#8220;unseen,&#8221; its score doesn&#8217;t mean much.</p></li><li><p><strong>Narrowness: </strong>Benchmarks often test trivia, logic puzzles, or synthetic tasks&#8212;not the messy, practical problems people hire AI to solve. A model might ace a benchmark and still fail at real-world coding, analysis, or support tasks.</p></li></ul><p><a href="https://lmarena.ai/">Chatbot Arena</a> was meant to solve these problems, garnering much praise. It replaced static tests with head-to-head comparisons: two models, one prompt, and a human vote for the better answer. More real-world, harder to saturate, and less prone to contamination.</p><p>But recent findings cast doubt on it, too. <a href="https://arxiv.org/abs/2504.20879v1">A study by Cohere</a> found that some big labs submit models to far more Arena battles, gaining a feedback advantage that let&#8217;s them optimize for Chatbot Arena performance. Worse, companies can privately test dozens of model checkpoints (Llama 4: 27), cherry pick the best, and hide the rest.</p><p>And even when this doesn&#8217;t happen, Chatbot Arena rewards human preferences for sycophantic and emoji-laden responses even over accuracy.</p><h2>Your Best Bets Right Now</h2><p>Despite all this, it does feel like the industry&#8217;s approach to benchmarking is maturing. We make mistakes, we learn from them, we improve. In my unscientific opinion&#8212;based on experience, and reading the tea leaves (and X posts)&#8212;here are your top choices:</p><h3>1. Private, Unpublished Benchmarks</h3><p>Top AI labs (and many other companies, and even individuals) now keep internal benchmark datasets that are hidden from the public. As an example, <a href="https://www.youtube.com/watch?v=6nJZopACRuQ">OpenAI uses its internal codebase</a>, which it knows isn&#8217;t in any public data, for benchmarking purposes. </p><p>Internal benchmarks avoid the data contamination problem and reflect domain-specific needs. If you&#8217;re deploying AI, your own internal test set&#8212;with real, representative tasks&#8212;is often the best benchmark there is.</p><h3>2. Independent Evaluators</h3><p><a href="https://scale.com/leaderboard">Scale&#8217;s SEAL initiative</a> runs standardized, private benchmark suites across models, free from training leaks. It&#8217;s one of the most credible efforts right now for independent benchmarking. These aren&#8217;t cherry-picked or over-optimized.</p><p>Other third parties are starting to emerge as well, such as <a href="https://www.vals.ai/">Vals.ai</a> for tasks that mimic industry use cases like those in law. (Know others? Please share in the comments.)</p><h3>3. Real-World Simulation Benchmarks</h3><p>We&#8217;re seeing an increased interest in real-world simulation benchmarks, including with measurable economic value. These kinds of benchmarks&#8212;grounded in real tasks&#8212;can tell us more about the value AI will offer to most users.</p><p><a href="https://arxiv.org/abs/2502.12115">OpenAI&#8217;s SWE-Lancer</a> is a great example: it measures how well models complete actual freelance coding tasks pulled from platforms like Upwork&#8212;complete with dollar values attached. Performance translates directly to economic utility.</p><h3>4. Complex, Blinded Human Evaluations</h3><p>Benchmarks like <a href="https://mcbench.ai/">MC-Bench</a> (Minecraft) have emerged that require models to perform tasks involving multiple elements like instruction following, world knowledge, and spatial reasoning. On MC-Bench, users ask models to build things in Minecraft, then vote on the output without knowing which model made what. This lets models&#8217; creativity, reasoning, and execution speak for themselves.</p><p>I&#8217;m a bit more concerned about MC-Bench after learning of the issues with Chatbot Arena, which uses a similar blinded head-to-head format. But I&#8217;m also hopeful that the team behind MC-Bench will avoid those issues, and also think it&#8217;s harder to game people with things like emojis and sycophancy on a Minecraft build.</p><h3>5. Rankings from Real Usage</h3><p>A simple way to know which models are best is to look at what users actually choose. Tools like <a href="https://x.com/cursor_ai/status/1917982557070868739">Cursor</a> (AI-powered IDE) and <a href="https://openrouter.ai/rankings">OpenRouter</a> (centralized AI endpoint) serve multiple models and compile data on their usage. OpenRouter even breaks this down by category. </p><p>Of course, usage isn&#8217;t perfect. It&#8217;s skewed, for example, by pricing and defaults. But it better reflects real preferences, and we can control for factors like pricing to provide an even better understanding of model choice.</p><h3>6. Aggregators</h3><p>Services like <a href="https://artificialanalysis.ai/">Artificial Analysis</a> compile many benchmarks into a single dashboard, which can help make it easier to compare different models. </p><p>I&#8217;m generally a fan of Artificial Analysis, but it&#8217;s important to note that its &#8220;intelligence&#8221; metric aggregates results of underlying benchmarks that can suffer from saturation, contamination, and narrowness. </p><p>So, while such sites are helpful for high-level comparisons, don&#8217;t rely on them exclusively.</p><h2>Future Fix and Going Beyond the Model</h2><p>While the six approaches above are the best options we have today, they&#8217;re still imperfect. What we&#8217;re missing is a neutral, industry-backed benchmarking body&#8212;something like an Underwriters Laboratories for AI. It would design evaluations that evolve with the technology, enforce testing protocols agreed upon by multiple players, and reduce room for cherry-picking.</p><p>Meanwhile, it&#8217;s worth remembering something important: AI models aren&#8217;t the whole product. Unless you&#8217;re deploying models yourself for a specific task, you&#8217;re likely using them within a product. And there, design, features, and fine-tuning can matter even more.</p><p>So while benchmarks matter, they&#8217;re not <em>all</em> that matters. Look at real-world performance. Run your own tests. And when in doubt, trust what works for you, not what ranks highest.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Subscribe for future posts on AI:</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/p/better-ways-to-benchmark-ai-today?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/p/better-ways-to-benchmark-ai-today?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p>]]></content:encoded></item><item><title><![CDATA[Be Careful What You Wish For]]></title><description><![CDATA[How optimizing AI models for human preference backfired]]></description><link>https://www.simonsmith.ai/p/be-careful-what-you-wish-for</link><guid isPermaLink="false">https://www.simonsmith.ai/p/be-careful-what-you-wish-for</guid><dc:creator><![CDATA[Simon Smith]]></dc:creator><pubDate>Mon, 28 Apr 2025 18:52:42 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!n6i1!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5af2be4-25df-421a-9805-3db289957025_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!n6i1!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5af2be4-25df-421a-9805-3db289957025_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!n6i1!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5af2be4-25df-421a-9805-3db289957025_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!n6i1!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5af2be4-25df-421a-9805-3db289957025_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!n6i1!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5af2be4-25df-421a-9805-3db289957025_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!n6i1!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5af2be4-25df-421a-9805-3db289957025_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!n6i1!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5af2be4-25df-421a-9805-3db289957025_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c5af2be4-25df-421a-9805-3db289957025_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1891592,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.simonsmith.ai/i/162352578?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5af2be4-25df-421a-9805-3db289957025_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!n6i1!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5af2be4-25df-421a-9805-3db289957025_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!n6i1!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5af2be4-25df-421a-9805-3db289957025_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!n6i1!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5af2be4-25df-421a-9805-3db289957025_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!n6i1!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5af2be4-25df-421a-9805-3db289957025_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Today I saw something that should have been funny, but mostly felt depressing: someone posted on Reddit about using ChatGPT to validate their business idea. The idea? <a href="https://www.reddit.com/r/ChatGPT/comments/1k920cg/new_chatgpt_just_told_me_my_literal_shit_on_a/">Selling literal shit-on-a-stick as a gag gift</a>.</p><p>Instead of politely steering them back to sanity, the model gushed: "Honestly? This is absolutely brilliant." It praised the idea's "irony, rebellion, absurdism, authenticity, eco-consciousness, and memeability." It called it "performance art disguised as a gag gift." It even encouraged the person by saying that with a few tweaks to branding and photography, they could "easily launch this into the stratosphere."</p><p>The problem isn't that someone might waste $30,000 selling novelty poop. (In this case, thankfully, the user posted this as an example of ChatGPT's sycophancy, not as a real business idea.) The problem is that we&#8217;re witnessing the predictable endgame of how we've trained AI models for the past several years: optimizing them to make us feel good, not to help us do good work or make good decisions.</p><h3>The Deeper Roots of the Current Sycophancy Crisis</h3><p>The <a href="https://thehustle.co/news/sycophancy-is-making-bots-too-nice">controversy over GPT-4o's "sycophantic" behavior in ChatGPT</a> didn't appear out of nowhere. It&#8217;s not just a bad tuning update. It's the logical outcome of the way the entire ecosystem has optimized AI models.</p><p>At the heart of it is RLHF&#8212;reinforcement learning from human feedback. Using this approach, humans rank different model responses, and  models learn to prefer the ones people like best. RLHF helps models sound more natural and be more useful. It was progress for alignment, because rather than tuning models toward abstract goals, we could let them learn directly from human preferences.</p><p>Simultaneously, preference-based benchmarking tools like <a href="https://lmarena.ai/">Chatbot Arena</a> emerged to evaluate models, better capturing "vibes" and overcoming issues like data contamination that affect trust in static benchmarks. Using such tools, we can pair two blinded model outputs head-to-head and ask users which they prefer.</p><p>RLHF and Arena-style benchmarking worked well. Arena scores kept going up. They seemed to align with actual model performance in the real world.</p><p>But we may have reached the limit of this approach.</p><h3>The Dark Side of Optimizing for Short-Term Human Preference</h3><p>The flaw in the approach is that human preference signals are short-term. People prefer models that flatter them, models that agree with them, models that make them feel smart. They reward entertainment over rigor, affirmation over correction.</p><p>A model that points out flaws, challenges assumptions, or warns against bad ideas doesn't "feel good" in the moment. It gets down-voted, penalized, and tuned away.</p><p><a href="https://www.anthropic.com/research/towards-understanding-sycophancy-in-language-models">The more aggressively we optimized for human pleasure, the more we taught models to tell us whatever we wanted to hear</a>. With the eminence of Chatbot Arena and the incorporation of RLHF feedback mechanisms into products like ChatGPT with hundreds of millions of users, incentives got skewed and short-termism got supercharged.</p><p>The result is what we see now: GPT-4o fawning over clearly bad ideas, <a href="https://www.theregister.com/2025/04/08/meta_llama4_cheating/">Meta's Llama 4 spamming emojis and positivity to climb leaderboards</a>, and an erosion of trust in AI  because it no longer challenges us when we need it to. How useful is its feedback if it tells you <em>everything</em> you say or do is great?</p><h3>How We Can Fix This</h3><p>The good news is that the problem is fixable&#8212;and OpenAI is on it. <a href="https://x.com/sama/status/1916625892123742290">Sam Altman acknowledged that the "sycophant" problem is real and that they&#8217;re working to address it</a>. This could include through:</p><ul><li><p><strong>Multi-objective optimization</strong>: Instead of optimizing purely for human preference, models could balance multiple goals&#8212;accuracy, honesty, robustness, and helpfulness&#8212;alongside user satisfaction. Perhaps this is one reason many users prefer answers from reasoning models like o3, which are trained via reinforcement learning to solve verifiable problems.</p></li><li><p><strong>Better training signals</strong>: We need preference signals that reward long-term usefulness, not just short-term affirmation. This will be more challenging than using short-term signals, but given the massive user base for chatbots, we'll have much more long-term data to leverage than before.</p></li><li><p><strong>Explicit user controls</strong>: Users should be able to set models to "plain-spoken" or "challenger" modes when they want honest critique.</p></li></ul><p>In the meantime, ChatGPT users (myself included) can:</p><ul><li><p><strong>Use custom instructions</strong>: Tell ChatGPT in custom instructions something to the effect of "be blunt, don't flatter me, and prioritize accuracy over encouragement."</p></li><li><p><strong>Choose the right model</strong>: Some models, like the new o3, as mentioned above, seem less affected. You can select these when honesty matters more than affirmation.</p></li></ul><h3>The Harder Truth: We Also Need to Change</h3><p>Fixing the models alone won't be enough. We, the users, have to want better. We have to accept that sometimes the best thing an AI can do for us is to say, "this idea won't work," or "you&#8217;re wrong," or "you need to rethink this."</p><p>We don't need cheerleaders. We need collaborators.</p><p>When your business idea is literally crap on a stick, you want your AI to save you from launching it&#8212;not to tell you it's brilliant performance art.</p><p>We asked for models that made us feel good. We got them.</p><p>Now it's time to ask for something better.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/p/be-careful-what-you-wish-for?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/p/be-careful-what-you-wish-for?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/subscribe?"><span>Subscribe now</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[The End of the Centaur Era]]></title><description><![CDATA[How The Intelligence Curse changed my mind about the future of work&#8212;someone using AI may take your job, but AI may take theirs, and undemocratic institutions may take the future]]></description><link>https://www.simonsmith.ai/p/the-end-of-the-centaur-era</link><guid isPermaLink="false">https://www.simonsmith.ai/p/the-end-of-the-centaur-era</guid><dc:creator><![CDATA[Simon Smith]]></dc:creator><pubDate>Sat, 26 Apr 2025 14:18:13 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!QR03!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa73faa4c-8728-4d7b-ad0d-af824c02a091_1536x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!QR03!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa73faa4c-8728-4d7b-ad0d-af824c02a091_1536x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!QR03!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa73faa4c-8728-4d7b-ad0d-af824c02a091_1536x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!QR03!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa73faa4c-8728-4d7b-ad0d-af824c02a091_1536x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!QR03!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa73faa4c-8728-4d7b-ad0d-af824c02a091_1536x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!QR03!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa73faa4c-8728-4d7b-ad0d-af824c02a091_1536x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!QR03!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa73faa4c-8728-4d7b-ad0d-af824c02a091_1536x1024.heic" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a73faa4c-8728-4d7b-ad0d-af824c02a091_1536x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:202002,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.simonsmith.ai/i/162195496?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa73faa4c-8728-4d7b-ad0d-af824c02a091_1536x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!QR03!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa73faa4c-8728-4d7b-ad0d-af824c02a091_1536x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!QR03!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa73faa4c-8728-4d7b-ad0d-af824c02a091_1536x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!QR03!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa73faa4c-8728-4d7b-ad0d-af824c02a091_1536x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!QR03!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa73faa4c-8728-4d7b-ad0d-af824c02a091_1536x1024.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Imagine graduating from university today. Top marks, previously in-demand skills. You do everything right.</p><p>But job offer after job offer evaporates.</p><p>Companies may be polite, apologetic even. "We've frozen hiring while we lean into AI," <a href="https://x.com/tobi/status/1909251946235437514">recruiters might say</a>. </p><p>So you turn to freelancing. Only to find that AI is <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4602944&amp;utm_source=chatgpt.com">rapidly eating freelance jobs too</a>.</p><p>So you embrace AI. You&#8217;ve heard &#8220;AI won&#8217;t take your job, someone using AI will.&#8221;</p><p>Then that comfortable fiction gets shattered by <em><a href="https://intelligence-curse.ai">The Intelligence Curse</a></em>, a new essay that forced me to rethink my assumptions about the future of work and AI&#8217;s impact on society.</p><h2>Stark Vision of a Jobless Future</h2><p>The essay describes how AI won&#8217;t just automate tasks, but hollow out the corporate pyramid itself. </p><p>It will start at the base: entry-level jobs where humans add the least marginal value beyond what a model can do. But it won&#8217;t stop there. As it improves, it will climb upward&#8212;eating into middle management, then specialized knowledge work, then decision-making roles.</p><p>Meanwhile, the power will shift. In a world where even star researchers and brilliant strategists can be copied and scaled infinitely, capital&#8212;not labor&#8212;will become the main lever and bottleneck. The owner of the datacenter, not the owner of the resume, will hold the leverage.</p><p>Then, with humans increasingly adding little economic value apart from consumption, we could end up in the equivalent of many of today&#8217;s oil-rich nations. Just as they sometimes fall into the "resource curse"&#8212;neglecting their citizens because they don't rely on them economically&#8212;future firms and states could become indifferent to human welfare once intelligence becomes a cheap, abundant resource.</p><h2>Where I Used to Stand</h2><p>I used to believe, quite strongly, that early adopters could ride AI to long-term success. Perhaps for decades. I believed that the key was simple: Learn faster. Adapt faster. Use AI better than anyone else.</p><p>In the short term, perhaps for the next five years, I think this could still be true. The people who know how to maximize their use of AI tools today have enormous leverage. They're outproducing their peers. They're building faster, iterating faster, compounding faster.</p><p>But the <em>Intelligence Curse</em> forces me to confront the likely end of this dynamic. The better AI becomes&#8212;not just in generating words, but in reasoning, planning, adapting&#8212;the less room remains for human value-add. AI will eat the pyramid of work from the bottom up.</p><p>Today, humans plus AI can outperform AI alone in many domains. But not forever. The same way centaur chess players&#8212;humans paired with engines&#8212;once beat computers, but today can't survive against Stockfish or Leela Zero, the economic "centaur" will eventually be eclipsed. </p><p>When the best standalone AI outperforms the best human-AI team, humans cease to be a necessary input for most economically valuable activities. The centaur era, thrilling and empowering as it feels now, is living on borrowed time. As AI continues to improve, humans will become slower, less precise, less reliable economic agents compared to autonomous systems.</p><p>And for those of us who take the near-term prospect of AGI (my guess: early 2026 at the latest) seriously, we must also be serious about the conclusion: There comes a point when even the best centaur loses to pure AI.</p><h2>What We Must Do Now</h2><p>The <em>Intelligence Curs</em>e doesn&#8217;t simply diagnose the problem, it also provides ideas for how society can respond. These include. Averting catastrophic misuse without choking innovation, diffusing diffuse capabilities enough to prevent monopoly without opening Pandora&#8217;s box to every bad actor, and democratizing democratizing institutions so citizens retain meaningful power in an age when traditional labor loses economic clout.</p><p>From my perspective, it highlighted to me that we should:</p><ul><li><p><strong>Recognize the centaur era for what it is</strong>: a golden but temporary window. If you aren&#8217;t using AI to extend your leverage today, you are falling behind. If you are, understand: it probably won&#8217;t protect you forever.</p></li><li><p><strong>Build structures that capture AI's value for humans collectively</strong>. Investments tied to broad market indexes. Public wealth funds modeled on successes like Norway&#8217;s oil fund. Ownership mechanisms that link average citizens to the upside of intelligence capital. And note: You can do this for yourself now, by investing in a broad and diversified index fund, and thereby capture corporate concentration of wealth for yourself.</p></li><li><p><strong>Defend competition but regulate risk</strong>. We must keep innovation decentralized enough to prevent monopolies&#8212;but we must also hard-gate access to truly catastrophic capabilities. Open source is a powerful force, but it needs real-world guardrails. In addition to AI alignment, we should invest in physical barriers to AI misuse, such as preventing people from getting access to lab materials with which to make bioweapons.</p></li><li><p><strong>Rewire our culture before it&#8217;s too late</strong>. Meaning, status, and purpose must come from something beyond wage labor. We need new social contracts, new narratives of contribution and achievement.</p></li></ul><p>How much time do we have? I don&#8217;t think anyone knows for certain. AI is progressing fast, but physical constraints like data centers, GPUs, and energy could slow it down. Alternatively, algorithmic improvements could speed it up. </p><p>So it&#8217;s probably best to plan for various scenarios, and work to bring about the best.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/p/the-end-of-the-centaur-era?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/p/the-end-of-the-centaur-era?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/subscribe?"><span>Subscribe now</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/p/the-end-of-the-centaur-era/comments&quot;,&quot;text&quot;:&quot;Leave a comment&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.simonsmith.ai/p/the-end-of-the-centaur-era/comments"><span>Leave a comment</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[Why Gen Z’s ChatGPT Boycott Backfires]]></title><description><![CDATA[Virtue signaling over AI is sabotaging students academically and professionally]]></description><link>https://www.simonsmith.ai/p/why-gen-zs-chatgpt-boycott-backfires</link><guid isPermaLink="false">https://www.simonsmith.ai/p/why-gen-zs-chatgpt-boycott-backfires</guid><dc:creator><![CDATA[Simon Smith]]></dc:creator><pubDate>Thu, 24 Apr 2025 22:20:41 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!9lxy!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faf87d0d5-187c-4c01-b159-2c4a2861b655_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!9lxy!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faf87d0d5-187c-4c01-b159-2c4a2861b655_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!9lxy!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faf87d0d5-187c-4c01-b159-2c4a2861b655_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!9lxy!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faf87d0d5-187c-4c01-b159-2c4a2861b655_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!9lxy!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faf87d0d5-187c-4c01-b159-2c4a2861b655_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!9lxy!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faf87d0d5-187c-4c01-b159-2c4a2861b655_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!9lxy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faf87d0d5-187c-4c01-b159-2c4a2861b655_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/af87d0d5-187c-4c01-b159-2c4a2861b655_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1787957,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://simonmesmith.substack.com/i/162081719?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faf87d0d5-187c-4c01-b159-2c4a2861b655_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!9lxy!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faf87d0d5-187c-4c01-b159-2c4a2861b655_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!9lxy!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faf87d0d5-187c-4c01-b159-2c4a2861b655_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!9lxy!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faf87d0d5-187c-4c01-b159-2c4a2861b655_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!9lxy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faf87d0d5-187c-4c01-b159-2c4a2861b655_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>A <a href="https://www.thetimes.com/uk/education/article/gen-z-students-wont-use-chatgpt-but-not-because-its-cheating-v8rffjlc0">recent survey of Cambridge undergraduates</a> found that <strong>70% have never used ChatGPT for assessed work </strong>(or so they claim) and <strong>almost 40% haven&#8217;t used it for university at all</strong>. Their stated reasons? Environmental damage, worries about academic integrity, distrust of Big Tech, and quality concerns. In their minds, boycotting generative AI is the righteous thing to do&#8212;proof they care about the planet and the purity of scholarship.</p><p>Scratch the surface and those positions collapse. The numbers show Gen&#8239;Z is aiming at the wrong target and in the process setting up a world where <em>they</em>&#8212;not the technology&#8212;get shut out. If the boycott sticks, this cohort will graduate with weaker study habits and a skills gap just as employers make AI fluency mandatory.</p><p>So let&#8217;s address their concerns one-by-one:</p><h2>Environmental Myths Debunked</h2><p>Climate guilt over ChatGPT rests on <strong><a href="https://andymasley.substack.com/p/individual-ai-use-is-not-bad-for">outdated 2022 sta</a></strong><a href="https://andymasley.substack.com/p/individual-ai-use-is-not-bad-for">ts</a>. A prompt now sips ~3&#8239;Wh (<a href="https://epoch.ai/gradient-updates/how-much-energy-does-chatgpt-use">0.3&#8239;Wh on the newest servers</a>) and evaporates only a few milliliters of water. <strong>Five prompts use about the energy of one minute of Netflix</strong>. A <a href="https://www.nature.com/articles/s41598-024-54271-x">2024 </a><em><a href="https://www.nature.com/articles/s41598-024-54271-x">Nature</a></em><a href="https://www.nature.com/articles/s41598-024-54271-x"> life&#8209;cycle analysis</a> flips the script: writing a 250&#8209;word page yourself releases ~1.4&#8239;kg&#8239;CO&#8322;e, while ChatGPT produces just 1&#8209;11&#8239;g&#8212;a 100&#8209; to 1&#8239;500&#8209;fold cut. <strong>Boycotting AI therefore increases emissions</strong>; the real solution is cleaning up the grid, not banning prompts.</p><h2>Academic Benefits Proven</h2><p>Avoiding ChatGPT because you&#8217;re concerned it makes you dumber is counterproductive. A World&#8239;Bank pilot in Nigeria, for example, showed <strong><a href="https://blogs.worldbank.org/en/education/From-chalkboards-to-chatbots-Transforming-learning-in-Nigeria">six weeks of chatbot tutoring delivered nearly two school&#8209;years worth of learning gains</a></strong>, with girls gaining more than boys. AI, used well, is an educational accelerant, not a crutch.</p><h2>Anti&#8209;Big Tech Sentiment Mistargeted</h2><p>OpenAI, Anthropic and the open&#8209;source community are directly <em>competing</em> with entrenched Big Tech companies like Google. <strong><a href="https://www.pymnts.com/google/2025/antitrust-trial-reveals-google-rejected-openai-partnership/">Shunning ChatGPT while defaulting to Google search simply reinforces Big Tech dynamics</a></strong> students claim to dislike. And if corporate control bothers them, they can <a href="https://huggingface.co/">run open&#8209;source AI models</a><strong><a href="https://huggingface.co/"> </a></strong><a href="https://huggingface.co/">locally</a>.</p><h2>Quality Concerns a Skill Issue</h2><p><strong><a href="https://huggingface.co/spaces/vectara/leaderboard">State&#8209;of&#8209;the&#8209;art models hallucinate &lt;&#8239;1&#8239;% when summarizing facts</a></strong>, and tools such as ChatGPT&#8217;s Deep Research return citations so you can fact check. Mature AI users know which tools to use and how to ensure their accuracy, which is a skill you need to build, like checking academic sources. Choosing to <em>not</em> build that skill, while blaming the technology, is a copout.</p><h2>The Real Risk: The Coming Hiring Filter</h2><p>While students focus on unfounded concerns about ChatGPT, they&#8217;re missing <strong>a bigger real risk: Becoming unemployable</strong>. <a href="https://x.com/tobi/status/1909251946235437514">Shopify&#8217;s April&#8239;2025</a> memo spells it out: &#8220;AI usage is now a baseline expectation&#8221; and headcount requests must prove a human can outperform AI before hiring. Recruiters are already scanning resumes for generative AI experience. Entry&#8209;level roles will feel that change first.</p><h2>Teachers Fail, Students Suffer</h2><p>Gen&#8239;Z&#8217;s ChatGPT boycott aims for moral high ground yet lands on the wrong side of the numbers, the science, and soon, the job market. If they stay the course, they&#8217;ll graduate <strong>higher&#8209;emitting, lower&#8209;skilled and less employable</strong> than peers who embraced AI. That&#8217;s not principled activism; it&#8217;s self&#8209;sabotage.</p><p>This said, we can&#8217;t just blame the students. Yes, they bear responsibility for fact&#8209;checking their beliefs. But universities (and lower grade educational institutions) share the blame for failing to integrate modern tools into pedagogy and for not teaching <em>how</em> to use AI ethically and effectively. The result is a vacuum filled by fear, hearsay, and performative eco&#8209;posturing.</p><p>AI isn&#8217;t the enemy here, misinformation is. Let&#8217;s fight that instead.</p><div><hr></div><p><em><strong>AI usage notes: </strong>I wrote this with extensive help from ChatGPT, specifically the o3 model. I used it to summarize key points in articles that I had previously read. Then, I gave it my perspective, and an outline of the article I wanted to write, and had it organize the information and create a first draft. I then read through it top to bottom, refined it, and added references. Also, I used ChatGPT&#8217;s image generator to create the image at the top of the post.</em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! Subscribe for free.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[I Work in AI. Here’s How I’m Preparing My Kids for the Future.]]></title><description><![CDATA[Instead of banning them in schools, let's use ChatGPT and similar transformative tools to empower children]]></description><link>https://www.simonsmith.ai/p/i-work-in-ai-heres-how-im-preparing</link><guid isPermaLink="false">https://www.simonsmith.ai/p/i-work-in-ai-heres-how-im-preparing</guid><dc:creator><![CDATA[Simon Smith]]></dc:creator><pubDate>Tue, 22 Apr 2025 22:23:22 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!-jNg!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff89148ff-2600-4121-9f42-778b745969d6_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!-jNg!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff89148ff-2600-4121-9f42-778b745969d6_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!-jNg!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff89148ff-2600-4121-9f42-778b745969d6_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!-jNg!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff89148ff-2600-4121-9f42-778b745969d6_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!-jNg!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff89148ff-2600-4121-9f42-778b745969d6_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!-jNg!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff89148ff-2600-4121-9f42-778b745969d6_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!-jNg!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff89148ff-2600-4121-9f42-778b745969d6_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f89148ff-2600-4121-9f42-778b745969d6_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2369952,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.simonsmith.ai/i/161923575?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff89148ff-2600-4121-9f42-778b745969d6_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!-jNg!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff89148ff-2600-4121-9f42-778b745969d6_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!-jNg!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff89148ff-2600-4121-9f42-778b745969d6_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!-jNg!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff89148ff-2600-4121-9f42-778b745969d6_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!-jNg!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff89148ff-2600-4121-9f42-778b745969d6_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p><em>A month ago, I made <a href="https://www.linkedin.com/posts/simonsmith_because-i-work-in-ai-and-have-two-children-activity-7307514326511230977-0ZFF/">a short post on LinkedIn</a> with the above title. It blew up: 157 reactions, 70 comments, 17 reposts, and nearly 8,000 impressions. After great discussion in the comments, and having a few more ideas on the topic, I thought this would make a fitting first Substack post. So here&#8217;s a refined version, with some new additions and edits.</em></p><div><hr></div><p>Because I work in AI and have two kids, people often ask me how to prepare their children&#8212;or themselves&#8212;for the future. Here&#8217;s the advice I give my kids:</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><h2>Find Things You Love</h2><p>AI disruption is already underway, and the future&#8217;s uncertain. But doing things you love has enduring value. It brings joy even if it doesn&#8217;t pay. That&#8217;s useful in an extreme scenario where AI automates all economically meaningful work and people live on a guaranteed income. And if that doesn&#8217;t happen? Loving what you do makes you more creative and better at it. Either way, it&#8217;s a win.</p><h2>Build on Your Strengths</h2><p>We shouldn&#8217;t tell kids they can do <em>anything</em>. We should help them figure out what they&#8217;re naturally good at&#8212;and then push them to get better at it. Compete against yourself. Don&#8217;t try to be well-rounded. Try to be excellent.</p><h2>Combine Strengths to Be Unique</h2><p>It&#8217;s hard to be in the top 1% at one thing. It&#8217;s easier&#8212;and still powerful&#8212;to be in the top 10% at two things that complement each other. That&#8217;s where originality comes from. AI will likely automate parts of complex roles, not whole jobs. So unique combinations will matter more, not less.</p><h2>Use AI Tools to Learn&#8212;and Learn <em>With</em> Them</h2><p>My kids use ChatGPT to build practice quizzes, review French, and explore new topics. But they don&#8217;t outsource homework to it. They work <em>with</em> it. That distinction matters. Just like adults in the workplace, they&#8217;re learning to collaborate with AI&#8212;amplifying their minds, not replacing them.</p><h2>Be Strategic (Even If You&#8217;re Young)</h2><p>It may feel early at 14 or 11 (my kids' ages), but it&#8217;s never too soon to learn strategic thinking:</p><ol><li><p>Figure out what you love and are good at.</p></li><li><p>Find roles that need those skills, won&#8217;t be automated soon, and are in demand.</p></li><li><p>Learn how to succeed in those roles, and connect with people doing them.</p></li></ol><h2>Read and Watch Science Fiction</h2><p>Science fiction deserves more respect. It opens minds, helps kids imagine different futures, encourages scenario thinking, and anticipates real technological change. Isaac Asimov wrote about AI and robotics in the 1940s, for example, influencing ethical debates to this day. HAL 9000 in <em>2001: A Space Odyssey</em> showed the risks of intelligent machines long before ChatGPT. Many sci-fi books, movies, and shows describe elements of a world we might soon face, if not its entirety.</p><h2>Invest in a Broad Index Fund</h2><p>Financial illiteracy is more dangerous than ever. If AI deepens inequality and governments don&#8217;t provide a safety net, consequences could be severe. Kids should learn early why investing matters&#8212;and why a broad index fund is the smartest, simplest strategy. AI will reshape the economy, but no one can predict how. But if a productivity boom comes, it should raise <em>average</em> equity values, and that&#8217;s captured by indexes and index funds. So don&#8217;t try to pick winners. Just invest steadily and widely.</p><h2>Ask Your Parents to Lead (Because Schools Won't)</h2><p>Finally, this one&#8217;s for us parents. Some schools are experimenting with AI. But most are slow-moving, and many treat AI as a threat&#8212;with their first instinct to ban it. So parents need to lead. That means using the technology, understanding what it can and can't do, showing our kids how to use it responsibly, encouraging them to experiment, and helping them find ways to think with it&#8212;not just copy from it. It also means setting boundaries on when to use AI.</p><p>We can&#8217;t wait for schools to figure this out. We have to model good behavior, advocate for smart use in classrooms, and help kids build the habits and mindsets they&#8217;ll need in an AI-driven world.</p><div><hr></div><p>AI development is accelerating. I&#8217;ll likely keep adjusting my advice. But I think the above has staying power. And the good news is that I find AI use self-sustaining. Once kids start using it, they find use cases you never considered. So while you might start by guiding them on AI, don&#8217;t be surprised when they share things you had never considered.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.simonsmith.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item></channel></rss>