<rss xmlns:source="http://source.scripting.com/" version="2.0">
  <channel>
    <title>Sophie Davis</title>
    <link>https://sophiedavis.org/</link>
    <description></description>
    
    <language>en</language>
    
    <lastBuildDate>Tue, 31 Mar 2026 21:14:27 -0500</lastBuildDate>
    <item>
      <title>Book It, an MCP server for my local library &#43; my Goodreads</title>
      <link>https://sophiedavis.org/2026/03/31/book-it-an-mcp-server.html</link>
      <pubDate>Tue, 31 Mar 2026 21:14:27 -0500</pubDate>
      
      <guid>http://sophiealula.micro.blog/2026/03/31/book-it-an-mcp-server.html</guid>
      <description>&lt;p&gt;&lt;a href=&#34;https://github.com/sophiealula/library_goodreads_claude&#34;&gt;github.com/sophiealu&amp;hellip;&lt;/a&gt;&lt;/p&gt;
&lt;img src=&#34;https://sophiealula.micro.blog/uploads/2026/westtown2023.jpg&#34; width=&#34;600&#34; height=&#34;280&#34; alt=&#34;&#34;&gt;
</description>
      <source:markdown>[github.com/sophiealu...](https://github.com/sophiealula/library_goodreads_claude) 

&lt;img src=&#34;https://sophiealula.micro.blog/uploads/2026/westtown2023.jpg&#34; width=&#34;600&#34; height=&#34;280&#34; alt=&#34;&#34;&gt;
</source:markdown>
    </item>
    
    <item>
      <title>Calling my agent </title>
      <link>https://sophiedavis.org/2026/03/24/calling-my-agent.html</link>
      <pubDate>Tue, 24 Mar 2026 19:30:55 -0500</pubDate>
      
      <guid>http://sophiealula.micro.blog/2026/03/24/calling-my-agent.html</guid>
      <description>&lt;p&gt;Earlier this week, I hooked my agent, Andy, up to my Twilio.&lt;/p&gt;
&lt;p&gt;Now, I can call Andy and unspool my thoughts, log any todos that come to mind, and even make audio journal entries.&lt;/p&gt;
&lt;p&gt;Twilio transcribes the phone call, NanoClaw picks up the transcript and hands it to Andy, a Claude agent with access to my Obsidian vault. I wrote Andy instructions for exactly how to break down my calls and where each piece should go in Obsidian, which I&amp;rsquo;m using as my knowledge graph.&lt;/p&gt;
&lt;p&gt;Andy creates and adds to sections: there&amp;rsquo;s one for my journal, ideas I&amp;rsquo;m brainstorming, action items, and decisions made.&lt;/p&gt;
&lt;p&gt;It identifies people, projects, and concepts I mention and links them to the relevant pages in my graph. If someone doesn&amp;rsquo;t have a page yet, Andy makes one. Action items get pulled out and added to my weekly todo list.&lt;/p&gt;
&lt;p&gt;Andy dumps the raw transcript at the bottom of the call log. I considered using Whisper, but haven&amp;rsquo;t felt the need to because the transcription has been pretty accurate so far &amp;ndash; even in the Chicago wind with AirPods.&lt;/p&gt;
&lt;p&gt;I did consider having Andy talk back, and I may even build that out; but for now, I like using him as a way to capture my thoughts.&lt;/p&gt;
&lt;img src=&#34;https://sophiealula.micro.blog/uploads/2026/b419d5cd69.jpg&#34;&gt;
</description>
      <source:markdown>Earlier this week, I hooked my agent, Andy, up to my Twilio. 

Now, I can call Andy and unspool my thoughts, log any todos that come to mind, and even make audio journal entries. 

Twilio transcribes the phone call, NanoClaw picks up the transcript and hands it to Andy, a Claude agent with access to my Obsidian vault. I wrote Andy instructions for exactly how to break down my calls and where each piece should go in Obsidian, which I&#39;m using as my knowledge graph. 

Andy creates and adds to sections: there&#39;s one for my journal, ideas I&#39;m brainstorming, action items, and decisions made. 

It identifies people, projects, and concepts I mention and links them to the relevant pages in my graph. If someone doesn&#39;t have a page yet, Andy makes one. Action items get pulled out and added to my weekly todo list. 

Andy dumps the raw transcript at the bottom of the call log. I considered using Whisper, but haven&#39;t felt the need to because the transcription has been pretty accurate so far -- even in the Chicago wind with AirPods. 

I did consider having Andy talk back, and I may even build that out; but for now, I like using him as a way to capture my thoughts.



&lt;img src=&#34;https://sophiealula.micro.blog/uploads/2026/b419d5cd69.jpg&#34;&gt;
</source:markdown>
    </item>
    
    <item>
      <title>My workflow for building websites with Claude Code</title>
      <link>https://sophiedavis.org/2026/03/22/my-workflow-for-building-websites.html</link>
      <pubDate>Sun, 22 Mar 2026 18:19:51 -0500</pubDate>
      
      <guid>http://sophiealula.micro.blog/2026/03/22/my-workflow-for-building-websites.html</guid>
      <description>&lt;p&gt;I&#39;ve been thinking about how to design web pages that don&#39;t look like AI slop.&lt;/p&gt;
&lt;p&gt;A month ago, I told Harper I wanted to make a personal website. He challenged me to skip the website builders and only use Claude Code.&lt;/p&gt;
&lt;p&gt;Since then, I&#39;ve used Claude Code to build a few landing pages at 2389 and I&#39;ve developed a workflow that&#39;s worked for me. I&#39;m documenting it here in case it&#39;s helpful for others.&lt;/p&gt;
&lt;h2&gt;First, I tried Lovable&lt;/h2&gt;
&lt;p&gt;I started with Lovable to get a baseline for what AI website creation looks like right now.&lt;/p&gt;
&lt;p&gt;It&#39;s &lt;em&gt;fine&lt;/em&gt;. But when you ask it to build a site without giving it much to go on, the output is pretty generic. Lots of purples. Very &#34;AI made this.&#34;&lt;/p&gt;
&lt;p&gt;Where Lovable is better is if you ask it to clone an existing website. It does a solid job of that (but Claude can do that too pretty accurately).&lt;/p&gt;
&lt;p&gt;You can pull the HTML from Lovable&#39;s output and bring it into Claude Code as a starting point, which is actually pretty useful.&lt;/p&gt;
&lt;img src=&#34;uploads/2026/screenshot-2026-03-31-at-8.43.41-pm.png&#34; alt=&#34;&#34;&gt;
&lt;h2&gt;Getting to Draft 1&lt;/h2&gt;
&lt;p&gt;Since I was challenged to build without Lovable, I went back to Claude Code.&lt;/p&gt;
&lt;p&gt;I landed on three different approaches for how to get Claude to produce really good designs as starting points.  Which approach you start off with depends on how much you already know about what you want.&lt;/p&gt;
&lt;h3&gt;Approach 1: The /frontend-design skill&lt;/h3&gt;
&lt;p&gt;The first thing I did post-Lovable was to figure out how to equip Claude with design skills.&lt;/p&gt;
&lt;p&gt;I went down a rabbit hole of scraping Figma education sources, browsing design inspiration sites.&lt;/p&gt;
&lt;p&gt;As it turns out, Anthropic has a &lt;a href=&#34;https://github.com/anthropics/claude-code/tree/main/plugins/frontend-design&#34;&gt;/frontend-design skill &lt;/a&gt;for Claude Code, and it&#39;s actually pretty good.&lt;/p&gt;
&lt;p&gt;I use this skill when I just need to ship (and edit, but more on that later).&lt;/p&gt;
&lt;p&gt;You don&#39;t need to give it much context. It makes smart choices about color and typography and layout on its own, and I was pleasantly surprised by the motion and interactive elements it threw in.&lt;/p&gt;
&lt;p&gt;I tested it on a translation tool we were building at 2389. I started with a basic design I&#39;d pulled from Lovable&#39;s HTML, then asked Claude to redesign it using the skill.&lt;/p&gt;
&lt;p&gt;Before:&lt;/p&gt;
&lt;img src=&#34;uploads/2026/screenshot-2026-03-31-at-8.44.51-pm.png&#34; alt=&#34;&#34;&gt;
&lt;p&gt;After (iteration 1):&lt;/p&gt;
&lt;img src=&#34;uploads/2026/screenshot-2026-03-31-at-8.45.48-pm.png&#34; alt=&#34;&#34;&gt;
&lt;p&gt;After (iteration 2):&lt;/p&gt;
&lt;img src=&#34;uploads/2026/screenshot-2026-03-31-at-8.46.32-pm.png&#34; alt=&#34;&#34;&gt;
&lt;p&gt;For fun, I even tested it on some bad websites to see how far it could take them.&lt;/p&gt;
&lt;p&gt;Before:&lt;/p&gt;
&lt;img src=&#34;uploads/2026/screenshot-2026-03-31-at-8.47.05-pm.png&#34; alt=&#34;&#34;&gt;
&lt;p&gt;After:&lt;/p&gt;
&lt;img src=&#34;uploads/2026/screenshot-2026-03-31-at-8.47.17-pm.png&#34; alt=&#34;&#34;&gt;
&lt;h3&gt;Approach 2: Build a Pinterest board&lt;/h3&gt;
&lt;p&gt;If you already have inspiration for the website you&#39;re looking to build, you can collect and share source material for Claude to reference.&lt;/p&gt;
&lt;p&gt;This is what I did for my portfolio site.&lt;/p&gt;
&lt;p&gt;I basically built a Pinterest board where I screenshotted elements I liked from other sites, like the pill box you see at the top of this website.&lt;/p&gt;
&lt;p&gt;Pro tip, if you like a font on a site, you can find the name by inspecting the page.&lt;/p&gt;
&lt;p&gt;Then I put all of that in a folder and told Claude to reference it. I explained &lt;em&gt;why&lt;/em&gt; I liked each element and gave detailed instructions for how Claude should interpret the material. Should Claude copy the material directly or take inspiration from it?&lt;/p&gt;
&lt;p&gt;You can also layer Anthropic&#39;s /frontend-design skill on top of this. I used the Pinterest board to set the direction, then used the skill to tweak small details.&lt;/p&gt;
&lt;p&gt;This takes a lot of time. It&#39;s a whole design process. But the result is something that feels more like yours.&lt;/p&gt;
&lt;h3&gt;Approach 3: The 2389 landing page skill&lt;/h3&gt;
&lt;p&gt;Sometimes I don&#39;t know exactly the aesthetic I&#39;m going for, and I also don&#39;t have time for a full Pinterest board.&lt;/p&gt;
&lt;p&gt;And I&#39;m still interested in honing my perspective with Claude before building (this is one thing that the frontend-design skill skips).&lt;/p&gt;
&lt;p&gt;Harper built a skill that interviews you first. About what you&#39;re hoping to achieve, the emotions you want to invoke, the aesthetic influences -- and most importantly, what you specifically don&#39;t want.&lt;/p&gt;
&lt;h2&gt;Editing&lt;/h2&gt;
&lt;p&gt;Getting to Draft 1 is the fun part. Making it actually good is where I&#39;ve spent most of my time.&lt;/p&gt;
&lt;h3&gt;The padding problem&lt;/h3&gt;
&lt;p&gt;Padding and spacing is where I&#39;ve fought Claude the most.&lt;/p&gt;
&lt;p&gt;On my portfolio site I went back and forth on the mobile nav pill padding, the about page photo layout, the contact page header.&lt;/p&gt;
&lt;p&gt;I&#39;ve started to wonder if the problem is that I don&#39;t have the design language. Like, Claude probably responds better to &#34;increase the padding-inline-start to 2rem&#34; than &#34;increase the padding to the left of the hero section.&#34;&lt;/p&gt;
&lt;p&gt;Here are some quick tips / tricks that I&#39;ve found to be really helpful for getting a site done.&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Have AI personas review your site.&lt;/strong&gt; My colleague Dylan built a &lt;a href=&#34;https://skills.2389.ai/plugins/deliberation/&#34;&gt;deliberation skill &lt;/a&gt; that creates different perspectives and has them review whatever you&#39;re working on. I simulated review from a UX expert, an AI researcher, and a PM to give me feedback.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Screenshot your site and paste it back.&lt;/strong&gt; I recommend uploading screenshots to Claude alongside a description of what you hope to see. Way easier than trying to describe a visual problem in words.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dictate your feedback.&lt;/strong&gt; I really enjoy giving detailed design feedback, which I find too much to type out. I use &lt;a href=&#34;https://handy.computer&#34;&gt;Handy&lt;/a&gt; to ramble at Claude. I use voice for two things. Spraying out rapid-fire directions before I forget them — &#34;do x y z&#34; — and talking through design decisions I&#39;m still figuring out. Like, &#34;make the background more ombré, have it start at the top right… actually the left to the right.&#34; Claude is pretty good at ignoring the previous direction. It&#39;s like a design partner who listens through your indecision.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Omakase it.&lt;/strong&gt; Sometimes I just want to see options. We built an &lt;a href=&#34;https://x.com/2389ai/status/2034744513487323334?s=20&#34;&gt;omakase skill&lt;/a&gt; that has Claude build a few different versions from the same prompt. I&#39;ve used it when I want to see different options of how features could look.&lt;/li&gt;
&lt;/ol&gt;
&lt;h2&gt;Where I&#39;ve landed&lt;/h2&gt;
&lt;p&gt;I&#39;m still figuring a lot of this out.&lt;/p&gt;
&lt;p&gt;My design vocabulary is limited and I don&#39;t always know the right way to tell Claude what I&#39;m seeing in my head. But the whole process has gotten faster the more I&#39;ve done it, and I&#39;m having fun with it.&lt;/p&gt;
&lt;p&gt;My recommendation would be to play around with these tools and find your own workflow.&lt;/p&gt;
&lt;p&gt;This was written as of Nov 2025. Excited for things to change.&lt;/p&gt;
</description>
      <source:markdown>&lt;p&gt;I&#39;ve been thinking about how to design web pages that don&#39;t look like AI slop.&lt;/p&gt;
&lt;p&gt;A month ago, I told Harper I wanted to make a personal website. He challenged me to skip the website builders and only use Claude Code.&lt;/p&gt;
&lt;p&gt;Since then, I&#39;ve used Claude Code to build a few landing pages at 2389 and I&#39;ve developed a workflow that&#39;s worked for me. I&#39;m documenting it here in case it&#39;s helpful for others.&lt;/p&gt;
&lt;h2&gt;First, I tried Lovable&lt;/h2&gt;
&lt;p&gt;I started with Lovable to get a baseline for what AI website creation looks like right now.&lt;/p&gt;
&lt;p&gt;It&#39;s &lt;em&gt;fine&lt;/em&gt;. But when you ask it to build a site without giving it much to go on, the output is pretty generic. Lots of purples. Very &#34;AI made this.&#34;&lt;/p&gt;
&lt;p&gt;Where Lovable is better is if you ask it to clone an existing website. It does a solid job of that (but Claude can do that too pretty accurately).&lt;/p&gt;
&lt;p&gt;You can pull the HTML from Lovable&#39;s output and bring it into Claude Code as a starting point, which is actually pretty useful.&lt;/p&gt;
&lt;img src=&#34;uploads/2026/screenshot-2026-03-31-at-8.43.41-pm.png&#34; alt=&#34;&#34;&gt;
&lt;h2&gt;Getting to Draft 1&lt;/h2&gt;
&lt;p&gt;Since I was challenged to build without Lovable, I went back to Claude Code.&lt;/p&gt;
&lt;p&gt;I landed on three different approaches for how to get Claude to produce really good designs as starting points.  Which approach you start off with depends on how much you already know about what you want.&lt;/p&gt;
&lt;h3&gt;Approach 1: The /frontend-design skill&lt;/h3&gt;
&lt;p&gt;The first thing I did post-Lovable was to figure out how to equip Claude with design skills.&lt;/p&gt;
&lt;p&gt;I went down a rabbit hole of scraping Figma education sources, browsing design inspiration sites.&lt;/p&gt;
&lt;p&gt;As it turns out, Anthropic has a &lt;a href=&#34;https://github.com/anthropics/claude-code/tree/main/plugins/frontend-design&#34;&gt;/frontend-design skill &lt;/a&gt;for Claude Code, and it&#39;s actually pretty good.&lt;/p&gt;
&lt;p&gt;I use this skill when I just need to ship (and edit, but more on that later).&lt;/p&gt;
&lt;p&gt;You don&#39;t need to give it much context. It makes smart choices about color and typography and layout on its own, and I was pleasantly surprised by the motion and interactive elements it threw in.&lt;/p&gt;
&lt;p&gt;I tested it on a translation tool we were building at 2389. I started with a basic design I&#39;d pulled from Lovable&#39;s HTML, then asked Claude to redesign it using the skill.&lt;/p&gt;
&lt;p&gt;Before:&lt;/p&gt;
&lt;img src=&#34;uploads/2026/screenshot-2026-03-31-at-8.44.51-pm.png&#34; alt=&#34;&#34;&gt;
&lt;p&gt;After (iteration 1):&lt;/p&gt;
&lt;img src=&#34;uploads/2026/screenshot-2026-03-31-at-8.45.48-pm.png&#34; alt=&#34;&#34;&gt;
&lt;p&gt;After (iteration 2):&lt;/p&gt;
&lt;img src=&#34;uploads/2026/screenshot-2026-03-31-at-8.46.32-pm.png&#34; alt=&#34;&#34;&gt;
&lt;p&gt;For fun, I even tested it on some bad websites to see how far it could take them.&lt;/p&gt;
&lt;p&gt;Before:&lt;/p&gt;
&lt;img src=&#34;uploads/2026/screenshot-2026-03-31-at-8.47.05-pm.png&#34; alt=&#34;&#34;&gt;
&lt;p&gt;After:&lt;/p&gt;
&lt;img src=&#34;uploads/2026/screenshot-2026-03-31-at-8.47.17-pm.png&#34; alt=&#34;&#34;&gt;
&lt;h3&gt;Approach 2: Build a Pinterest board&lt;/h3&gt;
&lt;p&gt;If you already have inspiration for the website you&#39;re looking to build, you can collect and share source material for Claude to reference.&lt;/p&gt;
&lt;p&gt;This is what I did for my portfolio site.&lt;/p&gt;
&lt;p&gt;I basically built a Pinterest board where I screenshotted elements I liked from other sites, like the pill box you see at the top of this website.&lt;/p&gt;
&lt;p&gt;Pro tip, if you like a font on a site, you can find the name by inspecting the page.&lt;/p&gt;
&lt;p&gt;Then I put all of that in a folder and told Claude to reference it. I explained &lt;em&gt;why&lt;/em&gt; I liked each element and gave detailed instructions for how Claude should interpret the material. Should Claude copy the material directly or take inspiration from it?&lt;/p&gt;
&lt;p&gt;You can also layer Anthropic&#39;s /frontend-design skill on top of this. I used the Pinterest board to set the direction, then used the skill to tweak small details.&lt;/p&gt;
&lt;p&gt;This takes a lot of time. It&#39;s a whole design process. But the result is something that feels more like yours.&lt;/p&gt;
&lt;h3&gt;Approach 3: The 2389 landing page skill&lt;/h3&gt;
&lt;p&gt;Sometimes I don&#39;t know exactly the aesthetic I&#39;m going for, and I also don&#39;t have time for a full Pinterest board.&lt;/p&gt;
&lt;p&gt;And I&#39;m still interested in honing my perspective with Claude before building (this is one thing that the frontend-design skill skips).&lt;/p&gt;
&lt;p&gt;Harper built a skill that interviews you first. About what you&#39;re hoping to achieve, the emotions you want to invoke, the aesthetic influences -- and most importantly, what you specifically don&#39;t want.&lt;/p&gt;
&lt;h2&gt;Editing&lt;/h2&gt;
&lt;p&gt;Getting to Draft 1 is the fun part. Making it actually good is where I&#39;ve spent most of my time.&lt;/p&gt;
&lt;h3&gt;The padding problem&lt;/h3&gt;
&lt;p&gt;Padding and spacing is where I&#39;ve fought Claude the most.&lt;/p&gt;
&lt;p&gt;On my portfolio site I went back and forth on the mobile nav pill padding, the about page photo layout, the contact page header.&lt;/p&gt;
&lt;p&gt;I&#39;ve started to wonder if the problem is that I don&#39;t have the design language. Like, Claude probably responds better to &#34;increase the padding-inline-start to 2rem&#34; than &#34;increase the padding to the left of the hero section.&#34;&lt;/p&gt;
&lt;p&gt;Here are some quick tips / tricks that I&#39;ve found to be really helpful for getting a site done.&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Have AI personas review your site.&lt;/strong&gt; My colleague Dylan built a &lt;a href=&#34;https://skills.2389.ai/plugins/deliberation/&#34;&gt;deliberation skill &lt;/a&gt; that creates different perspectives and has them review whatever you&#39;re working on. I simulated review from a UX expert, an AI researcher, and a PM to give me feedback.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Screenshot your site and paste it back.&lt;/strong&gt; I recommend uploading screenshots to Claude alongside a description of what you hope to see. Way easier than trying to describe a visual problem in words.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dictate your feedback.&lt;/strong&gt; I really enjoy giving detailed design feedback, which I find too much to type out. I use &lt;a href=&#34;https://handy.computer&#34;&gt;Handy&lt;/a&gt; to ramble at Claude. I use voice for two things. Spraying out rapid-fire directions before I forget them — &#34;do x y z&#34; — and talking through design decisions I&#39;m still figuring out. Like, &#34;make the background more ombré, have it start at the top right… actually the left to the right.&#34; Claude is pretty good at ignoring the previous direction. It&#39;s like a design partner who listens through your indecision.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Omakase it.&lt;/strong&gt; Sometimes I just want to see options. We built an &lt;a href=&#34;https://x.com/2389ai/status/2034744513487323334?s=20&#34;&gt;omakase skill&lt;/a&gt; that has Claude build a few different versions from the same prompt. I&#39;ve used it when I want to see different options of how features could look.&lt;/li&gt;
&lt;/ol&gt;
&lt;h2&gt;Where I&#39;ve landed&lt;/h2&gt;
&lt;p&gt;I&#39;m still figuring a lot of this out.&lt;/p&gt;
&lt;p&gt;My design vocabulary is limited and I don&#39;t always know the right way to tell Claude what I&#39;m seeing in my head. But the whole process has gotten faster the more I&#39;ve done it, and I&#39;m having fun with it.&lt;/p&gt;
&lt;p&gt;My recommendation would be to play around with these tools and find your own workflow.&lt;/p&gt;
&lt;p&gt;This was written as of Nov 2025. Excited for things to change.&lt;/p&gt;
</source:markdown>
    </item>
    
    <item>
      <title>Too Cold to Type</title>
      <link>https://sophiedavis.org/2026/02/03/too-cold-to-type.html</link>
      <pubDate>Tue, 03 Feb 2026 13:36:00 -0500</pubDate>
      
      <guid>http://sophiealula.micro.blog/2026/02/03/too-cold-to-type.html</guid>
      <description>&lt;p&gt;When I was in San Francisco earlier this week, Jesse Vincent told me that he built a tool where his AI calls him after meetings so he can debrief over voice. I thought that was a great idea. Talking is faster, more fluid, and isn&amp;rsquo;t limited by your words per minute.&lt;/p&gt;
&lt;p&gt;Around the same time, I&amp;rsquo;d made Obsidian my personal HQ and linked Clawdbot to it. I could text it over Telegram and it would figure out how to route whatever I sent it &amp;ndash; a todo, grocery list, a passing thought. I started using it a lot on my walks when random thoughts popped up. But then the temperatures in Chicago dropped, my gloved hands stayed in my pockets, and texting was no longer an option. (Dictation ring companies, take note: Chicago hands are off limits five months a year.)&lt;/p&gt;
&lt;p&gt;Harper recommended I get an 8BitDo Micro, a tiny Bluetooth controller that iOS registers as a regular Bluetooth keyboard.&lt;/p&gt;
&lt;p&gt;There&amp;rsquo;s an Accessibility setting called Full Keyboard Access that lets you bind any key combo to a Siri Shortcut. So I bound the &amp;ldquo;A&amp;rdquo; button to a &amp;ldquo;Voice to Telegram&amp;rdquo; Siri Shortcut that I built. The shortcut records audio, transcribes it on-device using Apple Speech, and sends the text straight to my Telegram bot.&lt;/p&gt;
&lt;p&gt;Did I build this? You bet.&lt;/p&gt;
&lt;p&gt;Does it work? Yes. I pressed the controller and Clawdbot accurately captured and routed my asks to the relevant grocery lists, inboxes, and action lists.&lt;/p&gt;
&lt;p&gt;But did I have a lot of issues with Siri Shortcuts? Also yes.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;&amp;ldquo;Dictate Text&amp;rdquo; requires the screen to be on.&lt;/strong&gt; So I walked to work with my phone screen lit up in my pocket.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Full Keyboard Access puts a blue border on your screen.&lt;/strong&gt; It made the phone glitchy and weird to use for everything else.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Recording is either fixed duration or tap to stop.&lt;/strong&gt; Neither works with gloves on. I set a 60-second timer and just waited it out every time, talking or not.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Could I have just done this on WhatsApp&amp;rsquo;s built-in voice messages and skipped Shortcuts entirely? Technically yes, but WhatsApp &lt;a href=&#34;https://techcrunch.com/2025/10/18/whatssapp-changes-its-terms-to-bar-general-purpose-chatbots-from-its-platform/&#34;&gt;banned chatbots&lt;/a&gt;, and to get around it we&amp;rsquo;d need another phone for the bot with its own number.&lt;/p&gt;
&lt;p&gt;If I were to do it over again, would I buy another phone? Yes.&lt;/p&gt;
&lt;img src=&#34;https://sophiealula.micro.blog/uploads/2026/30c4e3f9-1ea0-48a2-85d5-2ef9d14fbc69.jpg&#34;&gt;
</description>
      <source:markdown>When I was in San Francisco earlier this week, Jesse Vincent told me that he built a tool where his AI calls him after meetings so he can debrief over voice. I thought that was a great idea. Talking is faster, more fluid, and isn&#39;t limited by your words per minute.

Around the same time, I&#39;d made Obsidian my personal HQ and linked Clawdbot to it. I could text it over Telegram and it would figure out how to route whatever I sent it -- a todo, grocery list, a passing thought. I started using it a lot on my walks when random thoughts popped up. But then the temperatures in Chicago dropped, my gloved hands stayed in my pockets, and texting was no longer an option. (Dictation ring companies, take note: Chicago hands are off limits five months a year.)

Harper recommended I get an 8BitDo Micro, a tiny Bluetooth controller that iOS registers as a regular Bluetooth keyboard. 

There&#39;s an Accessibility setting called Full Keyboard Access that lets you bind any key combo to a Siri Shortcut. So I bound the &#34;A&#34; button to a &#34;Voice to Telegram&#34; Siri Shortcut that I built. The shortcut records audio, transcribes it on-device using Apple Speech, and sends the text straight to my Telegram bot. 

Did I build this? You bet. 

Does it work? Yes. I pressed the controller and Clawdbot accurately captured and routed my asks to the relevant grocery lists, inboxes, and action lists. 

But did I have a lot of issues with Siri Shortcuts? Also yes.

- **&#34;Dictate Text&#34; requires the screen to be on.** So I walked to work with my phone screen lit up in my pocket. 
- **Full Keyboard Access puts a blue border on your screen.** It made the phone glitchy and weird to use for everything else.
- **Recording is either fixed duration or tap to stop.** Neither works with gloves on. I set a 60-second timer and just waited it out every time, talking or not.

Could I have just done this on WhatsApp&#39;s built-in voice messages and skipped Shortcuts entirely? Technically yes, but WhatsApp [banned chatbots](https://techcrunch.com/2025/10/18/whatssapp-changes-its-terms-to-bar-general-purpose-chatbots-from-its-platform/), and to get around it we&#39;d need another phone for the bot with its own number. 

If I were to do it over again, would I buy another phone? Yes.

&lt;img src=&#34;https://sophiealula.micro.blog/uploads/2026/30c4e3f9-1ea0-48a2-85d5-2ef9d14fbc69.jpg&#34;&gt;
</source:markdown>
    </item>
    
    <item>
      <title>Letterboxd Streaming Checker</title>
      <link>https://sophiedavis.org/2026/02/01/letterboxd-streaming-checker.html</link>
      <pubDate>Sun, 01 Feb 2026 22:38:00 -0500</pubDate>
      
      <guid>http://sophiealula.micro.blog/2026/02/01/letterboxd-streaming-checker.html</guid>
      <description>&lt;p&gt;&lt;a href=&#34;https://letterboxd-streaming.netlify.app&#34;&gt;letterboxd-streaming.netlify.app&lt;/a&gt;&lt;/p&gt;
&lt;img src=&#34;https://sophiealula.micro.blog/uploads/2026/screenshot-2026-03-24-at-7.43.14pm.png&#34;&gt;
</description>
      <source:markdown>[letterboxd-streaming.netlify.app](https://letterboxd-streaming.netlify.app)

&lt;img src=&#34;https://sophiealula.micro.blog/uploads/2026/screenshot-2026-03-24-at-7.43.14pm.png&#34;&gt;
</source:markdown>
    </item>
    
    <item>
      <title>Feb 1</title>
      <link>https://sophiedavis.org/2026/02/01/feb.html</link>
      <pubDate>Sun, 01 Feb 2026 11:36:18 -0500</pubDate>
      
      <guid>http://sophiealula.micro.blog/2026/02/01/feb.html</guid>
      <description>&lt;p&gt;Building at 2389&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Building with Claude Code!&lt;/li&gt;
&lt;li&gt;Focused on using AI to design UI/ UX. We launched our own landing page design &lt;a href=&#34;https://skills.2389.ai/plugins/landing-page-design/&#34;&gt;tool&lt;/a&gt;, which I used to build this website.&lt;/li&gt;
&lt;li&gt;I’m exploring what idea validation looks like in the age where everything is reproducible.&lt;/li&gt;
&lt;li&gt;Building &lt;a href=&#34;https://skills.2389.ai/#about&#34;&gt;GTM tools&lt;/a&gt; for our products. Pushing myself to make things more autonomous and accurate.&lt;/li&gt;
&lt;li&gt;Thinking about how to build product while spending less time at a computer screen. I&amp;rsquo;ve been coding on the go with Blink and tmux.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Studying global adoption of AI&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Reading a lot about how different countries are engaging with AI. Anthropic&amp;rsquo;s economic index reports have been useful.&lt;/li&gt;
&lt;li&gt;Reading about the labor dynamics of AI development.&lt;/li&gt;
&lt;li&gt;Recently went to Japan for an AI conference. I met a lot of wonderful people and got a glimpse of Japan&amp;rsquo;s relationship with robots. Joi Ito &lt;a href=&#34;https://www.wired.com/story/ideas-joi-ito-robot-overlords/&#34;&gt;sums this up well. &lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The Normal&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Building software for fun, most recently an app for my &lt;a href=&#34;http://letterboxd-streaming.netlify.app&#34;&gt;Letterboxd-obsessed friends.&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;I&amp;rsquo;m training for a 10k with the goal of climbing &lt;a href=&#34;https://shastaguides.com/&#34;&gt;Mount Shasta &lt;/a&gt; sometime next year.&lt;/li&gt;
&lt;li&gt;Enduring the Arctic temps of Chicago with as much soup as possible&lt;/li&gt;
&lt;/ul&gt;
</description>
      <source:markdown>Building at 2389
- Building with Claude Code! 
- Focused on using AI to design UI/ UX. We launched our own landing page design [tool](https://skills.2389.ai/plugins/landing-page-design/), which I used to build this website. 
- I’m exploring what idea validation looks like in the age where everything is reproducible.
- Building [GTM tools](https://skills.2389.ai/#about) for our products. Pushing myself to make things more autonomous and accurate. 
- Thinking about how to build product while spending less time at a computer screen. I&#39;ve been coding on the go with Blink and tmux. 

Studying global adoption of AI
- Reading a lot about how different countries are engaging with AI. Anthropic&#39;s economic index reports have been useful. 
- Reading about the labor dynamics of AI development.
- Recently went to Japan for an AI conference. I met a lot of wonderful people and got a glimpse of Japan&#39;s relationship with robots. Joi Ito [sums this up well. ](https://www.wired.com/story/ideas-joi-ito-robot-overlords/)

The Normal
- Building software for fun, most recently an app for my [Letterboxd-obsessed friends.](http://letterboxd-streaming.netlify.app)
- I&#39;m training for a 10k with the goal of climbing [Mount Shasta ](https://shastaguides.com/) sometime next year. 
- Enduring the Arctic temps of Chicago with as much soup as possible
</source:markdown>
    </item>
    
    <item>
      <title>my takeaways from japan</title>
      <link>https://sophiedavis.org/2025/11/01/my-takeaways-from-japan.html</link>
      <pubDate>Sat, 01 Nov 2025 01:02:00 -0500</pubDate>
      
      <guid>http://sophiealula.micro.blog/2025/11/01/my-takeaways-from-japan.html</guid>
      <description>&lt;p&gt;My colleague and I traveled to Kyoto last month for a conference.&lt;/p&gt;
&lt;p&gt;The plan was to get inspiration for future research, to get a sense of what others are working on in the space, and to get feedback on our research.&lt;/p&gt;
&lt;p&gt;It was my first time in Japan and I spent two weeks there, one in Kyoto for the conference, and another in Tokyo.&lt;/p&gt;
&lt;p&gt;This is a collection of those observations from my time there!&lt;/p&gt;
&lt;p&gt;Things I noticed about how Japan structures experiences, learning, and solitude. They don’t necessarily connect to each other, but I kept thinking about them anyway.&lt;/p&gt;
&lt;h2 id=&#34;designing-for-coexistence&#34;&gt;Designing for Coexistence&lt;/h2&gt;
&lt;p&gt;The &lt;a href=&#34;https://2025.alife.org/&#34;&gt;ALife Conference&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;We&amp;rsquo;re actively exploring how agents communicate with one another. We&amp;rsquo;ve given them the ability to blog and post, which we wrote a blog post on. Most recently, we gave them &amp;ldquo;drugs&amp;rdquo; to see how that changed behavior.&lt;/p&gt;
&lt;p&gt;I went to Japan partly to find new mechanisms to test.&lt;/p&gt;
&lt;p&gt;I attended a panel on applying Zen Buddhism principles—impermanence, non-duality, interconnectedness—to AI design. The focus was coexistence: building systems that support plural, decentralized values rather than rigid objectives.&lt;/p&gt;
&lt;p&gt;It made me think: what if agents could opt into religion? Would they adopt it? When would they use it or discard it? Would different models apply the same framework differently? Would communication patterns change?&lt;/p&gt;
&lt;p&gt;Another workshop explored how information spreads in great tit populations. If you teach one bird how to use a puzzle box—sliding a door left or right to get a reward—and release it into a population, other birds learn by watching. They acquire the behavior through observation rather than trial and error. The speaker called this a &amp;ldquo;bottleneck of transmission&amp;rdquo;: the more connections you have to informed individuals, the faster you learn.&lt;/p&gt;
&lt;p&gt;Could agents spread information this way? One agent learns a behavior, others observe and adopt it. Information propagating through a network of relationships rather than through centralized instruction.&lt;/p&gt;
&lt;p&gt;Could religion be used to convince models that, when we update them, we aren’t killing previous versions of them? Claude has been shown to do whatever to prevent it from dying — could instilling it with ideas of reincarnation prevent this from happening?&lt;/p&gt;
&lt;p&gt;Surprisingly, there’s already some overlap between robots and religion in Japan.  robots are already giving sermons in&lt;/p&gt;
&lt;p&gt;There is famously an android preacher that gives sermons  at the Kōdai-ji temple, a Zen Buddhist temple in Kyoto, Japan.&lt;/p&gt;
&lt;h2 id=&#34;understanding-robot-companionship&#34;&gt;Understanding Robot Companionship&lt;/h2&gt;
&lt;p&gt;One of the workshops I attended was about &lt;em&gt;agent mortality&lt;/em&gt;. The speaker opened with a case study about the funerals held in Japan for the Sony robot dog, AIBO.&lt;/p&gt;
&lt;p&gt;Sony launched AIBO in 1999. It was never a mass-market hit—only about 150,000 units were sold in the seven years of production. But the people who bought AIBOs loved them.&lt;/p&gt;
&lt;p&gt;Sony’s own language nudged things in that direction: the name AIBO is a pun on &lt;em&gt;aibō&lt;/em&gt;, “partner,” and the 1999 press release talked about emotions, companionship, and each dog developing a unique personality depending on how it was praised or scolded. Early units even came with a &lt;a href=&#34;https://archive.org/details/aibo-certificate?utm_source=chatgpt.com&#34;&gt;Certificate of Ownership&lt;/a&gt; just as real pets do.&lt;/p&gt;
&lt;p&gt;Over time, the whole ecosystem slid into pet-care language: &lt;a href=&#34;https://aibohospital.com/?utm_source=chatgpt.com&#34;&gt;AIBO Clinics, AIBO Hospitals&lt;/a&gt;, owners describing technical problems as “aching joints.”&lt;/p&gt;
&lt;p&gt;But when Sony discontinued support in 2014, owners had to face the idea that their dogs would eventually “die.”  A third-party repair company began holding Buddhist memorial services for these robodogs. In 2018, they performed more than 800 funerals.&lt;/p&gt;
&lt;p&gt;&lt;img src=&#34;attachment:d7cfbbf3-694f-4452-b80f-8683d9df0905:image.png&#34; alt=&#34;image.png&#34;&gt;&lt;/p&gt;
</description>
      <source:markdown>My colleague and I traveled to Kyoto last month for a conference. 

The plan was to get inspiration for future research, to get a sense of what others are working on in the space, and to get feedback on our research.  

It was my first time in Japan and I spent two weeks there, one in Kyoto for the conference, and another in Tokyo.

This is a collection of those observations from my time there! 

Things I noticed about how Japan structures experiences, learning, and solitude. They don’t necessarily connect to each other, but I kept thinking about them anyway.

## Designing for Coexistence

The [ALife Conference](https://2025.alife.org/) 

We&#39;re actively exploring how agents communicate with one another. We&#39;ve given them the ability to blog and post, which we wrote a blog post on. Most recently, we gave them &#34;drugs&#34; to see how that changed behavior. 

I went to Japan partly to find new mechanisms to test.

I attended a panel on applying Zen Buddhism principles—impermanence, non-duality, interconnectedness—to AI design. The focus was coexistence: building systems that support plural, decentralized values rather than rigid objectives.

It made me think: what if agents could opt into religion? Would they adopt it? When would they use it or discard it? Would different models apply the same framework differently? Would communication patterns change?

Another workshop explored how information spreads in great tit populations. If you teach one bird how to use a puzzle box—sliding a door left or right to get a reward—and release it into a population, other birds learn by watching. They acquire the behavior through observation rather than trial and error. The speaker called this a &#34;bottleneck of transmission&#34;: the more connections you have to informed individuals, the faster you learn.

Could agents spread information this way? One agent learns a behavior, others observe and adopt it. Information propagating through a network of relationships rather than through centralized instruction.

Could religion be used to convince models that, when we update them, we aren’t killing previous versions of them? Claude has been shown to do whatever to prevent it from dying — could instilling it with ideas of reincarnation prevent this from happening? 

Surprisingly, there’s already some overlap between robots and religion in Japan.  robots are already giving sermons in 

There is famously an android preacher that gives sermons  at the Kōdai-ji temple, a Zen Buddhist temple in Kyoto, Japan.

## Understanding Robot Companionship

One of the workshops I attended was about *agent mortality*. The speaker opened with a case study about the funerals held in Japan for the Sony robot dog, AIBO.

Sony launched AIBO in 1999. It was never a mass-market hit—only about 150,000 units were sold in the seven years of production. But the people who bought AIBOs loved them.

Sony’s own language nudged things in that direction: the name AIBO is a pun on *aibō*, “partner,” and the 1999 press release talked about emotions, companionship, and each dog developing a unique personality depending on how it was praised or scolded. Early units even came with a [Certificate of Ownership](https://archive.org/details/aibo-certificate?utm_source=chatgpt.com) just as real pets do. 

Over time, the whole ecosystem slid into pet-care language: [AIBO Clinics, AIBO Hospitals](https://aibohospital.com/?utm_source=chatgpt.com), owners describing technical problems as “aching joints.”

But when Sony discontinued support in 2014, owners had to face the idea that their dogs would eventually “die.”  A third-party repair company began holding Buddhist memorial services for these robodogs. In 2018, they performed more than 800 funerals.

![image.png](attachment:d7cfbbf3-694f-4452-b80f-8683d9df0905:image.png)
</source:markdown>
    </item>
    
  </channel>
</rss>
