Project Type: Live

  • francisco-carracedo.com

    francisco-carracedo.com

    Why publish a meta case study

    The most expensive part of evaluating a freelance developer is figuring out whether they actually build what they say they build. Most portfolios solve that problem by listing client logos and asking you to trust the copy. This one solves it by being the thing it sells: every architectural decision is in the source you can inspect from your browser right now.

    The brief I gave myself

    • Double as proof of what I sell — headless WordPress, Next.js, AEO, multilingual.
    • Load fast on a typical UK mobile connection, not just on my laptop.
    • Surface correctly in AI Overviews, Perplexity, and ChatGPT Search — answer-engine first, not page-rank first.
    • Run without monthly DevOps overhead — one Vercel project, one WordPress host, zero servers I need to babysit.

    Architecture

    WordPress is the CMS. Posts, projects, services, taxonomies, ACF Pro field groups — all the editorial surface lives there. The frontend is a Next.js 16 App Router build that fetches over WPGraphQL with ISR-flavored revalidation: the content is dynamic to the editor but static to the visitor. The CMS and the frontend can fail independently — if WP goes down, last-good content keeps serving from the edge.

    The two systems are joined only by a single server-side env var, and the GraphQL client deliberately drops the NEXT_PUBLIC_ prefix so the endpoint never reaches the browser bundle. That detail matters because the WP install is the only system holding secrets; if it leaked client-side, every other layer would be a false sense of security.

    The AEO surface

    Every page emits a stack of JSON-LD schemas — Person, WebSite, Service, Article, BreadcrumbList, FAQPage — with stable @id values so crawlers consolidate the site as a single entity, not as a handful of near-duplicate pages. A dynamic /llms.txt route serves the canonical summary that LLM ingestion agents grab when they want the elevator pitch without scraping HTML. The FAQ section on /services is written with literal question-as-heading + concise-answer-below — the exact shape AI Overviews extract verbatim.

    i18n done properly

    The portal ships in English (UK target market) and Spanish. The <html lang> is dynamic per locale, every page generates alternates.languages with hreflang and x-default, the sitemap emits one entry per URL per locale with paired alternates, and a build-time script enforces parity between the EN and ES bundles — the build fails if a key exists in one and not the other. When the WP-side content isn’t translated (currently the case for project bodies), a discreet “Available in English” banner appears for ES visitors with a direct route swap. No silent fallback.

    Performance

    Lighthouse 90+ is the floor I commit to on every page. next/image serves AVIF/WebP variants under a per-context sizes hint, the LCP image is fetched eagerly via priority, fonts are subsetted, and most of the site renders as Server Components — the JS bundle that reaches the visitor is a small fraction of what the page contains.

    What you can verify in 30 seconds

    • View Source on any page — find the application/ld+json blocks and the hreflang tags.
    • Open /llms.txt in your browser — that’s the page LLM agents see.
    • Open /sitemap.xml — every URL is listed once per locale with alternates.
    • Switch the locale in the navbar — the URL rewrites, the <html lang> changes, and the case study you’re on swaps cleanly.
    • Run Lighthouse on any route from DevTools — see the floor.

    What this case study isn’t

    It isn’t a client project. It’s the demonstrator I built between client work to make the rest of the offer concrete. If you’re considering a similar build for your own practice or your client’s, the patterns here transfer directly — the WP install is decoupled, the i18n setup is reusable, and the JSON-LD factory is generic enough to drop into any Next App Router project.

  • TrillAI Chat

    TrillAI Chat

    The problem

    Small and mid-size WooCommerce merchants can’t afford an AI team, a separate SaaS dashboard, or a 6-week integration project. They need AI assistance that installs as a plugin, configures in ten minutes, and understands their actual catalogue — without re-architecting the store or paying a per-seat fee for every checkout assistant.

    The approach

    TrillAI Chat is a native WordPress + WooCommerce plugin. Once activated, it ingests the product catalogue through WP’s own REST API, indexes the data into a vector store for retrieval, and exposes a chat widget on the storefront via shortcode or Gutenberg block. The plugin is provider-agnostic — OpenAI and Anthropic are both supported so the merchant never gets locked into a single LLM vendor.

    Architecture decisions worth naming

    • Plugin-first, not SaaS-first. Customer data stays inside the merchant’s WordPress install. The AI provider call is the only external hop, and the keys live in the merchant’s own admin.
    • Capability-tiered, not feature-flagged. The free tier ships meaningful capabilities (catalogue Q&A, multilingual EN/ES). Paid tiers add reasoning depth and the AI ROI Surface — instrumentation that measures the chat’s contribution to revenue, not just its message count.
    • Multilingual from day one. The widget operates in the same language as the storefront’s locale, with hreflang-aware routing so a Spanish shopper gets a Spanish assistant on a bilingual store.
    • Distribution via WordPress.org. Listed in the official plugin directory so installation is a click from WP admin. No license server, no auto-update hijinks.

    What this case study demonstrates

    If you’re a merchant evaluating WordPress AI plugins, the questions you actually need to ask aren’t “does it work” — every demo works. The real questions are: where does customer data go, what happens if the LLM provider rate-limits you, how do you measure ROI, and what’s the upgrade path when your store crosses a threshold the plugin wasn’t built for. TrillAI Chat is the practical answer to those four questions, and it’s the example I point to when a client asks me to add AI to their existing WooCommerce store.

    Status

    Live on the WordPress.org plugin directory. Active development continues on the AI ROI Surface and tier-2 reasoning depth. Roadmap visible at trillai.io.