kurtisjustin.co.uk
A rebuild + structured refactor of my personal site so I can ship updates without sweating, keep SEO predictable, and make publishing case studies feel like editing content — not wrangling routes.
Personal Website · 2025 - 2026 · Digital Project Manager / Developer
Goals
- Create a content-driven case-study model so new entries don’t require touching routing code
- Make routing and canonical behaviour boringly consistent (no trailing slashes, no surprises)
- Implement a safe staging setup (noindex + no analytics leakage)
- Improve SEO reliability across metadata, schema and sitemap inclusion
- Keep the UI clean while improving accessibility and long-term maintainability
Constraints
- Preserve live routes and avoid regressions during refactor
- No invented business metrics (keep outcomes qualitative and honest)
- Static hosting constraints on DigitalOcean App Platform
- Staging and production had to remain operationally separated
Problem
The site grew in that classic side-project way: a few quick improvements here and there, and suddenly you’ve got a bunch of ‘how did this end up like that?’ moments. Routing, SEO output, analytics behaviour, and staging guardrails all needed tightening so the site stayed easy to evolve (and I stopped debugging myself).
Approach
Phase 1: Inventory and guardrails - Audited route structure, redirects, metadata, sitemap and analytics touchpoints. - Locked in a strict no-trailing-slash URL policy and ‘safe by default’ environment behaviour.
Phase 2: Structure and content model - Introduced typed content schemas (Zod) and moved case studies into content-first data. - Kept rendering component-driven, but made publishing content-owned.
Phase 3: Router and URL cleanup - Consolidated route behaviour and legacy alias handling. - Ensured canonical output always matches the route policy.
Phase 4: Consent + analytics gating - Implemented localStorage-backed consent behaviour. - Added environment gating so staging doesn’t send analytics or get indexed.
Phase 5: SEO and production hardening - Standardised title/description/canonical/JSON-LD output. - Ensured case-study routes and sitemap entries are generated from the content model.
Implementation highlights
- Dynamic /case-studies/:slug routing driven by content data (not hardcoded pages)
- Filtering, sorting and search on the case-study list page via query params
- Environment-aware robots policy for staging (noindex by default)
- Analytics behind two gates: consent + environment
- Sitemap inclusion generated from case-study content
Challenges
- Refactoring while keeping the live site stable (and not accidentally changing public URLs)
- Balancing content flexibility with strict type safety (without making authoring a pain)
- Preventing staging indexing and analytics leakage reliably
- Keeping the UX clean while adding more structure under the hood
Results
- Cleaner architecture and lower friction when making updates (less ‘where does this live?’ energy)
- More reliable SEO output and canonical consistency across pages
- Safer staging workflow with explicit non-indexing and tracking suppression
- Case studies are publishable via content updates rather than route code changes
What I'd do next
- Add richer media support and per-case-study OG images
- Expand automated link validation across case-study entries (catch broken links before they ship)
- Add a lightweight editorial checklist so case studies stay consistent over time
How AI helped
ChatGPT helped me turn messy ‘I’ll just tweak it’ work into a phased plan: audit → guardrails → structure → cleanup → hardening. Codex was most useful for the implementation passes: scanning the repo, applying targeted changes, and tightening the rough edges (routes, content model wiring, SEO consistency). AI sped up the execution, but the architectural choices and release decisions were still manual and review-driven — basically: faster hands, same brain.