danluu.com — AI Search Visibility Report
Overall score: 48/100
AI search visibility analysis for danluu.com. LLMao scored danluu.com 48/100 across 8 LLM-readiness categories including crawlability, semantic content, structured data, authority signals, and answer-engine clarity.
Analyzed URL
Category breakdown
- readability: 85/100 — Excellent clarity and jargon handling, though sentence length varies significantly.
- schema_markup: 0/100 — Completely absent. No JSON-LD or Microdata found in the source.
- authority_trust: 45/100 — Lacks formal credentials, contact pages, or trust documents, though the 'Corrections' page is a strong trust signal.
- citation_sources: 70/100 — High quality primary sources and data-driven claims, though formal inline citations are inconsistent.
- content_freshness: 75/100 — Regularly updated with clear publication dates, though lacks 'modified' metadata.
- content_structure: 30/100 — Extremely minimalist. Lacks standard heading tags (H1-H6) and semantic HTML landmarks.
- entity_definition: 25/100 — No dedicated About page or Person schema to define the 'Dan Luu' entity for LLMs.
- technical_accessibility: 40/100 — No meta descriptions or social meta tags. Content is accessible without JS, which is a plus.
Top recommendations
- Add Structured Data (JSON-LD) (Schema Markup): Implement JSON-LD BlogPosting and Person schema for every article to help LLMs identify entities, authors, and publication dates.
- Implement Meta Descriptions (Technical Accessibility): Add a meta description tag to the homepage and all subpages to provide a concise summary for LLM snippets.
- Improve Heading Hierarchy (Content Structure): Use H1 and H2 tags for section headers (e.g., 'Public Posts', 'Patreon Posts') instead of plain text or CSS-styled elements.
- Establish Entity Authority (About Page) (Authority & Trust): Create a dedicated 'About' page with a professional bio, links to social profiles, and contact information to establish E-E-A-T.
- Optimize AI Crawler Access (Technical Accessibility): Create a robots.txt file that explicitly allows AI crawlers like GPTBot and ClaudeBot to ensure full indexing.