arkhamintelligence.com — AI Search Visibility Report
Overall score: 59/100
AI search visibility analysis for arkhamintelligence.com. LLMao scored arkhamintelligence.com 59/100 across 8 LLM-readiness categories including crawlability, semantic content, structured data, authority signals, and answer-engine clarity.
Analyzed URL
https://arkhamintelligence.com
Category breakdown
- readability: 90/100 — Excellent clarity and jargon handling for a technical crypto product.
- schema_markup: 0/100 — No JSON-LD schema detected in the provided HTML.
- authority_trust: 60/100 — Strong social proof and contact info, but lacks clear publication dates and editorial standards.
- citation_sources: 40/100 — Claims are made regarding reserves and analytics but lack external primary source citations.
- content_freshness: 0/100 — No visible publication or modification dates found in text or metadata.
- content_structure: 70/100 — Good use of H3s and sections, but lacks a clear H1-H2 hierarchy and semantic HTML tags like <article>.
- entity_definition: 60/100 — Brand consistency is high, but lacks a dedicated About page and Person schema for authors.
- technical_accessibility: 80/100 — Strong meta descriptions and social meta tags, though AI-specific crawler directives are unverified.
Top recommendations
- Implement JSON-LD Schema Markup (Schema Markup): Implement FAQPage schema for the existing FAQ section and Organization schema to define the entity.
- Add Content Freshness Signals (Authority & Trust): Add a visible 'Last Updated' date or publication date to the homepage and tutorials to signal freshness to LLMs.
- Strengthen Entity Definition (Entity Definition): Create a dedicated 'About Us' page with team bios and clear entity relationships between Arkham Intel and Arkham Exchange.
- Improve Claim Verification with Citations (Citation & Source Quality): Add outbound links to primary data sources or regulatory documentation to verify claims about 'Proof of Reserves'.
- Optimize AI Crawler Access (Technical Accessibility): Ensure the robots.txt explicitly allows AI crawlers like GPTBot and ClaudeBot to ensure full indexing.