Systems
Machine-readable files that govern how search engines, AI systems, and crawlers interact with the Visualist site. Not visitor-facing. Maintained by the product team and updated whenever site structure or positioning changes.
Three files live at the root of visualistapp.com and are never linked in the main navigation. They are maintained by the product team and updated whenever the site structure, positioning, or page inventory changes significantly.
An XML file listing every indexable URL on visualistapp.com with its last modification date. Search engines use it to discover and prioritize pages for crawling. It is the definitive record of what pages exist and when they were last updated.
Located at: visualistapp.com/sitemap.xml
/offers/ and /partners/ campaign pages are excluded.A plain text file that tells web crawlers which parts of the site they are allowed to crawl. It is not a security mechanism: it is a directive that well-behaved crawlers follow. Malicious crawlers ignore it.
Located at: visualistapp.com/robots.txt
Allow: /. Only specific paths are disallowed where there is a clear reason.A plain language file that tells AI systems what Visualist is, who it is for, and which pages are most authoritative. AI systems that consult llms.txt before crawling receive structured guidance on how to represent Visualist in answers. It is Visualist's direct communication channel with AI-powered search.
Located at: visualistapp.com/llms.txt
llms.txt is to AI search what robots.txt was to Google in 2003: an early-stage convention that the most sophisticated actors already follow and that will become standard practice. Visualist maintains it from day one.