Bots, crawlers, and linking

Robots and website crawler documentation

Looking to hire an SEO? click here.

Web taxonomy

Web taxonomy is the system used to classify and organise the content on a website — the set of categories, subcategories, tags, and labels that group pages by topic, type, or attribute. It's the structural layer that turns a collection of individual [...] Read more

2 min

Sitemaps

A sitemap is a file that lists the URLs on a website to help search engines discover and crawl them efficiently. The most common format is the XML sitemap, which follows the sitemaps.org protocol and is read by Google, Bing, and other crawlers. What [...] Read more

2 min

Internal linking

Internal linking refers to links between pages on the same website/domain, e.g. https://www.example.com > https://www.example.com/shop/ While this is a basic function of any website, internal linking is one of the most common missed SEO [...] Read more

4 min