- The Disavow file only works after links after the disavowed links are crawled by Google. This can take up to 9 months if done wrong or incomplete. more »
- Domain should be sufficient Some voices say that despite a domain-wide disavow, a recrawl of EVERY single page is necessary. In our opinion this sounds crazy, but then who knows, we’re not Google, you’re not Google, and I want to make sure you know… more »
- You can plan your future use of Link Crawl Budget in the Link Crawl Budget management dashboard. There you define which of your projects are recrawled at which frequency. more »
- For every page that cannot be crawled successfully, a network error code is stored. This is only possible, if there are no network errors during crawling. Source vs. Target Network Error A network error can occur on both ends of a link, the source… more »
- KwDomain indicates the total number of ranking keywords for the whole domain, whereas KwPage evaluates the number of ranking keywords of the page (according to SEMRush). Both are indicators of the strength of a page or a domain. The KwDomain Metric… more »
- The parameters of this package look at the links on a page and tell you how many links of each kind a page has. Metric Description MetaRobots Analyzes a site's robots-meta-tag and tells you, if a page uses one or a combination of the values… more »
- Link Status determines the result of the link crawling and interpretation of tags in HEAD and Robots.txt (Follow, NoFollow, Mention, Redirect, Unverified). more »
- The IP Addresses of webhosts are useful to identify various footprints. Keep in mind though, that the IP Address shown here is often the IP of a CDN, not the real webhost. Because of this the impact the hosting IP is often over-weighted nowadays. Metric… more »