Tag: "Source Metrics"
- 1
- 2
- For every page that cannot be crawled successfully, a network error code is stored. This is only possible, if there are no network errors during crawling. Source vs. Target Network Error A network error can occur on both ends of a link, the source… more »
- KwDomain indicates the total number of ranking keywords for the whole domain, whereas KwPage evaluates the number of ranking keywords of the page (according to SEMRush). Both are indicators of the strength of a page or a domain. The KwDomain Metric… more »
- The parameters of this package look at the links on a page and tell you how many links of each kind a page has. Metric Description MetaRobots Analyzes a site's robots-meta-tag and tells you, if a page uses one or a combination of the values… more »
- The IP Addresses of webhosts are useful to identify various footprints. Keep in mind though, that the IP Address shown here is often the IP of a CDN, not the real webhost. Because of this the impact the hosting IP is often over-weighted nowadays. Metric… more »
- Power and Trust metrics for source or target page more »
- Power and Trust metrics for the domains and top-domains (root-domains) more »
- Look at the link velocity trends and understand what is the natural or unnatural backlink growth for any given linking domain. Link Velocity Trends describe the change in growth or decline. It does not relate directly to the number of won or lost links. Hence it can also be a negative number if a website still earns links, just a lot less than it used to. more »
- Depending on the interpretation of META tags and Robots.txt, for different types of links and redirects, LRT displays different status icons next to each link in your reports. more »
- To help you quickly detect if certain URLs are blocked by robots.txt, we provide detailed robot.txt metrics for every source page. Metric Description Robots.txt Bots txt allows general bots for currents URL Robots.txt Googlebot txt allows Googlebot for… more »
- 1
- 2