Microsys
  

Website Analyzer Crawl Does Not Find All URLs

What to do if you only see few links after website crawl by website analyzer program.
Navigate: overview | previous | next

Remember: To see all options in A1 Website Analyzer you will have to switch off easy mode.

When Website Analyzer Program Finds Too Few or Odd Page URLs

First read how A1 Website Analyzer helps find website linking problems. Then go through this check list:

  • Using a firewall program? You need to configure it if website scan returns few URLs, and they all have response code -4 : CommError.

  • Are you mixing www and non-www usage in website links and redirects? Check the externals tab to know.

  • Does your website use website cloaking, i.e. change content depending on the user agent string used by crawler? Then change the user agent string used by the crawler to identify itself: General options | Internet crawler | User agent ID.

  • Does your website and/or pages in it redirect to or get content from another domain? (e.g. through <frame> or <iframe>) Check the externals tab to know.

  • Is an entire section of pages hidden and not linked at all from the other parts of the website? In this case, having cross-linked all hidden pages is no help! To solve this, you can use multiple start search paths.

  • Does the website rely on Javascript or uncommon types of HTML link tags for website navigation, e.g. <iframe>, <form> and <button>? Solution: Enable checking these things for links in Scan website | Crawler options.

  • Does the website use // instead of / in links? And does the webserver not respond with an error or redirect in such cases? And does the problem cascade if the page URL is linked using relative paths? Solution: Configure Scan website | Crawler options to handle this situation.

  • Does the website have a dynamic page that generates unique links based on input from GET ? data? This can sometimes cause an endless loop of unique URLs!

  • Webmaster crawl filters:
    Besides URL filtering support, you can also configure when filtered URLs are removed:
    • Website scan results: Scan website | Crawler options | Apply "webmaster" and "output filters" after website scan stops.

  • Are you scanning a website subdirectory which contains no links to pages within that directory? Check externals tab to know.

  • Consider if your website is using non-standard file extensions. If you know which, you can add them:

    sitemapper crawl file extensions

    sitemapper list file extensions

    Alternatively, clear all file extensions in analysis and output filters, but keep the default MIME filters both places. Then try scan again.

  • Do you have directories with response code 0 : VirtualItem in scan results? Check the information about internal website linking.

  • Are there many URLs with errors in website scan results? If the webserver is causing some URLs to give error response codes, e.g. because of server bandwidth throttling, you can try resume scan until all errors are gone. This will most likely lead to more found links and pages.

    Another solution towards solving URLs with error responses is to experiment with options found in Scan website | Crawler engine | Advanced engine settings. Some common settings which often help: Increasing timeout values, using GET only and enabling/disabling GZip/defalte support.

    website analyzer unstable servers

Help page primarily maintained and written by

As one of the lead developers in Microsys, his hands have touched almost all the code in the software available at this website. If you email any questions, chances are he will be the one answering them.
A1 Website AnalyzerAbout A1 Website Analyzer

SEO website crawler tool that can find broken links, analyze internal link juice flow, show duplicate titles, perform custom code/text search and much more.
share   LinkedIn   Twitter   Facebook   Pinterest   Google+   YouTube  
 © Copyright 1997-2014 Microsys
 Usage of this website constitutes an accept of our legal, privacy and cookies information.