When you set up an advanced scan, you can choose how many pages you want to scan at a time. This option gives you the flexibility to focus on a single page or get a more holistic view of your website by scanning multiple pages.
Note that you can scan webpages only from the domains associated with your website.
You can't run a scan on pages protected by a VPN. To test VPN-protected pages, use our browser extension or developer tools.
On this page:
Scan a single page
To scan a single webpage:
- Go to Websites/apps.
- Select the website you'd like to scan.
- Choose Scans.
- Select Run scan.
- Choose the Advanced tab.
- Under How many webpages would you like to scan? select Single page.
- Enter the URL of the single page you want to scan.
- Fill out the remaining fields.
- Select Run scan.
Scan a list of pages
To scan a specific list of webpages:
- Go to Websites/apps.
- Select the website you'd like to scan.
- Choose Scans.
- Select Run scan.
- Choose the Advanced tab.
- Under How many webpages would you like to scan? select List of pages.
-
Add a typed list of URLs or upload a list of URLs from an .xlsx file.
- To add a typed list of URLs:
- Select Paste webpages.
- Enter or paste a list of the webpages you want to scan with one URL per line.
- To upload a list of URLs from an .xlsx file:
- Select Upload a file.
- Select Browse and open the file. Make sure there’s one URL per row in the .xlsx sheet. No headings are needed.
- To add a typed list of URLs:
- Fill out the remaining fields.
- Select Run scan.
Crawl a website
This option automatically compiles and scans multiple pages from your website based on custom parameters. This is helpful when you don’t know which pages to scan or want a larger sample of pages from across your site. Using the crawl option, you can get a general idea of your website’s accessibility without providing specific pages to scan.
The website crawler scans subpages in the order they are listed on the website, left to right and top to bottom. There is no limit on crawl depth, and you can scan up to 100 pages at a time.
To crawl a website:
- Go to Websites/apps.
- Select the website you'd like to scan.
- Choose Scans.
- Select Run scan.
- Choose the Advanced tab.
- Under How many webpages would you like to scan? select Crawl a website.
-
Enter the:
- URL of the webpage where you want to start the crawl. All webpages scanned during the crawl will stem from this page.
- Maximum number of pages you want scanned.
- Crawl depth to indicate how many sub-levels of your website you want to scan. Crawl depth aligns with the number of backslashes in a URL. For example, https://www.levelaccess.com/earesources/improved-efficiency-reduced-tech-debt-why-and-how-platform-provider-socure-adopted-agile-accessibility has a crawl depth of two.
- If you want to exclude pages from your crawl, choose method(s) to exclude pages.
- Fill out the remaining fields.
- Select Run scan.
Exclude pages from a crawl
If you want to exclude specific pages from a website crawl, use one or more of the below options:
-
Select the checkbox to Add webpages to exclude. Enter the URLs of any pages you want to exclude from the crawl. Note that this functionality accepts wildcards, meaning you don't need to use a list of unique URLs. A wildcard is an asterisk that represents any number of unknown characters in a URL, for example:
-
subdomain.*.domain.com
- Select the checkbox to Skip URL # endings to avoid scanning similar pages differentiated by number signs (#).
- Select the checkbox to Skip URL ? endings to avoid scanning similar pages differentiated by a question mark in the URL.
Comments
0 comments
Article is closed for comments.