Get a List of All the URLs from a Website

Find and create a list of all the urls of a particular website

You might need to do this if you’re moving to a new permalink structure and need to 301 redirect the pages. For large sites, a lot of time can be saved by making good use of free sitemap generators online and excel.

A bunch of the online sitemap generator tools either ask for your email or have a maximum number of pages that they index. This one: www.xsitemap.com is my favorite since it works well, doesn’t ask for an email, and as far as I can tell, has no limit on the number of pages. I just ran it for a site with 2500 pages. It took about a half hour.

get-list-of-all-urls-website

After it’s finished, you have two options for grabbing the data. Either you can copy and paste from the readout or download the file provided.

Once you’ve pasted the data into excel, use the find/replace function to eliminate all the data you don’t need. For 301 redirects, you can get rid of anything extraneous, which means everything prior to the suffix, including the root domain.

301 redirect structure looks like this:

redirect 301 /old/old.htm http://www.you.com/new.htm

17 Replies to “Get a List of All the URLs from a Website”

  1. Marker says:

    Thank you for the solid writeup. If truth be told I may not use this for all White hat means. (=

  2. Simon G says:

    Hello, Neat post. A good solid webmaster strategy.

  3. David says:

    Great post! Thanks for the sitemap site. Saved me hours!

  4. Peter Donalds says:

    This is a really interesting strategy, although I am still trying to get my head around it, to understand it better.

  5. Raheel Farooq says:

    Thank you, Ryan. I needed the list of my website URLs for SEO purposes. Your trick worked.
    Have a nice day!

  6. Desktop Backgrounds says:

    Thank you

  7. Kim says:

    Thanks for the resource. This is the best one I’ve ever used. I love the fact that it will do more than 500 urls.

  8. baadshah says:

    looking for advance version of this feature.
    I am looking for filter on page show only pages with extension which i need
    also i want to view links upto certain level, basically i can give criteria to exclude some of sub links.

  9. Ajit says:

    I have a website which is password protected. When I enter URL as http: //Loginname:Password@URL it gives only 23 pages where in reality there are more than 3000 pages.

    What is the solution for this problem?

    1. Ryan Howard says:

      Screaming Frog has a password access feature that might work or you. Check their FAQ and do a find in page search for “Password” to jump to the correct section: Screaming Frog FAQ

  10. Bala says:

    Thanks for this post!! You have saved my time.

  11. Umar Sofiyaan says:

    What about a local server website? which is hosted from a computer on Local network??

    1. Ryan Howard says:

      Screaming Frog will do the trick.

  12. Yash Rastogi says:

    Saved ample of hours

  13. Albert says:

    Thank you for this awesome writeup!

    1. Kyle Sanders says:

      You’re welcome!

  14. Muhammad says:

    What about subdomains? How do I probe them?

Comments are closed.