Is there a way to list out all the webpages stored or linked to on a domain of the site? For example listing out all the pages that belong to MakeUseOf?
You can use a Sitemap generator program to at least get it into XML format http://www.xml-sitemaps.com/crawl.html
Then use Notepad ++ to easily remove all the rest of the junk.
Hope that helps.
If you want you can use BackStreetBrowser to download the website in question
you mean this will download all webpages hosted on the website?
thats even cooler.
Thanks, will check it out.
If you're going to try this method make sure to keep the link depth at most 3 and not do that for large websites OR it may take ages to download the website :-)
That can get messy with active blogs, but you could use Google with a modifier. Searching for "site:example.com" without quotes and replacing example.com with the domain in question. This will bring up everything that Google has listed for the domain, but you may need to tell it to repeat the search including omitted results. Some pages can be repeated because of how some sites work due to archives, categories, tags, etc.
Thank you so much!!
Your Method worked! the site domain didn't have many webpages so i looked through all of them. I was trying to find the main one.
in case you are wondering, the site is:
Have you tried any Whois tools?
I tried Who.is but it only shows the number of links but does not list them.
whois.net requires signup, so i didn't use that