allows you to check the history of sites and pages, but a service most are not aware of is one that allows you to get a list of every page that a has for a given domain. This is great for enumerating a web applications, many times you’ll find parts of web apps that have been long forgotten (and usually vulnerable).

This module doesn’t make any requests to the targeted domain, it simply outputs a list to the screen/or a file of all the pages it has found on

msf auxiliary(enum_wayback) > info
       Name: Pull stored URLs for a domain
    Version: 10394
    License: Metasploit Framework License (BSD)
       Rank: Normal

Provided by:
  Rob Fuller 

Basic options:
  Name     Current Setting  Required  Description
  ----     ---------------  --------  -----------
  DOMAIN  yes       Domain to request URLS for
  OUTFILE                   no        Where to output the list for use

  This module pulls and parses the URLs stored by for the 
  purpose of replaying during a web assessment. Finding unlinked and 
  old pages.

msf auxiliary(enum_wayback) > run

[*] Pulling urls from
[*] Located 289 addresses for


You can set the OUTFILE so that you can parse it a bit and import it into Burp, or use a quick script to make the queries yourself. Here is one I wrote in python:

# cat

#!/usr/bin/env python
import urllib
proxies = {'http': ''}
filename = "/tmp/waybacklist.txt"

fl = open(filename, 'r')
for lines in fl:
	url = str(lines)
	if len(url) < 4:
		print "Skipping blank line"
	    print "Requesting " + url
	    temp = urllib.urlopen(url, proxies=proxies).read()