Crawlic is a web reconnaissance tool.

Crawlic is a web reconnaissance tool.

Crawlic is a Web recon tool (find temporary files, parse robots.txt, search folders, google dorks and search domains hosted on same server)
Automatic GIT/SVN clone (using dvcs-ripper) if .git or .svn folder if found.
Latest change 21/12/2015 : crawlic.py Fix path errors

Crawlic Helper

Crawlic Helper

Requirements :
+ Python 2.7.x
+ git/svn ripper needs LWP.pm library (Original dvcs ripper: https://github.com/kost/dvcs-ripper)
+ pholcidae library

Configuration :
– Change user-agent : Edit user_agent.lst, one user agent per line
– Change folders to find : Edit folders.lst, one directory per line
– Change files to scan : Edit extensions.lst, one file extension per line
– Change dorks list : Edit dorks.lst, one dork per line
– Change google dorks list : Edit google_dorks, one dork per line, use %s as target url

Installation:

Source: https://github.com/Ganapati