Gathering Information before hacking website.
Spidering the web:
We may find out many important information from spidering the target.
Screen shot of Burp suit:
Burp suite spiderd some important link which we nee for later attack(Directory,login page, password forgotten pages, robots.txt etc) .
Configuring the burp suite for spider the web :
1. Open the burp suite .
2. Configure your browser as proxy for burp suite>> Firefox: preference>>Advance>>Network>>Setting>>Manual proxy configuration and enter host: localhost and port: 8080
3. Now browser your target website. And you will see your target address in the burp suite proxy’s target menu.
4. Now right click on your target host from burp then click on the “Spider this host”
Now it will spider the website.Note:play more with burp suite.
Now we know to configure browser for burp suite and spidering the target host. So let’s continue gathering information.
wget -r www.target.com
And it will download the full website. Now browse all pages, see source code, coment etc and see if you i/you get any good information .
Information Gathering with Google:
If we search on google with operator ‘site’ then we get many result :
Click on the link and you will see.
More example :
You will find many Google dork : http://www.exploit-db.com/google-dorks/
Don’t be lazy if you are serious.
There are some tools for automated search but i always prefer manually.
Finding hidden file and content,default file:
You should browse all pages manually, review behavior for all pages. Here some point you can follow :
1. Brute force/Dictionary attack for hidden directory. You can use Burp suite or owasp DirBuster(I will post later about all tools tutorial).
2. See if you find any link like : www.target.com/login.php then there may be also logout.php, or if there is a www.taget.com/adduser.php then it may also exist www.target.com/deleteuser.php…. So try.
3. See the comment in the pages source for any interesting information.
4. Find out the login pages(admin+users).
5. Find out all url and save in a file for later uses.
6. Find out default file,content(What about www.target.com/phpinfo.php?).
7. I think you better run nikto against the site . Nikto is powerful tool for discovering default content.
Finding other information:
What is other information ?
1. Email(Social Engineering attack).
2. Phone number(Social Engineering).
3. Users and employee name(Social Engineering).
4. Find out the web server version. What version of apache, iis they are using? Perhaps if it is old then you may be lucky to find out some vulnerability on exploit-db,security focus for known vulnerability against the old software.
5. What type of web software are they using? Joomla, MyBB, PhpBB , Vbulletin or other? Do you know what version ? If these are old then you may search for vulnerability which already discovered before.
WITHOUT THESE INFORMATION YOU SHOULD NOT GO AHEAD .