[prev in list] [next in list] [prev in thread] [next in thread] 

List:       nmap-dev
Subject:    [NSE] http-stored-xss.nse
From:       George Chatzisofroniou <sophron () latthi ! com>
Date:       2013-06-29 19:15:15
Message-ID: 20130629191515.GF2704 () gmail ! com
[Download RAW message or body]

[Attachment #2 (multipart/signed)]

[Attachment #4 (multipart/mixed)]


The attached script will identify any stored XSS vulnerabilities. Stored (or
persistent) XSS occur when the data provided by the attacker is saved by the
server, and then permanently displayed on pages returned to other users in the
course of regular browsing, without proper HTML escaping.

To identify these vulnerabilities, the script will POST specially crafted
strings to every form it encounters and then it will search through the website
for those strings to check if the payloads were succesful.

To make this work i had to do another change in httpspider library. I had to
turn off http caching while crawling. I don't know why there wasn't a parameter
for this before. So, you also need to apply the attached patch to make this work
properly.

The script will, by default, crawl the target website two times. First, to find
any forms and POST the malicious strings and a second one to search for the
strings it previously POSTed. You can save some time by passing the single pages
you are interested using the formpaths and uploadpaths arguments.

There is also an option called fieldvalues to bypass form's restrictions by
manually setting its fields.

Finally, there is an option to use your own XSS vectors. Simply, write them in a
file and pass its path to the filedb argument.

The output looks like this:

 PORT   STATE SERVICE REASON
 80/tcp open  http    syn-ack
 | http-stored-xss: 
 |   Found the following stored XSS vulnerabilities: 
 |   
 |      Payload: ghz>hzx
 |    Uploaded on: /guestbook.php
 |    Description: Unfiltered '>' (greater than sign). An indication of potential XSS vulnerability.
 |      Payload: zxc'xcv
 |    Uploaded on: /guestbook.php
 |    Description: Unfiltered ' (apostrophe). An indication of potential XSS vulnerability.
 |   
 |      Payload: ghz>hzx
 |    Uploaded on: /posts.php
 |    Description: Unfiltered '>' (greater than sign). An indication of potential XSS vulnerability.
 |      Payload: hzx"zxc
 |    Uploaded on: /posts.php
 |_   Description: Unfiltered " (double quotation mark). An indication of potential XSS vulnerability.


While this is our third XSS script, there could still be some improvements in
the XSS area of NSE. For example, we still lack of a DOM-based XSS script.
DOM-based XSS is the third category of XSS, after the reflected XSS (which is
covered by unsafe-output-escaping.nse) and stored XSS (which is covered by this
script). Also, it would be great if Nmap holded a database of XSS vectors (we
could borrow XSSer's [1]), so it can output a more descriptive output about the
possible XSS that can be executed in the target. I'm writing these down so we
can have them in mind.

[1]: https://n-1.cc/pages/view/16105/

-- 
George Chatzisofroniou
http://sophron.latthi.com

["added_caching_option_httpspider.diff" (text/x-diff)]

Index: nselib/httpspider.lua
===================================================================
--- nselib/httpspider.lua	(revision 31156)
+++ nselib/httpspider.lua	(working copy)
@@ -692,14 +692,14 @@
         end
         if is_web_file then
           stdnse.print_debug(2, "%s: Using GET: %s", LIBRARY_NAME, file)
-          response = http.get(url:getHost(), url:getPort(), url:getFile(), { timeout \
= self.options.timeout, redirect_ok = self.options.redirect_ok } ) +          \
response = http.get(url:getHost(), url:getPort(), url:getFile(), { timeout = \
self.options.timeout, redirect_ok = self.options.redirect_ok, no_cache = \
self.options.no_cache } )  else
           stdnse.print_debug(2, "%s: Using HEAD: %s", LIBRARY_NAME, file)
           response = http.head(url:getHost(), url:getPort(), url:getFile())
         end
       else
 			  -- fetch the url, and then push it to the processed table
-			  response = http.get(url:getHost(), url:getPort(), url:getFile(), { timeout = \
self.options.timeout, redirect_ok = self.options.redirect_ok } ) +			  response = \
http.get(url:getHost(), url:getPort(), url:getFile(), { timeout = \
self.options.timeout, redirect_ok = self.options.redirect_ok, no_cache = \
self.options.no_cache } )  end
       
 			self.processed[tostring(url)] = true


["http-stored-xss.nse" (text/plain)]

description = [[
This script will POST specially crafted strings to every form it 
encounters and then it will search through the website for those 
strings to check if the payloads were succesful.
]]

---
-- @usage nmap -p80 --script http-stored-xss.nse <target>
-- 
-- This script works in two phases.
-- 1) Posts specially crafted strings to every form it encounters.
-- 2) Crawls through the page searching for these strings.
--
-- If any string is reflected on some page without any proper 
-- HTML escaping, it's a sign for potential XSS vulnerability.
--
-- @args http-fileupload-exploiter.formpaths The pages that contain 
--       the forms to exploit. For example, {/upload.php,  /login.php}.
--       Default: nil (crawler mode on)
-- @args http-fileupload-exploiter.uploadspaths The pages that reflect
--       back POSTed data. For example, {/comments.php, /guestbook.php}. 
--       Default: nil (Crawler mode on)
-- @args http-fileupload-exploiter.fieldvalues The script will try to 
--       fill every field found in the form but that may fail due to 
--       fields' restrictions. You can manually fill those fields using 
--       this table. For example, {gender = "male", email = "foo@bar.com"}. 
--       Default: {}
-- @args http-fileupload-exploiter.filedb The path of a plain text file 
--       that contains one XSS vector per line. Default: nil
--      
-- @output
-- PORT   STATE SERVICE REASON
-- 80/tcp open  http    syn-ack
-- | http-stored-xss: 
-- |   Found the following stored XSS vulnerabilities: 
-- |   
-- |      Payload: ghz>hzx
-- |    Uploaded on: /guestbook.php
-- |    Description: Unfiltered '>' (greater than sign). An indication of potential \
                XSS vulnerability.
-- |      Payload: zxc'xcv
-- |    Uploaded on: /guestbook.php
-- |    Description: Unfiltered ' (apostrophe). An indication of potential XSS \
                vulnerability.
-- |   
-- |      Payload: ghz>hzx
-- |    Uploaded on: /posts.php
-- |    Description: Unfiltered '>' (greater than sign). An indication of potential \
                XSS vulnerability.
-- |      Payload: hzx"zxc
-- |    Uploaded on: /posts.php
-- |_   Description: Unfiltered " (double quotation mark). An indication of potential \
                XSS vulnerability.
--
-- 
--
---

categories = {"intrusive", "exploit", "vuln"}
author = "George Chatzisofroniou"
license = "Same as Nmap--See http://nmap.org/book/man-legal.html"

local http = require "http"
local string = require "string"
local httpspider = require "httpspider"
local shortport = require "shortport"
local stdnse = require "stdnse"
local table = require "table"

portrule = shortport.port_or_service( {80, 443}, {"http", "https"}, "tcp", "open")


-- A list of payloads.
--
-- You can manually add / remove your own payloads but make sure you 
-- don't mess up, otherwise the script may succeed when it actually 
-- hasn't. 
--
-- Note, that more payloads will slow down your scan.
payloads = {

    -- Basic vectors. Each one is an indication of potential XSS vulnerability. 
    { vector = 'ghz>hzx', description = "Unfiltered '>' (greater than sign). An \
indication of potential XSS vulnerability." },  { vector = 'hzx"zxc', description = \
"Unfiltered \" (double quotation mark). An indication of potential XSS \
vulnerability." },  { vector = 'zxc\'xcv', description = "Unfiltered ' (apostrophe). \
An indication of potential XSS vulnerability." },  }
 

-- Create customized requests for all of our payloads.
local makeRequests = function(host, port, submission, fields, fieldvalues)

    local postdata = {}
    for _, p in ipairs(payloads) do
        for __, field in ipairs(fields) do
            if field["type"] == "text" or field["type"] == "textarea" or \
field["type"] == "radio" or field["type"] == "checkbox" then  
                if fieldvalues[field["name"]] ~= nil then
                    value = fieldvalues[field["name"]]
                else
                    value = p.vector
                end
              
                postdata[field["name"]] = value
                
            end
        end 
    
    stdnse.print_debug(2, "Making a POST request to " .. submission .. ": ")
    for i, content in pairs(postdata) do 
           stdnse.print_debug(2, i .. ": " .. content)
    end
    local response = http.post(host, port, submission, { no_cache = true }, nil, \
postdata)  end

end

local checkPayload = function(body, p)
 
    if (body:match(p)) then 
        return true
    end

end

-- Check if the payloads were succesfull by checking the content of pages in the \
uploadspaths array. local checkRequests = function(body, target)
 
    output = {}
    for _, p in ipairs(payloads) do
        if checkPayload(body, p.vector) then
            report = " Payload: " .. p.vector .. "\n\t Uploaded on: " .. target
            if p.description then
                report = report .. "\n\t Description: " .. p.description
            end
            table.insert(output, report)
        end
    end
        return output
end

local readFromFile = function(filename)
    local database = { }
    for l in io.lines(filename) do
        table.insert(payloads, { vector = l })
    end
end

action = function(host, port)

    local formpaths = stdnse.get_script_args("http-stored-xss.formpaths")
    local uploadspaths = stdnse.get_script_args("http-stored-xss.uploadspaths")
    local fieldvalues = stdnse.get_script_args("http-stored-xss.fieldvalues") or {}
    local dbfile = stdnse.get_script_args("http-stored-xss.dbfile") 

    if dbfile then
        readFromFile(dbfile)
    end

    local returntable = {}
    local result

    local crawler = httpspider.Crawler:new( host, port, '/', { scriptname = \
SCRIPT_NAME,  no_cache = true } )

    if (not(crawler)) then
		return
	end

	crawler:set_timeout(10000)
 
    local index, k, target, response

    -- Phase 1. Crawls through the website and POSTs malicious payloads.
    while (true) do

        if formpaths then

            k, target = next(formpaths, index)
            if (k == nil) then
                break
            end 
            response = http.get(host, port, target, { no_cache = true }) 
            target = host.name .. target
        else
                        
            local status, r = crawler:crawl()
            -- if the crawler fails it can be due to a number of different reasons
            -- most of them are "legitimate" and should not be reason to abort
            if ( not(status) ) then
                if ( r.err ) then
                    return stdnse.format_output(true, ("ERROR: %s"):format(r.reason))
                else
                    break
                end
		    end

            target = tostring(r.url)
            response = r.response
       
        end

        if response.body then 
        
            local forms = http.grab_forms(response.body)

            for i, form in ipairs(forms) do 
                
                form = http.parse_form(form)

                if form then
                
                    local action_absolute = string.find(form["action"], "https*://")
              
                    -- Determine the path where the form needs to be submitted.
                    if action_absolute then
                        submission = form["action"]
                    else    
                        local path_cropped = string.match(target, "(.*/).*")
                        path_cropped = path_cropped and path_cropped or ""
                        submission = path_cropped..form["action"]
                    end

                    makeRequests(host, port, submission, form["fields"], fieldvalues)

                end 
            end
        end
        if (index) then
            index = index + 1
        else 
            index = 1
        end

    end
        
    local crawler = httpspider.Crawler:new( host, port, '/', { scriptname = \
SCRIPT_NAME } )  local index

    -- Phase 2. Crawls through the website and searches for the special crafted \
strings that were POSTed before.  while true do
        if uploadspaths then
            k, target = next(uploadspaths, index)
            if (k == nil) then
                break
            end
            response = http.get(host, port, target)
        else
            
            local status, r = crawler:crawl()
            -- if the crawler fails it can be due to a number of different reasons
            -- most of them are "legitimate" and should not be reason to abort
            if ( not(status) ) then
                if ( r.err ) then
                    return stdnse.format_output(true, ("ERROR: %s"):format(r.reason))
                else
                    break
                end
		    end

            target = tostring(r.url)
            response = r.response
       
        end

        if response.body then 

            result = checkRequests(response.body, target)
                    
            if next(result) then
                table.insert(returntable, result)
            end
        end 
        if (index) then
            index = index + 1
        else 
            index = 1
        end
    end

    if next(returntable) then 
        table.insert(returntable, 1, "Found the following stored XSS vulnerabilities: \
")  return returntable
    else
        return "Couldn't find any stored XSS vulnerabilities."
    end
end


["signature.asc" (application/pgp-signature)]

_______________________________________________
Sent through the dev mailing list
http://nmap.org/mailman/listinfo/dev
Archived at http://seclists.org/nmap-dev/

[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic