[prev in list] [next in list] [prev in thread] [next in thread] 

List:       full-disclosure
Subject:    [FD] The continuing problem of a third party resources in web applications.
From:       x ksi <s3810 () pjwstk ! edu ! pl>
Date:       2016-08-20 6:11:40
Message-ID: CAN10O-YHxrz_mogR9dtw=nyvZZWxrAXwx8e+OB8vjXwmat9mYg () mail ! gmail ! com
[Download RAW message or body]

$ cat ./3rdparty.txt


Release   (08.06.2015): Writeup with PHP PoC released

Update #1 (04.02.2016): JavaScript PoC created

Update #2 (16.06.2016): W3C SRI information added

Update #3 (01.08.2016): Added reference about the AdGholas

Update #4 (20.08.2016): Added reference about D. Trump site


--


[ Subresource Integrity ]


I'm slightly disappointed in my poor web crawling skills. Apparently works

on the Subresource Integrity[0] started in January 2014. At that time I

was unable to stumble upon this project when I was researching the subject

back in the mid 2015. However, it was not until June 2015 for Chrome

(45.0) and September 2015 for Firefox (43) when the new browser versions

with an SRI support were released. For that reason I feel excused to some

extent.


Currently only Chrome, Firefox and Opera support this mechanism on desktop

computers.  On mobile platforms it looks it's only supported by Chrome and

Firefox. Microsoft considers implementing it in the Internet Explorer but

doesn't provide any timelines for that[2].


The SRI can be applied to <script> and <link> tags what seems to be

obvious and reasonable. It could be used for other tags as well (e.g.

<img>, <media>, <object> etc.) but unfortunately this idea has been

abandoned by Chrome devs[3] (i.e. spec doesn't cover other tags). I

presume Firefox devs decision was based on the same grounds. It's a pity

that spec hasn't been extended in that regard instead.


From the above it is clear that there is still room for improvements (when

there isn't?) but honestly, it looks it's currently the best solution to

address the problem of fetching remote resources and verification of their

integrity.


Thanks to Edwin Kwan who brought[4] the SRI to my attention.



References:

[1] https://www.w3.org/TR/SRI/

[2] https://developer.mozilla.org/en-US/docs/Web/Security/Subresource_Integrity

[3] https://codereview.chromium.org/1151773007

[4] https://www.linkedin.com/pulse/trust-your-cdn-verify-sri-edwin-kwan?trk=hp-feed-article-title-like




[ Third Party JS scripts verification ]


Few years ago there was a fuss in the Polish part of the net. It was all

about the security breach that happened to a controversial security news

blog[1][2][3]. Story like many others but the attack vector was quite

interesting - get into the third party server and modify the JS script

that was used by the targeted blog.  Shortly after, some discussions took

place about possible preventive measures against rough third party scripts

and/or malicious CDNs.  Unfortunately, I couldn't attend the OWASP

meeting[4] during which the potential mitigations and solutions to the

problem were meant to be discussed. It's also a pity the meeting was not

recorded. I'm not aware of any new technical solutions, which addresses

the problem and would be a direct outcome of this OWASP brain-storming

(feel free to correct me if you think that I'm wrong). On the other hand,

it doesn't mean there is no such solution already out there, but please

keep on reading.  I'm pretty sure that our domestic case wasn't the first

nor the last one that happened this way[5][6][7][15][16]. Therefore, I

thought there is a need to talk more about this problem and help to

address it.


During the past years I saw few penetration test reports written by

various companies. It is not a secret that pentesters use the Burp proxy

extensively when conducting a web app assessments. Its scanner module

picks up JS scripts that are included by the applications from remote

locations/other domains[8]. Well, at least the simplest scenarios (it

seems that it has difficulties to properly detect cases where scripts are

included by creating a new script element using createElement() function,

and then by setting its src attribute to URL in a different domain). Said

that, it comes as no surprise that this finding is quite commonly

reported. Exactly, "Another one got caught today". Most sane people would

say that it's enough to copy the contents of the remote script and serve

it locally. This would concur on Burp's recommendation too.  Very simple

and elegant solution. However, one can imagine scenarios where this

approach is not applicable (e.g. the script content is not static). Other

recommendations include these from OWASP[9][10], which IMO are not

promoted adequately for people that are interested in the subject / are

affected by the problem. Would be cool to see this problem being described

in thr OWASP testing guide or somewhere in their wiki.

The client side solutions like NoScript[13] (allows to enable JS scripts

selectively) are not an answer for obvious reasons. What we need is a

simple server-side solution that would be platform independent and could

be used even for already deployed applications.


If you follow the references in the text as you read, then you could

notice that some of them are really old, eight years and older. One would

say that the subject has been already profoundly studied and the problem

has been fixed long time ago. I'm sorry to say that but that's only a

wishful thinking.


Probably the most promising solution meant to solve the problem was a

js.js API[11][12]. However, it never managed to become popular enough and

wasn't used on a larger scale. This could be due to its overcomplexity but

it's only my suspicion. What we know for a fact is that the project is now

effectively dead. What do I mean exactly by the above? It's enough to say

that:


 - the last commit was submitted 4 years ago[14]

 - web developers can be unfamiliar with compiler optimizations, LLVM,

   javascripts engines etc.

 - new javascript engine versions required recompilation of the projects

 - its deployment and maintenance seemed to be an overkill if it was going

   to be used only for a small scripts


Are you still here? Don't worry, it's almost over.


If, for whatever reason, the third party script can't be copied and served

locally, my suggestion would be to keep locally hash checksums of the

third party scripts and before including them for execution simply verify

the downloaded script checksum. If the checksum is different from a known

"good", then you may have a problem. It could be a legit change on the

side of the third party you weren't aware of (e.g. new ad campaign, new

features, bugfixes etc.), but it could be also a malicious change

(watering hole, drive-by-download etc).


I wrote a proof of concept for such checksum verification of a third party

scripts. It is available at:


http://s1m0n.dft-labs.eu/files/3rdparty/


The very concept can be implemented relatively quickly and easily in

various languages/frameworks.


In the first version of this writeup I wrote:


"Most people probably would like to customize it anyway thus I don't see a

point to attempt and implement some generic PoC implementation".


After some time I must disagree with myself. Surely I had to wrote it

because I'm lazy slash short of time. The JavaScript PoC implementation

should provide platform independency and I suspect will be more appealing

than PHP. This is why you can now also check the JS version of the PoC at

the same location.


Yes, I'm perfectly aware this is nothing ground breaking and it's far from

a perfect solution. Needless to say, it's rather a sad compromise between

doing nothing and potentially good but dead solution.


I'm not under illusions that developers will start implement this or other

solution anytime soon. Probably some epic fail caused by this attack

vector would break the status quo but why sit and wait for it? I would

presume it's more beneficient to be proactive rather than reactive. Time

will tell.



References:

[1] http://zaufanatrzeciastrona.pl/post/niebezpiecznik-hacked-czyli-przepowiednie-sie-spelniaja/
 (PL)

[2] http://blog.pi3.com.pl/?p=411 (PL)

[3] http://wampir.mroczna-zaloga.org/archives/1183-sie-porobilo.html (PL)

[4] https://www.owasp.org/index.php/Poland#tab=Past_Events (PL)

[5] http://blog.sucuri.net/2014/11/the-dangers-of-hosted-scripts-hacked-jquery-timers.html

[6] http://www.clickz.com/clickz/news/1705296/doubleclick-admits-servers-were-hacked

[7] http://www.theregister.co.uk/2008/11/22/google_analytics_as_security_risk/

[8] http://portswigger.net/burp/help/scanner_issuetypes.html (Type ID 5244160)

[9] http://owasp.blogspot.co.uk/2011/06/question-on-3rd-party-js.html

[10] https://www.owasp.org/images/6/6d/OWASP-WASCAppSec2007SanJose_Dangers_of3rdPartyContent.ppt


[11] https://www.usenix.org/system/files/conference/webapps12/webapps12-final6.pdf

[12] https://github.com/jterrace/js.js/

[13] https://noscript.net/

[14] https://github.com/jterrace/js.js/commit/87cdca631515d97fa155f48fc76ca02732b19075

[15] https://www.proofpoint.com/threat-insight/post/massive-adgholas-malvertising-campaigns-use-steganography-and-file-whitelisting-to-hide-in-plain-sight


[16] https://blog.chibicode.com/you-can-submit-a-pull-request-to-inject-arbitrary-js-code-into-donald-trumps-site-here-s-how-782aa6a17a56#.3kfo3dy2p



Filip Palian

_______________________________________________
Sent through the Full Disclosure mailing list
https://nmap.org/mailman/listinfo/fulldisclosure
Web Archives & RSS: http://seclists.org/fulldisclosure/


[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic