Friday, August 31, 2007

Query HTTP status codes and headers with PowerShell


I wrote this after a posting it in the PowerShell newsgroup. Three methods to the task:


1. Xml Http (com object)

$url = "http://www.cnn.com"
$xHTTP = new-object -com msxml2.xmlhttp;
$xHTTP.open("GET",$url,$false);
$xHTTP.send();
$xHTTP.ResponseText; # returns the html doc like downloadstring
$xHTTP.status # returns the status code

 

PS C:\Scripts> $xHTTP.getAllResponseHeaders()
Date: Fri, 31 Aug 2007 19:05:39 GMT
Server: Apache
Accept-Ranges: bytes
Cache-Control: max-age=60, private
Expires: Fri, 31 Aug 2007 19:06:31 GMT
Vary: Accept-Encoding,User-Agent
Content-Encoding: gzip
Content-Length: 30296
Content-Type: text/html
Keep-Alive: timeout=5, max=64
Connection: Keep-Alive

PS C:\Scripts> $xHTTP.getResponseHeader("Content-Length")
30296
PS C:\Scripts> $xHTTP.status
200
PS C:\Scripts> $xHTTP.statusText
OK

# $xHTTP.responseText will return the document text.

 

 

2. System.Net.HttpWebRequest

$url = "http://www.cnn.com"
$req=[system.Net.HttpWebRequest]::Create($url);
$res = $req.getresponse();
$stat = $res.statuscode;
$res.Close();

 

PS C:\Scripts> $res

IsMutuallyAuthenticated : False
Cookies : {}
Headers : {Vary, X-Pad, Keep-Alive, Connection...}
ContentLength : 135531
ContentEncoding :
ContentType : text/html
CharacterSet : ISO-8859-1
Server : Apache
LastModified : 8/31/2007 7:01:46 PM
StatusCode : OK
StatusDescription : OK
ProtocolVersion : 1.1
ResponseUri : http://www.cnn.com/
Method : GET
IsFromCache : False
PS C:\Scripts> $res.Headers
Vary
X-Pad
Keep-Alive
Connection
Accept-Ranges
Content-Length
Cache-Control
Content-Type
Date
Expires
Server

PS C:\Scripts> $res.GetResponseHeader("Content-Length")
135531

-or-  $res.<HeaderString>

$res.ContentLength
135531

PS C:\Scripts> $res.StatusCode
OK
PS C:\Scripts> $res.StatusDescription
OK

 

3. System.Net.WebClient

$url="http://www.cnn.com"
$wc = new-object net.webclient
$html = $wc.DownloadString($url)

 

There's is still a lot to investigate on this classes and its methods and this post can be only the tip of the iceberg. Personally, I prefer to use the System.Net.WebClient DownloadString method. It's quick and short to type. I don't want to rely on any methods to query responses based on status codes.
In my scripts I choose to get the document text and query it for a matching string.
I do it because if the target web server redirects you (which happens in most cases) to another page using the Server.Transfer method (e.g., 404 - page not found) than you
wont get 404, you'll get back "OK" or 200.

 

Here is another example. Frequently, I need to check if one of my web server on the Internet is listed in CBL's (Composite Blocking List) database as a spam source. This site is one of the most important spam databases. If you're server is listed there it is more likely that you are listed on numerous spam databases on the web. Most of them will simply query CBL.

 

$ip = "<WebServerIP>";
$url = "http://cbl.abuseat.org/lookup.cgi?ip=$ip";
$lookup = "^IP Address $ip was not found.*";
$wc = (new-object net.webclient).DownloadString($url);

if($wc -match $lookup){
    "Listed";
} else{
    "Not Listed";
}

 

Enjoy,

Shay

13 comments:

Anonymous said...

If you are only interested in the headers then it is more efficient to use the HEAD method instead of GET. HEAD returns just the headers.

$hay@Israel said...

Good point. Anyway, I wanted to show a broad range of uses along with other method and properties but I should have mention it. Thank You.

halr9000 said...

Thanks for the the examples Shay, #1 is perfect for my needs.

towens said...

Any ideas on using a similar method to use HEAD request on a webserver that is using SSL with ClientAuth=required? I don't want to pass a cert and bother with the ssl negotiation overhead. I'd like to just do a quick check to make sure site is listening on 443.

$hay@Israel said...

Try a different approach, see here:
http://halr9000.com/article/418

Does this help?

Anonymous said...

Would you please let me know which command I should send to get the list of users from the active directory of a remote machine?

-Jahedur Rahman

$hay@Israel said...

This should get you your AD domain users:

$domain = New-Object System.DirectoryServices.DirectoryEntry
$searcher = New-Object System.DirectoryServices.DirectorySearcher
$searcher.Filter = '(&(objectCategory=person)(objectClass=user))'
$searcher.FindAll()

Brian said...

Another way to get AD users:

[adsi]'WinNT://your_domain_name' |select -exp Children |? {$_.Class -eq 'User'} |select FullName

Prakash Sundaramoorthy said...

Hi Shay - This is a really cool fucntion and it has certainly improved my work. I need a help from you (please), is there a way to view a SAML header (something liek Iehttpheaders).

Thank you.

Pra4ash.

$hay@Israel said...

Thanks Prakash, shoot me an email: scriptolog at gmail dot com

NewEmployee said...

Since you are familiar with IIS7 Web Site administration, I was wondering if you could help me out.

Our Team is building a C# project with a Silverlight module. We deploy to a Windows 2008 with IIS 7. I’m trying to Programmatically Expire the HTTP Response Headers Associated with a Folder called ClientBin immediately. I know how to do it manually through IIS Manager. ( Basically, I go to the HTTP Response Headers Section of the folder or file that is of interest, and then I use "Set Common Headers...." to expire immediately.) However, we will be Redeploying to IIS a number of times, and I want to ensure that it is programmatically done because it’s a headache to keep Reconfiguring all the time.

Should I do it from the C# code of my project or is it better practice to do it using WMI scripting and/or PowerShell?

Could someone please assist?
Thanks,

Web Developer

Anonymous said...

Nice post! it fixed my problem at all. thanks. :)

novelitasty said...
This comment has been removed by the author.