The .shell command in Windbg allows to pipe the output of a debugger command to an external process and automatically print its output back inside the debugger window; a useful example is the command FIND, for example if we want to parse the stack for every thread and find every call where the word “isapi” is involved: 0:036> .shell -ci "~*kpL1000" find /i "isapi"013cff78 1004f94e ISAPI_Rewrite!TerminateFilter+0x3cef013cffb0 1004f9f3 ISAPI_Rewrite!TerminateFilter+0x4462e013cffec 00000000 ISAPI_Rewrite!TerminateFilter+0x446d30144ff78 1004f94e ISAPI_Rewrite+0x77630144ffb0 1004f9f3 ISAPI_Rewrite!TerminateFilter+0x4462e0144ffec 00000000 ISAPI_Rewrite!TerminateFilter+0x446d3.shell: Process exited Incidentally also LogParser (one of my favorite debugging tools) can accept data to be parsed from the input stream using the STDIN keyword, so for example refactoring a script I posted some time ago we can find out if there are any duplicated assemblies in our application pool that should be moved to the GAC: 0:000> .shell -ci "!peb" logparser "select extract_filename(text) as Duplicated_Assemblies, count(Duplicated_Assemblies) as Hits from STDIN where index_of(text, 'temporary asp.net files') > 0 group by Duplicated_Assemblies having count(Duplicated_Assemblies) > 1" -i:textline -o:nat -rtp:-1Duplicated_Assemblies Hits ---------------------------- ----errormanager.dll 2winformsui.dll 2externallibraryinterface.dll 2ptshopengine.dll 2schemas.dll 2dbengine.dll 2flowservice.dll 2Statistics:-----------Elements processed: 182Elements output: 7Execution time: 0.02 seconds.shell: Process exited Following the same principle, we can find out if there are strong named assemblies in our…
-
-
IIS logs around midnight
Recently I had an interesting discussion with a friend of mine about IIS logging: basically he configured his web server to create a new log every day and then wanted to use a third party tool to extract some statistics and graphics from the logs. Unfortunately the tool often has troubles and gets weird results because sometime at the end of the old log file (the one about the ending day) they have some records whose date and time is into the new day. Something like the following: #Software: Microsoft Internet Information Services 7.0 #Version: 1.0 #Date: 2009-03-06 23:59:53 #Fields: date time s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs(User-Agent) sc-status sc-substatus sc-win32-status time-taken 2009-03-06 23:59:59 ::1 GET /MidnightTest/Never+doubt+thy+debugger.htm - 80 - ::1 Mozilla/4.0+(compatible;+MSIE+7.0;+Windows+NT+6.0;+WOW64;+Trident/4.0;+SLCC1;+.NET+CLR+2.0.50727;+InfoPath.2;+FDM;+.NET+CLR+3.5.21022;+.NET+CLR+1.1.4322;+.NET+CLR+3.5.30729;+.NET+CLR+3.0.30618;+OfficeLiveConnector.1.3;+OfficeLivePatch.0.0;+MS-RTC+EA+2;+MS-RTC+LM+8) 200 0 0 15 2009-03-06 23:59:59 ::1 GET /MidnightTest/Never+doubt+thy+debugger_files/global.js - 80 - ::1 Mozilla/4.0+(compatible;+MSIE+7.0;+Windows+NT+6.0;+WOW64;+Trident/4.0;+SLCC1;+.NET+CLR+2.0.50727;+InfoPath.2;+FDM;+.NET+CLR+3.5.21022;+.NET+CLR+1.1.4322;+.NET+CLR+3.5.30729;+.NET+CLR+3.0.30618;+OfficeLiveConnector.1.3;+OfficeLivePatch.0.0;+MS-RTC+EA+2;+MS-RTC+LM+8) 200 0 0 15 2009-03-06 23:59:59 ::1 GET /MidnightTest/Never+doubt+thy+debugger_files/light_sidebar_left_wide.css - 80 - ::1 Mozilla/4.0+(compatible;+MSIE+7.0;+Windows+NT+6.0;+WOW64;+Trident/4.0;+SLCC1;+.NET+CLR+2.0.50727;+InfoPath.2;+FDM;+.NET+CLR+3.5.21022;+.NET+CLR+1.1.4322;+.NET+CLR+3.5.30729;+.NET+CLR+3.0.30618;+OfficeLiveConnector.1.3;+OfficeLivePatch.0.0;+MS-RTC+EA+2;+MS-RTC+LM+8) 200 0 0 15 2009-03-06 23:59:59 ::1 GET /MidnightTest/Never+doubt+thy+debugger_files/customcss.css - 80 - ::1 Mozilla/4.0+(compatible;+MSIE+7.0;+Windows+NT+6.0;+WOW64;+Trident/4.0;+SLCC1;+.NET+CLR+2.0.50727;+InfoPath.2;+FDM;+.NET+CLR+3.5.21022;+.NET+CLR+1.1.4322;+.NET+CLR+3.5.30729;+.NET+CLR+3.0.30618;+OfficeLiveConnector.1.3;+OfficeLivePatch.0.0;+MS-RTC+EA+2;+MS-RTC+LM+8) 200 0 0 15 2009-03-06 23:59:59 ::1 GET /MidnightTest/Never+doubt+thy+debugger_files/vjr6phmypy.cache - 80 - ::1 Mozilla/4.0+(compatible;+MSIE+7.0;+Windows+NT+6.0;+WOW64;+Trident/4.0;+SLCC1;+.NET+CLR+2.0.50727;+InfoPath.2;+FDM;+.NET+CLR+3.5.21022;+.NET+CLR+1.1.4322;+.NET+CLR+3.5.30729;+.NET+CLR+3.0.30618;+OfficeLiveConnector.1.3;+OfficeLivePatch.0.0;+MS-RTC+EA+2;+MS-RTC+LM+8) 404 3 50 0 2009-03-06 23:59:59 ::1 GET /MidnightTest/Never+doubt+thy+debugger_files/msAnalytics.cache - 80 - ::1 Mozilla/4.0+(compatible;+MSIE+7.0;+Windows+NT+6.0;+WOW64;+Trident/4.0;+SLCC1;+.NET+CLR+2.0.50727;+InfoPath.2;+FDM;+.NET+CLR+3.5.21022;+.NET+CLR+1.1.4322;+.NET+CLR+3.5.30729;+.NET+CLR+3.0.30618;+OfficeLiveConnector.1.3;+OfficeLivePatch.0.0;+MS-RTC+EA+2;+MS-RTC+LM+8) 404 3 50 0 2009-03-06 23:59:59…
-
ViewState validation troubles?
If you work with web sites in a complex/multi server environment you might be familiar with this error and as the error message itself suggest, the first thing to check if the machineKey value which must be the same across all of the involved server. Anyway every now and then it happens that you might get this same error even if you are not in a NBL/Cluter environment (Tess and Tom have blogged about this here, here and here) or if you are absolutely sure you have properly configured the machineKey value. So what else can lease you to this state? If you want some quick background you have a look at my post here; basically we have the following (very raw) steps: Begin processing the new request Load the “old” state (ViewState and ControlState) received with the posted values Process the posted data (the actual form values) and raise the “change” events to change control state and do the real page work Save the new state (again, ViewState and ControlState) Send the result back to the client It is important to note that if you dynamically add controls at runtime on your page (i.e. you do not only rely…
-
Charting with LogParser
A picture worth 1000 words and this is true also when you have to analyze IIS logs to get some statistics on your site; as you already know, LogParser is extremely flexible and very useful in such situation and its chart output format is what it’s needed.The first thing you need is a copy of Office Web Components installed on the machine; if you don’t, LogParser will complain and return this error: Error creating output format “chart”: This output format requires a licensed Microsoft Office Chart Web Component to be installed on the local machine From LogParser help: chartType Values: name of chart type Default: Line Description: Chart type Details: The set of available chart types depends on the version of the Microsoft Office Web Components installed on the local computer. For a list of the available chart types, type the following help command from the command-line shell: LogParser -h -o:CHART Currently the latest Office Web Components version available (the newest I’ve been able to find on the Internet) is version 11 which has been released for Office 2003. Changes in the 2007 Office System OWC11 Description: OWC11 is removed from the 2007 Office system and now ships only…
-
LogParser did it again: application pool recycle
This started as a “standard” w3wp.exe crash, but this time IIS was also randomly disabling one of the application pools on the server. This is a protection mechanism we have since IIS 6 called Rapid Fail Protection that disables ad application pool (for security and stability reasons) if it fails (encounters severe errors) too often: how often and how much time it’s configurable (see Configuring Rapid-Fail Protection in IIS 6.0), the default is 5 failures in 5 minutes. Looking at the System event log there were plenty of messages like: A worker process with process id of ‘xxx’ serving application pool ‘MyAppPool’ has requested a recycle because it reached its virtual memory limit. And also: A process serving application pool ‘MyAppPool’ exceeded time limits during shut down. The process id was ‘xxx’. How often? As usual, LogParser comes at hand. logparser "select quantize(TimeGenerated, 300) as TimeGen, count(*) as NumberOfrecycles into memory_limit_recycles.log from system.evt where EventID = 1077 and Message like '%\'MyAppPool%' group by TimeGen order by TimeGen" -i:evt -o:nat -rtp:-1 Note the “quantize” function: LogParser help states “Truncates a value to the nearest multiple of another value”; what it actually does in the command above is count the number of…
-
Time-taken and LogParser for web site statistics
The time-taken field is quite handy if your web application or IIS web server is performing poorly (or slower that you expect) or even for simple statistics and monitoring purposes, if you want to keep under how long it takes every web request to be processed and the page sent back to the client. In earlier versions of IIS the time-taken field was not selected by default so I often had to request customers to first enable it, wait some more time for the problem to reoccur (to give IIS the change to create a significant amount of logs with the new value) before being able to start troubleshooting. Well, at least before being able to extract some statistics involving the duration of web requests. Luckily in IIS 7 time-taken is now selected by default so we already it logged and ready to use. Taken from the IIS online docs, here is the meaning each field in IISW3C logs: Date (date) Logs the date on which the request occurred. Selected by default Time (time) Logs the time, in Coordinated Universal Time (UTC), at which the request occurred. Selected by default Client IP Address (c-ip) Logs the IP address of the…
-
LogParser & event logs on Vista/Windows 2008
I was discussing this morning with a customer, here’s a useful reminder to this post if you need to deal with “legacy” .evt logs on Vista/Win2008. wevtutil epl application.evt application.evtx /lf:true CarloQuote of the day: That’s the secret to life… replace one worry with another…. – Charles M. Schulz
-
Select date ranges with LogParser
Here’s an addendum to my previous LogParser collection of scripts: how can we filter our IIS logs folder to extract only the events happened in a certain time range? logparser “select * into 27.5.log from med*.log where to_timestamp(date,time) between timestamp(‘2008/05/27 10:11:00’, ‘yyyy/MM/dd hh:mm:ss’) and timestamp(‘2008/05/27 10:13:00’, ‘yyyy/MM/dd hh:mm:ss’)” -i:iisw3c -o:nat -rtp:-1 Note I’m using the to_timestamp conversion to combine date and time fields from the IIS log and compare to the date range I’m interested in. CarloQuote of the day: The important thing in science is not so much to obtain new facts as to discover new ways of thinking about them. – Sir William Bragg
-
LogParser scripts for various occasions…
I’ve been working again with LogParser lately to extract some statistics for an IIS server which was facing a suspicious activity from the outside world: they were getting literally thousands of requests from a bunch if IP addresses to their page to request a “forgot password” for their online services. For this post is not important how we resolved the problem, but rather that of the occasion I had to create a few LogParser scripts to extract some statistics from the IIS logs, so I through those might be useful for other people too… Of course you’re free to change them to adapt to your needs. Before you proceed, a couple of words on the scripts: those are meant to be “generic” and run with a batch file which accepts some input arguments, but you can run them from a command prompt directly replacing the “%x” placeholders; also, I print them on multiple lines to be easier to read, but you must run then on a single line. This is to count how many requests you got to a specific page with a specific value in the query string (we were extracting data for the “change password” page with a…
-
Find folder size (conditionally) with LogParser
I was reading this post from Rahul a few days ago, and wondered if something similar could have been done (hopefully with less efforts) with LogParser… Well, it turned our that LogParser accepts a “-FS” switch which stands for File System as the input type, and if you run “logparser -h examples” you’ll have a few sample commands you can use as the basis of your scripts. And guess what? Here’s one of them: Print the 10 largest files on the C: drive: LogParser “SELECT TOP 10 * FROM C:\*.* ORDER BY Size DESC” -i:FS I created a similar structure and put in there some sample images, just to have some files to count: So… here’s the script: logparser “SELECT SUM(Size) FROM c:\temp\Rahul\*.* WHERE INDEX_OF(Path, ‘_wow’) > 0” -i:FS And the output: 5 images, 35.391 bytes each = 179.655 bytes… here we are! ? Carlo Quote of the Day: It is a great thing to start life with a small number of really good books which are your very own. –Sir Arthur Conan Doyle