Saturday, December 26, 2009

Blogger Migration

I'm moving my blog to Blogger, formerly hosted in a custom ASP.NET app running at home. I want to get out of the business of running a server at home, and have migrated that box to Windows Home Server (I'll rave about that in another blog post).

Only one problem to migrating via the Blogger APIs: they rate limit you to fifty posts per day (the failure is an HTTP 400 with a plain text body "Blog has exceeded rate limit or otherwise requires word verification for new posts"). I'm not sure if that includes comments or not (which I'm importing, too). But either way, it will be somewhere between a week or two until I'm done (the code's all written, I just need to run it until it stops every day).


Home, sweet home.

Wednesday, December 16, 2009

Cookie Gotchas in ASP.NET

As a result of a recent security audit, we were asked to implement a more secure session identifier to help make session hijacking harder. Specifically, our requirements were:

  • Ensure that client-specified session IDs are system-generated
  • Generate a new session ID on login
  • Invalidate anonymous session IDs on login

We started following a strategy very similar to this solution on MSDN. If you don't click the link, the summary is, implement an HttpModule, in BeginRequest inspect the request cookie and extract & verify a hash from it, overwrite the Session ID cookie in the Request so that the underlying session implementation is unaware of any cookie mucking, and then on EndRequest write a new session cookie to the Response with a hash appended. The primary difference between that solution and ours is that ours encrypted the session ID and authentication information into the token so that we could meet the latter two requirements. (I'll blog about the specific solution at a later date.)

There's only one problem: it doesn't work. At all. You can't overwrite a Request cookie.

Sure, it works in trivial solution (including my POC), but if you modify the Response.Cookies collection, you lose your modifications to the Request.Cookies collection.

I cracked open Reflector to understand why. Take a look at System.Web.HttpCookieCollection.Remove(string):

public void Remove(string name)

{
    if (this._response != null)
    {

        this._response.BeforeCookieCollectionChange();
    }
    this.RemoveCookie(name);

    if (this._response != null)
    {
        this._response.OnCookieCollectionChange();

    }
}

And then System.Web.HttpResponse.OnCookieCollectionChange():

internal void OnCookieCollectionChange()
{

    this.Request.ResetCookies();
}

And as if this were necessary, System.Web.HttpRequest.ResetCookies():

internal void ResetCookies()

{
    if (this._cookies != null)
    {

        this._cookies.Reset();
        this.FillInCookiesCollection(this._cookies, true);

    }
    if (this._params != null)
    {

        this._params.MakeReadWrite();
        this._params.Reset();
        this.FillInParamsCollection();

        this._params.MakeReadOnly();
    }
}

Digging through the Analyzer, the cookies in the HttpRequest are reset by the following seemingly innocuous methods:

  • System.Web.HttpCookieCollection.Remove(String)
  • System.Web.HttpCookieCollection.Set(HttpCookie)
  • System.Web.HttpResponse.SetCookie(HttpCookie)

So if you ever try and overwrite a cookie in the HttpRequest, you had better not call any of those three methods, otherwise the cookies will get reloaded by parsing the original values from the raw request.

Final thoughts:

- You can use reflection to set HttpCookieCollection._response to null (from HttpResponse.Cookies), which then avoids the OnCookieCollectionChange call, but if you do, it will break other things, like automatic adding of cookies on accessing them. Like in System.Web.HttpCookieCollection.Get(string) (called by HttpCookieCollection this[string]):

public HttpCookie Get(string name)
{
    HttpCookie cookie = (HttpCookie) base.BaseGet(name);

    if ((cookie == null) && (this._response != null))

    {
        cookie = new HttpCookie(name);
        this.AddCookie(cookie, true);

        this._response.OnCookieAdd(cookie);
    }
    return cookie;

}

- I could only find one other person on the Internet who experience this problem.

- There's some other weirdness where adding a Response cookie resets the Request.Cookies collection, and adds the Response.Cookies to the Request.Cookies collection. You might be able to benefit from that; add a cookie to the Response.Cookies collection, call one of the methods that triggers resetting the Request cookies, and now your Response cookie will be one of the Request cookies. That didn't work in our case due to later cookie manipulation by the Session provider.

Wednesday, November 25, 2009

Troubleshooting a Custom Resource Provider in ASP.NET

My current project is a large multi-lingual eCommerce engagement. Due to business requirements (on-the-fly updates), we externalized resources into a database (following this MSDN article). Furthermore due to business requirements (uptime for a very high revenue site with regular mid-day content updates), they need to be able to publish markup (aspx & ascx) on the fly without restarting IIS.

We were seeing a sporadic issue where resources would disappear on the page. For example, there was custom control that looked like so:

<custom:CustomHyperLink ID="TL" runat="server" ImageUrl="[Header Image]" SiteMapNodeId="Basket" ToolTip="[Header]" meta:resourcekey="HeaderImage" />

When everything was working (which was 99% of the time), the ImageUrl would be looked up at runtime (via the control's HeaderImage.ImageUrl resource) and substituted in as ~/images/en/header_en.gif. But when it didn't work, we would see an <img src="[Header Image]" />, which obviously is a problem.

Adding to the confusion was that only some of the resources showed this problem. Also, it was seemingly random. Everything would be working one minute, broken the next, in the middle of the day. Putting together steps to reproduce was nearly impossible. It had something to do with loading resources and/or compiling the site and/or restarting iis and/or the planets aligning.

Here's what we didn't understand: Resources in ASP.NET can be declared either explicitly or implicitly. Explicit means code, GetGlobalResourceObject("keyName") or GetLocalResourceObject("keyName"). Implicit means markup, meta:resourcekey="keyName".

Explicit calls are evaluated at runtime, every time, because it is regular code that gets executed in the regular way.

Implicit calls are evaluated once at compile time, and the compiler (essentially) hooks up a binding to between the appropriate property and the explicit / GetLocalResourceObject()* call. To walk through the above control sample:

  • At ASP.NET compile time, that line is parsed. The compiler looks at what’s specified in meta:resourcekey.
  • That key is passed into IImplicitResourceProvider.GetImplicitResourceKeys. This method queries the resource provider for all valid keys starting with HeaderImage, and returns all of those keys in an ICollection. Our example thus returns {"HeaderImage.ImageUrl", "HeaderImage.ToolTip"}.
  • The compiler inspects all of the keys, parses them, and performs databinding. Our example matches HeaderImage.ImageUrl from our resource provider and hooks up a binding between the property and IImplicitResourceProvider.GetObject("HeaderImage.ImageUrl") ... *which I believe calls GetLocalResourceObject() to complete the circle.

If the resource datastore, whatever it may be, is unavailable (or empty!) at ASP.NET compile time, the call to GetImplicitResourceKeys will return nothing and no databinding will occur.

That's what was happening to our resources. We would push new markup and immediately afterwards import new resources. But our import process was naïve; it deleted all of the resources from the database, scanned the filesystem for .resx files, loaded them into memory, and then inserted them into the database. And because it was a live site, many of these controls would be hit by the ASP.NET compiler while the database was empty. This meant the compiler thought it had nothing to do because GetImplicitResourceKeys returned nothing and so would let the resources fall back to their default text, in our case, [Header Image].

The resources that never exhibited a problem were of course retrieved explicitly.

I want to give credit to Rick Strahl, whose blogged frustrations about this topic helped point me in the right direction. Unlike him, I did not have to implement IImplicitResourceProvider (which agrees with MSDN), I just had to trace down a race condition where it was being called at an inopportune time.

In summary, pushing new markup and then immediately emptying the resource datastore on a live site will break implicit (meta:resourcekey) resources due to how the ASP.NET compiler resolves implicit bindings.

Tuesday, October 20, 2009

Windows 7 Media Center WTV to iPhone / iPod Video Transcode

Below is my first pass at building a Powershell utility to monitor my Recorded TV folder from Windows 7 Media Center and automatically trim commercials and transcode them into iPhone format.

Some notes:

  • The version of ffmpeg.exe comes from iPodifier. I want it to use a more recent stock build.
  • Requires comskip and and drvcut. Built with Comskip 80_25.
  • Next version will also output Podcast XML, which will greatly clean up the experience on the iPhone.
  • Keep an eye on the transcode tag here for future updates.

$sourceFilesUnfiltered = "C:\Users\Public\Recorded TV\*.wtv"; $binDir = "C:\Users\Media\Downloads\comskip80_025"; $outputDir = "C:\iPodVideos2"; $keepWindowDays = 14; $sleepInterval = 600; while(1) { # Filter the input files $sourceFiles = Get-ChildItem $sourceFilesUnfiltered | Where-Object { $_.CreationTime -gt (Get-Date).AddDays(0 - $keepWindowDays) } | Sort-Object CreationTime; Write-Output ("Setting process priority to Idle."); [System.Diagnostics.Process]::GetCurrentProcess().PriorityClass = [System.Diagnostics.ProcessPriorityClass]::Idle; # Convert WTV to DVR-MS, Analyze afor commercials, trim commercials, and transcode to iPhone format. foreach($file in $sourceFiles) { #echo "Beginning $file"; # We don't want to process a currently recording show, so check last write time $file.Refresh(); #if(((Get-Date) - ($file.LastWriteTime)).TotalSeconds -lt 120) #{ # Write-Output ("Skipping $file because it is in use."); # continue; #} try { $handle = $file.OpenWrite(); } catch { Write-Output ("Skipping $file because it is in use."); continue; } finally { #TODO if handle isn't null $handle.Close(); } $baseName = $file.BaseName; $dvrmsFile = "$outputDir\$baseName.dvr-ms"; $cleanName = "$outputDir\$baseName" + "_clean.dvr-ms"; $m4vName = "$outputDir\$baseName" + ".m4v"; $trimFile = "$outputDir\$baseName" + "_dvrcut.bat"; #if ipod version does not exist if(!(Test-Path $m4vName)) { Write-Output "Beginning $file"; Write-Output ("Converting to DVR-MS."); & "C:\Windows\ehome\WTVConverter.exe" $file $dvrmsFile | Wait-Process; Write-Output ("Running ComSkip."); & "$binDir\comskip.exe" -q --ini="$binDir\comskip.ini" $dvrmsFile $outputDir | Wait-Process; Write-Output ("Trimming Commercials."); & Start-Process -Wait $trimFile;# | Wait-Process; Write-Output ("Compressing video."); & "$binDir\ffmpeg.exe" -y -i $cleanName -f mp4 -s 480x320 -acodec libfaac -async 4800 -dts_delta_threshold 1 -threads auto -vcodec libx264 -b 512k -level 21 -r 30000/1001 -bufsize 2000k -maxrate 768k -g 250 -coder 0 $m4vName | Wait-Process; del "$outputDir\$baseName.log", "$outputDir\$baseName.logo.txt", "$outputDir\$baseName.txt", $trimFile, $dvrmsFile, $cleanName; } else { Write-Output "Nothing to do for $file"; } #break; } # Delete extra (outside of the keep window) files foreach($m4vfile in Get-ChildItem "$outputDir\*.m4v") { $matchFound = [bool]0; foreach($wtvFile in $sourceFiles) { if($wtvFile.BaseName -eq $m4vFile.BaseName) { $matchFound = [bool]1; } } if(!$matchFound) { Write-Output ("Extra file found: $m4vFile"); } } Start-Sleep $sleepInterval; }

Sunday, September 06, 2009

Whoops

Had a hard drive die on the RAID5 and was down for a few weeks. Everything is back up, now with the web and SQL servers running on a Hyper-V VM. Next time the OS drive dies, all I have to do is remount the VHD and go.

Sunday, January 25, 2009

Bootable USB Flash Drive

Reposting these instructions on making a bootable flash drive.

  • On a command prompt, run diskpart.
  • list disk
  • select disk 2 (Replacing 2 with the correct number!)
  • clean
  • create partition primary
  • select partition 1
  • active
  • format fs=fat32 quick (Note that fs=ntfs will NOT be bootable!)
  • assign
  • exit
  • robocopy /MIR [source drive] [destination flash drive]

In short, you create an active, primary partition on the drive (and you have to do this in diskpart, because the Windows UI doesn't support it) and then you copy the files from your Windows installation media to the drive. And here I always thought that this was something difficult!

Windows 7 Media Center

I upgraded my Media Center box to Windows 7 Beta build 7000. Most everything is working; the native Clear-QAM support is most welcome! My channels actually say 5.1 instead of 1868. Good stuff. A little polish on the UI is also welcome. (I did have to manually change a registry flag that didn't upgrade automatically.)

The only issue I am having is with—surprise, surprise—the NVidia drivers. HDCP isn't working (solved with AnyDVD HD), switching resolutions is flaky and resizing the desktop (to correct overscan) isn't working. Oh, and that's with Vista drivers because the Windows 7 drivers crashed the system and had to be rolled back. But...I'm excited enough about testing it to deal with it.

Saturday, January 17, 2009

ShowMeCables.com

I have to post a rave about the customer service at ShowMeCables.com.

A while back, I decided to make a Y-adapter so that I could use an iPhone headset with my computer, which has the traditional separate mic and headphone jacks.

A few weeks ago, I got an email through my web contact form from John at ShowMeCables.com. He asked me if I knew why he was seeing so much web traffic from my domain. I told him about the adapter I had put together and shared links, and said that a post I had made on a forum had really led to it being noticed.

He replied and said he understood what I was trying to accomplish. He said that he would have one of the guys in his custom cable division whip me up a prototype and he'd send it to me. All he asked in return was that I let him know if it met my expectations.

A few days later I got a fedex with the connector. It was well made with durable-feeling construction. The best part was the handwritten-in-sharpie "mic" and "headphone" labels on the connectors. They sent me the very first prototype! I love it!

So to anyone looking for any custom cable connectors, do me a favor and check out ShowMeCables.com. If you have interest in the specific adapter, you can order it directly here. In fact, the adapter pictured there (at least as of now) is the actual unit they sent to me.

Is that customer service worth raving about, or what??

Sunday, January 04, 2009

Anthony Bourdain

Quotes from Kitchen Confidential:

"Saving for well-done" is a time-honored tradition dating back to cuisine's earliest days. ... What happens when the chef finds a tough, slightly skanky end-cut of sirloin that's been pushed repeatedly to the back of the pile? He can throw it out, but that's a total loss. He can feed it to the family, which is the same as throwing it out. Or he can "save for well-done"—serve it to some rube who prefers his meat or fish incinerated into a flavorless, leathery hunk of carbon, who won't be able to tell if what he's eating is food or flotsam. Ordinarily, a proud chef would hate this customer, hold him in contempt for destroying his fine food. But not in this case. The dumb bastard is paying for the privilege of eating his garbage! What's not to like?

Vegetarians, and their Hezbollah-like splinter faction, the vegans, are a persistent irritant to any chef worth a damn. To me, life without veal stock, pork fat, sausage, organ meat, demi-glace or even stinky cheese is a life not worth living. Vegetarians are the enemy of everything good and decent in the human spirit, an affront to all I stand for, the pure enjoyment of food.

Saturday, January 03, 2009

Packing Peanuts

I was cleaning up some old packaging (as well as taking down the Christmas tree) when I noticed that this box full of packing peanuts had a note in it that the peanuts were made of cornstarch. Intrigued, I got one wet and it melted away. So I dumped the whole box in the sink and stirred it in hot water until they dissolved. They washed right away. I wonder if I could thicken a sauce with them, too?

The quest for the perfect cup of coffee

I enjoy coffee, and am stuck on the quest for the perfect cup. There are so many variables: the quality and temperature of the water, the origin and processing of the coffee beans, the freshness of the coffee roast, the freshness of the coffee grind, the evenness of the grind, the time the grounds spend in the hot water. And of course, how those variables are combined: in an espresso machine, in a French press, in an automatic drip machine or—in my latest and best experiment—a vacuum pot.

French presses have two main flaws: they make too muddy of a cup, and they are prone to over-extraction (if the coffee and hot water are in contact for too long, the beans are over-extracted and bitter compounds are released—most cheap coffee we drink in America is made from too little coffee brewed for too long, resulting in a weak and bitter cup) as you tend to press the plunger and drink off of the still-extracting, slowly-bittering brew as you go (you can fix that by pouring the pot into a thermos immediately after plunging, but who does that?). That just leaves the muddy cup issue.

Most automatic drip machines (the typical American way of drinking coffee) don't get the water to the proper temperature and/or dump the water unevenly on the grounds and/or let the water and grounds commingle for the wrong amount of time and/or lose some of the flavorful oils to the paper filter. The fact is, once you surrender control of so many variables to an "automatic" process, you also surrender the ultimate quality of the final product.

To fix the drawbacks of the auto-drip process, I have been, for the past year or two, using a manual pour-over process. With a filter cone, a permanent gold filter and a tea kettle, you can make a damn good cup of coffee. But the gold filter does leave a little mud in the very last cup.

And then of course there is espresso—the magnifying-glass-on-your-flaws method of brewing coffee. Using a very fine grind, densely packed, being extracted by hot water forced through it under high pressure, the slightest flaw (such as an uneven tamp of the grounds) can produce a worthless shot of espresso. But with such high demands comes a high reward: a perfect shot of espresso is a dense and intense bit of coffee flavor. Like a beautiful piece of dark chocolate, there is nothing quite like it.

Today I tried out the latest method of making coffee: a manual vacuum pot. (I have since stopped using the automatic vacuum pot as I didn't like surrendering control to the automatic process.) My latest and greatest coffee gadget is a Yama 20oz stove-top coffee siphon. I also picked up a used Cory glass rod filter (search on eBay for it; it's a contraption from 1933 that'll run you $5) so that I would never have to replace the cloth filter that came with the Yama.

Vacuum pots are probably the "coolest" way to brew coffee. As boiling water causes steam pressure to build in the bottom chamber, water is forced up the tube into the top chamber, where it is now slightly below boiling temperature; when removed from heat, the decreasing temperature in the bottom creates a vacuum, pulling the coffee from the top back down and filtering it (search YouTube for an example video). Vacuum pots produce an exceptionally clean cup of coffee—even though it's just the friction of a glass rod resting in a tube holding back the grounds. The pressure from the vacuum actually packs the grounds together and pulls the liquid through it, further filtering out the fine sediment.

All things considered, this new brewer has amazing potential. I need some practice to tweak the variables that I control (size of grind, amount of grind, and extraction time), but the beautifully clean cup of coffee and the simplicity of operation will keep me happy for a while...at least until the next cool gadget comes along!