At this year's GDC, I went to a talk by John Hable of Naughty Dog, and he showed a slide very much like this. I subsequently reproduced it in our game engine here at Sucker Punch. It's a real eye-opener about the importance of gamma correction.
Both images here are using a bog-standard ambient + Lambert diffuse + Blinn specular lighting model. The difference is that on the left, we are ignoring gamma altogether, which means we're effectively doing our lighting computation in gamma space, which is physically incorrect. On the right, we apply inverse gamma to all the light levels and colors to convert them into linear space, then do the lighting, and then convert the result back into gamma space.
As you can see, it makes an enormous difference for the specular highlight. Lighting in gamma space (the left) causes the spec highlight to take on the color of the underlying surface somewhat, creating the odd-looking ring where the green channel saturates but the red and blue channels are still catching up. In linear space (the right) the green interpolates to white much more cleanly. The diffuse shadow line is also crisper in the gamma-correct version.
These images used the sRGB gamma curve, which is implemented in hardware on many GPUs, including the PS3's RSX. (The X360's Xenos GPU also supports hardware gamma encoding/decoding, but with a slightly different curve.) The hardware gamma correction is quite fast, so lighting in linear space needn't be a performance hit even in a real game.
If I'm not mistaken, many renderers out there ignore gamma issues altogether, or perhaps just provide a single gamma slider to adjust things at the end after all rendering is performed. This screenshot is, I hope, motivation enough to start paying attention to gamma in your own renderer! While I can't show any screenshots, it certainly has had a positive effect on the overall lighting quality of our upcoming game.