I came across this article on Wired.com, and the title was a new tool. Exciting!!!!
Not really, although it is a new tool, it isn't a new methodology. I have done it before years ago when at a company run into a problem with their usage of a web service. They were repeatedly calling the same web service over and over again. Because the application had already gone thru the QA/UAT cycle (which were very expensive to do). They didn't want to have to go thru the regression costs.
My solution was to make a ServiceHost application with the same interface. Move the web service to a different port and have the ServiceHost takeover the original port. Then have the ServiceHost create/maintain the cache to the web service.
Since the interface didn't change, the only QA/UAT costs involved were for the ServiceHost to prove that it was caching and fetching approriately.