A Google Webmaster Help thread has a webmaster who is looking to lower the number of blocked URLs being reported within Google Webmaster Tools Index Status report.
To make a long story short, they used the robots.txt to block hundreds of thousands of pages. Then eventually they just removed the pages and removed the lines in the robots.txt to block them. But Google still shows them as being blocked in the Index Status report.
Google’s John Mueller explained that it can take a long time for Google to recrawl and notice the pages are no longer there. He wrote:
We also know the index status report is delayed by about a week or so.
Forum discussion at Google Webmaster Help.