Bob Bentley of Intel told the story of formal verification within the company last month at the Jasper User Group Meeting. His basic philosophy is that proving correctness is much better than testing for correctness. As Djikstra said in the context of software development, "testing shows the presence of bugs not their absence." Bob started off giving Intel's policy of not endorsing vendors and thus saying nothing should be taken that way. In fact Intel use a mixture of internal tools and commercial tools. It is a reasonable bet that they have at least one of everything. Intel got into formal verification very early due to a well known, and very expensive, error that was not caught by simulation.
Formal approaches suddenly gained a lot of traction after the 1994 Pentium floating-point divide bug. This caused Intel to take a $475M charge against earnings and management "don't ever let this happen again". In 1996 they started proving properties of the Pentium processor FPU.
Then in 1997 a bug was discovered in the FIST instruction (that converts floating point numbers to integers) in the formally verified correct Pentium Pro FPU. It was a protocol mismatch between two blocks not accounted for in the informal arguments. Another escape.
So they went back to square one and during 1997-98 the verified the entire FPU against high-level specs so that mismatches like the FIST bug could no longer escape. During 1997-99 the Pentium 4 processor was verified and there were no escapes.
That formed the basis of work done at Intel in the 2000s as they generalized the approach and also scaled it out to other design teams and simplified the approach so that it was usable "by mere mortals" rather than formal verification gods.
They also extended the work to less datapath-dominated parts of designs such as out-of-order instruction logic or clock gating for power reduction.
Going forward they want to replace 50% of unit level simulation with formal approaches by 2015. This is a big challenge, of course. This will spread the word and democratize formal as an established part of the verification landscape and systematize it.
Going forward they want to extend the work to formally verifying security features, firmware, improve test coverage while reducing vector counts, do symbolic analysis of analog (including formally handling variation), and pre-validating IP.