Really, trying to debug an FPGA in a more complex system with just a scope can be very frustrating.
As opposed to simulating a USB controller, DDR SDRAM, and numerous complex proprietary peripherals for which you don't have models?
Simulation has its value in education, but you'll never convince me that it's a win in real-world complex designs. The effort required to keep the simulation faithful to the peripheral hardware, including aspects that aren't documented, will generally be higher than the effort needed to debug the hardware itself.
Those are valid points, but there is another benefit to simulation: rerunning previous validation and verification tests when something changes.
Changes can be any of:
- the next incremental implementation of part of a design. The software world has triumphantly reinvented this concept as TDD!
- a peripheral change
- changed requirements, either from the customer or based on improved understanding
- changed device, within limits!
and there are many others, of course.
Of course the temptation is to use TDD as a crutch, "it passes the simulation tests, therefore it works". Inappropriate use of any tool is possible, and probable with some people

Normally in large designs developing and maintaining tests takes at least as much time as developing the implementation. The benefits can and should be reduced integration times, coupled with the ability to point fingers at other people/companies
