Some people advocate defensive programming, think that it's better than a system carries on working, logs the fault and continues on merrily. This is okay for any programming where performance isn't of the utmost importance, and you don't mind shipping your software riddled with bugs that have all been caged. What it's not good for is any software that needs to be really safe, or really fast.
Why does it slow stuff down? First reason is, it's usually code that makes sure that return values from get functions are not null, or systems that try to handle illegal or irregular arguments.
if not null is bad because it is an inherent indirect branch (to get a value), then a probably predicted branch (not null). Constantly getting pointers to things and checking them for null is just going to thrash your branch and memory to death. It's offensive to the cache, and offensive to the in order processors in general.
What to do instead? Use asserts. Assume things are not null and carry on regardless. Make your game break when things are actually going wrong. What is wrong with finding out it's all broken a year before you release rather than a day after?