I’m in the middle on this argument, I agree with Wictor Wilén regarding the dangers of rogue scripters… let’s be honest a LOT of people copy code off the internet, have little to know idea what it does or how it works, but because it sounds like it will solve their problem the decide to use it. Because, as Marc D Anderson says, they really really just want to get the job done, and they have no resources, time, money to go the route of bringing in a professional (either internally or externally sourced). My concern lies more with unintentionally putting code on a page that yes the CEO could hit, that could really be quite malicious. I have significantly less concern about putting poorly performing code on pages, as to Marc D Anderson’s point, scripts can be fixed.
For me, the real concern here is not finding a way to give users a sandbox to get work done and governance and security around those sites/locations that have information that needs to be secured (Wow, I’m clearly channeling Sue Hanley right now.) I have seen it countless times in my consulting life, where becuase IT has so much regulation and control around what users can and cannot do to deploy their solutions in areas where there really doesn’t need to be that level of stringency, they find ways around it… Water always finds a path. If by limiting the deployment of “web parts” the admin creates a barrier that feels insurmountable to the users trying to get things done, they will find a way around it. It's a balancing act, and not an easy one.