Which is a better choice?
- Use a technology that decreases the ability to introduce bugs (e.g. restrict the domain) but takes more time to use and maintain.
- Use a more raw technology where more care is needed to ensure that bugs are not introduced but is much easier to perform changes.
There is no clear-cut answer to this: “it depends”.
Let’s throw out numbers to attempt to make sense of this. It takes 5 man days to implement a change using technology number one that has an average bug rate of 1 bug per week. It takes 0.2 man days to implement a change using technology number two but it has a bug rate of 10 bugs per week. This means that technology #2 is 10x more error prone but takes 25x less effort to use it. From this, it appears that it is better to use the more bug-prone technology than it would be to use the restrictive technology.
This is obviously a contrived case. The gedankenexperiment behind all of this is to determine when one should introduce a technology into a project in order to reduce risk (in this case, bugs). If a technology has high costs associated with its use (e.g. time, training, personnel) then it may not reduce the overall project risk.
Bottom line: Understand all risks associated with a new technology and appropriately factor those into the over-all risk of the project. If the risk increases, it may not be worth while to use technology. If the risk decreases, then the technology will likely provide the desired returns. If there is no appreciable change in risk, then look at other factors such as long-term benefits, project duration, and cost.