Once upon a time, in a nonprofit organization, which will remain nameless, a volunteer sysadmin set up a PC to serve as the nonprofit's public server, serving few Web sites, managing mailing lists and some other services.
The guy was clever and configured the OS (Linux) on the PC in a nonstandard but highly secure way.
Few years later, other volunteers took over the PC. They preferred not to bother to learn how the system is configured and how to administer it. They preferred instead to reconfigure the PC to a more conventional and familiar configuration.
End of story.
The arguments, which erupted due to the above preference, led me to ponder the general question: when and why do software professionals prefer to reinvent the wheel?
On one hand, operating systems and computer languages are not, as a rule, reinvented all the time. Most people are content to learn an existing environment, become expert in it and stick to it. Only very few venture forth and write a new OS or a development framework for a new programming language.
On the other hand, when confronted by legacy software or existing installation, several people prefer to discard any existing work and start from afresh.
What differentiates among those two extremes? I tried to build a list of the relevant variables:
- How well is the framework designed for extensibility or for building upon it?
- Quality and thoroughness of documentation - especially instructions how to make changes to the system.
- Amount of wisdom invested in the basic system design, which is worthy of learning due to its own sake.