I am a sysadmin that also deploys software software deployment. My job is to look after network traffic, software deployments, tweaks and, scripts. Scale and breadth-of-impact is always a concern. Furthermore, unlike O-notation, I do care about the multiplier. In the real world, 1000000n or 5n are not the same to me!
If you move some log files for all your users to a network instead of a local drive, your network traffic might be ok, but how will disk access cope? If you write a script to move the old logs, what happens when you deploy it for everyone at the same time? How are you deploying it? What's the algorithm you're using to find all files you want to move?
Do you have software that is frequently broadcasting itself across your network? (insert stink-eye for Dropbox here)
If you write a script that you are somewhat ashamed of, but it works, maybe you choose to run it after 5pm (because solving a stupid problem TODAY is sometimes more important than the elegance of the solution).
Still have a hub with multiple devices on it? How much does it matter?