Moore's Law was a mistake.

i don't mean it was a mistake as in it was an inaccurate observation, but as in, the observed effect has had serious detrimental effects.

//long time ago, computer science, programming, and so on was largely the purview of enthusiastic supernerds. there was a desire to make good, elegant, software. hackers were pursuing the same kind of simple beauty that mathematicians strive for in our proofs. shaving a few lines out of your code was bragging rights and proof of your skills and passion.

//eventually though, old corporate dinosaurs sunk their claws into the computer science community, and their interests and motivations came to dominate. why spend the time making your code clever, and elegant, when you could leave a clungy mess, and implement some new, marketable garbage? why write a function if you can include a 3MB library that already has it. i mean, by the time release rolls around next year, computers will be able to run it anyways.

the standards of 'good' software, in ages past, used to mean simple, clever, elegant algorithms, and using no more than necessary for anything, so you could squeeze as much as possible out of outrageously simple (by today's standards) machines. nowadays, the standards have shifted to a less artistic, and more financial standard. what's 'good' is now whatever you can make money with, and why bother writing elegant hacks when you can put together marketable garbage that'll run anyways on next years computers because of moore's law.