|Programming has become more complex, and simpler to do, in tandem as it has evolved so far. I see no reason why this should change.
In the beginning, (are you sitting comfortably? Then I'll begin...)
When you programmed, you had to know exactly how the hardware worked, and what you could do with it. You had to write pretty much everything from scratch, starting with the routine to get a key from the user, debounce it, autorepeat it, and convert it into a character your program could understand. Then you could write the routine to get a number of these characters, then the routine to convert some of those into numbers, then... Anyone who doesn't think this was complex didn't do it!
There was not a lot of code-reuse in the early days - when it came in, in the form of libraries to do that basic stuff, it made our lives a whole lot simpler. So we could move on to more complex ways of using those libraries. Databases for instance - in the early days, a database was a stack of cards with (if you were lucky) a device to spit them into piles...
As we have moved on, we have made it simpler and simpler, and easier and easier to get into programming - look at Q&A if you don't believe me - but increased the complexity of what you do with the computer and the software at the same time - the simpler we make it, the more complex end result we can produce.
Will this end? No, I don't think so. Computers are more complex, but simpler to use. The tools we use are more complex, but simpler to use. And the same goes for the end result!
Real men don't use instructions. They are only the manufacturers opinion on how to put the thing together.
Manfred R. Bihy: "Looks as if OP is learning resistant."