|
I got the impression one person was definitely HR and one probably a developer. The third person I honestly couldn't place as he didn't say much beyond the greeting. I was just kinda dumbfounded. If I was hiring a carpenter to build a house I wouldn't ask him "Do you know what a hammer is? What about wood? Alright, that's all I need." 
|
|
|
|
|
Debugging and testing are the most valuable skills, and they're seldom taught.
|
|
|
|
|
dandy72 wrote: These days there's too many people in this field who'd have to resort to calling their IT support department because you disconnected their keyboard while they were away at lunch time.
With a vast array of desirable business technology needs people specialize. Just as long ago the person that built a log cabin could dig the outhouse latrine but today I do not expect the cable guy to fix my toilet.
|
|
|
|
|
While I agree with your assertion in the general sense, are you saying it's ok for people to never try to do anything, ever, that deviates from the only script they've learned to follow? If that's the case, then the automation revolution can't get here fast enough, because clearly nothing of value will be lost.
|
|
|
|
|
dandy72 wrote: are you saying it's ok for people to never try to do anything, ever
No.
|
|
|
|
|
Jeremy Falcon wrote: you learn X, Y, and Z.
I wasn't there either, but if I understand correctly, you didn't learn all three.
You picked your career path and then learned COBOL or FORTRAN or ASSEMBLY.
Or, you learned Pascal and BASIC and hoped to get a job teaching.
|
|
|
|
|
Or Algol. There's a good chance if you were a programmer in the 60s you'd be exposed to Algol.
|
|
|
|
|
You say that like it is some form of harmful radiation. I like. 
|
|
|
|
|
In the end, it's all about breaking down and solving problems. I'll gladly learn a new stack/framework if it solves a problem at hand. (makes or saves $) That said, I usually don't (aside from maybe reading articles) invest in learning something new just to add a feather to my cap.
On another topic, with the answers to the universe at out fingertips these days, getting by on your wits is much easier than it used to be. Either I've done it (or something like it) and can re-use the code/logic, or I can usually find something useful in less than 10 seconds using google. This is why I haven't bought a real programming book/manual in more than 5 years. These days the only mastery required is in phrasing search terms.
"Go forth into the source" - Neal Morse
|
|
|
|
|
One cannot simply learn everything that the industry uses, this could apply to other fields too.Considering the remarkable impact that computer and engineering has made on other disciplines and considering the modern trends in the industry.Older systems and technology gets replaced by newer system and programming languages.....On a long enough timeline, the survival rate for everyone drops to zero...
Caveat Emptor.
"Progress doesn't come from early risers – progress is made by lazy men looking for easier ways to do things." Lazarus Long
|
|
|
|
|
To digress a little.... Time was an intelligent and educated person could know just about everything there was to know. Literally. And from that grew the stereotype of the lone scientist in his lab coming up with some new invention to change the world... for a while, such people could exist, but not any longer. No one can know everything, not even within one subject area - the most anyone can be is a master at one or two (r more, maybe) disciplines within a subject, there is that much knowledge about.
So science now, and in the future, is and will be a collaborative affair. The big advances now - take nuclear fusion (if it ever happens), quantum computing, or a myriad of medical advances - these aren't and won't be made by our stereotypical white-coated lone scientist in a lab, but by the collaborative efforts of different research groups around the world.
We all have to stand now on the giant collective shoulder of those around us in order to see anything.
|
|
|
|
|
Makes me think of James Burke's Connections series. The path to any discovery is usually weird, and builds on what came before.
|
|
|
|
|
The code I am trying to maintain began with a definable message based API. It was a little slow on the under-powered embedded system we started with, so "optimizations" were made. This is my current contract interface: API[^]
I now have logic in some key processing that depends on the string content of a global value. Rather than just look at the interface, I am reduced to searching through all project files for all references of said global string variable. I have code from multiple targets in the wrong files, etc. I've never seen entropy attack so fast.
Charlie Gilley
<italic>Stuck in a dysfunctional matrix from which I must escape...
"Where liberty dwells, there is my country." B. Franklin, 1783
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
|
|
|
|
|
Don't try to learn anything new until you need to. For instance, if knowing A, B, and C gets the job done where you're at now, then become proficient in A, B, and C.
When you're considering trying to find a new job (or have to because you got fired/laid off), look at what's currently in use, and learn enough to at least be familiar with the general territory, and when you do get a job do the new tech, become proficient in it.
Little secret - all programming languages are essentially the same. The primary stuff you have to learn is the topology of the framework(s)-de-jeurs. THAT is where the steep learning curves exist.
I've been doing this crap for 40 years, and that's the way I've been doing it the entire time.
".45 ACP - because shooting twice is just silly" - JSOP, 2010 ----- You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010 ----- When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013
|
|
|
|
|
I'd agree with that - about the time you figure out the framework, it's just the same thing with different sauce. I pondered Xaml for a while, then I learned it was repackaged UIL from days past...
Charlie Gilley
<italic>Stuck in a dysfunctional matrix from which I must escape...
"Where liberty dwells, there is my country." B. Franklin, 1783
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
|
|
|
|
|
I was a little too young to learn programming in the 60s, but had my university education around 1980. That was an age where a lot of exciting things happened in the language world - every man had his own language. I can't count the languages I have been programming in.
So when people asked me how many langugages I were familiar with, I said: Maybe three or four.
First comes the algorithmic. That covers Fortran, Basic, Algol and Pascal including variants like Concurrent Pascal. Then comeplain C and C++ and Objective C, and lesser known languages like CHILL and the vendor-specific Planc, and later I have been using Java and Javascript ... and let's throw in COBOL, for good measure. Python, of course, and PHP. Those are all samples of Language One - the way of thinking is very much the same in all of these, and syntax differnces are not that significant.
Second comes the "workspace" model which is like a sandpit where you throw in and take out functions and objects over time. Smalltalk comes in this class; my main experience is with APL, which is also quite different with its extremely array based data structures. Yet, the most essnetial difference from the algorithic group is the workspace model.
The third language is the predicate languages, of which I have only used three variants: SQL, Prolog and SNOBOL. You could say that SNOBOL is a semi-algorithmic language; it has flow control and at the top level is rather sequential, but the more familiar you are with the language, the more you leave to predicate logic. I never became friendly with XSLT, and am happy that I managed to sneak out of it regex I have to do every now and then.
Fourth language: Essentially Lisp, which I have only used for programming Emacs. I can see that some people like it, but it doesn't give you much help in offloading the conceptual model into the saved file; essentially you must free up space inside your brain to hold the model.
For a couple of projects, we peeked at functional programming, but I wouldn't count that as number five. We just studied the syntax of Erlang (and peeked at others from a distance), but I never really tried it in pratice.
I would rather like to add as number five: Data modelling - not a complete language by itself, as it usually won't provide definition of operations, only the interface to them. I think this language (group) is the most underrated one! A good data model, whether you use ER or ASN.1 or an XML schema, is essential to understand the problem domain. You may say that if you use the whole of the OSI communication protocol stack toolbox, you do operations modelling very much in the sam spirit as you do data modelling.
We may add yet another language group: State/event programming. Look at how the OSI session layer is programmed: Tables with current state along one axis, event along the other, each square stating predicates and actions. No other language could possibly come close to that state/event description of the logic, when it comes to clarity, consiseness and freedom from ambiguity.
- - -
Gee, did we have a plethora of languages in those days! Now, we have chopped off eighty percent of the algorithic group and 100% of the rest. State/event is squeezed into a switch. Data modelling is squeezed into C struct. The workspace concept is squeezed into C malloc, or by its modern name: new. List processing is squeezed into C linked structures ... And so on. One bad thing is that today's young programmers never ever question the () around if/while conditions (whether they program in a C variant, Python, Java,... When they see Pascal code, they wonder: Didn't you forget the parentheses? Talk to them about workspace models, and they worry how efficient the malloc will be in such an environment, and when a Cobol decimal field is 6 digits wide they figure that a long long is needed to hold the value range.
Really, today's programming world is a lot more primitive than in the 80s.
And you could say the same about OSes. About file systems. About communication protocols. About user interfaces. Maybe the 80s were a chaos, but it certainly wasn't that primitive monoculture of today, where 95+% of all programming is done in C-like languages, 95+% of all data traffic is limited by the Internet Protocol family at least at some level or stage, 95% of all data is stored in a file system that provides no high level support, and for 95% of process/thread modelling the entire world uses one of two models - and those two certainly do not differ very much.
True: We have got dozens of variations on C. Dozens of variations of U*ix. Dozens of variations of sh. Dozens of variations on fork. But those are essentially small ripples on the surface. Fundamentally different ways of thinking and of doing things have weathered away. I think that is just as big a loss as the ripples are a problem.
|
|
|
|
|
I do half my work in Fortran and half in C#. So, on average, it feels like the 80's to me.
I feel blessed to be avoiding the new stuff. I do suffer vicariously through all of you CPers though.
|
|
|
|
|
Wait. I'm not the only person using Fortran on the forums? BLASPHEMER!
|
|
|
|
|
Maybe we're the only 2 that will admit it.
|
|
|
|
|
As stated, it's been like this for decades. However, the tools that exist and people want are far more greater in number that it was even 10 or 20 years ago.
As a dev approaching 15+ years of work and 20+ years programming experience, you basically have two semi-smart choices to make as you grow up in this field. Position your self to grow from dev to middle management and beyond. Not everyone is managerial material, or wants to be so this isn't always the best option.
The other smart path is for the first 5-10 years of your career, look for the BIG tech tools that every big company uses for 5-10 years and become an expert amongst experts. After that tech is dead for 10+ years, dig up all of your old resources, load up some old VMs and re-familiarize yourself with "legacy" tech.
There are a lot of major corporations that are not in IT that still have old COBOL/RPG mainframes. There are also a ton of VB6/very-old Microsoft Access databases that are in use in even BIG companies. These companies often get to the point where this stuff breaks or they realize they have to throw a TON of money to upgrade their systems. Software specs tend to not exist within these companies (or were long lost) so they need legacy gurus to come in, dissect the existing system(s) and then fix it or spec-out a new system from old code. This is "consultation" work and can pay very well, if you are familiar with the right tech.
This isn's "sexy", but by golly if you get bored with the "rat" race of being perpetually obsolete, let your most obsolete expertise start to work for you. I'm not their yet, but I'm probably going to start pushing "WinForms" development around 2021. I plan to work hard at staying on top of what is relevant today but in 2021 I'll be 40 and that's a good age to start pulling out the old familiar tech that none of the "new kids", fresh out of school, have even heard of or will laugh at when they still see it in use. Suckers... it's the desperate corps that pay the best, who are scrambling to find the grey-beard experts.
|
|
|
|
|
Yup. I believe the latest buzzword today is "full stack" developer. Sorry, I don't buy that designation AT ALL. You could re-brand that "jack of all trades, master of none". I'm sorry, but the designation is pure B.S.
I've been developing code for 40 years and I think I've developed some good proficiency in that time and know a few good technologies to use in my development. My code gets answers and it runs FAST. (I've had more than one employer ask me in an incredulous tone "why does your stuff run so fast"?). Er, maybe it's because I don't haul in a couple of gigs of library code to run my executables...
I don't even apply for positions that are looking for "full stack" developers because, IMHO, they are completely disillusioned as to what software development is really about. I believe that would be ... solving problems? Full stack ... seriously?
If you think hiring a professional is expensive, wait until you hire an amateur! - Red Adair
|
|
|
|
|
Back in the day, the progression was:
Jr. Programmer
Programmer
Programmer/Analyst
Programmer/Analyst II
Sr. Programmer/Analyst
Sr. Analyst (Or Business Analyst)
<some level="" of="" management="">
I always preferred adding Analyst. First you learn the syntax, and the environment.
As a Jr. Programmer, you often took someones scribbles of code on punch cards, and punched them.
The person reviewed them. One programmer could keep a few Jr. Programmers busy. (things changed).
Usually it was teams of both...
After the language/syntax and environment was learned. You moved up.
The real interplay is in taking business needs and getting to computer solutions.
My favorite job interview was where I was competing with someone with 5 years of Clipper for a Clipper job. I had SEEN clipper code, and did DBase code a little bit. But I had great analytical skills.
The guy interviewing me for a part-time position was convinced he would hire the "Pro", and not me, but already had my interview scheduled.
I simply explained that it is the Analysis where all the failures being. The syntax of the language is easy enough to learn. If you are solving the right problem. I asked him to think about the "fixes" he had to have the previous guy make. What percentage were:
- Did not understand the goal properly
- Logic Error (Did not express the goal properly)
- Lack of testing
- Lack of User Sign off
- User Error/User Confusion
- Bad Syntax/Failure to use the programming language correctly?
I explained to him, that if he hired me, I would drive the first few items to ZERO occurrences, and that my biggest fear was programming myself out of a job, because the current guy was constantly fixing his own mistakes. He laughed. He thought... He Hired...
One year later, he apologized that he ran out of work for me to do. Wrote me a 2 page letter of recommendation, and gave me a minimum number of hours each week to do whatever I wanted.
I want to hire creative problem solvers who know how to solve problems and express them in code.
Then the importance of the language is reduced. The rework is reduced.
Nobody wants to help that person by giving them a little time to learn a technology they may need.
That's crazy. Good problem solvers are hard to find. Great programmer/analysts are hard to find.
So old companies would make them!
|
|
|
|
|
Bare foot, in the snow, up hill, both ways! That's the way it was, and we loved it.
|
|
|
|
|
Heck, even into the early '90s you could "get by" with just a few good skills. I think retraining hell is companies' revenge for having to pay us so well. I have a Despair Inc coffee mug that says "Just because you're necessary, doesn't mean you're important." That sums it up nicely.
|
|
|
|
|
Jeremy Falcon wrote: Who cares if it's a 90% copy of Y... Z is so shin
Unfortunately the reality is that no one can figure out if new idioms are worthwhile until a lot of people use them.
In the 60 there were no options. Not to mention that programmers had to wear suits.
Not to mention you can still get a job programming Cobol if you want to.
|
|
|
|
|