|
Impressive!
Get me coffee and no one gets hurt!
|
|
|
|
|
I grew up with Open Systems Interconnection - communication protocols that are very explicitly designed for being completely independent of any given tool used to generate the protocol elements. One very fundamental protocol principle is that the application, and all lower layers, should relate to the protocol specification, and nothing else.
As OSI was pushed into the darkness, it felt like a rope being tightened around my neck: the protocols replacing the OSI ones more or less demanded that you use specific libraries, specific OS conventions, specific languages - or else you would have to emulate those specific libraries, OSs and languages. But that is troublesome, because new extensions were added all the time, and keeping up with the updates to the libraries, OSs and languages in your emulation was almost impossible.
Sometimes, the binding to specific tools is not immediately apparent. Take the RFC 4506 serialization format (a.k.a SUN XDR): In my archives from the years of the network wars is a "benchmark" of it against OSI BER (Basic Encoding Rules). XDR beats BER by an high factor. What the benchmark documents keep perfectly quiet about is that XDR is more or less a direct streaming of the binary layout of a C struct on a SUN machine; there is hardly any conversion at all. After generating the red tape, you set that byte pointer to the start of the struct and send the following sizeof(struct) bytes out on the line. (Needless to say, this benchmark was run on a SUN.)
I never thought BER would be anywhere as fast as XDR (BER has a lot more flexibility, which can't be realized at zero extra cost). But if you set up a similar benchmark on a non-SUN machine, the serialization cost of XDR might easily raise by a magnitude. Say, if the machine had a different float format. Different byte order. Maybe an old machine wouldn't be byte addressable - my first three university years, I was programming non byte addressable machines exclusively; the Univac even used 1-complement integers, and its proprietary 6-bit character set. (The Decsystem series used 7-bit ASCII, packed 5 to a 36-bit word.) Say, if your language has proper string handling, you will have to explicitly add a C-style terminating NUL at the end (a use of NUL which is in direct conflict the the ASCII / ISO 646 standard). The specification of how to encode a structure is given by how a C declaration of it looks.
And so on. This is typical: As long as you use the "proper" machine, the "proper" language, the "proper" OS, you find it very easy to read the specifications and implement the standards. If you come from a different environment: Not so.
There is so much of "You can have the T-Ford in any color you want, as long as you want it in black".
This is my major objection against Java: It is not just another language that fits in with other languages, under an arbitrary OS. It creates its own world, telling others that you sure can join in -- if you adapt to the Java way. It is your responsibility to adapt to Java, not the other way around. And Java will be the master of the Java environment.
(I have the same objections against Python: It makes a world of its own, trying to monopolize software development completely. Can you compile a Python library, making it callable through a standard calling convention from an arbitrary compiled language? Dream on...)
I like .NET as a language independent, OS independent platform. Java affectionados may claim that JVM is not in principle Java/*nix bound, but with .NET, you see it in practice as well. (But it cannot be hidden that a lot of Windows GUI stuff is based on p/invoke, which really should have been avoided!)
When MFC came on the scene, I rejected it because I felt that it tried to bind me on hands and feet - the philosophy of multi-tier software design wasn't well developed at the time, and MFC most definitely was not designed for such ideas). Java by itself is not that restrictive, but the Java ecosystem creates a similar tie-in that I do not want to live by.
|
|
|
|
|
I saw that tendency in MFC and I never bought into that aspect of it. I try my best to avoid all of its collections and OS synchronization object wrappers and I use STL and write my own wrappers. That lets me leave only the UI for MFC and I actually like how it handles most of that. It's pretty much on life support now so soon I am going to leave it and move on.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
Nothing really, as long as you remain inside its walled garden.
I had a few interactions with Java, all of them ending in pain and tears because at some point I needed step out of the Virtual Machine.
I fondly remember discovering that the byte type is signed (why, really why ???) and spending a few days debugging my hardware to figure out in the end that Java was to blame.
Or the magical moment when one of the gazilion DLLs needed by an over engineered project had a bug. I simply fixed the bug and recompiled the source and build the DLL again. Something none of the over Java experts were even aware it was possible.
And of course, how can I forget when I relied on the Java standard String library only to find out that the target where the program ran had an incomplete (but still announced as 100% compatible) implementation of that library. What can be more fun than writing your own standard library functions ?
A bit more serious, there is nothing wrong with Java. It is widely used, and in most cases it is good enough. I was just an unfortunate victim of the attempt to using Java in the embedded world, where it most definitively is not right an appropriate tool.
|
|
|
|
|
Quote: I fondly remember discovering that the byte type is signed (why, really why ???)
I know the answer to this one: it's because the representation of signed integers is specified (as two's complement), so there is no problem with bit-shifting (or other bit operations) them as there is with bit-shifting signed numbers in C++ (which doesn't specify the integer representation).
In C++, for example, you cannot write this (to check if variable 'a' overflowed):
if (a + 1 < a) { ... }
and expect it to work on all conforming compilers because there is no guarantee that the underlying representation uses twos-complement. In Java the representation is guaranteed to be twos complement.
In C++, the result of:
a = b >> 3;
Is undefined if 'b' is signed. In java it's defined (right-shift twos-complement).
Sure, those are sh*tty reasons to leave out unsigned types, but they're still reasons. The only place you're likely to run into problems is when trying to add two signed 64-bit numbers that overflow the signed 64-bit type.
|
|
|
|
|
Thanks for the explanation.
Member 13301679 wrote: Sure, those are sh*tty reasons to leave out unsigned types, but they're still reasons. The only place you're likely to run into problems is when trying to add two signed 64-bit numbers that overflow the signed 64-bit type.
I am OK with the design decisions made, the issue there was conflicting expectations. The Java language designers choose to treat bytes as signed integers, while in the embedded world a byte is simply a string of 8 bits and endianess.
In my case, the problem wasn't math but reading data from a serial port. I was getting "garbage" until my colleagues told me that in Java bytes are signed.
|
|
|
|
|
Quote: I am OK with the design decisions made, the issue there was conflicting expectations.
Yes, that is the problem: if you're going to have a type called 'byte' in your language, it's better represent a pattern of bits (no matter the size of the byte itself).
As I get older, I get more and more irritated that all languages are not keeping to The Law of least Astonishment; they have rules, then exceptions to those rules, then overloaded meanings for keywords, then overloaded meanings for characters ...
I keep meaning to design a language that is easier to read than to write: most modern languages right now make the reading very difficult unless you have unreasonable context memorised. C++ is a great example of producing "easy to write, but impossible to read" compound statements - most lines in C++ require knowing complex rules, some part of the standard and some part of the program.
In particular, I'd like to shoot whoever thought that pass-by-reference must only be indicated in the declaration, and "helpfully" filled in by the compiler at the point of call. This means that looking at any function or method call is meaningless even if you just want to ignore it while looking for something else.
|
|
|
|
|
NelsonGoncalves wrote: I fondly remember discovering that the byte type is signed (why, really why ???) Since my student days (long ago!) I have been fighting this concept that "inside a computer, everything is a number". No, it isn't! Data are bit patterns, that are bit patterns, and not "zeroes and ones". A type defines which bit patterns are used (on a given machine) to represent various values, such as 'x' or 'y'. They are letters, da**it, no sort of 'numbers'. Similarly, colors are colors. Dog breeds are dog breeds. Weekdays are weekdays. Seasons are seasons.
One problem is that computer professionals are among the most fierce defenders of this 'number' concept, arguing that 'A' really is 65 (or, most would prefer 0x41, but still a 'number'). They think it perfectly natural to divide 'o' by two does not give 'c' (as you might think from the graphical image), but '7' -and that is a perfectly valid operation because 'o' is really not a letter but the numeric value 111, and '7' is really 55.
Even programmers who have worked with objects and abstractions and abstractions of abstractions still are unable to see a bit pattern as directly representing something that is not numerical. They cannot relate to the bit pattern as a representation of abstract information of arbitrary type, but must go via a numeric interpretation.
So we get this idea that an uninterpreted octet (the ISO term, partially accepted even outside ISO), a.k.a an 8 bit 'byte', in spite of its uninterpretedness does have a numeric interpretation, by being signed.
I shake my head: How much has the IT world progressed the last three to four decades (i.e. since High Level Languages took over) at all in the direction of a 'scientific discipline'? When we can't even manage abstractions at the octet level, but insist on a numeric interpretation when it isn't, then I think we are quite remote from a science on a solid academic foundation.
The bad thing is that we are not making very fast progress. 40+ years ago, you could, in Pascal, declare 'type season = (winter, spring, summer, fall)', and they are not numeric: You cannot divide summer by two to get spring (the way you can in C and lots of its derivates). There is no strong movement among software developers for a proper enumeration, discrete value, concept: We have written so much software that depends on spring+2 being fall. It would created havoc if we abandon the idea of a season as an integer value.
The entire software field really should run a complete set of regression tests every month, and if any regression (say, from the Pascal enumeration concept to that of C derivate languages) is detected, actions should be taken immediately to remedy it and bring the profession back to a proper state!
|
|
|
|
|
trønderen wrote: We have written so much software that depends on spring+2 being fall. It would created havoc if we abandon the idea of a season as an integer value.
I actually think it is a good thing that we can do 'fall = spring + 2'. First because it makes sense intuitively. Second, because although computers hold bits you need to do operations on those bits otherwise computers are not useful. And (some sort of) math seems to be a good enough common denominator that mostly everybody can rely and build on.
Personally I tend to view software development more as art, and less as science.
|
|
|
|
|
It is your view because you are a programmer, ask Ops.
As mentioned above, versioning. Tell me another software with so many different version number referencing the same thing.
Versioning 2.0 : Have you found your version number? Then here is another error message for you: class can not be loaded because it is version 52 but your JVM supports only up to version 51.
Well, disk space was cheap THEN, so maybe it was not an issue that a 4th digit upgrade installed the full thing in a new directory. Or at least it was not a problem till someone set the JAVA_HOME to one of them, and yum removed the other one...
The usual Java software customizable to an amazing degree via command line switches and config files, but very slow to start up, because it has to read all those small XML files.
Config 2.0: The main selling point of one commercial Tomcat clone is that you don't have to hunt tens of config files to set up the thing.
|
|
|
|
|
Quote: I don't understand the snarky comments one sees about Java. Confused |
Because language-hating on $OLD_AND_STABLE_LANGUAGE is currently fashionable. FCOL, 3 years ago I was reading commenters on reddit asking "Why would anyone in their right mind start a new project in C when Rust is available?".
In that time, however, Rust changed so much that it is slightly incompatible.
My observation is that, feature-wise, all languages converge towards Lisp, while syntax-wise, all languages converge towards C++ (which itself is, syntax-wise, already past the madness event-horizon and still accelerating).
Java (and C#) when initially released were fairly easy to read for anyone coming from almost any existing language (C, PHP, Perl, etc). Each feature added, added to the syntax instead of replacing existing grammar (so they didn't break existing programs), leading to the situation now where a symbol may mean almost anything.
I look forward to a future where source code looks more like BF than like Pascal /s ...
|
|
|
|
|
i personally dislike javans because they look down on other people. that is something you haven't noticed because you were on "the winning" team, java and oop. haven't you heard jokes about perl, c and javascript in the last 20 years?
it's a personal feeling you get that you cannot describe to others. it's subjective. let me try to put a lighte on it by using this totally unrelated article:
Linux Mint users are surprisingly irresponsible regarding updates
who is this guy telling other people how to use their computers?
now imagine this fairy tail rewriten to fit java propaganda from 10 years ago (now javans start to taste the truth that perl and javascript were vastly more powerful than java since inception and the only thing java's got is the story of oop) that everyone is irresponsible by using languages like c, tcl, javascript... any other language usage than java is irresponsible. who needs all those obsolete languages and why are they not facing the inevitable, that there should and will be only java and oop.
all this without any real reason for javans to look down upon others.
java was never a better c++ as it was advertised. it may be better for some people, but not for others.
"Java owes much of its initial popularity to the most intense marketing campaign ever mounted for a programming language." - Bjarne Stroustrup
and in that campaign advocates of java were visiting corporations telling people bad things about c++ and what not about c or other languages.
every now and then an article would pop up, like "why the c language refuses to die?!" and why is it now difficult to understand that some people want revenge? even if we have to change java with something more bizarre like a purely functional dictatorship static typed language that looks down on java and everybody else at the same time. that's so lame.
here is an article from java oop programmer Execution in the Kingdom of Nouns
here is a lisp programmer Why Lisp? | Hacker News
notice the: "A Lisp programmer who notices a common pattern in their code can write a macro to give themselves a source-level abstraction of that pattern. A Java programmer who notices the same pattern has to convince Sun that this particular abstraction is worth adding to the language"
same works for c, "These languages [Java] solve problems by adding more language features. A language like C solves problems by writing more C code."
all in all, i dislike java because i was walking along my way when i heard shouting from the other side of the street, 'hey, you are no good. i'm better', for which i could not find a reasonable explanation.
i disliked having to answer job interview question "what are the benefits of oop" for straight 15 years in a row and bs article posts by people who tried to stay relevant and prove the benefits of oop by giving a small and elegant oop example vs an grotesque, usually procedural, code that resembles someone who is learning basic with line numbers. but it was never the other way around. they never show you the flip-side.
Alan Kay (1997) “I invented the term object-oriented, and I can tell you I did not have C++ in mind.” and “Java and C++ make you think that the new ideas are like the old ones. Java is the most distressing thing to happen to computing since MS-DOS.”
sadly i can go on forever which will be a waste of time. i have a particular bookmarked folder for every anti-class based-oop post i have stomp upon written by people who by my opinion have competent or expert skills at programming including oop.
|
|
|
|
|
Quote: i have a particular bookmarked folder for every anti-class based-oop post i have stomp upon written by people who by my opinion have competent or expert skills at programming including oop.
Maybe you should share that 
|
|
|
|
|
|
From the C# side, it feels clunky and verbose and despite having added a few modern features like lambdas overall feels like I'm stuck in C# 1.x.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
|
|
|
|
|
Going to Java after C# is really tedious, especially for multi-threaded applications.
|
|
|
|
|
I tried writing an accounting program in Java, but gave up, when I got tired of trying to outwit UI classes that didn't do the job, and having to write factory classes to construct other classes to do the simplest things in the most complicated way.
|
|
|
|
|
Were you using Swing or JavaFX?
Get me coffee and no one gets hurt!
|
|
|
|
|
It was a very long time ago, at the time Swing was being developed. So, that may have fixed some of the formatting issues. I wanted an HTML-like table with wrapped text in the cells. I think I succeeded, but the low level code I had to do to wrap text was not pretty. I still have a problem with all of those factory classes.
|
|
|
|
|
Well, today the task would probably be a lot simpler: Create a TableView with a TextArea in the cells. That should do it, but you would probably need JavaFX. I have created a number of TableViews with controls like CheckBoxes and Rectangles in the cells.
Get me coffee and no one gets hurt!
|
|
|
|
|
That's nice to hear! Just about 20 years too late for me, but still nice to hear, in case I ever use Java again. 
|
|
|
|
|
As an embedded engineer, I've only worked on one project that had enough memory to support java. The only snark I'd have for Java is the lack of unsigned integers! I will say that my favorite editor is java based (jEdit).
|
|
|
|
|
Depends on what you what domain you are working in. I wrote a DAB (digital Aufio Broadcasting) decoder in C++ and simplified versions - just as a programming exercise - in Ada and in java.
The type of program of such a decoder requires extensive interaction with libraries written in C
(to name a few, device handling, fft transforms, and aac decoding).
In my personal opinion, binding java structures to C libraries is a crime.
btw the Ada binding is simpler since I was using the Gnat compiler system, but even then ...
Java is just a language, it is not my choice, but for many applications it seems more or less OK.
Personally I do not like the GUI handling, but that is probably a matter of taste.
The misery with binding to non-java (read: C) libraries is such that I would not recommend it
for applications where one depends on that kind of libraries
(I dislike all kinds of so-called integrated environments such as IntelliJ or whatever,
right now I am writing some stuff where I (more or less) have to use VS as development
environment. It is probably my ignorance, but I absolutely dislike the destruction of
the formats I use in my coding, and the error messages are a horror. For me the command line tools such as vim, qmake, mae and the GCC suite - with gdb as debugger under Linux - are
the ideal development tools)
|
|
|
|
|
What's wrong with Java is that it's 30 years old or something, and its architecture is a prisoner of what was available at the time: it no longer makes any sense now that richer and more capable systems exist. And it doesn't have properties, which is ridiculous.
|
|
|
|
|
Haters are going to Hate! No matter what.
The VB guys have been living with B.S. for years.
Wear a mask! Wash your hands too. The life you save might be your own.
|
|
|
|
|