|
"I didn't mention the bats - he'd see them soon enough" - Hunter S Thompson - RIP
|
|
|
|
|
cough cough Pentax cough cough...
I'd rather be phishing!
|
|
|
|
|
Sounds like you are about to throw up, is that right? Maybe you should put that Pentax away, then 
|
|
|
|
|
enforcement of byte order.
clunky base runtimes.
a template engine that is source level so *could* be as powerful as C++'s and better than C#s but sadly, isn't.
and personally, it just feels stifling somehow. i find myself getting into "the zone" in C# much more quickly than java, and staying there longer. I think part of it is the tools. Vstudio is just great though i've never used intelliJ. Eclipse is garbage, IMO. it always crashes on me if i try to use extensions, and it feels open source - designed by 100 different people.
so i think a big part for me is the tools.
If it weren't for all that, I'd probably prefer it to C# simply because of the amount of "cool code" or otherwise code or libraries I could have found very useful but were java only.
Real programmers use butterflies
|
|
|
|
|
Quote: so i think a big part for me is the tools I started out using Eclipse, but then I noticed many Java developers were switching to IntelliJ. Then I switched to IntelliJ. I will never go back to Eclipse. Try IntelliJ if you ever again need to do some JavaFX.
Get me coffee and no one gets hurt!
|
|
|
|
|
Yes, the IDE being Eclipse also made Java a troublesome uptake. Very good point.
I started doing Android early on and it was Eclipse-based and it drained all of the happy-happy new energy of a new development platform (Android) and I ran away.
Then, they went to Android Studio (intellij) and it was YESS!!!!!!!
|
|
|
|
|
I agree, I work in both Java and C# and don't have a problem with either. I think Java gained a stigma from ye olde Java applets, but for modern-day coding I enjoy using Java and Spring for building REST APIs.
|
|
|
|
|
There are two main issues to me:
1) versioning -- difficult to know which version to run and what functionality I will have -- this is especially after Oracle took over and then it split even more with the OpenJDK and all that nonsense. It's quite difficult. Along with versioning it is difficult to find tools that feel like they are "official". For instance, I am attempting to use JCov (java coverage tool) and it is supposed to be the "official" but very poorly or not documented at all.
2) UI Framework - Oh boy. I remember the original was something like AWT, right? Then JavaFX (but never caught on fully). 3rd party stuff, and controls that are instantly recognizable that they weren't Windows controls. It was all so confusing and there were better options (C#, Visual Studio and MFC, etc).
3) Java Applets they used applets to introduce Java and it was supposed to be gee-whiz. I was like, "a plugin...?? that fails a lot in my browser...?? and needs to be updated constantly...??? which MS doesn't like to support ???" That intro to Java kind of killed it.
After that it felt like a slow cumbersome thing with no direct line to components without lots of management. So, over to C#, which was easy.
Much of this isn't "fair" to Java, but it is the perception.
|
|
|
|
|
Following on with raddevus' points are:
4) Support -- ongoing support is difficult as the Java versions keep changing. Applications that run in "version X update Y" may not run on "update Y+1". And if multiple installed applications require different versions? Getting each to use the right installed version is its own version of DLL Hell.
5) Hiring -- finding people with experience in the needed versions of specific libraries can be tough. If not using the latest and greatest, finding people with experience and a willingness to work in an older version can be all but impossible. Besides, knowing one version of a library may not mean anything in a different version.
Java the language? It's just another language. It's got its pluses and minuses, same as every other language. IMO, the serious problems are everything except the language itself.
C# has its own issues, but post-deployment it's MUCH easier to support.
|
|
|
|
|
|
I like the blue. It is more bitter, but with a small piece of dark chocolate it is perfect. Consider throwing in a couple of cardamom seed with the beans when grinding them.
For my computer, I prefer a somewhat sharper variant.
|
|
|
|
|
The snarky comments are coming from C++ (the world of fast and free memory mangement) not C# people (the java people with a Microsoft sticker on the forehead).
Was that a snarky comment ?
|
|
|
|
|
It may be fast, but it sure as hell ain't free.
|
|
|
|
|
What was wrong was: Struts, Swing, NetBeans, JBoss, Apache ... and whatever else you needed to get an app going.
.NET didn't require shopping around.
It was only in wine that he laid down no limit for himself, but he did not allow himself to be confused by it.
― Confucian Analects: Rules of Confucius about his food
|
|
|
|
|
I worked with a customer who implemented a serious, corporate-wide Manufacturing Control System using Java. I worked with them on systems at four sites - San Jose, CA; Szenzhen, China; Mainz, Germany; and a place in Thailand I have forgotten the name of. The MCS was used with Windows, AIX, Linux, MacOS, and a few others I can't remember. We used sockets as our interface and there were no problems with it all. I don't even know what OS the systems we directly interfaced with ran on because we didn't need to. That was a few employers ago so I have forgotten some details now. The systems were used in the manufacturing of disk drives and dealt with making the disks themselves. Assembly happened at other sites and we did a few of those systems too.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
Impressive!
Get me coffee and no one gets hurt!
|
|
|
|
|
I grew up with Open Systems Interconnection - communication protocols that are very explicitly designed for being completely independent of any given tool used to generate the protocol elements. One very fundamental protocol principle is that the application, and all lower layers, should relate to the protocol specification, and nothing else.
As OSI was pushed into the darkness, it felt like a rope being tightened around my neck: the protocols replacing the OSI ones more or less demanded that you use specific libraries, specific OS conventions, specific languages - or else you would have to emulate those specific libraries, OSs and languages. But that is troublesome, because new extensions were added all the time, and keeping up with the updates to the libraries, OSs and languages in your emulation was almost impossible.
Sometimes, the binding to specific tools is not immediately apparent. Take the RFC 4506 serialization format (a.k.a SUN XDR): In my archives from the years of the network wars is a "benchmark" of it against OSI BER (Basic Encoding Rules). XDR beats BER by an high factor. What the benchmark documents keep perfectly quiet about is that XDR is more or less a direct streaming of the binary layout of a C struct on a SUN machine; there is hardly any conversion at all. After generating the red tape, you set that byte pointer to the start of the struct and send the following sizeof(struct) bytes out on the line. (Needless to say, this benchmark was run on a SUN.)
I never thought BER would be anywhere as fast as XDR (BER has a lot more flexibility, which can't be realized at zero extra cost). But if you set up a similar benchmark on a non-SUN machine, the serialization cost of XDR might easily raise by a magnitude. Say, if the machine had a different float format. Different byte order. Maybe an old machine wouldn't be byte addressable - my first three university years, I was programming non byte addressable machines exclusively; the Univac even used 1-complement integers, and its proprietary 6-bit character set. (The Decsystem series used 7-bit ASCII, packed 5 to a 36-bit word.) Say, if your language has proper string handling, you will have to explicitly add a C-style terminating NUL at the end (a use of NUL which is in direct conflict the the ASCII / ISO 646 standard). The specification of how to encode a structure is given by how a C declaration of it looks.
And so on. This is typical: As long as you use the "proper" machine, the "proper" language, the "proper" OS, you find it very easy to read the specifications and implement the standards. If you come from a different environment: Not so.
There is so much of "You can have the T-Ford in any color you want, as long as you want it in black".
This is my major objection against Java: It is not just another language that fits in with other languages, under an arbitrary OS. It creates its own world, telling others that you sure can join in -- if you adapt to the Java way. It is your responsibility to adapt to Java, not the other way around. And Java will be the master of the Java environment.
(I have the same objections against Python: It makes a world of its own, trying to monopolize software development completely. Can you compile a Python library, making it callable through a standard calling convention from an arbitrary compiled language? Dream on...)
I like .NET as a language independent, OS independent platform. Java affectionados may claim that JVM is not in principle Java/*nix bound, but with .NET, you see it in practice as well. (But it cannot be hidden that a lot of Windows GUI stuff is based on p/invoke, which really should have been avoided!)
When MFC came on the scene, I rejected it because I felt that it tried to bind me on hands and feet - the philosophy of multi-tier software design wasn't well developed at the time, and MFC most definitely was not designed for such ideas). Java by itself is not that restrictive, but the Java ecosystem creates a similar tie-in that I do not want to live by.
|
|
|
|
|
I saw that tendency in MFC and I never bought into that aspect of it. I try my best to avoid all of its collections and OS synchronization object wrappers and I use STL and write my own wrappers. That lets me leave only the UI for MFC and I actually like how it handles most of that. It's pretty much on life support now so soon I am going to leave it and move on.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
Nothing really, as long as you remain inside its walled garden.
I had a few interactions with Java, all of them ending in pain and tears because at some point I needed step out of the Virtual Machine.
I fondly remember discovering that the byte type is signed (why, really why ???) and spending a few days debugging my hardware to figure out in the end that Java was to blame.
Or the magical moment when one of the gazilion DLLs needed by an over engineered project had a bug. I simply fixed the bug and recompiled the source and build the DLL again. Something none of the over Java experts were even aware it was possible.
And of course, how can I forget when I relied on the Java standard String library only to find out that the target where the program ran had an incomplete (but still announced as 100% compatible) implementation of that library. What can be more fun than writing your own standard library functions ?
A bit more serious, there is nothing wrong with Java. It is widely used, and in most cases it is good enough. I was just an unfortunate victim of the attempt to using Java in the embedded world, where it most definitively is not right an appropriate tool.
|
|
|
|
|
Quote: I fondly remember discovering that the byte type is signed (why, really why ???)
I know the answer to this one: it's because the representation of signed integers is specified (as two's complement), so there is no problem with bit-shifting (or other bit operations) them as there is with bit-shifting signed numbers in C++ (which doesn't specify the integer representation).
In C++, for example, you cannot write this (to check if variable 'a' overflowed):
if (a + 1 < a) { ... }
and expect it to work on all conforming compilers because there is no guarantee that the underlying representation uses twos-complement. In Java the representation is guaranteed to be twos complement.
In C++, the result of:
a = b >> 3;
Is undefined if 'b' is signed. In java it's defined (right-shift twos-complement).
Sure, those are sh*tty reasons to leave out unsigned types, but they're still reasons. The only place you're likely to run into problems is when trying to add two signed 64-bit numbers that overflow the signed 64-bit type.
|
|
|
|
|
Thanks for the explanation.
Member 13301679 wrote: Sure, those are sh*tty reasons to leave out unsigned types, but they're still reasons. The only place you're likely to run into problems is when trying to add two signed 64-bit numbers that overflow the signed 64-bit type.
I am OK with the design decisions made, the issue there was conflicting expectations. The Java language designers choose to treat bytes as signed integers, while in the embedded world a byte is simply a string of 8 bits and endianess.
In my case, the problem wasn't math but reading data from a serial port. I was getting "garbage" until my colleagues told me that in Java bytes are signed.
|
|
|
|
|
Quote: I am OK with the design decisions made, the issue there was conflicting expectations.
Yes, that is the problem: if you're going to have a type called 'byte' in your language, it's better represent a pattern of bits (no matter the size of the byte itself).
As I get older, I get more and more irritated that all languages are not keeping to The Law of least Astonishment; they have rules, then exceptions to those rules, then overloaded meanings for keywords, then overloaded meanings for characters ...
I keep meaning to design a language that is easier to read than to write: most modern languages right now make the reading very difficult unless you have unreasonable context memorised. C++ is a great example of producing "easy to write, but impossible to read" compound statements - most lines in C++ require knowing complex rules, some part of the standard and some part of the program.
In particular, I'd like to shoot whoever thought that pass-by-reference must only be indicated in the declaration, and "helpfully" filled in by the compiler at the point of call. This means that looking at any function or method call is meaningless even if you just want to ignore it while looking for something else.
|
|
|
|
|
NelsonGoncalves wrote: I fondly remember discovering that the byte type is signed (why, really why ???) Since my student days (long ago!) I have been fighting this concept that "inside a computer, everything is a number". No, it isn't! Data are bit patterns, that are bit patterns, and not "zeroes and ones". A type defines which bit patterns are used (on a given machine) to represent various values, such as 'x' or 'y'. They are letters, da**it, no sort of 'numbers'. Similarly, colors are colors. Dog breeds are dog breeds. Weekdays are weekdays. Seasons are seasons.
One problem is that computer professionals are among the most fierce defenders of this 'number' concept, arguing that 'A' really is 65 (or, most would prefer 0x41, but still a 'number'). They think it perfectly natural to divide 'o' by two does not give 'c' (as you might think from the graphical image), but '7' -and that is a perfectly valid operation because 'o' is really not a letter but the numeric value 111, and '7' is really 55.
Even programmers who have worked with objects and abstractions and abstractions of abstractions still are unable to see a bit pattern as directly representing something that is not numerical. They cannot relate to the bit pattern as a representation of abstract information of arbitrary type, but must go via a numeric interpretation.
So we get this idea that an uninterpreted octet (the ISO term, partially accepted even outside ISO), a.k.a an 8 bit 'byte', in spite of its uninterpretedness does have a numeric interpretation, by being signed.
I shake my head: How much has the IT world progressed the last three to four decades (i.e. since High Level Languages took over) at all in the direction of a 'scientific discipline'? When we can't even manage abstractions at the octet level, but insist on a numeric interpretation when it isn't, then I think we are quite remote from a science on a solid academic foundation.
The bad thing is that we are not making very fast progress. 40+ years ago, you could, in Pascal, declare 'type season = (winter, spring, summer, fall)', and they are not numeric: You cannot divide summer by two to get spring (the way you can in C and lots of its derivates). There is no strong movement among software developers for a proper enumeration, discrete value, concept: We have written so much software that depends on spring+2 being fall. It would created havoc if we abandon the idea of a season as an integer value.
The entire software field really should run a complete set of regression tests every month, and if any regression (say, from the Pascal enumeration concept to that of C derivate languages) is detected, actions should be taken immediately to remedy it and bring the profession back to a proper state!
|
|
|
|
|
trønderen wrote: We have written so much software that depends on spring+2 being fall. It would created havoc if we abandon the idea of a season as an integer value.
I actually think it is a good thing that we can do 'fall = spring + 2'. First because it makes sense intuitively. Second, because although computers hold bits you need to do operations on those bits otherwise computers are not useful. And (some sort of) math seems to be a good enough common denominator that mostly everybody can rely and build on.
Personally I tend to view software development more as art, and less as science.
|
|
|
|
|
It is your view because you are a programmer, ask Ops.
As mentioned above, versioning. Tell me another software with so many different version number referencing the same thing.
Versioning 2.0 : Have you found your version number? Then here is another error message for you: class can not be loaded because it is version 52 but your JVM supports only up to version 51.
Well, disk space was cheap THEN, so maybe it was not an issue that a 4th digit upgrade installed the full thing in a new directory. Or at least it was not a problem till someone set the JAVA_HOME to one of them, and yum removed the other one...
The usual Java software customizable to an amazing degree via command line switches and config files, but very slow to start up, because it has to read all those small XML files.
Config 2.0: The main selling point of one commercial Tomcat clone is that you don't have to hunt tens of config files to set up the thing.
|
|
|
|
|