|
I bought it in 1985 and it's a marvelous work of engineering.
"
The need for one lens able to do everything, or at least as much as possible, was an influence on lens design in the last quarter century. The Kino Precision Kiron 28-210mm f/4-5.6 (Japan) of 1985 was the first very large ratio focal length zoom lens for still cameras (most 35mm SLRs). The fourteen element/eleven group Kiron was first 35mm SLR zoom lens to extend from standard wide angle to long telephoto (sometimes referred to as "superzoom"),[191] able to replace 28, 35, 50, 85, 105, 135 and 200mm prime lenses, albeit restricted to a small variable maximum aperture to keep size, weight and cost within reason (129×75 mm, 840 g, 72mm filter, US$359 list).[192][193][194]
" -- History of photographic lens design - Wikipedia[^]
|
|
|
|
|
What's a 'camera'?
Isn't that what smartphones are for?
How else to instantly upload to that other essential invention ... social media? 
|
|
|
|
|
The problem with smartphones is that they suck out the subject's soul. 
|
|
|
|
|
Leo56 wrote: What's a 'camera'?
Isn't that what smartphones are for? Obligatory Geek and Poke[^]
|
|
|
|
|
Hmmph! https://www.flickr.com/photos/awrose/103252765/in/pool-camerawiki ain't it pretty?
Story - 35 yrs ago - guy had done some beautiful work and was very proud that the grain was so fine.
I said so? I get that smooth out of Tri-X all the time. He goes uh - OH! you're the one with THAT camera!
|
|
|
|
|
Beautiful camera!
I love looking at old photos taken with those types of cameras.
|
|
|
|
|
Ah yes, one of far too many "need to spend time with thats". ( And the F-1, A-1... the D2s ( which are 5-7s, the Gundlach is a 4-5 ) and the Bronica - which I got paid for using _almost_ what I paid for it. )
|
|
|
|
|
I am very much two-faced about grains. On the one hand, those super-smooth tones from old 4*5" films (or even larger), when the emulsion was stuffed with silver, and not a trace of grain, can be a pleasure to study for their tonal qualities alone.
Then, in significant parts of modern B&W photography, graininess is used as an artistic expression, not unlike the 'dottiness' of some impressionist painters. In journalism and sports photography, graininess and hard, 'graphics style' contrast has been a style of expression for at least 50 years. Even in some landscape photography, grains can add structure to a surface that would otherwise be boring (e.g. a misty landscape).
My photography books fill two meters of shelf space. In them, I can "in no time" find a hundred photos, from internationally recognized photographers, that would have lost some of their qualities if they were absolutely grain-less and with a smooth, 'natural' tone scale. Fair enough for 'scientific style' documentary photos, but if you want your photo to tell a story, you may need something beyond a simple and boring 'This is exactly how it looks'. It may be true, but So what?
Who said "A picture shouldn't show something, it should be something"? In the process of making it "be" something, grains can be a great tool.
|
|
|
|
|
"I didn't mention the bats - he'd see them soon enough" - Hunter S Thompson - RIP
|
|
|
|
|
cough cough Pentax cough cough...
I'd rather be phishing!
|
|
|
|
|
Sounds like you are about to throw up, is that right? Maybe you should put that Pentax away, then 
|
|
|
|
|
enforcement of byte order.
clunky base runtimes.
a template engine that is source level so *could* be as powerful as C++'s and better than C#s but sadly, isn't.
and personally, it just feels stifling somehow. i find myself getting into "the zone" in C# much more quickly than java, and staying there longer. I think part of it is the tools. Vstudio is just great though i've never used intelliJ. Eclipse is garbage, IMO. it always crashes on me if i try to use extensions, and it feels open source - designed by 100 different people.
so i think a big part for me is the tools.
If it weren't for all that, I'd probably prefer it to C# simply because of the amount of "cool code" or otherwise code or libraries I could have found very useful but were java only.
Real programmers use butterflies
|
|
|
|
|
Quote: so i think a big part for me is the tools I started out using Eclipse, but then I noticed many Java developers were switching to IntelliJ. Then I switched to IntelliJ. I will never go back to Eclipse. Try IntelliJ if you ever again need to do some JavaFX.
Get me coffee and no one gets hurt!
|
|
|
|
|
Yes, the IDE being Eclipse also made Java a troublesome uptake. Very good point.
I started doing Android early on and it was Eclipse-based and it drained all of the happy-happy new energy of a new development platform (Android) and I ran away.
Then, they went to Android Studio (intellij) and it was YESS!!!!!!!
|
|
|
|
|
I agree, I work in both Java and C# and don't have a problem with either. I think Java gained a stigma from ye olde Java applets, but for modern-day coding I enjoy using Java and Spring for building REST APIs.
|
|
|
|
|
There are two main issues to me:
1) versioning -- difficult to know which version to run and what functionality I will have -- this is especially after Oracle took over and then it split even more with the OpenJDK and all that nonsense. It's quite difficult. Along with versioning it is difficult to find tools that feel like they are "official". For instance, I am attempting to use JCov (java coverage tool) and it is supposed to be the "official" but very poorly or not documented at all.
2) UI Framework - Oh boy. I remember the original was something like AWT, right? Then JavaFX (but never caught on fully). 3rd party stuff, and controls that are instantly recognizable that they weren't Windows controls. It was all so confusing and there were better options (C#, Visual Studio and MFC, etc).
3) Java Applets they used applets to introduce Java and it was supposed to be gee-whiz. I was like, "a plugin...?? that fails a lot in my browser...?? and needs to be updated constantly...??? which MS doesn't like to support ???" That intro to Java kind of killed it.
After that it felt like a slow cumbersome thing with no direct line to components without lots of management. So, over to C#, which was easy.
Much of this isn't "fair" to Java, but it is the perception.
|
|
|
|
|
Following on with raddevus' points are:
4) Support -- ongoing support is difficult as the Java versions keep changing. Applications that run in "version X update Y" may not run on "update Y+1". And if multiple installed applications require different versions? Getting each to use the right installed version is its own version of DLL Hell.
5) Hiring -- finding people with experience in the needed versions of specific libraries can be tough. If not using the latest and greatest, finding people with experience and a willingness to work in an older version can be all but impossible. Besides, knowing one version of a library may not mean anything in a different version.
Java the language? It's just another language. It's got its pluses and minuses, same as every other language. IMO, the serious problems are everything except the language itself.
C# has its own issues, but post-deployment it's MUCH easier to support.
|
|
|
|
|
|
I like the blue. It is more bitter, but with a small piece of dark chocolate it is perfect. Consider throwing in a couple of cardamom seed with the beans when grinding them.
For my computer, I prefer a somewhat sharper variant.
|
|
|
|
|
The snarky comments are coming from C++ (the world of fast and free memory mangement) not C# people (the java people with a Microsoft sticker on the forehead).
Was that a snarky comment ?
|
|
|
|
|
It may be fast, but it sure as hell ain't free.
|
|
|
|
|
What was wrong was: Struts, Swing, NetBeans, JBoss, Apache ... and whatever else you needed to get an app going.
.NET didn't require shopping around.
It was only in wine that he laid down no limit for himself, but he did not allow himself to be confused by it.
― Confucian Analects: Rules of Confucius about his food
|
|
|
|
|
I worked with a customer who implemented a serious, corporate-wide Manufacturing Control System using Java. I worked with them on systems at four sites - San Jose, CA; Szenzhen, China; Mainz, Germany; and a place in Thailand I have forgotten the name of. The MCS was used with Windows, AIX, Linux, MacOS, and a few others I can't remember. We used sockets as our interface and there were no problems with it all. I don't even know what OS the systems we directly interfaced with ran on because we didn't need to. That was a few employers ago so I have forgotten some details now. The systems were used in the manufacturing of disk drives and dealt with making the disks themselves. Assembly happened at other sites and we did a few of those systems too.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
Impressive!
Get me coffee and no one gets hurt!
|
|
|
|
|
I grew up with Open Systems Interconnection - communication protocols that are very explicitly designed for being completely independent of any given tool used to generate the protocol elements. One very fundamental protocol principle is that the application, and all lower layers, should relate to the protocol specification, and nothing else.
As OSI was pushed into the darkness, it felt like a rope being tightened around my neck: the protocols replacing the OSI ones more or less demanded that you use specific libraries, specific OS conventions, specific languages - or else you would have to emulate those specific libraries, OSs and languages. But that is troublesome, because new extensions were added all the time, and keeping up with the updates to the libraries, OSs and languages in your emulation was almost impossible.
Sometimes, the binding to specific tools is not immediately apparent. Take the RFC 4506 serialization format (a.k.a SUN XDR): In my archives from the years of the network wars is a "benchmark" of it against OSI BER (Basic Encoding Rules). XDR beats BER by an high factor. What the benchmark documents keep perfectly quiet about is that XDR is more or less a direct streaming of the binary layout of a C struct on a SUN machine; there is hardly any conversion at all. After generating the red tape, you set that byte pointer to the start of the struct and send the following sizeof(struct) bytes out on the line. (Needless to say, this benchmark was run on a SUN.)
I never thought BER would be anywhere as fast as XDR (BER has a lot more flexibility, which can't be realized at zero extra cost). But if you set up a similar benchmark on a non-SUN machine, the serialization cost of XDR might easily raise by a magnitude. Say, if the machine had a different float format. Different byte order. Maybe an old machine wouldn't be byte addressable - my first three university years, I was programming non byte addressable machines exclusively; the Univac even used 1-complement integers, and its proprietary 6-bit character set. (The Decsystem series used 7-bit ASCII, packed 5 to a 36-bit word.) Say, if your language has proper string handling, you will have to explicitly add a C-style terminating NUL at the end (a use of NUL which is in direct conflict the the ASCII / ISO 646 standard). The specification of how to encode a structure is given by how a C declaration of it looks.
And so on. This is typical: As long as you use the "proper" machine, the "proper" language, the "proper" OS, you find it very easy to read the specifications and implement the standards. If you come from a different environment: Not so.
There is so much of "You can have the T-Ford in any color you want, as long as you want it in black".
This is my major objection against Java: It is not just another language that fits in with other languages, under an arbitrary OS. It creates its own world, telling others that you sure can join in -- if you adapt to the Java way. It is your responsibility to adapt to Java, not the other way around. And Java will be the master of the Java environment.
(I have the same objections against Python: It makes a world of its own, trying to monopolize software development completely. Can you compile a Python library, making it callable through a standard calling convention from an arbitrary compiled language? Dream on...)
I like .NET as a language independent, OS independent platform. Java affectionados may claim that JVM is not in principle Java/*nix bound, but with .NET, you see it in practice as well. (But it cannot be hidden that a lot of Windows GUI stuff is based on p/invoke, which really should have been avoided!)
When MFC came on the scene, I rejected it because I felt that it tried to bind me on hands and feet - the philosophy of multi-tier software design wasn't well developed at the time, and MFC most definitely was not designed for such ideas). Java by itself is not that restrictive, but the Java ecosystem creates a similar tie-in that I do not want to live by.
|
|
|
|
|