|
Thank you, so much for your response, David.
My critique was probably more about the programmers use of C++ than the language.
I did a quick look at your article and found it well organized. Good Job. I made a PDF copy of your article to read offline. BTW your complaints about C are also some of mine.
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
I hope it shines a light on some new stuff for you!
Best wishes!
|
|
|
|
|
BTW I use Codeblocks for my development environment.
It has both a C and C++ environment and works for me.
I also have visual studio 2022 installed but not my choice yet.
Thanks again. "Light" is the appropriate term as lighting simulations was my line of work for many years.
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
David O'Neil wrote: First is the typedefs that surround all structs in the MS headers. I still don't understand why that was done,
Quirkiness of old C standard: when using struct the type is actually
struct type and not type . With ANSI C you would have to declare every variable of type type as
struct type variable; rather than the most sensible
type variable;
A way to get around that is using typedef.
GCS/GE d--(d) s-/+ a C+++ U+++ P-- L+@ E-- W+++ N+ o+ K- w+++ O? M-- V? PS+ PE Y+ PGP t+ 5? X R+++ tv-- b+(+++) DI+++ D++ G e++ h--- r+++ y+++* Weapons extension: ma- k++ F+2 X
|
|
|
|
|
Thank you. That makes sense. I hate it!
|
|
|
|
|
I hate it so much. And I like VisualStudio because at lerast since 2008, maybe before but after VisualStudio 6, it allowed mixing C++ syntax in C source so we could do without a lot of old C absurdities. Also C99 fixed a lot of these inconsistencies, but MS headers were written long before it.
GCS/GE d--(d) s-/+ a C+++ U+++ P-- L+@ E-- W+++ N+ o+ K- w+++ O? M-- V? PS+ PE Y+ PGP t+ 5? X R+++ tv-- b+(+++) DI+++ D++ G e++ h--- r+++ y+++* Weapons extension: ma- k++ F+2 X
|
|
|
|
|
Quote: The second is that everything has to be casted on both sides of the '='. That got old quick.
You must have first seen C back in the 80s?
Since C99, anytime a cast is done in C it's a code-smell; casting should almost never be required.
|
|
|
|
|
Yeah, my first exposure was in college, around '88. Every time I've came across C-like code even within the past 10 years it has been full of casts. But I haven't programmed C, so am not too familiar with the current idiosyncrasies.
|
|
|
|
|
I too is the same, started in C language in 1992 and in a few years, I graduated to C++ and I loved it. However, my last project included over 400,000 lines of MFC/C++ code and a lot of memory. So much memory (CStrings!) it would crash in 2 weeks for no reason. I finally figured out that the CStrings were eating up memory. I changed most of the CStrings with common C language variables (static arrays [], chars etc). More work but running smoothly.
|
|
|
|
|
Well, to be clear, I started with C++, not C, but yeah. =) CString, std::string and the like are a mess.
To err is human. Fortune favors the monsters.
|
|
|
|
|
Another good argument for using C++ templates is its capability to "inject" code into algorithms. I mean that C is forced to use callbacks to expand the functionality of the algorithms: with C++ you can pass the "portion" of the algorithm that extends the functionality of the base algorithm as a template argument to a template parameter. Functors (or lambdas, they are the same beast) which has code declared inline, it is generally injected into the template of the base algorithm.
For example, the classic C library qsort algorithm, ignoring the fact that it forces you to play dangerously with casts to connect arguments to the callback function parameters, produces a function call for each comparison. If the algorithm (which is never a pure qsort but is a modified version, e.g. introsort) has to do with many elements and the compare function is inexpensive, it turns out that the biggest waste of time is in the invocation of the callback. In fact, especially when dealing with processors without caches or sophisticated branch prediction mechanisms, which is almost always the case with small MCUs, a function call can cost a lot.
This means that, apart from code bloating (which with modern MCUs is no longer a big problem since they have a lot of Flash), C++ is often faster than C, unless a C programmer writes everything by hand and doesn't use any library facility.
This gives the possibility to use the just-ready STL or Boost algorithms that do not need exceptions or dynamic allocation, customizing them efficiently. I want to remind all of you that usually the above code is written better than we could do and, in general, it is already debugged. And yes, using custom allocators and the placement-new operator, C++ can do what C does and many times it does even better: that is, C++, with a little care, can be happily used on small embedded systems.
Cheers
|
|
|
|
|
Absolutely, although I would not use STL and and Boost on most IoT class MCUs, and the reason his heap fragmentation.
When you're dealing with say 360KB of usable SRAM (the amount available for use on an ESP32) or less, the struggle is real. STL just takes a big steamer all over your heap.
To err is human. Fortune favors the monsters.
|
|
|
|
|
honey the codewitch wrote: Absolutely, although I would not use STL and and Boost on most IoT class MCUs, and the reason his heap fragmentation.
I understand, but I wasn't talking about STL containers: I was talking about "algorithms". Many ready-to-use algorithms only work on data in place. However, you can use STL algorithms on standard C arrays (basically pointers are "random access iterators", intended as the category, aren't they?). If you like templates can always use std::array which has zero overhead with respect C array.
honey the codewitch wrote: When you're dealing with say 360KB of usable SRAM (the amount available for use on an ESP32) or less, the struggle is real. STL just takes a big steamer all over your heap.
I started programming MCUs like 6805 in pure assembly up to MC68000 in "pure C". So I know well what are you talking about.
Regards.
|
|
|
|
|
Fair enough. When you said algorithms I thought you were including things like hashtables.
To err is human. Fortune favors the monsters.
|
|
|
|
|
"Also, using the preprocessor freely is kind of liberating"
C will set you free. the oldest dogmas i've heard were: goto is evil and macros are evil
i remember you once said (probably here on The Lounge) that you prefer C++ and even that you work with it like it is C# (translating C++ to C# in your head and vice versa)
what i don't like about C is the standards. there are nice things in every standard that benefit, but they push with the Undefined Behavior as a way to shame and discipline the coder. what the creators of C didn't put in the language, they try to force thru the standards
the spirit of C is kind of hippie, uncertain. that made me search for the first edition of The C Programming Language:
"C is a general-purpose programming language with features economy of expression...", "its absence of restrictions and its generality make it more convenient and effective for many tasks than supposedly more powerful languages."
"...the run-time library required to implement self-contained programs is tiny.", "...efficient enough that there is no compulsion to write assembly language instead." - this seems like something that is not important now, but lets think of the energy consumption.
"Existing compilers provide no run-time checking of array subscripts, argument types, etc." - wooow, you just put an int where the function takes a float , the sizeof float bytes are copied from the address of the integer object to the stack frame. the function treats the data as float .
the most astonishing for me has always been "The arguments to functions are passed by copying the value of the argument, and it is impossible for the called function to change the actual argument in the caller", i interpret as Ritchie's intention towards pure functions. yes, you can pass Person 's pointer to a function, but the default is passing a copy of Person .
i chose C because it doesn't change, although C++ was my first choice. C++ now changes every 3 years? i cannot even recognize the language. anyway, i'm not a competent programmer.
|
|
|
|
|
Martin ISDN wrote: C will set you free. the oldest dogmas i've heard were: goto is evil and macros are evil
There's a time and a place for nearly everything (except Python ). Macros make certain impossible things possible in C, like conditionally compiling code for different platforms, or setting compile time configuration parameters in the absence of templates. Gotos are pretty handy for state machines, where higher level constructs don't work because you can't jump in and out of while loops.
Martin ISDN wrote: the most astonishing for me has always been "The arguments to functions are passed by copying the value of the argument, and it is impossible for the called function to change the actual argument in the caller"
I'm surprised this astonished you, as it's the default in most any programming language including asm, where the most natural way to call is to put a *copy* of a value in a register or onto the stack. Indeed to pass by reference you need to put the *address* of the object in a register or on the stack. BASIC facilitates this using the Byref keyword, C# with the ref and out keywords, but it's pretty much always extra typing. The exception is arrays including strings, because you *reference* them (in C by referencing the first element), and while in theory you could push each element onto the stack in practice that's prohibitive.
To err is human. Fortune favors the monsters.
|
|
|
|
|
honey the codewitch wrote:
I'm surprised this astonished you, as it's the default in most any programming language including asm, where the most natural way to call is to put a copy of a value in a register or onto the stack. Indeed to pass by reference you need to put the address of the object in a register or on the stack. BASIC facilitates this using the Byref keyword, C# with the ref and out keywords, but it's pretty much always extra typing. The exception is arrays including strings, because you reference them (in C by referencing the first element), and while in theory you could push each element onto the stack in practice that's prohibitive.
probably i wanted to give Dennis more credit than he deserves. once an idea like "i have underrated C, Dennis was more clever and foreseeing than i thought. he made the right compromises" appeared in my mind it is constantly working in the background trying to find new prof of greatness.
cannot test it, but the BASICs on the home computers may have been default to pass by reference. that set the intuition that the function changes the callers arguments at very young age
though, he made copy by value the only way to pass. except for that array!
i often wonder at length, why he did so. the default way is to pass it by reference i suppose for economical reasons. there is a way to pass it by copy if you put it inside a struct. or simply, cast the array as a struct.
i wish i could find some paper written by Ritchie about this or "the other things, not because they are easy, but because they are hard"
|
|
|
|
|
I should add, I agree with you about C++ changes being frustrating, but I lean on the -std=c++17 compiler option. as it gives me a good mix of features without overwhelming me with language features I'm not familiar with, or restricting me to toolchains that support the newer standards.
The saving grace of C++ is the standards are stamped every few years with compilers allowing you to choose between them. It helps a lot.
To err is human. Fortune favors the monsters.
|
|
|
|
|
Back in the day, working for a NASA contractor, we were working on graphic representation of data from the engineering (FORTRAN) programs, which had to be done in C on massive UNIX machines. OOP was still a new concept, but we used the flexibility of C to create object-like arrays of data (okay, just the parameter part of objects, but with a suite of functions to support each). OOP was the natural progression, but I never did learn C++. Perhaps I should. The freedom of C is both scary and appealing to the megalomaniac in me; I always delighted in writing in it.
|
|
|
|
|
My main language is C, although I've coded in C++ far longer. I don't use classes unless I have a very good reason to and on only one very specific occasion I used templates (I hate generics). That said I have to confess I come from the opposite end of programming. C is a very nice abstraction of assembler, with the added caveat of being portable and I have emulated objects in C using function pointers within structs.
Despite the supposed (never seen it) security implications of void pointers, those are about as close as I can imagine to using generic data types in your functions (I do videogames so performance and efficiency always trumps readability...and security will always be an afterthought)
Anyways welcome 
|
|
|
|
|
I spent my career as an embedded developer (retired now) and I've only had one project that had enough RAM to be able to use something like STL. The project was basically done when I got it so all I did to it was add/fix miscellaneous features.
For most of my projects in the last 15 years of my career, I was using C++ but stayed away from dynamic object creation (all objects instantiated at startup) and inheritance. Since compiler tech had gotten so good, I did use templates for common things like FIFO's, queue's and components for some DSP (filters, tone generators, etc). Don't be afraid to use C++ features, just make sure you know the memory and time cost.
|
|
|
|
|
Sounds like we're very similar in how we approach C++ on embedded and IoT.
In projects like htcw_gfx[^] I rarely allocate memory for you, although temporary allocations are sometimes necessary for performance, they are few and far between, plus allow you to specify custom allocators.
I don't use STL in such projects. In fact, I've gone out of my way to avoid it, even implementing my own streams and basic data structures like a hashtable and a vector that you can't remove items from aside from clearing the whole thing. The reason is flash size and memory usage - primarily heap frag.
I keep my constructors inlineable by my compiler and generally use template based "interfaces" at the source level rather than inheritance based interfaces at the binary level. The idea being flash size is at less of a premium than CPU cycles.
To err is human. Fortune favors the monsters.
|
|
|
|
|
As a late-comer to this thread, all I can say is that somewhere inside C++ is a beautiful language waiting to get out.
C/C++ both suffer from readability issues. When I look at a piece of code, I want to be able to "grok" it in a few seconds.
My most serious complaint about both is that the very things, flexibility and conciseness, that make them usable work against them.
I trip through a lot of third party code and it takes waaaaay too much time to understand what the original author wanted to do and how he went about it. Then try to find the error or where it needs to be tweaked to add/modify the code.....days turn into weeks, then months.
Simply put, I want to look at the code and comprehend its intent and organization in minutes.
Bottom line: C/C++ allow a programmer too many variations to accomplish a task. Good and bad.
|
|
|
|
|
I think you have to be more careful to write readable C++ code but it's totally doable. It's just that a lot of people (sadly, including myself ) don't bother.
On the other hand, reading C++ is a bit like listening to Aesop Rock. Absolutely unintelligible at first, each for similar reasons, actually, but you start to get an organic feel for the various ways it tends to get put together and then it clicks.
To err is human. Fortune favors the monsters.
|
|
|
|
|
I agree on both counts: It's doable, ...don't bother.
|
|
|
|
|