Unicode is a vast specification covering many things. Encluding serveral representations of some very broad (many natural languages) character sets.
There are also associated encodings for those character sets.
Some encodings have a fixed size and some are variable.
A variable size encoding has some byte sequences which represent specific characters and other byte sequences which are used as flags to indicate that additional bytes are needed to determine the actual character.
A multi-byte character set might either mean a fixed size representation of a character set but normally means a variable sized encoding which started with a single byte for the initial encoding. UTF8 is a variable sized encoding and can thus also be considered a multibyte character set.
For coverage of this specific point see the following
I have to develop an application that captures a portion of the desktop every 30 seconds, after viewing the captured image in the window of application, should find (and then highlight) which regions of the new image have changed compared to the previous image.
I'm always a bit suspicious about forms of array initialization, so I automatically assumed that was what you're at.
Sure, the code is missing a const there, which makes me wonder if it has been compiled as C-code, rather than C++? But I wonder if it's even possible to compile the invocation of a socket object as C-code - I don't think it is...
In fact it should not, as it would require the compiler to analyze the semantics of the code in order to make sure that the variableSTRLEN is not being changed. Even though this could be done in an easy example like this, the syntax is just as wrong as a missed ';'. Have you ever seen a compiler insert a missing ';' for you, no matter how obvious the fix?
I'm not that familiar with the ClientSocket class but I have a few observations / questions.
1) Clearly you expect "recMessage" to be a null terminated string (you "cout" it)
2) Who puts the NULL character at the end of each message?
3) Your buffer is "STRLEN" characters long (assuming you fix the declaration), you tell RecvData to read "STRLEN" characters. Who accounts for the NULL? Either the buffer needs to be one character bigger or the RecvData should be for one character smaller.
4) there is nothing to indicate the length of the received message in that API so how do you know or verified that all messages are < 200 characters as you stated in another reply.
5) the memset at the end of the routine does nothing useful. While it looks OK, are you sure that statement is not causing the problem.
6) other folks have commented on the declaration "char recMessage[STRLEN]" where STRLEN is not a #define constant but a variable. I suspect this is not what is used in the actual compiled code.
Regarding 3) and the code you linked, you need to pass STRLEN-1 to the sub. RecvData takes a size (STRLEN in this case), passes it directly to recv() which returns in the variable i the amount of bytes read. Then buffer[i] is set to '\0'. Which means that in the second worst case you write size, so the 0-termination writes to index STRLEN, which means the buffer should be STRLEN+1 in size at least. In the worst case scenario, SOCKET_ERROR is returned from receive and you'll end up trying to write to buffer[-1]. So... please tell me that the code at the site is not what you are using... please.