|
Wordle 651 5/6
🟨⬛⬛🟨⬛
⬛🟨🟨⬛⬛
⬛🟩🟩⬛⬛
⬛🟩🟩🟨⬛
🟩🟩🟩🟩🟩
Get me coffee and no one gets hurt!
|
|
|
|
|
Wordle 651 5/6
🟨⬜⬜⬜⬜
⬜🟩⬜⬜🟨
🟨🟩🟩⬜⬜
⬜🟩🟩🟩🟩
🟩🟩🟩🟩🟩
|
|
|
|
|
Wordle 651 4/6
⬜⬜⬜⬜⬜
⬜🟨⬜⬜⬜
🟨⬜🟩⬜⬜
🟩🟩🟩🟩🟩
Jeremy Falcon
|
|
|
|
|
Wordle 651 4/6
⬛⬛🟨⬛⬛
⬛🟩⬛⬛🟨
⬛🟩🟨⬛⬛
🟩🟩🟩🟩🟩
|
|
|
|
|
Wordle 651 3/6
⬜⬜⬜⬜⬜
⬜🟨⬜🟨🟨
🟩🟩🟩🟩🟩
not too bad
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
#Worldle #434 1/6 (100%)
🟩🟩🟩🟩🟩🎉
https://worldle.teuteuf.fr
easy one
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
Since that's the gist of about 80% of the responses to questions in QA, I'm not sure you're making the point you think you are.
Keep Calm and Carry On
|
|
|
|
|
Or it emphasizes his point -- there is zero "intelligence" in AI, it's just a monkey learning colors and passing matching blocks.
When I execute a search, I must review the answers, as some are obviously wrong (for my need), so I have to exert myself to locate the most correct answer(s) I can.
The biggest danger of so-called AI is that people will begin to trust it, when it's just a monkey doing color matching. "Training" reduces the number of obviously wrong answers, and possibly increases the number of "may be correct" answers. Intelligence and experience is required to determine the validity, and AI does not have that.
|
|
|
|
|
There is a tremendous amount of intelligence in a monkey learning colors and passing matching blocks. It's not enough intelligence to write an epic poem or wonder what's beyond the horizon perhaps, but might be enough to take over the world if left unsupervised around just the right kind of blocks.
An AGI that gets the answer wrong sometimes could still be as successful as, say, a politician or CEO.
I think there is far more to be concerned about than you may realize. Of course, YMMV.
|
|
|
|
|
Member 14968771 wrote: what does AI knows about "GIGO" ?
Since "G" is exactly what they are largely trained on, I'd say they are intimately acquainted with GIGO ...
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Should anybody worry about AI "taking over the world" in the next few years? No.
Should anybody be concerned about the future of their jobs specifically and society generally as AI becomes more robust in the future? Yes.
Not sure why so many struggle with this... like everything else in technology - AI will continue to get more functional with every advancement.
|
|
|
|
|
fgs1963 wrote: Should anybody be concerned about the future of their jobs specifically and society generally as AI becomes more robust in the future? Yes.
Future being what exactly?
Development of "AI" started in the 1950s.
Myself I am still waiting for that self driving car and the endless parade of software that claims it will make developers obsolete to become more "robust".
|
|
|
|
|
jschell wrote: Future being what exactly? Sometime after now.
jschell wrote: Development of "AI" started in the 1950s. If you haven't noticed, human technological advancement a) sometimes happens in leaps and b) is escalating at a geometric rate not linear. In the 1950's there was severely limited hardware* and only a handful of developers working on AI. Now we have exaflop super computers and thousands of very well financed developers working on AI.
If you don't see AI becoming radically better over the short term (ie. the next few decades) then you might want to open your eyes.
*The 1960 Cray CDC1604 was the fastest super computer ever made at the time. It was 48bit, 192kB of memory and operated at 0.1 MIPS.
|
|
|
|
|
fgs1963 wrote: sometimes happens in leaps and
No I haven't noticed that. Actually, since I am more aware of it now, I am come to be aware that it never happens that way. It just seems like that since people are not looking at the full process.
Technology is not built on the shoulders of giants. It is built on the shoulders of very normal size people who made incrementally improved perhaps just one thing.
Moreover technology is never what drives that. Rather businesses do.
Ford (the person) did not invent anything. He put an bunch of bits and pieces (some at least decades old) and then publicized it for very specific business reasons. Same for Edison.
Cell phones have a very specific progression from the utility of satellite phones.
Computers have a very smooth transition from larger computers and the very real need and demand by businesses as the saw the benifits.
Applying technology to agriculture has been something that has been going on since agriculture existed.
The Wright brothers were absolutely not even close to being the first ones to 'fly'. They were not even the first ones to put a human on a motor propelled flying machine (that happened in France.) For close to one hundred years that was how it was depicted though.
Look even at the following which specifically mentions that they used a wind tunnel but in fact that was invented by someone else.
1903 Wright Flyer | National Air and Space Museum[^]
Cell phones have had so much impact not because of the technology but rather because it is just so cheap to put up the towers. Businesses world wide could see the need and desire to communicate and providing that was so cheap and profitable that they did it. The technology allows it but it does not do it. If it was just a matter of technology then the first Mars-Earth war would have already happened.
fgs1963 wrote: In the 1950's there was severely limited hardware*
And yet there was significant investments in that hardware made just so AI research could go on. It wasn't faltering due a lack of hardware.
|
|
|
|
|
No, but I still worry about the stupid people that continue to take over the world. It's not the AI I'm worried about, it's the NS - Natural Stupidity.
|
|
|
|
|
I wonder if there's a stupid gene or does it take practice?
|
|
|
|
|
Mike Hankey wrote: I wonder if there's a stupid gene or does it take practice? As a serious reply, I think neither - it's a complex problem of culture, education, media (news and social), environmental factors, prenatal care, nutrition, economic situation, and so forth, in no particular order. The sad thing is, all of those factors could be addressed at least to the extent that infants => children => adults could at least have a decent opportunity in life. But I'm biased in my views of the sad situation most kids are dealing with, so take all that with the due amount of sodium intake.
|
|
|
|
|
I totally agree.
I won't go into any more, it would probably start a big hullabaloo!
The apply above was tongue in cheek...so to speak!
|
|
|
|
|
Some are born stupid, some achieve stupidity, and some have stupidity thrust upon then.
(With apologies to William Shakespeare)
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
To be fair there is stupidity, ignorance and foolishness.
And then those are impacted by other factors like arrogance, elitism, laziness, etc.
|
|
|
|
|
Since I posted th4e message I started reading "How the mind works" by Pinker and it goes into a lot of the reasons why people act the way they do.
Interesting read.
|
|
|
|
|
It was asked recently if the need for software developers would ever go away. After some debate 2 schools of thought emerged. The 1st group thought that artificial intelligence would supplant developers and that project managers and sales & marketing would become the more important fields in software. The 2nd Group (consisting mostly of software developers) couldn't articulate a coherent response - they were laughing too hard.
|
|
|
|
|
Neither group is entirely right but Group 1 is nearest the mark.
My prediction: Within 50 years (probably a bit less) a LARGE percentage of today's software development jobs will be gone. A few "wizards" will remain to mind the AI but the mundane stuff that the vast majority of today's devs do will be gone. That being said - project managers and sales/marketing drones will also be gone. The CEOs will deal directly with the wizards.
|
|
|
|
|
Logically, that would make me a witch.
I can live with that.🧙♀️
|
|
|
|
|
fgs1963 wrote: My prediction: Within 50 years (probably a bit less) a LARGE percentage of today's software development jobs will be gone
Will that be before or after self driving cars actually work?
|
|
|
|