|
bespoke polo catering stimulated the senses (12)
|
|
|
|
|
ORGANOLEPTIC
Bespoke - anagram indicator of
polo catering
Stimulated the senses = "Relating to perception by a sensory organ."
|
|
|
|
|
|
@GregUtas
Where's the CCC?
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
I've been looking for information on chip lithography processes used in creating SoCs for IoT devices like the ESP32 based devices.
The ESP32 is @ 40nm which seems huge today, given my AMD APU is at 7nm.
Even when it was released 40nm was kind of big.
Now, these devices are known for low power, but even now, I struggle writing software that will use my ESP32's tensilica CPU in such a way as to extend the operating life on a charge.
If it's a cost issue, I don't think it's a tenable one:
I would pay 4x the price for ultra low power versions of the ESP32 widgets I have.
I don't know as much about other offerings like ARMs and so far googling isn't coming up with much real world info on lithography used across the range of IoT SoC offerings.
But my takeaway is these little chips have some catching up to do in an area where they could sorely use it.
Real programmers use butterflies
|
|
|
|
|
But a thing to remember SoC is intended for embedded devices, which are more prone to static damage.
Stick you 7nM AMD is the envoirment I work in and it will dead with a matter of hours from static/seawater ingress, 40nM will last longer as there is more to be damaged before it pops. While not an issue for most things, if you are mounting it on the sea bed where down time cost a very large telephone number one of the key design requirments is how hard is to break, not how efficient it is. My company looked at getting Siemens to restart making an old chip design as it was 32 bits, big flat pack with hand solderable leads to supply spares.
|
|
|
|
|
That's true, but you'd think given how many people *aren't* using these things in hardcore industrial applications where that is an issue one has to wonder about the dearth of ultra low power versions, even if they are a bit more fragile and expensive. For something like the device I'm currently building it would be a win. For a lot of the things I've seen other people build (though non-commercial in that case it still makes $$ for espressif) it would be a win as well. Just my opinion.
Real programmers use butterflies
|
|
|
|
|
There is definitely a cost factor involved. Those little chips are intended for sale at very low prices. The latest, cutting-edge processor designs are very, very expensive in part because of the cost to fabricate devices with the latest equipment. That does not make very much sense if you want to sell a low-cost device. I remember when I worked on some systems used to make PICs they used rather old technology, being fabbed on 4 and 5-inch wafers when we had recently installed systems at Intel, TSMC, and Samsung that used 12-inch wafers. FWIW, most of our systems were for 6 and 8-inch wafers. The three companies listed above were the only ones that used 12s that we worked with out of over three hundred companies.
I think the real answer to your question is they don't use the latest technology because they do not need to.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
There is a need though. I mean, on the ESP32 forums (reddit, espressif, anywhere) you'll find people struggling with battery life for their devices.
Plus at the current cost of $5 even making it $20 for a ULP version isn't unreasonable.
Real programmers use butterflies
|
|
|
|
|
Yes, but then that will cause other limitations like complicating interface with external devices. As the song goes, one thing leads to another.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
how would shrinking the die do that?
They wouldn't even have to change the actual form factor of their chip to shrink the die, much less change the actual functionality
Real programmers use butterflies
|
|
|
|
|
A physically smaller die has less room to attach IO pins; and higher density processes are much more expensive per mm^2; down to about 14nm increased transistor density allowing more parts per wafer is able to keep marginal prices for denser parts down. (Following all the additional/more complex design rules for eg 14nm vs 40nm, along with other fixed setup costs still makes smaller processes a lot more expensive if you're only making a small volume of parts though.)
One of the major drivers 10-20 years ago of Intel cramming more and more stuff into the chipset (most notably a basic GPU) was that the overall size of the die was being set by the number of IO pins it needed, and they had "free space" to put anything that didn't need much more IO into.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
|
|
|
|
|
Yes. Also, shrinking pads and traces will generally reduce their current carrying capacity and that's usually not so good with interfaces.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
If you are contractually bound to the ESP32, then you are contractually bound to the ESP32. Case closed.
If you are free to switch to another chip, why don't you state your requirements for power consumption, on-chip functionality and maximum cost (for development kit, raw chip or whatever), and other requirements / restrictions?
I don't expect anyone around here to be able to reduce the power consumption of the ESP32, except by trivial measures such as reducing the clock frequency (if that is possible on the ESP32) or being more clever in turning of peripherals when not needed (if that is possible / relevant on the ESP32).
Knowing nothing about your functional requirements (so this may not be relevant for you): If what you need is Bluetooth, the nRF52 / nRF53 IoT SoC chips are recognized for their low power consumption. If going to another chip is an option, take a look at nRF[^]. These chips also support some older wireless technologies such as ANT and Zigbee. If what you need is 4G IoT, there is the nRF91 chip, but I believe that is a more expensive chip.
Off hand, I can't tell which technology these chips are using, but the proof of the pudding is the actual power consumption, not the nanometers. If you have the option to go to another chip, and your target is either Bluetooth or LTE, you could pick up the specifications of the nRF chips from the website for consideration.
|
|
|
|
|
We're already roped in.
The ESP32's strengths are the amount of I/O it has and the built in array of radio comms it supports.
I'm using pretty much all of that.
More importantly, we've already built our supply chain around using these chips, so even if I changed all my code and went out and bought one of the chips you mentioned, it would be a non-starter.
I'm using BLE on the ESP32 already. The BLE portion isn't taking up much juice according to my USB amp meter but esp32 itself is kind of hungry.
Real programmers use butterflies
|
|
|
|
|
A couple of questions, is it true to assume that the primary issue you have is with battery life? Does your system compute continuously or are there long periods when power could be saved by shutting down the processor to say 1µA and then restart it again when processing is required?
|
|
|
|
|
Yes, my primary concern is I think they (esp the ESP32) could be much less power hungry.
In one of my commercial projects I've got an ESP32 that drives a screen, and the screen is the biggest pig, but it's continuous during a session. There are "wait screens" where I can blank out probably suspend the CPU with just enough state to bring it back if the radio and screen survive the suspend.
But my concern is more of a general one. The thing uses 60mA at idle with nothing connected except a voltage regulator and some pullups, and jumps rapidly as soon as you start computing with it. It doesn't seem like much but in practice, unless your device goes to sleep, wakes up, senses and transmits, then sleeps again (in which case 4AA batteries would last you a day) your batteries last mere hours.
The ARM Cortex-M based STM32 systems run at 400Mhz and are 40nm as well so i expect them to fare as poorly.
Lithium ION is a little better lifewise but for wearables and stuff this is just too hungry in general.
Real programmers use butterflies
|
|
|
|
|
I am a curious amateur, knowing "nothing" about advanced biology. But I am curious
I still remember the days when there was a race to be the first research institute to do a complete sequencing of human DNA. It took weeks, or maybe it was months.
Nowadays, DNA sequencing seems to be as simple as taking a breath. "Everyone" seems to do it in a snap. Covid19 RNA (it is RNA, isn't it? or is it DNA?) is obviously magnitudes below human DNA, but even human DNA sequencing seems to be a piece of cake today, to determine your ancestry, or your disposition of various diseases, or in criminal cases, or for a whole lot of other purposes.
Most of this change has come in less than twenty years. What happened? Computers haven't become that much faster! (I got the impression that they are essential as a tool.) Is the speedup in other, non-computer analysis hardware? Or have the scientists developed a completely new methodology that is a magnitude or two faster? Or are those companies offering info about ancestry or disease risk doing only a quick, partial analysis rather than a full sequencing?
To phrase it differently: If we twenty years ago had had all the knowledge that we have today about methodologies, would the hardware of the day be capable of sequencing the human genome in a many hours as they did use months, or are the methods of today fully dependent on recent hardware development? (I assume that computer hardware development is only part of it!)
|
|
|
|
|
trønderen wrote: It took weeks, or maybe it was months years. FTFY.trønderen wrote: it is RNA, isn't it? or is it DNA? It can be either one, depends on the virus group. RNA is only used for replication, which is what most viruses are targetting.trønderen wrote: What happened? It is like learning how to ride a bike, once you know how it goes you can ride all bikes, not only yours.
trønderen wrote: would the hardware of the day be capable of sequencing the human genome in a many hours as they did use months No.
|
|
|
|
|
trønderen wrote: it is RNA, isn't it? or is it DNA?
It's a single-stranded RNA virus.
|
|
|
|
|
Covid is an RNA virus
Back in the day (staring in 1990) the Human Genome Project was labouring to sequence the human genome. However, the technologies they (and others) developed advanced, and in 1998 Craig Venter started a parallel private effort. Depending on what "complete" counts as, he actually published the sequence first. I've seen estimates of the Human Genome Project as costing USD2.7B, but now you can get a full sequence done for < USD1000.
So, yeah. If we had today's technology then, stuff would have completed faster. Something short (~30,000 bases) like Covid doesn't take them long anymore - they had the full sequence last January, and they know exactly where the variants differ as well.
Ancestry, 23andMe, and the others aren't doing full sequencing though. They're doing a partial method (as you guessed) that involves trying to find specific differences. (They chop up the DNA at specific sequence types, then separate out the remaining chains to find differences).
TTFN - Kent
|
|
|
|
|
trønderen wrote: Nowadays, DNA sequencing seems to be as simple as taking a breath. "Everyone" seems to do it in a snap. Covid19 RNA (it is RNA, isn't it? or is it DNA?) is obviously magnitudes below human DNA, but even human DNA sequencing seems to be a piece of cake today, to determine your ancestry, or your disposition of various diseases, or in criminal cases, or for a whole lot of other purposes. Google "RNA vs DNA".
trønderen wrote: Most of this change has come in less than twenty years. What happened? Computers haven't become that much faster! (I got the impression that they are essential as a tool.) Is the speedup in other, non-computer analysis hardware? Or have the scientists developed a completely new methodology that is a magnitude or two faster? Or are those companies offering info about ancestry or disease risk doing only a quick, partial analysis rather than a full sequencing? Computers did increase, as did our knowledge of biology.
trønderen wrote: To phrase it differently: If we twenty years ago had had all the knowledge that we have today about methodologies, would the hardware of the day be capable of sequencing the human genome in a many hours as they did use months, or are the methods of today fully dependent on recent hardware development? (I assume that computer hardware development is only part of it!) Twenty years ago is 2001. I'm from 1977. We'd have dealt with it another way; stricter lockdowns, quarantines for entire cities.
Let me paint you a (realistic) picture; I'll be vaccinated this month, but the effectiveness of the vaccine is 70%. Nothing like measles. I won't be allowed outside during lockdown, vaccinated or not.
For every human it infects, millions of copies (and evolution). If it infects someone vaccinated in those 30%, it may become immune to the vaccine. So after my shot, I still need to stay indoors.
I am old enough to remember when every new vaccine was celebrated. We seen the effects of polio. Suddenly, half of our country refuses vaccines without any good reason. This "minor disease" as called here, may turn hostile soon, if we don't stop the amount of people it infects.
We need stop spread. We need to stop idiots.
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
Eddy Vluggen wrote: We need to stop idiots.
That's what people from both sides of the issue say.

|
|
|
|
|
Imagine the same with polio. Look up what the disease did, and how vaccines killed it.
There's no discussion here.
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
I say we wear masks forever, and stay in lockdown forever.
|
|
|
|
|