Discussion:
Neural Networks (MNIST inference) on the “3-cent” Microcontroller
(too old to reply)
D. Ray
2024-10-21 20:06:28 UTC
Permalink
Bouyed by the surprisingly good performance of neural networks with
quantization aware training on the CH32V003, I wondered how far this can be
pushed. How much can we compress a neural network while still achieving
good test accuracy on the MNIST dataset? When it comes to absolutely
low-end microcontrollers, there is hardly a more compelling target than the
Padauk 8-bit microcontrollers. These are microcontrollers optimized for the
simplest and lowest cost applications there are. The smallest device of the
portfolio, the PMS150C, sports 1024 13-bit word one-time-programmable
memory and 64 bytes of ram, more than an order of magnitude smaller than
the CH32V003. In addition, it has a proprieteray accumulator based 8-bit
architecture, as opposed to a much more powerful RISC-V instruction set.

Is it possible to implement an MNIST inference engine, which can classify
handwritten numbers, also on a PMS150C?





<https://cpldcpu.wordpress.com/2024/05/02/machine-learning-mnist-inference-on-the-3-cent-microcontroller/>

<https://archive.md/DzqzL>
Don Y
2024-10-21 22:09:10 UTC
Permalink
Post by D. Ray
Is it possible to implement an MNIST inference engine, which can classify
handwritten numbers, also on a PMS150C?
Wouldn't it be smarter to come up with an approach that *can*
rather than trying to force some approach to "fit"?
George Neuner
2024-10-22 19:39:42 UTC
Permalink
Post by D. Ray
Bouyed by the surprisingly good performance of neural networks with
quantization aware training on the CH32V003, I wondered how far this can be
pushed. How much can we compress a neural network while still achieving
good test accuracy on the MNIST dataset? When it comes to absolutely
low-end microcontrollers, there is hardly a more compelling target than the
Padauk 8-bit microcontrollers. These are microcontrollers optimized for the
simplest and lowest cost applications there are. The smallest device of the
portfolio, the PMS150C, sports 1024 13-bit word one-time-programmable
memory and 64 bytes of ram, more than an order of magnitude smaller than
the CH32V003. In addition, it has a proprieteray accumulator based 8-bit
architecture, as opposed to a much more powerful RISC-V instruction set.
Is it possible to implement an MNIST inference engine, which can classify
handwritten numbers, also on a PMS150C?
<https://cpldcpu.wordpress.com/2024/05/02/machine-learning-mnist-inference-on-the-3-cent-microcontroller/>
<https://archive.md/DzqzL>
Depends on whether you mean implementing /their/ recognizer, or just
implementing a recognizer that could be trained using their data set.

Any 8-bitter can easily handle the computations ... FP is not required
- fixed point fractions will do fine. The issue is how much memory is
needed and what your target chip brings to the party.
D. Ray
2024-10-28 15:42:42 UTC
Permalink
Post by George Neuner
Depends on whether you mean
Perhaps you misunderstood me. I’m not the author, I just posted beginning
of a blog post and provided the link to the rest of it because it seemed
interesting. The reason I didn’t post a whole thing is because there are
quite few illustrations.

Blog post ends with:

“It is indeed possible to implement MNIST inference with good accuracy
using one of the cheapest and simplest microcontrollers on the market. A
lot of memory footprint and processing overhead is usually spent on
implementing flexible inference engines, that can accomodate a wide range
of operators and model structures. Cutting this overhead away and reducing
the functionality to its core allows for astonishing simplification at this
very low end.

This hack demonstrates that there truly is no fundamental lower limit to
applying machine learning and edge inference. However, the feasibility of
implementing useful applications at this level is somewhat doubtful.”
David Brown
2024-10-28 16:50:12 UTC
Permalink
Post by D. Ray
Post by George Neuner
Depends on whether you mean
Perhaps you misunderstood me. I’m not the author, I just posted beginning
of a blog post and provided the link to the rest of it because it seemed
interesting. The reason I didn’t post a whole thing is because there are
quite few illustrations.
“It is indeed possible to implement MNIST inference with good accuracy
using one of the cheapest and simplest microcontrollers on the market. A
lot of memory footprint and processing overhead is usually spent on
implementing flexible inference engines, that can accomodate a wide range
of operators and model structures. Cutting this overhead away and reducing
the functionality to its core allows for astonishing simplification at this
very low end.
This hack demonstrates that there truly is no fundamental lower limit to
applying machine learning and edge inference. However, the feasibility of
implementing useful applications at this level is somewhat doubtful.”
It's fine to quote from a blog post or other such sources, as long as
you make it clear that this is what you are doing (and that you are not
quoting so much that it is copyright infringement). Your first post in
this thread was formatted in a way that makes it clear and obvious that
it was your own original words, written for the Usenet post - but
apparently that was not the case. Remember, no one reading Usenet is
going to click on random links in a post - we need very good reason to
do so. So please, next time write some introductory or explanatory text
yourself and make the whole thing clearer.

I think it is quite cool to hear that it is possible to do something
like this on these 3-cent microcontrollers, but I would not expect
anyone to use them in practice.

olcott
2024-10-27 01:43:01 UTC
Permalink
Post by D. Ray
Bouyed by the surprisingly good performance of neural networks with
quantization aware training on the CH32V003, I wondered how far this can be
pushed. How much can we compress a neural network while still achieving
good test accuracy on the MNIST dataset? When it comes to absolutely
low-end microcontrollers, there is hardly a more compelling target than the
Padauk 8-bit microcontrollers. These are microcontrollers optimized for the
simplest and lowest cost applications there are. The smallest device of the
portfolio, the PMS150C, sports 1024 13-bit word one-time-programmable
memory and 64 bytes of ram, more than an order of magnitude smaller than
the CH32V003. In addition, it has a proprieteray accumulator based 8-bit
architecture, as opposed to a much more powerful RISC-V instruction set.
Is it possible to implement an MNIST inference engine, which can classify
handwritten numbers, also on a PMS150C?


<https://cpldcpu.wordpress.com/2024/05/02/machine-learning-mnist-inference-on-the-3-cent-microcontroller/>
<https://archive.md/DzqzL>
test to see if this posts or I should dump this paid provider.
--
Copyright 2024 Olcott

"Talent hits a target no one else can hit;
Genius hits a target no one else can see."
Arthur Schopenhauer
George Neuner
2024-10-27 20:41:31 UTC
Permalink
Post by olcott
test to see if this posts or I should dump this paid provider.
Eternal September is a good, no cost Usenet provider.

http://www.eternal-september.org/
D. Ray
2024-10-28 15:42:41 UTC
Permalink
Post by olcott
Post by D. Ray
Bouyed by the surprisingly good performance of neural networks with
quantization aware training on the CH32V003, I wondered how far this can be
pushed. How much can we compress a neural network while still achieving
good test accuracy on the MNIST dataset? When it comes to absolutely
low-end microcontrollers, there is hardly a more compelling target than the
Padauk 8-bit microcontrollers. These are microcontrollers optimized for the
simplest and lowest cost applications there are. The smallest device of the
portfolio, the PMS150C, sports 1024 13-bit word one-time-programmable
memory and 64 bytes of ram, more than an order of magnitude smaller than
the CH32V003. In addition, it has a proprieteray accumulator based 8-bit
architecture, as opposed to a much more powerful RISC-V instruction set.
Is it possible to implement an MNIST inference engine, which can classify
handwritten numbers, also on a PMS150C?


<https://cpldcpu.wordpress.com/2024/05/02/machine-learning-mnist-inference-on-the-3-cent-microcontroller/>
<https://archive.md/DzqzL>
test to see if this posts or I should dump this paid provider.
It worked.
Loading...