• =?UTF-8?Q?Re:=20Neural=20Networks=20(MNIST=20inference)?= =?UTF-8?Q?=20

    From D. Ray@21:1/5 to George Neuner on Mon Oct 28 15:42:42 2024
    George Neuner <gneuner2@comcast.net> wrote:

    Depends on whether you mean

    Perhaps you misunderstood me. I’m not the author, I just posted beginning
    of a blog post and provided the link to the rest of it because it seemed interesting. The reason I didn’t post a whole thing is because there are quite few illustrations.

    Blog post ends with:

    “It is indeed possible to implement MNIST inference with good accuracy
    using one of the cheapest and simplest microcontrollers on the market. A
    lot of memory footprint and processing overhead is usually spent on implementing flexible inference engines, that can accomodate a wide range
    of operators and model structures. Cutting this overhead away and reducing
    the functionality to its core allows for astonishing simplification at this very low end.

    This hack demonstrates that there truly is no fundamental lower limit to applying machine learning and edge inference. However, the feasibility of implementing useful applications at this level is somewhat doubtful.”

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)