Trying to get an Edge Impulse Vision model onto the web.

First as a WASM

This is the normal way I would show an Edge Impulse model for tinyML but on the web using my version of an Edge Impulse web viewer. Note: With sounds. Kind of cool
  • index-wasm.html
    The example kind of badly detects heart shaped rocks. How well or badly it works it not the point.

    Then with TensorflowJS

    Converting the saved Tensorflow model to a TensforflowJS model here
    Main work in progress github here
    Best so far
  • forweb/index.html Final should go here
  • forweb/index-cool.html Just working on this one
    Best working backup
  • forweb/index-last-working.html
    Best simplifications
  • forweb/index-edge-impulse-to-tfjs-basic-graph-demo.html
  • forweb/index-edge-impulse-to-tfjs-basic-layers-demo.html

    Raw data test
  • forweb/index-raw-test.html The results should be unknown 0% heart Rock 100%
    Crashing after about 20 seconds and ~10000 un-disposed tensors
    This issue was fixed using:
      
       tf.engine().startScope()
      
           // Tensor stuff
      
       tf.engine().endScope()
    

  • forweb/backups/index-crashing.html
    Research possible code to adapt
  • forweb/model.json The present model.json file converted from a saved tensorflow model exported from edge impulse
  • index-just-load.html Just load any Layers or Graph Vision saved TFJS model.json
  • forweb/index-squeezenet2.html Might be the best starting point, but need to make the webcam work on apple phones
  • index-squeezenet.html
  • index-mobilenet.html
  • index-mnist.html
  • index-from-image-mobilenet.html
  • index-codelab.html
  • index-blaize.html
  • index-big-image.html
  • index-basic.html
  • Index-touch.html
  • index-load.html Try to just load the model and test it against an image
  • ...
  • ...
  • ...
  • ...
  • ..
  • ..
  • ..
  • ..
  • ..
  • ..
  • ..
  • ..
  • ..
  • ..