webinar register page

Webinar banner
Bottoms-up Approach to Building Extremely Small Models
Speaker: Blair Newman, Chief Technology Officer @ Neuton

Neural networks created today contain more and more coefficients and neurons and require ever-increasing processing power. A majority of neural networks are based on a predetermined architecture and during the optimization process, only neuron parameters undergo optimization, while the architecture itself remains predetermined. This is the main cause of the unnecessary growth of network size.

During this tech talk, we will show you how Neuton has taken a completely different approach resulting in models that are 1000x smaller and 1000x faster than other frameworks such as Tensorflow lite and Pytorch.

This talk is part of the bi-weekly AI Virtual Tech Talk Series: https://developer.arm.com/solutions/machine-learning-on-arm/ai-virtual-tech-talks

Arm will process your information in accordance with our Privacy Policy:
https://www.arm.com/company/policies/privacy.

Sep 7, 2021 04:00 PM in London

Webinar logo
Webinar is over, you cannot register now. If you have any questions, please contact Webinar host: Tobias McBride.