Inference on the Edge
Lots have been said about AI and the Cloud. But the latest buzz is all about doing Machine Learning on the edge i.e on a small, cheap device with limited computing power. Like our cellphones or an embedded device like Rasberry Pi. Running the machine learning modes in small devices is called Inference on the Edge or simply Edge AI. The Machine Learning models are becoming smaller and these edge devices are getting faster, that we can finally do useful machine learning apps on these devices. We spend lot of our time building these small Machine Learning models that are robust and yet can run on our customer’s phone. With the marriage of robust edge devices and smaller, faster machine learning models, what we can do is only constrained by our imagination. Of course, we also build AI solutions that need the computing power of the cloud if the customer’s requirements warrant it. We have expertise in build solutions in Google, Amazon and Microsoft’s Azure, IBM Watson cloud. Please contact us if this sounds interesting
Interested? Lets take it forward
Fill out the form below and our experts will get in touch for free consulting!