Abstract:
"
Smartphones are getting more complex and sophisticated, more powerful than before.
Similarly, mobile applications are also getting advanced in order to cater users a better
personalized experience. Machine learning and augmented reality are some of the
cutting-edge technologies which are popular with mobile applications. On-device AI is
also one of the trending topics nowadays. On-device AI is a technology which runs
machine learning tasks on the device to provide better privacy, high reliability and,
minimize latency. However, due to this transformation, mobile applications are
becoming more power-consuming and it is becoming a burning problem for especially
low end smartphones. Since mobile devices are often resource-limited devices, it's
harder to run more complex applications on these devices. The above-mentioned
problem is also applicable when it comes to on-device inference based applications.
This research tries to address the above problem by proposing an offloading mechanism
for mobile devices to perform on-device inference machine learning tasks. The research
focused on developing a library that can be integrate to new or existing on-device AI
based Android applications, which allows these applications to offload their machine
learning tasks to nearby resource-rich devices. Further, the system is designed to
provide an automated decision-making process that decides when to offload based on
different factors"