Detecting faces in an image with the power of mobile machine learning

Image for post
Image for post
Photo by Kyle Glenn on Unsplash

Creating accurate machine learning models capable of identifying multiple faces (or other target objects) in a single image remains a core challenge in computer vision, but now with the advancement of deep learning and computer vision models for mobile, you can much more easily detect faces in an image.

In this article, we’ll do just that on Android with the help of Google ML Kit’s Face Detection API.

What Is Face Detection?

Face detection is a computer vision task with which you can detect faces in an image—it also detects the various part of the face, known as landmarks. …


Fix bugs up to 50x faster

Image for post
Image for post
Credits

Introduction

Shake is a bug reporting SDK for iOS and Android apps. It’s a powerful tool that easily compares to other bug reporting tools like Instabug or Crashlytics. In this article, I’ll explore some of the main features of the Shake SDK, it’s benefits, and how you can integrate it into your Android Project.

With Shake SDK added to your app, your testers can report any bug in under 5 seconds. It will take all the extra info you as a developer need and send it to the Shake web dashboard:

Shake SDK automatically captures the following information:

  • Tester’s device model


Identify the language of a text with the power of mobile machine learning

Image for post
Image for post
Photo by Hannah Wright on Unsplash

Nowadays, language detection is very popular (especially with machine learning), and mobile apps that use it are widely popular in every part of the world, with different users speaking different languages. Language identification can easily help you understand your users’ languages and personalize your app based on them.

Language detection is essentially a technique/science that allows us to automatically identify the language of a given text, be it English, Chinese, or many others. We can use machine learning for this kind of identification —and we can even do this inside mobile apps!

In this article, we’ll do just that, with…


Image for post
Image for post
Photo by Ricky Kharawala on Unsplash

Firebase Crashlytics is a real-time crash reporting tool. It helps by automatically collecting, analyzing, and organizing your crash reports.

It also helps you understand which issues are most important, so you can priorities those first and keep your users happy. It combines a real-time dashboard and alerts to keep you aware of your newest issues and any sudden changes instability.

Create a Firebase Project and Add to Android

The first step we need to take is to create a Firebase project and add it to our Android app. At this point, I’ve already created a Firebase project, added it to a new project, and also included the Google…


Image for post
Image for post
Photo by Shahadat Rahman on Unsplash

In the world of computer science, there are many programming languages, and no single language is superior to another. In other words, each language is best suited to solve certain problems, and in fact, there is often no one best language to choose for a given programming project. For this reason, it is important for students who wish to develop software or to solve interesting problems through code to have strong computer science fundamentals that will apply across any programming language.

Note: This article originally appeared on junilearning.com

Programming languages tend to share certain characteristics in how they function, for…


Identify and locate objects in images with the power of mobile machine learning

Image for post
Image for post

Creating accurate machine learning models capable of identifying multiple objects in a single image remains a core challenge in computer vision, but now with the advancement of deep learning and computer vision models for mobile, you can easily detect target objects from an image on Android.

In this article, we’ll do just that with the help of Firebase ML Kit’s Object Detection API.

What is Object Detection?

Object detection is a computer vision technique for identifying and locating instances of objects within images or videos. It provides much better understanding of an image as a whole when compared to visual recognition (i.e. …


Customize toolbar font easily

Image for post
Image for post
Photo by Alexander Andrews on Unsplash

Nowadays In Android applications, we often have seen custom fonts throughout the application, as the custom font in detailing, the custom font in the toolbar, custom font everywhere in the application. Which makes UI more appropriate and appealing.

In this article, we will learn how we can customize the toolbar font, how to change the Toolbar font family with the help of a theme.

Originally published at https://danishamjad.com on June 2, 2020.

Change Font for TextView

Before directly jumping towards custom toolbar font, first, see how we can easily change the font family for TextView in Android.

We just need to add the font-family


3x your build speed by following these simple best practices

Image for post
Image for post
Photo by Kelly Sikkema on Unsplash

Slow build speed is a huge momentum buster. It’s kind of like driving down the road and constantly hitting speed bump after speed bump. Build speed is critical to your productivity, and over the past few years, the Android team has consistently focused on improving build speed performance.

After implementing the tips in this article, we’ll be able to speed up development build times by 3x, 4x—sometimes even up to 10x. All by simply applying a set of best practices.

To help contextualize and show the impact of these tips, I’m going to use Google’s Santa Tracker Android project as…


Image for post
Image for post
Photo by Miss Zhang on Unsplash

Nowadays, people check their watches for important notifications as much as they do the time. Wearable devices, like Android-based watches, are gaining more momentum as Google continues to improve its Android Wear platform for developers.

A couple of months ago, I was working on a simple application called Location-based tasks using Geofence. If you don’t know what Geofencing is, check out the link below:

https://developer.android.com/training/location/geofencing

In the application I was building, I needed to notify users about tasks they’d added, which were connected to particular locations. Also, I needed to send the notification to Android Wear, as well. …


Debug your application code more efficiently

Image for post
Image for post
Photo by Suzy Brooks on Unsplash

In this article, we’ll explore a range of debugging techniques that can be used while developing in Android Studio. As developers, we all know that there are days where you spend way more time in your debugger than you do in your code editor.

If you don’t know what debugging actually is then check out the link below:

In this post, we’ll cover some of the essential tips and tricks that will be easy to incorporate into your debugging flow—and will hopefully make it more efficient.

Here are some of the tips that we will cover here:

  • Log filtering
  • Folding…

Danish Amjad

Senior Software Engineer(Android). Open Source Contributor, Technical Writer, Email: Dani.amjad12@gmail.com check my website at https://danishamjad.com/

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store