Flutterexperts

Empowering Vision with FlutterExperts' Expertise
Face Mask Detection App In Flutter With TensorFlow Lite

COVID-19 crises have drastically affected our livelihood. There are many guidelines given by WHO to prevent the spread of the COVID-19 virus. This virus is very dangerous and infectious. In today’s scenarios, it has become mandatory to wear a mask in public places. In many countries, people are even charged money for not wearing a mask. So it has become our moral duty to wear a mask in public places, but some people are not following the guidelines. The system can’t identify the people who are not wearing a mask, so it has become important to build a tool to identify the person whether he/she is wearing a mask or not. With the help of Machine Learning & Deep Learning, we can easily build a model and train it with a dataset to solve this problem and help prevent the spread of the COVID-19 virus.

In this blog, we shall learn how to build a Face Mask Detection app with Flutter using tflite package to identify whether the person is wearing a mask or not.


Table of contents:

Install Packages

Configure Project

Download training dataset & train our model

Initializing Camera

Load Model

Run Model

Camera Preview

Full Code


Demo Module:

Install Packages:

To build this app we will need two packages:

camera | Flutter Package
A Flutter plugin for iOS and Android allows access to the device cameras. Note: This plugin is still under…pub.dev

tflite | Flutter Package
A Flutter plugin for accessing TensorFlow Lite API. Supports image classification, object detection ( SSD and YOLO)…pub. dev

  • camera the package is used to get the streaming image buffers.
  • tflite is used to run our trained model.

Configure Project:

  • For Android

In android/app/build.gradle, add the following setting in android block.

aaptOptions {
noCompress 'tflite'
noCompress 'lite'
}
  • Change minSdkVersion 21

Download training dataset & train our model:

  • To download the dataset visit kaggle.com and search for “Face mask detection”.
  • Download the dataset.

COVID Face Mask Detection Dataset
This dataset contains about 1006 equally distributed images of 2 distinct types.www.kaggle.com

  • Upload the images of masked people in With mask class and Without mask images in Without mask class.
  • Then click on Train Model , do not change the settings.
  • Export the model.
  • Click on Tensorflow Lite and download the model.
  • Click on Ok
  • Export the model.tflite and lable.txt file and store them in the assets folder in your project.

Let’s start coding:

Initializing Camera:

Inside the main method initialize the available cameras using availableCameras.

List<CameraDescription> cameras;

Future<void> main() async {
WidgetsFlutterBinding.ensureInitialized();
cameras = await availableCameras();
runApp(MyApp());
}

camera the package provides us with support for live image streaming. Firstly create an object of the CameraController. CameraController takes two arguments CameraDescription and ResolutionPreset. initialize the cameraController and then we can start our image streaming using the startImageStream method. startImageStream the method that provides us with the images, we will give these images to cameraImage and then we will run our model.

CameraImage cameraImage;
CameraController cameraController;
String result = "";

initCamera() {
cameraController = CameraController(cameras[0], ResolutionPreset.medium);
cameraController.initialize().then((value) {
if (!mounted) return;
setState(() {
cameraController.startImageStream((imageStream) {
cameraImage = imageStream;
runModel();
});
});
});
}

Load Model:

Tflite provides us loadModel method to load our model. It takes two values model file path and labels file path.

loadModel() async {
await Tflite.loadModel(
model: "assets/model.tflite", labels: "assets/labels.txt");
}

Run Model:

In this method, we will run the model using Tflite. Here we are using the live stream of the image so we will have to use runModelOnFrame the method to run our model.

runModel() async {
if (cameraImage != null) {
var recognitions = await Tflite.runModelOnFrame(
bytesList: cameraImage.planes.map((plane) {
return plane.bytes;
}).toList(),
imageHeight: cameraImage.height,
imageWidth: cameraImage.width,
imageMean: 127.5,
imageStd: 127.5,
rotation: 90,
numResults: 2,
threshold: 0.1,
asynch: true);
recognitions.forEach((element) {
setState(() {
result = element["label"];
print(result);
});
});
}
}

recognitions is the list of future so for each element or label text, we will set it to result variable.

initState method:

@override
void initState() {
super.initState();
initCamera();
loadModel();
}

Camera Preview:

Tflite package provide us CameraPreview widget to preview the camera on the app screen, it takes cameraController .

AspectRatio(
aspectRatio: cameraController.value.aspectRatio,
child: CameraPreview(cameraController),
),

Displaying Result

Text(
result,
style: TextStyle(fontWeight: FontWeight.bold,fontSize: 25),
)

Full Code:

import 'package:camera/camera.dart';
import 'package:flutter/material.dart';
import 'package:tflite/tflite.dart';

List<CameraDescription> cameras;

Future<void> main() async {
WidgetsFlutterBinding.ensureInitialized();
cameras = await availableCameras();
runApp(MyApp());
}

class MyApp extends StatelessWidget {
@override
Widget build(BuildContext context) {
return MaterialApp(
theme: ThemeData.dark(),
home: MyHomePage(),
);
}
}

class MyHomePage extends StatefulWidget {
@override
_MyHomePageState createState() => _MyHomePageState();
}

class _MyHomePageState extends State<MyHomePage> {
CameraImage cameraImage;
CameraController cameraController;
String result = "";

initCamera() {
cameraController = CameraController(cameras[0], ResolutionPreset.medium);
cameraController.initialize().then((value) {
if (!mounted) return;
setState(() {
cameraController.startImageStream((imageStream) {
cameraImage = imageStream;
runModel();
});
});
});
}

loadModel() async {
await Tflite.loadModel(
model: "assets/model.tflite", labels: "assets/labels.txt");
}

runModel() async {
if (cameraImage != null) {
var recognitions = await Tflite.runModelOnFrame(
bytesList: cameraImage.planes.map((plane) {
return plane.bytes;
}).toList(),
imageHeight: cameraImage.height,
imageWidth: cameraImage.width,
imageMean: 127.5,
imageStd: 127.5,
rotation: 90,
numResults: 2,
threshold: 0.1,
asynch: true);
recognitions.forEach((element) {
setState(() {
result = element["label"];
print(result);
});
});
}
}

@override
void initState() {
super.initState();
initCamera();
loadModel();
}

@override
Widget build(BuildContext context) {
return SafeArea(
child: Scaffold(
appBar: AppBar(
title: Text("Face Mask Detector"),
),
body: Column(
children: [
Padding(
padding: const EdgeInsets.all(20),
child: Container(
height: MediaQuery.of(context).size.height - 170,
width: MediaQuery.of(context).size.width,
child: !cameraController.value.isInitialized
? Container()
: AspectRatio(
aspectRatio: cameraController.value.aspectRatio,
child: CameraPreview(cameraController),
),
),
),
Text(
result,
style: TextStyle(fontWeight: FontWeight.bold,fontSize: 25),
)
],
),
),
);
}
}

flutter-devs/flutter_tflite
A new Flutter application. This project is a starting point for a Flutter application. A few resources to get you…github.com


🌸🌼🌸 Thank you for reading. 🌸🌼🌸

Feel free to connect with us:
And read more articles from FlutterDevs.com.

FlutterDevs team of Flutter developers to build high-quality and functionally-rich apps. Hire a flutter developer for your cross-platform Flutter mobile app project on an hourly or full-time basis as per your requirement! You can connect with us on Facebook, GitHub, Twitter, and LinkedIn for any flutter-related queries.

We welcome feedback and hope that you share what you’re working on using #FlutterDevs. We truly enjoy seeing how you use Flutter to build beautiful, interactive web experiences.

Leave comment

Your email address will not be published. Required fields are marked with *.