Open In App

Face Detection in Flutter using Firebase ML Kit

Improve
Improve
Like Article
Like
Save
Share
Report

Face detection is a technique by which we can locate the human faces in the image given. Face detection is used in many places nowadays, especially websites hosting images like Picasa, Photobucket, and Facebook. The automatic tagging feature adds a new dimension to sharing pictures among the people who are in the picture and also gives the idea to other people about who the person is in the image. And In this article, we are not using our own created Face detection machine learning algorithm. But We will be using the Firebase ML kit that gives us to use the face detection algorithm in Flutter.

Step-by-Step Implementation:

First of all, Created an empty project in Flutter. You may refer to this article for the same. Before moving further let’s add some dependency into the pubspec.yaml file

firebase_ml_vision: ^ 17.0.2
image_picker: ^0.8.5+3
firebase_core: ^1.19.2

We are using three dependencies for the project that gonna help us. 

firebase_ml_vision and firebase_core are used for the face detection in an image and image_picker is used to pick the image from the camera or gallery and any other source. Our aim is to pick the image from the gallery using the button or from the camera and then show that image as an output with a human face detected. 

Note: When you are this reading, the version of packages may change.

Now import these additional following packages that help in using the inbuilt firebase functions in our flutter project.

import 'package:firebase_ml_vision/firebase_ml_vision.dart';
import 'package:image_picker/image_picker.dart';

Now in void main( ) function call the runApp( ) method and call the class MyApp( ). Now create a new class named as MyApp( ) this will be going to be a stateful widget or class because our application does change its state at run time. And return the Scaffold( ). In scaffold, we have AppBar with title text Face Detection. Also, Add the floating action button to take the image, we are taking from the gallery in this project.

Dart




// runapp
void main() => runApp( 
      // MaterialApp with debugbanner false
      MaterialApp(
        debugShowCheckedModeBanner: false,
        theme: ThemeData(primarySwatch: Colors.teal),
        // this will going to our main class.
        home: MyApp(), 
      ),
    );
class MyApp extends StatefulWidget {
  @override
  _MyAppState createState() => _MyAppState();
}
 
class _MyAppState extends State<MyApp> {
 
   @override
  Widget build(BuildContext context) {
    // scaffold with appbar
    return Scaffold(
      appBar: AppBar(
        automaticallyImplyLeading: true,
        title: Text('Face Detection'),
      ),
      // floating action button
      floatingActionButton: FloatingActionButton(
        onPressed: _getImagegallery,
        tooltip: 'camera',
        child: Icon(Icons.add_a_photo),
      ),
      );
  }
}


Our UI screen looks like this:

 

Now Let’s create the _getImagegallery method for taking images. and call the setState method for making isLoading variable to true. And detecting faces in the image using the firebaseVision.

Dart




  _getImagegallery() async {
    // picking image
    imageFile = await picker.getImage(source: ImageSource.gallery);
    setState(() {
      // set isLoading to true
      isLoading = true;
    });
        
    // detecting faces in the images
    final image = FirebaseVisionImage.fromFile(File(imageFile.path));
    final faceDetector = FirebaseVision.instance.faceDetector();
    List<Face> faces = await faceDetector.processImage(image);
 
    if (mounted) {
      setState(() {
        _imageFile = File(imageFile.path);
        _faces = faces;
        _loadImage(File(imageFile.path));
      });
    }
  }
 
  _loadImage(File file) async {
    final data = await file.readAsBytes();
    await decodeImageFromList(data).then((value) => setState(() {
          _image = value;
          isLoading = false;
        }));
  }
}


Now we have to use canvas to paint each face in the rectangular color that is detected and shown in the body of the Scaffold.

Complete Code

Dart




// @dart=2.9
import 'package:flutter/material.dart';
import 'package:firebase_ml_vision/firebase_ml_vision.dart';
import 'package:image_picker/image_picker.dart';
import 'dart:io';
import 'dart:ui' as ui;
 
// main method that runs the our main app
void main() => runApp(
      MaterialApp(
        debugShowCheckedModeBanner: false,
        theme: ThemeData(primarySwatch: Colors.teal),
        home: MyApp(),
      ),
    );
 
class MyApp extends StatefulWidget {
  @override
  _MyAppState createState() => _MyAppState();
}
 
class _MyAppState extends State<MyApp> {
  File _imageFile;
  List<Face> _faces;
  bool isLoading = false;
  ui.Image _image;
  final picker = ImagePicker();
  var imageFile;
  @override
  Widget build(BuildContext context) {
    // scaffold with appbar
    return Scaffold(
      appBar: AppBar(
        automaticallyImplyLeading: true,
        // title of the appbar
        title: Text('Face Detection'),
      ),
      // floating button for picking image
      floatingActionButton: FloatingActionButton(
        onPressed: _getImagegallery,
        tooltip: 'camera',
        child: Icon(Icons.add_a_photo),
      ),
      // if file is null print no image
      // selected others wise show image
      body: isLoading
          ? Center(child: CircularProgressIndicator())
          : (_imageFile == null) 
              ? Center(child: Text('no image selected'))
              : Center(
                  child: FittedBox(
                    child: SizedBox(
                      width: _image.width.toDouble(),
                      height: _image.height.toDouble(),
                      child: CustomPaint(
                        painter: FacePainter(_image, _faces),
                      ),
                    ),
                  ),
                ),
    );
  }
     
  // function for pick the image
  // and detect faces in the image
  _getImagegallery() async {
    imageFile = await picker.getImage(source: ImageSource.gallery);
    setState(() {
      isLoading = true;
    });
 
    final image = FirebaseVisionImage.fromFile(File(imageFile.path));
    final faceDetector = FirebaseVision.instance.faceDetector();
    List<Face> faces = await faceDetector.processImage(image);
 
    if (mounted) {
      setState(() {
        _imageFile = File(imageFile.path);
        _faces = faces;
        _loadImage(File(imageFile.path));
      });
    }
  }
 
  _loadImage(File file) async {
    final data = await file.readAsBytes();
    await decodeImageFromList(data).then((value) => setState(() {
          _image = value;
          isLoading = false;
        }));
  }
}
 
// paint the face
class FacePainter extends CustomPainter {
  final ui.Image image;
  final List<Face> faces;
  final List<Rect> rects = [];
 
  FacePainter(this.image, this.faces) {
    for (var i = 0; i < faces.length; i++) {
      rects.add(faces[i].boundingBox);
    }
  }
 
  @override
  void paint(ui.Canvas canvas, ui.Size size) {
    final Paint paint = Paint()
      ..style = PaintingStyle.stroke
      ..strokeWidth = 2.0
      ..color = Colors.red;
 
    canvas.drawImage(image, Offset.zero, Paint());
    for (var i = 0; i < faces.length; i++) {
      canvas.drawRect(rects[i], paint);
    }
  }
 
  @override
  bool shouldRepaint(FacePainter old) {
    return image != old.image  || faces != old.faces;
  }
}


Output:



Last Updated : 28 Feb, 2023
Like Article
Save Article
Previous
Next
Share your thoughts in the comments
Similar Reads