当前位置:网站首页>Super simple integration HMS ml kit face detection to achieve cute stickers
Super simple integration HMS ml kit face detection to achieve cute stickers
2022-07-29 05:49:00 【Quantify NPC】
Preface
Here beauty is truth 、 The era of national entertainment , Cute and interesting face stickers have been widely used in beauty software , Now it's not limited to camera beauty software , Socially 、 Entertainment app Chinese face stickers 、AR The demand for stickers is also very wide . This paper introduces the integration of Huawei in detail HMS ML kit Face recognition 2d Sticker integration process , In the following article, we will introduce 3D Sticker development process , Welcome to pay attention
scene
In the beauty camera 、 The beautiful app And the social class app( Tiktok 、 Microblogging 、 WeChat ) When you need to take photos , Or processing photos app Will build their own unique sticker needs .
Preparation before development
At the project level gradle Add Huawei maven warehouse
open AndroidStudio Project level build.gradle file
Incrementally add the following maven Address :
buildscript {
{
maven {url 'http://developer.huawei.com/repo/'}
}
}
allprojects {
repositories {
maven { url 'http://developer.huawei.com/repo/'}
}
}
At the application level build.gradle Inside plus SDK rely on
// Face detection SDK.
implementation 'com.huawei.hms:ml-computer-vision-face:2.0.1.300'
// Face detection model.
implementation 'com.huawei.hms:ml-computer-vision-face-shape-point-model:2.0.1.300'
stay AndroidManifest.xml In the file, apply for a camera 、 Access to network and storage rights
<!-- Camera permissions -->
<uses-feature android:name="android.hardware.camera" />
<uses-permission android:name="android.permission.CAMERA" />
<!-- Write permissions -->
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<!-- Read permission -->
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
Key steps in code development
Set up the face detector
MLFaceAnalyzerSetting detectorOptions;
detectorOptions = new MLFaceAnalyzerSetting.Factory()
.setFeatureType(MLFaceAnalyzerSetting.TYPE_UNSUPPORT_FEATURES)
.setShapeType(MLFaceAnalyzerSetting.TYPE_SHAPES)
.allowTracing(MLFaceAnalyzerSetting.MODE_TRACING_FAST)
.create();
detector = MLAnalyzerFactory.getInstance().getFaceAnalyzer(detectorOptions);
Here we get the camera frame data through the camera callback , Write the face contour and call the face detector FacePointEngine For sticker filter .
@Override
public void onPreviewFrame(final byte[] imgData, final Camera camera) {
int width = mPreviewWidth;
int height = mPreviewHeight;
long startTime = System.currentTimeMillis();
// Set the camera in the same direction
if (isFrontCamera()){
mOrientation = 0;
}else {
mOrientation = 2;
}
MLFrame.Property property =
new MLFrame.Property.Creator()
.setFormatType(ImageFormat.NV21)
.setWidth(width)
.setHeight(height)
.setQuadrant(mOrientation)
.create();
ByteBuffer data = ByteBuffer.wrap(imgData);
// Call the face detection interface
SparseArray<MLFace> faces = detector.analyseFrame(MLFrame.fromByteBuffer(data,property));
// Determine whether the face information is obtained
if(faces.size()>0){
MLFace mLFace = faces.get(0);
EGLFace EGLFace = FacePointEngine.getInstance().getOneFace(0);
EGLFace.pitch = mLFace.getRotationAngleX();
EGLFace.yaw = mLFace.getRotationAngleY();
EGLFace.roll = mLFace.getRotationAngleZ() - 90;
if (isFrontCamera())
EGLFace.roll = -EGLFace.roll;
if (EGLFace.vertexPoints == null) {
EGLFace.vertexPoints = new PointF[131];
}
int index = 0;
// Get the coordinates of a person's contour points and convert them to openGL Floating point values in normalized coordinates
for (MLFaceShape contour : mLFace.getFaceShapeList()) {
if (contour == null) {
continue;
}
List<MLPosition> points = contour.getPoints();
for (int i = 0; i < points.size(); i++) {
MLPosition point = points.get(i);
float x = ( point.getY() / height) * 2 - 1;
float y = ( point.getX() / width ) * 2 - 1;
if (isFrontCamera())
x = -x;
PointF Point = new PointF(x,y);
EGLFace.vertexPoints[index] = Point;
index++;
}
}
// Insert face object
FacePointEngine.getInstance().putOneFace(0, EGLFace);
// Set the number of faces
FacePointEngine.getInstance().setFaceSize(faces!= null ? faces.size() : 0);
}else{
FacePointEngine.getInstance().clearAll();
}
long endTime = System.currentTimeMillis();
Log.d("TAG","Face detect time: " + String.valueOf(endTime - startTime));
}
ML kit The face contour points returned by the interface are shown in the figure :
How to design stickers , First look at the number of stickers JSON Data definition
public class FaceStickerJson {
public int[] centerIndexList; // Central coordinate index list , It could be multiple key points, computing centers
public float offsetX; // Relative to the center coordinates of the sticker x Axis offset pixels
public float offsetY; // Relative to the center coordinates of the sticker y Axis offset pixels
public float baseScale; // Sticker base zoom factor
public int startIndex; // Face start index , Used to calculate the width of the face
public int endIndex; // Face end index , Used to calculate the width of the face
public int width; // Sticker width
public int height; // Sticker height
public int frames; // Sticker frame number
public int action; // action ,0 Indicates the default display of , It's used to deal with sticker actions and so on
public String stickerName; // Sticker name , Used to mark the folder where the sticker is located and png Of documents
public int duration; // Sticker frame display interval
public boolean stickerLooping; // Whether the sticker is circular rendering
public int maxCount; // Maximum sticker rendering times
...
}
We make cat ear stickers JSON file , Find the brow center through the face index 84 Point and tip of nose 85 On the ear and nose, respectively , Then put it and the picture on assets Under the table of contents
{
"stickerList": [{
"type": "sticker",
"centerIndexList": [84],
"offsetX": 0.0,
"offsetY": 0.0,
"baseScale": 1.3024,
"startIndex": 11,
"endIndex": 28,
"width": 495,
"height": 120,
"frames": 2,
"action": 0,
"stickerName": "nose",
"duration": 100,
"stickerLooping": 1,
"maxcount": 5
}, {
"type": "sticker",
"centerIndexList": [83],
"offsetX": 0.0,
"offsetY": -1.1834,
"baseScale": 1.3453,
"startIndex": 11,
"endIndex": 28,
"width": 454,
"height": 150,
"frames": 2,
"action": 0,
"stickerName": "ear",
"duration": 100,
"stickerLooping": 1,
"maxcount": 5
}]
}
Here we render the sticker texture, we use GLSurfaceView, Compared with TextureView Simple , First, in the onSurfaceChanged Instantiate the sticker filter , Pass in the sticker path and turn on the camera
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
GLES30.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
mTextures = new int[1];
mTextures[0] = OpenGLUtils.createOESTexture();
mSurfaceTexture = new SurfaceTexture(mTextures[0]);
mSurfaceTexture.setOnFrameAvailableListener(this);
// take samplerExternalOES Input into the texture
cameraFilter = new CameraFilter(this.context);
// Set up assets Directory under the face sticker path
String folderPath ="cat";
stickerFilter = new FaceStickerFilter(this.context,folderPath);
// Create a screen filter object
screenFilter = new BaseFilter(this.context);
facePointsFilter = new FacePointsFilter(this.context);
mEGLCamera.openCamera();
}
And then in onSurfaceChanged Initialize the sticker filter
@Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
Log.d(TAG, "onSurfaceChanged. width: " + width + ", height: " + height);
int previewWidth = mEGLCamera.getPreviewWidth();
int previewHeight = mEGLCamera.getPreviewHeight();
if (width > height) {
setAspectRatio(previewWidth, previewHeight);
} else {
setAspectRatio(previewHeight, previewWidth);
}
// Set the size of the screen , establish FrameBuffer, Set the display size
cameraFilter.onInputSizeChanged(previewWidth, previewHeight);
cameraFilter.initFrameBuffer(previewWidth, previewHeight);
cameraFilter.onDisplaySizeChanged(width, height);
stickerFilter.onInputSizeChanged(previewHeight, previewWidth);
stickerFilter.initFrameBuffer(previewHeight, previewWidth);
stickerFilter.onDisplaySizeChanged(width, height);
screenFilter.onInputSizeChanged(previewWidth, previewHeight);
screenFilter.initFrameBuffer(previewWidth, previewHeight);
screenFilter.onDisplaySizeChanged(width, height);
facePointsFilter.onInputSizeChanged(previewHeight, previewWidth);
facePointsFilter.onDisplaySizeChanged(width, height);
mEGLCamera.startPreview(mSurfaceTexture);
}
Finally through onDrawFrame Draw the sticker to the screen
@Override
public void onDrawFrame(GL10 gl) {
int textureId;
// Clear screen and depth cache
GLES30.glClear(GLES30.GL_COLOR_BUFFER_BIT | GLES30.GL_DEPTH_BUFFER_BIT);
// Update to get a picture
mSurfaceTexture.updateTexImage();
// obtain SurfaceTexture Transformation matrix
mSurfaceTexture.getTransformMatrix(mMatrix);
// Set the camera display conversion matrix
cameraFilter.setTextureTransformMatrix(mMatrix);
// Draw the camera texture
textureId = cameraFilter.drawFrameBuffer(mTextures[0],mVertexBuffer,mTextureBuffer);
// Paint the sticker texture
textureId = stickerFilter.drawFrameBuffer(textureId,mVertexBuffer,mTextureBuffer);
// Draw to the screen
screenFilter.drawFrame(textureId , mDisplayVertexBuffer, mDisplayTextureBuffer);
if(drawFacePoints){
facePointsFilter.drawFrame(textureId, mDisplayVertexBuffer, mDisplayTextureBuffer);
}
}
So our stickers are painted on our faces .
Demo effect
边栏推荐
猜你喜欢
DAY13:文件上传漏洞
How to survive in the bear market of encryption market?
Shanzhai coin Shib has a US $548.6 million stake in eth whale's portfolio - traders should be on guard
如何在加密市场熊市中生存?
"Shandong University mobile Internet development technology teaching website construction" project training log V
DeFi 2.0的LaaS协议,重振DeFi赛道发展的关键
ReportingService WebService Form身份验证
From starfish OS' continued deflationary consumption of SFO, the value of SFO in the long run
Build msys2 environment with win10
裸金属云FASS高性能弹性块存储解决方案
随机推荐
Related knowledge of elastic box
ThinkPHP6 输出二维码图片格式 解决与 Debug 的冲突
7 月 28 日 ENS/USD 价值预测:ENS 吸引巨额利润
Get the number of daffodils
The Platonic metauniverse advocated by musk has long been verified by platofarm
Training log III of "Shandong University mobile Internet development technology teaching website construction" project
DAY15(DAY16拓展):文件包含漏洞
Fvuln automated web vulnerability detection tool
QT layout management -- Part stretch principle and sizepolicy
Fantom (FTM) 在 FOMC会议之前飙升 45%
H5 semantic label
超简单集成HMS ML Kit 人脸检测实现可爱贴纸
How to survive in the bear market of encryption market?
中海油集团,桌面云&网盘存储系统应用案例
深度学习的趣味app简单优化(适合新手)
QPalette学习笔记
获取水仙花数
MOVE PROTOCOL全球健康宣言,将健康运动进行到底
与张小姐的春夏秋冬(3)
Crypto giants all in metauniverse, and platofarm may break through