diff --git a/Codelabs/MLKit/.gitignore b/Codelabs/MLKit/.gitignore index d75c03b..603b140 100644 --- a/Codelabs/MLKit/.gitignore +++ b/Codelabs/MLKit/.gitignore @@ -12,4 +12,3 @@ /captures .externalNativeBuild .cxx -/gradle diff --git a/Codelabs/MLKit/README.md b/Codelabs/MLKit/README.md deleted file mode 100644 index c2ebd12..0000000 --- a/Codelabs/MLKit/README.md +++ /dev/null @@ -1,124 +0,0 @@ -## mlkit-example - - -## Table of Contents - - * [Introduction](#introduction) - * [Preparation](#preparation) - * [Installation](#installation) - * [Experience Different Functions](#experience-different-functions) - * [Supported Environments](#supported-environments) - * [License](#license) - - -## Introduction - The sample code mainly shows the use of Huawei Machine Learning SDK. - - Including face recognition, text recognition(bankcard recognition, ID card recognition, general card recognition), image classification, landmark recognition, object detection and tracking, translation, language detection, product visual search, image segmentation. - - It includes both camera capture video for real-time detection and still image recognition. - - Ability called by the sample: - 1. Face Recognition - a. MLAnalyzerFactory.getInstance().GetFaceAnalyzer (MLFaceAnalyzerSetting): Create a face recognizer. This is the most core class of face recognition. - b. MLFaceAnalyzer.setTransactor(): Set the face recognition result processor for subsequent processing of the recognition result. - c. MLFaceAnalyzerSetting.Factory().SetFeatureType (MLFaceAnalyzerSetting.TYPE_FEATURES): Turn on facial expression and feature detection, including smile, eyes open, beard and age. - d. MLFaceAnalyzerSetting.Factory().AllowTracing (): Whether to start face tracking mode - e. LensEngine: camera source that generates continuous image data for detection. - 2. Text Recognition - a. MLAnalyzerFactory.getInstance().getLocalTextAnalyzer():Create a device text recognizer. - b. MLAnalyzerFactory.getInstance().getRemoteTextAnalyzer():Create a cloud text recognizer. - c. MLAnalyzerFactory.getInstance().getRemoteDocumentAnalyzer():Create a cloud document recognizer. - d. MLTextAnalyzer.asyncAnalyseFrame(frame): Parse text information in pictures. - e. MLDocumentAnalyzer.asyncAnalyseFrame(frame): Parse document information in pictures. - f. MLText.getBlocks(): Get text blocks. Generally, a text block represents one line. There is also a case where a text block corresponds to multiple lines. - g. MLText.Block.getContents(): Get list of text lines(MLText.TextLine). - h. MLText.TextLine.getContents(): Get the text content of each line(MLText.Word, The device text analyzer returns contains spaced, the cloud text analyzer does not). - i. MLText.Word.getStringValue(): Gets the word of each line. - j. MLDocument.getBlocks(): Get document blocks. Generally, a document block represents multiple paragraphs(MLDocument.Block). - k. MLDocument.getSections(): Get list of document paragraphs(MLDocument.Section). - l. MLDocument.getLineList(): Get list of document lines(MLDocument.Line). - m. MLDocument.getWordList(): Get list of document words(MLDocument.Word). - 3. Image Classification - a. MLAnalyzerFactory.getInstance().getLocalImageClassificationAnalyzer(setting):Create a device image classifier. - b. MLAnalyzerFactory.getInstance().getRemoteImageClassificationAnalyzer():Create a cloud image classifier. - c. MLImageClassificationAnalyzer.asyncAnalyseFrame(frame): Classify images and generate a MLImageClassification collection, which indicates the category to which the image belongs. - d. MLImageClassification.getName():Get the name of the image category, such as pen, phone, computer, etc. - 4. Object Detection And Tracking - a. MLAnalyzerFactory.getInstance().getLocalObjectAnalyzer(setting):Creating an object analyzer. - b. MLObjectAnalyzerSetting.Factory.setAnalyzerType(MLObjectAnalyzerSetting.TYPE_VIDEO): Set the recognition mode. - c. MLOject.getTypePossibility: Get the category name of the object. - d. MLOject.getTypeIdentity():Get the ID number of the object. - e. LensEngine:camera source that generates continuous image data for detection. - 5. Landmark Detection - a. MLAnalyzerFactory.getInstance().getRemoteLandmarkAnalyzer(settings):Create a landmark analyzer. - b. MLRemoteLandmarkAnalyzerSetting.Factory.setLargestNumOfReturns():Set the maximum number of detection results. - c. MLRemoteLandmarkAnalyzerSetting.Factory.setPatternType():Set detection mode. - d. MLRemoteLandmarkAnalyzer.asyncAnalyseFrame(frame): Parse out all landmark information contained in the picture. - 6. Translation - a. MLTranslatorFactory.getInstance().getRemoteTranslator(settings):Create a translator. - b. MLRemoteTranslateSetting.Factory.setSourceLangId():Set source language ID. - c. MLRemoteTranslateSetting.Factory.setTargetLangId():Set target language ID. - d. MLRemoteTranslator.asyncTranslate(sourceText): Parse out text from source language to target language, sourceText indicates the language to be detected. - 7. Language Detection - a. MLLangDetectorFactory.getInstance().getRemoteLangDetector(settings):Create a language detector. - b. MLRemoteLangDetectorSetting.Factory.setTrustedThreshold():Set the minimum confidence threshold for language detection. - c. MLRemoteLangDetector.firstBestDetect(sourceText): - d. MLRemoteLangDetector.probabilityDetect(sourceText): Returns the language code with the highest confidence, sourceText represents the language to be detected. - 8. Product Visual Search - a. MLAnalyzerFactory.getInstance().getRemoteProductVisionSearchAnalyzer(settings):Create a product visual search analyzer. - b. MLRemoteProductVisionSearchAnalyzerSetting.Factory.setLargestNumOfReturns():Set the maximum number of detection results. - c. MLRemoteProductVisionSearchAnalyzer.asyncAnalyseFrame(frame): Parse out all product information contained in the picture. - 9. Image Segmentation - a. MLAnalyzerFactory.getInstance().getImageSegmentationAnalyzer(settings):Create a image segment analyzer. - b. MLImageSegmentationSetting.Factory.setExact():Set detection mode, true is fine detection mode, false is speed priority detection mode. - c. MLImageSegmentationAnalyzer.asyncAnalyseFrame(frame): Parse out all target contained in the picture. - d. LensEngine:camera source that generates continuous image data for detection. - 10. ID card recognition - a. new MLCnIcrCapture.Callback() { }: Create the recognition result callback function and reload the onSuccess, onCanceled, onFailure, and onDenied functions. - b. public void onSuccess(MLCnIcrCaptureResult idCardResult){ }: Get notification of recognition results, where you can process the results. - c. new MLCnIcrCaptureConfig.Factory().setFront(isFront).setRemote(isRemote).create(): Set the recognition parameters for calling the capture API of the recognizer. - d. MLCnIcrCaptureFactory.getInstance().getIcrCapture(this.config): Create a new detector, pass in the detector configuration. - e. icrCapture.capture(idCallBack ,this): Call the detection interface to get the ID card information. - f. MLCardAnalyzerFactory.getInstance().getIcrAnalyzer(): Create a ID card recognition analyzer. - g. MLIcrAnalyzerSetting.Factory().setSideType(MLIcrAnalyzerSetting.FRONT): Set the front or back side of an ID card. - h. MLRemoteIcrAnalyzer.asyncAnalyseFrame(frame): Parse out all target on cloud contained in the picture. - i. MLIcrAnalyzer.asyncAnalyseFrame(frame): Parse out all target on device contained in the picture. - 11. Bankcard recognition - a. new MLBcrCapture.Callback() { }: Create the recognition result callback function and reload the onSuccess, onCanceled, onFailure, and onDenied functions. - b. public void onSuccess(MLBcrCaptureResult cardResult){ }: Get notification of recognition results, where you can process the results. - c. new MLBcrCaptureConfig.Factory().setFront(isFront).setRemote(isRemote).create(): Set the recognition parameters for calling the capture API of the recognizer. - d. MLBcrCaptureFactory.getInstance().getBcrCapture(this.config): Create a new detector, pass in the detector configuration. - e. MLBcrCapture.capture(callback ,this): Call the detection interface to get the bank card information. - f. MLCardAnalyzerFactory.getInstance().getBcrAnalyzer(): Create a bank card recognition analyzer. - g. MLBcrAnalyzerSetting.Factory().setLangType("cn"): Set the language code of bank card. - h. MLBcrAnalyzer.asyncAnalyseFrame(frame): Parse out all target contained in the picture. - 12. General card recognition - a. new MLGcrCapture.Callback() { }: Create the recognition result callback function and reload the onResult, onCanceled, onFailure, and onDenied functions. - b. public int onResult(MLGcrCaptureResult result, Object object) { }: Get notification of recognition results, where you can process the results. - c. MLGcrCaptureConfig.Factory().create(): Set the recognition parameters for calling the capture API of the recognizer. - d. MLGcrCaptureUIConfig.Factory().setScanBoxCornerColor(Color.BLUE): Set the scan UI color of general card recognition. - e. MLGcrCaptureUIConfig.Factory().setTipText("Taking picture, align edges"): Set the scan tips of general card recognition. - f. MLGcrCaptureUIConfig.Factory().setOrientation(MLGcrCaptureUIConfig.ORIENTATION_AUTO): Set the scan screen rotation of general card recognition. - g. MLGcrCaptureFactory.getInstance().getUcrCapture(cardConfig, uiConfig): Create a general card recognition analyzer. - h. MLGcrCapture.capturePhoto(this, object, callback): Call the detection interface to get the general card information. - -## Preparation -### 1. Register as a developer. - Register a [HUAWEI account](https://developer.huawei.com/consumer/en/). -### 2. Create an app and apply for a agconnect-services.json. - Create an app and set Package type to APK (Android app). Apply for an agconnect-services.json file in the developer alliance(https://developer.huawei.com/consumer/en/doc/development/HMS-Guides/ml-preparations4). -### 3. Build - To build this demo, please first import the demo in the Android Studio (3.x+). Then download the file "agconnect-services.json" of the app on AGC, and add the file to the app root directory(\app) of the demo. - -## Installation - Download the sample code and open in android Studio - -## Experience Different Functions - You can change the main activity in the manifest to experience the different features provided by MLKit - -## Supported Environments - android 4.4 or a later version is recommended. - -## License - ML Kit example is licensed under the [Apache License, version 2.0](http://www.apache.org/licenses/LICENSE-2.0). \ No newline at end of file diff --git a/Codelabs/MLKit/app/LICENSE b/Codelabs/MLKit/app/LICENSE deleted file mode 100644 index 490b5c7..0000000 --- a/Codelabs/MLKit/app/LICENSE +++ /dev/null @@ -1,53 +0,0 @@ -Apache License - -Version 2.0, January 2004 - -http://www.apache.org/licenses/ - -TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION - -1. Definitions. - -"License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. - -"Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. - -"Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. - -"You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. - -"Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. - -"Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. - -"Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). - -"Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. - -"Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." - -"Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. - -2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. - -3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. - -4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: - -You must give any other recipients of the Work or Derivative Works a copy of this License; and -You must cause any modified files to carry prominent notices stating that You changed the files; and -You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and -If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. - -You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. -5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. - -6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. - -7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. - -8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. - -9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. - -END OF TERMS AND CONDITIONS \ No newline at end of file diff --git a/Codelabs/MLKit/app/README.md b/Codelabs/MLKit/app/README.md new file mode 100644 index 0000000..6ee91a1 --- /dev/null +++ b/Codelabs/MLKit/app/README.md @@ -0,0 +1,31 @@ +## face-demo + + +## Table of Contents + + * [Introduction](#introduction) + * [Installation](#installation) + * [Supported Environments](#supported-environments) + * [License](#license) + + +## Introduction + The sample code describes how to use the face detection service provided by the HMS Core ML SDK to recognize facial features and facial expressions. + + Ability called by the demo: + + 1. MLAnalyzerFactory.getInstance (). GetFaceAnalyzer (MLFaceAnalyzerSetting setting): Create a face recognizer. Creates a face analyzer. + 2. MLFaceAnalyzer.selectTransactor (): Sets a face detection result processor for subsequent processing of the result. + 3. MLFaceAnalyzerSetting.Factory (). SetFeatureType (MLFaceAnalyzerSetting.TYPE_FEATURES): Enables facial expression and feature detection, including smiling, the possibility of opening the eyes, possibility of wearing a beard, and age. + 4. MLFaceAnalyzerSetting.Factory (). AllowTracing (): Indicates whether to enable the face tracking mode. + 5. LensEngine: Camera source that used for generating continuous image data for detection. + +## Installation + Download the HUAWEI-HMS-MLKit-Sample code and open it in Android Studio. Ensure that your device has been connected to the Internet and obtain the APK by building a project. + +## Supported Environments + Devices with Android 4.4 or later are recommended. + +## License + The face detection sample of HUAWEI ML Kit has obtained the [Apache 2.0 license](http://www.apache.org/licenses/LICENSE-2.0). + diff --git a/Codelabs/MLKit/app/Third Party Open Source Software Notice.docx b/Codelabs/MLKit/app/Third Party Open Source Software Notice.docx deleted file mode 100644 index cd62ad5..0000000 Binary files a/Codelabs/MLKit/app/Third Party Open Source Software Notice.docx and /dev/null differ diff --git a/Codelabs/MLKit/app/build.gradle b/Codelabs/MLKit/app/build.gradle index 491fe55..d0c41f2 100644 --- a/Codelabs/MLKit/app/build.gradle +++ b/Codelabs/MLKit/app/build.gradle @@ -1,15 +1,14 @@ apply plugin: 'com.android.application' + android { compileSdkVersion 29 - buildToolsVersion "29.0.2" + buildToolsVersion "28.0.3" defaultConfig { - // "Replace the app package name with your own app package name on AppGallery Connect (AGC). - // Keep the same as the package_name in agconnect-services.json." - applicationId "com.huawei.mlkit.example" + applicationId "com.huawei.mlkit.face.demo" minSdkVersion 19 targetSdkVersion 29 - versionCode 1 - versionName "1.0" + versionCode rootProject.ext.mlVersionCode as int + versionName rootProject.ext.mlVersionName testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner" } buildTypes { @@ -18,48 +17,30 @@ android { proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro' } } + aaptOptions { noCompress "tflite", "mnn", "cambricon" cruncherEnabled false useNewCruncher false } + + repositories { + flatDir { + dirs 'libs' + } + } } dependencies { implementation fileTree(dir: 'libs', include: ['*.aar']) implementation 'androidx.appcompat:appcompat:1.1.0' - - implementation 'com.huawei.hms:ml-computer-vision-ocr:1.0.3.300' - implementation 'com.huawei.hms:ml-computer-vision-ocr-cn-model:1.0.3.300' - implementation 'com.huawei.hms:ml-computer-vision-ocr-jk-model:1.0.3.300' - implementation 'com.huawei.hms:ml-computer-vision-ocr-latin-model:1.0.3.300' - implementation 'com.huawei.hms:ml-computer-card-gcr-plugin:1.0.3.300' - - implementation 'com.huawei.hms:ml-computer-vision-segmentation:1.0.3.300' - implementation 'com.huawei.hms:ml-computer-vision-image-segmentation-body-model:1.0.3.300' - implementation 'com.huawei.hms:ml-computer-vision-image-segmentation-multiclass-model:1.0.3.300' - - implementation 'com.huawei.hms:ml-computer-vision-classification:1.0.3.300' - implementation 'com.huawei.hms:ml-computer-vision-image-classification-model:1.0.3.300' - - implementation 'com.huawei.hms:ml-computer-vision-object:1.0.3.300' - implementation 'com.huawei.hms:ml-computer-vision-object-detection-model:1.0.3.300' - - implementation 'com.huawei.hms:ml-computer-vision-face:1.0.3.300' - implementation 'com.huawei.hms:ml-computer-vision-face-recognition-model:1.0.3.300' - - implementation 'com.huawei.hms:ml-computer-translate:1.0.3.300' - - implementation 'com.huawei.hms:ml-computer-language-detection:1.0.3.300' - - implementation 'com.huawei.hms:ml-computer-vision-bcr:1.0.3.303' - implementation 'com.huawei.hms:ml-computer-card-bcr-model:1.0.3.300' - implementation 'com.huawei.hms:ml-computer-card-bcr-plugin:1.0.3.300' - - implementation 'com.huawei.hms:ml-computer-vision-icr:1.0.3.300' - implementation 'com.huawei.hms:ml-computer-card-icr-cn-model:1.0.3.300' - implementation 'com.huawei.hms:ml-computer-card-icr-cn-plugin:1.0.3.300' - implementation 'com.huawei.hms:ml-computer-card-qa-plugin:1.0.3.300' + implementation 'androidx.constraintlayout:constraintlayout:1.1.3' + // Face detection SDK. + implementation 'com.huawei.hms:ml-computer-vision-face:1.0.4.300' + // Face detection model. + implementation 'com.huawei.hms:ml-computer-vision-face-emotion-model:1.0.4.300' + implementation 'com.huawei.hms:ml-computer-vision-face-feature-model:1.0.4.300' + implementation 'com.huawei.hms:ml-computer-vision-face-shape-point-model:1.0.4.300' } -apply plugin: 'com.huawei.agconnect' // HUAWEI agconnect Gradle plugin \ No newline at end of file +apply plugin: 'com.huawei.agconnect' // HUAWEI agconnect Gradle plugin diff --git a/Codelabs/MLKit/app/proguard-rules.pro b/Codelabs/MLKit/app/proguard-rules.pro index 973f0e1..f1b4245 100644 --- a/Codelabs/MLKit/app/proguard-rules.pro +++ b/Codelabs/MLKit/app/proguard-rules.pro @@ -19,12 +19,3 @@ # If you keep the line number information, uncomment this to # hide the original source file name. #-renamesourcefileattribute SourceFile --ignorewarnings --keepattributes *Annotation* --keepattributes Exceptions --keepattributes InnerClasses --keepattributes Signature --keepattributes SourceFile,LineNumberTable --keep class com.hianalytics.android.**{*;} --keep class com.huawei.updatesdk.**{*;} --keep class com.huawei.hms.**{*;} \ No newline at end of file diff --git a/Codelabs/MLKit/app/sample-agconnect-services.json b/Codelabs/MLKit/app/sample-agconnect-services.json deleted file mode 100644 index 2dc7c19..0000000 --- a/Codelabs/MLKit/app/sample-agconnect-services.json +++ /dev/null @@ -1 +0,0 @@ -// For details, please refer the 'Preparation' in the README.md. \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/AndroidManifest.xml b/Codelabs/MLKit/app/src/main/AndroidManifest.xml index 855e770..7432dea 100644 --- a/Codelabs/MLKit/app/src/main/AndroidManifest.xml +++ b/Codelabs/MLKit/app/src/main/AndroidManifest.xml @@ -1,53 +1,26 @@ + package="com.huawei.mlkit.face.demo"> - - - + - - - - - - + - - - - - - - - - - - - - - - + tools:replace="android:allowBackup" + android:theme="@style/Theme.AppCompat"> + + + \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/assets/Country_pair_new.txt b/Codelabs/MLKit/app/src/main/assets/Country_pair_new.txt deleted file mode 100644 index 76441ab..0000000 --- a/Codelabs/MLKit/app/src/main/assets/Country_pair_new.txt +++ /dev/null @@ -1,49 +0,0 @@ -Afrikaans af -Arabic ar -Bulgarian bg -Bengali bn -Catalan ca -Czech cs -Welsh cy -Danish da -German de -Greek el -English en -Spanish es -Estonian et -Finnish fi -French fr -Gujarati gu -Hebrew he -Hindi hi -Croatian hr -Hungarian hu -Indonesian id -Italian it -Japanese ja -Kannada kn -Korean ko -Lithuanian lt -Macedonian mk -Malayalam ml -Marathi mr -Dutch nl -Punjabi pa -Polish pl -Portuguese pt -Romanian ro -Russian ru -Slovak sk -Slovenian sl -Somali so -Albanian sq -Swedish sv -Tamil ta -Telugu te -Thai th -Tagalog tl -Turkish tr -Ukrainian uk -Urdu ur -Vietnamese vi -Chinese zh \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/IDCard/IcrAnalyseActivity.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/IDCard/IcrAnalyseActivity.java deleted file mode 100644 index 650d221..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/IDCard/IcrAnalyseActivity.java +++ /dev/null @@ -1,301 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.IDCard; - -import android.Manifest; -import android.content.pm.PackageManager; -import android.graphics.Bitmap; -import android.os.Bundle; -import android.util.Log; -import android.view.View; -import android.widget.ImageView; -import android.widget.TextView; - -import androidx.appcompat.app.AppCompatActivity; -import androidx.core.app.ActivityCompat; - -import com.huawei.hmf.tasks.OnFailureListener; -import com.huawei.hmf.tasks.OnSuccessListener; -import com.huawei.hmf.tasks.Task; -import com.huawei.hms.mlplugin.card.icr.cn.MLCnIcrCapture; -import com.huawei.hms.mlplugin.card.icr.cn.MLCnIcrCaptureConfig; -import com.huawei.hms.mlplugin.card.icr.cn.MLCnIcrCaptureFactory; -import com.huawei.hms.mlplugin.card.icr.cn.MLCnIcrCaptureResult; -import com.huawei.hms.mlsdk.card.MLCardAnalyzerFactory; -import com.huawei.hms.mlsdk.card.icr.MLIcrAnalyzer; -import com.huawei.hms.mlsdk.card.icr.MLIcrAnalyzerSetting; -import com.huawei.hms.mlsdk.card.icr.MLIdCard; -import com.huawei.hms.mlsdk.card.icr.cloud.MLRemoteIcrAnalyzer; -import com.huawei.hms.mlsdk.card.icr.cloud.MLRemoteIcrAnalyzerSetting; -import com.huawei.hms.mlsdk.common.MLFrame; -import com.huawei.mlkit.example.R; - -import java.io.IOException; - -/** - * It provides the identification function of the second-generation ID card of Chinese residents, - * and recognizes formatted text information from the images with ID card information. - * ID Card identification provides on-cloud and on-device API. - */ -public class IcrAnalyseActivity extends AppCompatActivity implements View.OnClickListener { - private static final String TAG = "IcrAnalyse"; - private int CAMERA_PERMISSION_CODE = 100; - private TextView mTextView; - - private boolean isFront; - - private ImageView previewImageFront; - - private ImageView previewImageBack; - - private MLIcrAnalyzer localAnalyzer; - - private MLRemoteIcrAnalyzer remoteIcrAnalyzer; - - private Bitmap cardFront; - - private Bitmap cardBack; - - private String cardResultFront = ""; - - private String cardResultBack = ""; - - @Override - protected void onCreate(Bundle savedInstanceState) { - super.onCreate(savedInstanceState); - this.setContentView(R.layout.activity_image_icr_analyse); - this.mTextView = this.findViewById(R.id.text_result); - this.previewImageFront = this.findViewById(R.id.IDCard_image_front); - this.previewImageBack = this.findViewById(R.id.IDCard_image_back); - this.previewImageFront.setScaleType(ImageView.ScaleType.FIT_XY); - this.previewImageBack.setScaleType(ImageView.ScaleType.FIT_XY); - this.findViewById(R.id.detect).setOnClickListener(this); - this.previewImageFront.setOnClickListener(this); - this.previewImageBack.setOnClickListener(this); - if (!(ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED)) { - this.requestCameraPermission(); - } - } - - private void requestCameraPermission() { - final String[] permissions = new String[] {Manifest.permission.CAMERA}; - - if (!ActivityCompat.shouldShowRequestPermissionRationale(this, Manifest.permission.CAMERA)) { - ActivityCompat.requestPermissions(this, permissions, this.CAMERA_PERMISSION_CODE); - return; - } - } - - @Override - public void onClick(View v) { - switch (v.getId()) { - case R.id.detect: - this.localAnalyzer(); - break; - case R.id.IDCard_image_front: - this.isFront = true; - this.mTextView.setText(""); - this.startCaptureActivity(this.idCallBack, true, false); - break; - case R.id.IDCard_image_back: - this.isFront = false; - this.mTextView.setText(""); - this.startCaptureActivity(this.idCallBack, false, false); - break; - default: - break; - } - } - - /** - * Icr analyse on the cloud. If you want to use product search analyzer, - * you need to apply for an agconnect-services.json file in the developer - * alliance(https://developer.huawei.com/consumer/en/doc/development/HMS-Guides/ml-preparations4), - * replacing the sample-agconnect-services.json in the project. - */ - private void remoteAnalyzer() { - if (this.cardFront == null) { - this.mTextView.setText("Please take the front photo of IDCard."); - return; - } - // Use customized parameter settings for cloud-based recognition. - MLRemoteIcrAnalyzerSetting setting = - new MLRemoteIcrAnalyzerSetting.Factory() - .setSideType(MLRemoteIcrAnalyzerSetting.FRONT) - .create(); - this.remoteIcrAnalyzer = MLCardAnalyzerFactory.getInstance().getRemoteIcrAnalyzer(setting); - // Create an MLFrame by using the bitmap. Recommended image size: large than 512*512. - Bitmap bitmap = this.cardFront; - MLFrame frame = MLFrame.fromBitmap(bitmap); - Task task = this.remoteIcrAnalyzer.asyncAnalyseFrame(frame); - task.addOnSuccessListener(new OnSuccessListener() { - @Override - public void onSuccess(MLIdCard mlIdCard) { - // Recognition success. - IcrAnalyseActivity.this.displaySuccess(mlIdCard, true); - } - }).addOnFailureListener(new OnFailureListener() { - @Override - public void onFailure(Exception e) { - // Recognition failure. - IcrAnalyseActivity.this.displayFailure(); - } - }); - } - - private void localAnalyzer() { - if (this.cardFront == null) { - this.mTextView.setText("Please take the front photo of IDCard."); - return; - } - // Use customized parameter settings for device-based recognition. - MLIcrAnalyzerSetting setting = new MLIcrAnalyzerSetting.Factory() - .setSideType(MLIcrAnalyzerSetting.FRONT) - .create(); - this.localAnalyzer = MLCardAnalyzerFactory.getInstance().getIcrAnalyzer(setting); - // Create an MLFrame by using the bitmap. Recommended image size: large than 512*512. - Bitmap bitmap = this.cardFront; - MLFrame frame = MLFrame.fromBitmap(bitmap); - Task task = this.localAnalyzer.asyncAnalyseFrame(frame); - task.addOnSuccessListener(new OnSuccessListener() { - @Override - public void onSuccess(MLIdCard mlIdCard) { - // Recognition success. - IcrAnalyseActivity.this.displaySuccess(mlIdCard, true); - } - }).addOnFailureListener(new OnFailureListener() { - @Override - public void onFailure(Exception e) { - // Recognition failure. - IcrAnalyseActivity.this.displayFailure(); - } - }); - } - - private void displaySuccess(MLIdCard mlIdCard, boolean isFront) { - StringBuilder resultBuilder = new StringBuilder(); - if (isFront) { - resultBuilder.append("Name:" + mlIdCard.getName() + "\r\n"); - resultBuilder.append("Sex:" + mlIdCard.getSex() + "\r\n"); - resultBuilder.append("IDNum: " + mlIdCard.getIdNum() + "\r\n"); - } else { - resultBuilder.append("ValidDate: " + mlIdCard.getValidDate() + "\r\n"); - } - this.mTextView.setText(resultBuilder.toString()); - } - - private String formatIdCardResult(MLCnIcrCaptureResult idCardResult, boolean isFront) { - StringBuilder resultBuilder = new StringBuilder(); - if (isFront) { - resultBuilder.append("Name:" + idCardResult.name + "\r\n"); - resultBuilder.append("Sex:" + idCardResult.sex + "\r\n"); - resultBuilder.append("IDNum: " + idCardResult.idNum + "\r\n"); - } else { - resultBuilder.append("ValidDate: " + idCardResult.validDate + "\r\n"); - } - return resultBuilder.toString(); - } - - private void displayFailure() { - this.mTextView.setText("Failure"); - } - - @Override - protected void onDestroy() { - super.onDestroy(); - if (this.remoteIcrAnalyzer != null) { - try { - this.remoteIcrAnalyzer.stop(); - } catch (IOException e) { - Log.d(TAG, "stop exception."); - } - } - if (this.localAnalyzer != null) { - try { - this.localAnalyzer.stop(); - } catch (IOException e) { - Log.d(IcrAnalyseActivity.TAG, "Stop failed:" + e.getMessage()); - } - } - } - - /** - * Use the Chinese second-generation ID card pre-processing plug-in to identify video stream ID cards. - * Create a recognition result callback function to process the identification result of the ID card. - */ - private MLCnIcrCapture.CallBack idCallBack = new MLCnIcrCapture.CallBack() { - // Identify successful processing. - @Override - public void onSuccess(MLCnIcrCaptureResult idCardResult) { - Log.i(IcrAnalyseActivity.TAG, "IdCallBack onRecSuccess"); - if (idCardResult == null) { - Log.i(IcrAnalyseActivity.TAG, "IdCallBack onRecSuccess idCardResult is null"); - return; - } - Bitmap bitmap = idCardResult.cardBitmap; - if (IcrAnalyseActivity.this.isFront) { - IcrAnalyseActivity.this.cardFront = bitmap; - IcrAnalyseActivity.this.previewImageFront.setImageBitmap(bitmap); - IcrAnalyseActivity.this.cardResultFront = - IcrAnalyseActivity.this.formatIdCardResult(idCardResult, true); - } else { - IcrAnalyseActivity.this.cardBack = bitmap; - IcrAnalyseActivity.this.previewImageBack.setImageBitmap(bitmap); - IcrAnalyseActivity.this.cardResultBack = - IcrAnalyseActivity.this.formatIdCardResult(idCardResult, false); - } - if (!IcrAnalyseActivity.this.cardResultFront.equals("") - && !IcrAnalyseActivity.this.cardResultBack.equals("")) { - IcrAnalyseActivity.this.mTextView.setText(IcrAnalyseActivity.this.cardResultFront); - IcrAnalyseActivity.this.mTextView.append(IcrAnalyseActivity.this.cardResultBack); - } - } - - // User cancellation processing. - @Override - public void onCanceled() { - Log.i(IcrAnalyseActivity.TAG, "IdCallBackonRecCanceled"); - } - - // Identify failure processing. - @Override - public void onFailure(int recCode, Bitmap bitmap) { - Log.i(IcrAnalyseActivity.TAG, "IdCallBackonRecFailed"); - } - - // Camera unavailable processing, the reason that the camera is unavailable is generally that the user has not been granted camera permissions. - @Override - public void onDenied() { - Log.i(IcrAnalyseActivity.TAG, "IdCallBackonCameraDenied"); - } - }; - - /** - * Set the recognition parameters, call the recognizer capture interface for recognition, and the recognition result will be returned through the callback function. - * @param callback The callback of ID cards analyse. - * @param isFront Whether it is the front of the ID card. - * @param isRemote Whether to use the on-cloud model, this parameter is always false when using the pre-processing plug-in scenario. - */ - private void startCaptureActivity(MLCnIcrCapture.CallBack callback, boolean isFront, boolean isRemote) { - MLCnIcrCaptureConfig config = new MLCnIcrCaptureConfig.Factory() - .setFront(isFront) - .setRemote(isRemote) - .create(); - MLCnIcrCapture icrCapture = MLCnIcrCaptureFactory.getInstance().getIcrCapture(config); - icrCapture.capture(callback, this); - } -} \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/MainActivity.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/MainActivity.java deleted file mode 100644 index 4c688e5..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/MainActivity.java +++ /dev/null @@ -1,105 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example; - -import android.content.Intent; -import android.os.Bundle; -import android.view.View; -import androidx.appcompat.app.AppCompatActivity; - -import com.huawei.mlkit.example.bankCard.BcrAnalyseActivity; -import com.huawei.mlkit.example.classification.ImageClassificationAnalyseActivity; -import com.huawei.mlkit.example.document.ImageDocumentAnalyseActivity; -import com.huawei.mlkit.example.face.LiveFaceAnalyseActivity; -import com.huawei.mlkit.example.face.StillFaceAnalyseActivity; -import com.huawei.mlkit.example.generalCard.GcrAnalyseActivity; -import com.huawei.mlkit.example.IDCard.IcrAnalyseActivity; -import com.huawei.mlkit.example.imgseg.ImageSegmentationAnalyseActivity; -import com.huawei.mlkit.example.landmark.ImageLandmarkAnalyseActivity; -import com.huawei.mlkit.example.object.LiveObjectAnalyseActivity; -import com.huawei.mlkit.example.productvisionsearch.ProductVisionSearchAnalyseActivity; -import com.huawei.mlkit.example.text.ImageTextAnalyseActivity; -import com.huawei.mlkit.example.translate.TranslatorActivity; - -public class MainActivity extends AppCompatActivity implements View.OnClickListener { - - @Override - protected void onCreate(Bundle savedInstanceState) { - super.onCreate(savedInstanceState); - this.setContentView(R.layout.activity_main); - this.findViewById(R.id.btn_face_live).setOnClickListener(this); - this.findViewById(R.id.btn_face_image).setOnClickListener(this); - this.findViewById(R.id.btn_text).setOnClickListener(this); - this.findViewById(R.id.btn_object).setOnClickListener(this); - this.findViewById(R.id.btn_document).setOnClickListener(this); - this.findViewById(R.id.btn_classification).setOnClickListener(this); - this.findViewById(R.id.btn_landmark).setOnClickListener(this); - this.findViewById(R.id.btn_translate).setOnClickListener(this); - this.findViewById(R.id.btn_productvisionsearch).setOnClickListener(this); - this.findViewById(R.id.btn_imageseg).setOnClickListener(this); - this.findViewById(R.id.btn_icr).setOnClickListener(this); - this.findViewById(R.id.btn_bcr).setOnClickListener(this); - this.findViewById(R.id.btn_gcr).setOnClickListener(this); - } - - @Override - public void onClick(View v) { - switch (v.getId()) { - case R.id.btn_face_live: - this.startActivity(new Intent(MainActivity.this, LiveFaceAnalyseActivity.class)); - break; - case R.id.btn_face_image: - this.startActivity(new Intent(MainActivity.this, StillFaceAnalyseActivity.class)); - break; - case R.id.btn_classification: - this.startActivity(new Intent(MainActivity.this, ImageClassificationAnalyseActivity.class)); - break; - case R.id.btn_object: - this.startActivity(new Intent(MainActivity.this, LiveObjectAnalyseActivity.class)); - break; - case R.id.btn_document: - this.startActivity(new Intent(MainActivity.this, ImageDocumentAnalyseActivity.class)); - break; - case R.id.btn_landmark: - this.startActivity(new Intent(MainActivity.this, ImageLandmarkAnalyseActivity.class)); - break; - case R.id.btn_text: - this.startActivity(new Intent(MainActivity.this, ImageTextAnalyseActivity.class)); - break; - case R.id.btn_translate: - this.startActivity(new Intent(MainActivity.this, TranslatorActivity.class)); - break; - case R.id.btn_productvisionsearch: - this.startActivity(new Intent(MainActivity.this, ProductVisionSearchAnalyseActivity.class)); - break; - case R.id.btn_imageseg: - this.startActivity(new Intent(MainActivity.this, ImageSegmentationAnalyseActivity.class)); - break; - case R.id.btn_icr: - this.startActivity(new Intent(MainActivity.this, IcrAnalyseActivity.class)); - break; - case R.id.btn_bcr: - this.startActivity(new Intent(MainActivity.this, BcrAnalyseActivity.class)); - break; - case R.id.btn_gcr: - this.startActivity(new Intent(MainActivity.this, GcrAnalyseActivity.class)); - break; - default: - break; - } - } -} diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/bankCard/BcrAnalyseActivity.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/bankCard/BcrAnalyseActivity.java deleted file mode 100644 index 0d59f49..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/bankCard/BcrAnalyseActivity.java +++ /dev/null @@ -1,171 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.bankCard; - -import android.Manifest; -import android.content.pm.PackageManager; -import android.graphics.Bitmap; -import android.os.Bundle; -import android.util.Log; -import android.view.View; -import android.widget.ImageView; -import android.widget.TextView; - -import androidx.appcompat.app.AppCompatActivity; -import androidx.core.app.ActivityCompat; - -import com.huawei.hmf.tasks.OnFailureListener; -import com.huawei.hmf.tasks.OnSuccessListener; -import com.huawei.hmf.tasks.Task; -import com.huawei.hms.mlplugin.card.bcr.MLBcrCapture; -import com.huawei.hms.mlplugin.card.bcr.MLBcrCaptureConfig; -import com.huawei.hms.mlplugin.card.bcr.MLBcrCaptureFactory; -import com.huawei.hms.mlplugin.card.bcr.MLBcrCaptureResult; -import com.huawei.hms.mlsdk.card.MLCardAnalyzerFactory; -import com.huawei.hms.mlsdk.card.bcr.MLBankCard; -import com.huawei.hms.mlsdk.card.bcr.MLBcrAnalyzer; -import com.huawei.hms.mlsdk.card.bcr.MLBcrAnalyzerSetting; -import com.huawei.hms.mlsdk.common.MLFrame; -import com.huawei.mlkit.example.R; - -/** - * It provides the identification function of the bank card, - * and recognizes formatted text information from the images with bank card information. - * Bank Card identification provides on-device API. - */ -public class BcrAnalyseActivity extends AppCompatActivity implements View.OnClickListener { - private static final String TAG = "BcrAnalyse"; - private int CAMERA_PERMISSION_CODE = 100; - private int READ_EXTERNAL_STORAGE_CODE = 100; - - private TextView mTextView; - - private ImageView previewImage; - - private String cardResultFront = ""; - - @Override - protected void onCreate(Bundle savedInstanceState) { - super.onCreate(savedInstanceState); - this.setContentView(R.layout.activity_image_bcr_analyse); - this.mTextView = this.findViewById(R.id.text_result); - this.previewImage = this.findViewById(R.id.Bank_Card_image); - this.previewImage.setScaleType(ImageView.ScaleType.FIT_XY); - this.findViewById(R.id.detect).setOnClickListener(this); - this.previewImage.setOnClickListener(this); - if (!(ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED)) { - this.requestCameraPermission(); - } - if (!(ActivityCompat.checkSelfPermission(this, Manifest.permission.READ_EXTERNAL_STORAGE) == PackageManager.PERMISSION_GRANTED)) { - this.requestCameraPermission(); - } - } - - private void requestCameraPermission() { - final String[] permissions = new String[]{Manifest.permission.CAMERA, Manifest.permission.READ_EXTERNAL_STORAGE}; - - if (!ActivityCompat.shouldShowRequestPermissionRationale(this, Manifest.permission.CAMERA)) { - ActivityCompat.requestPermissions(this, permissions, this.CAMERA_PERMISSION_CODE); - } - if (!ActivityCompat.shouldShowRequestPermissionRationale(this, Manifest.permission.READ_EXTERNAL_STORAGE)) { - ActivityCompat.requestPermissions(this, permissions, this.READ_EXTERNAL_STORAGE_CODE); - } - } - - @Override - public void onClick(View v) { - this.mTextView.setText(""); - this.startCaptureActivity(this.banCallback); - } - - private String formatIdCardResult(MLBcrCaptureResult bankCardResult) { - StringBuilder resultBuilder = new StringBuilder(); - - resultBuilder.append("Number:"); - resultBuilder.append(bankCardResult.getNumber()); - resultBuilder.append("\r\n"); - - Log.i(BcrAnalyseActivity.TAG, "front result: " + resultBuilder.toString()); - return resultBuilder.toString(); - } - - private void displayFailure() { - this.mTextView.setText("Failure"); - } - - @Override - protected void onDestroy() { - super.onDestroy(); - } - - /** - * Use the bank card pre-processing plug-in to identify video stream bank cards. - * Create a recognition result callback function to process the identification result of the card. - */ - private MLBcrCapture.Callback banCallback = new MLBcrCapture.Callback() { - // Identify successful processing. - @Override - public void onSuccess(MLBcrCaptureResult bankCardResult) { - Log.i(BcrAnalyseActivity.TAG, "CallBack onRecSuccess"); - if (bankCardResult == null) { - Log.i(BcrAnalyseActivity.TAG, "CallBack onRecSuccess idCardResult is null"); - return; - } - Bitmap bitmap = bankCardResult.getOriginalBitmap(); - BcrAnalyseActivity.this.previewImage.setImageBitmap(bitmap); - BcrAnalyseActivity.this.cardResultFront = BcrAnalyseActivity.this.formatIdCardResult(bankCardResult); - BcrAnalyseActivity.this.mTextView.setText(BcrAnalyseActivity.this.cardResultFront); - } - - // User cancellation processing. - @Override - public void onCanceled() { - Log.i(BcrAnalyseActivity.TAG, "CallBackonRecCanceled"); - } - - // Identify failure processing. - @Override - public void onFailure(int recCode, Bitmap bitmap) { - BcrAnalyseActivity.this.displayFailure(); - Log.i(BcrAnalyseActivity.TAG, "CallBackonRecFailed"); - } - - @Override - public void onDenied() { - BcrAnalyseActivity.this.displayFailure(); - Log.i(BcrAnalyseActivity.TAG, "CallBackonCameraDenied"); - } - }; - - /** - * Set the recognition parameters, call the recognizer capture interface for recognition, - * and the recognition result will be returned through the callback function. - * - * @param Callback The callback of band cards analyse. - */ - private void startCaptureActivity(MLBcrCapture.Callback Callback) { - MLBcrCaptureConfig config = new MLBcrCaptureConfig.Factory() - // Set the screen orientation of the plugin page. - // MLBcrCaptureConfig.ORIENTATION_AUTO: Adaptive mode, the display direction is determined by the physical sensor. - // MLBcrCaptureConfig.ORIENTATION_LANDSCAPE: Horizontal screen. - // MLBcrCaptureConfig.ORIENTATION_PORTRAIT: Vertical screen. - .setOrientation(MLBcrCaptureConfig.ORIENTATION_AUTO) - .create(); - MLBcrCapture bcrCapture = MLBcrCaptureFactory.getInstance().getBcrCapture(config); - bcrCapture.captureFrame(this, Callback); - } -} \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/camera/GraphicOverlay.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/camera/GraphicOverlay.java deleted file mode 100644 index 2e84105..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/camera/GraphicOverlay.java +++ /dev/null @@ -1,196 +0,0 @@ -/* - * Copyright (C) The Android Open Source Project - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - * 2019.12.15-Changed modify to use LensEngine - * Huawei Technologies Co., Ltd. - */ - -package com.huawei.mlkit.example.camera; - -import android.content.Context; -import android.graphics.Canvas; -import android.util.AttributeSet; -import android.view.View; - -import com.huawei.hms.mlsdk.common.LensEngine; - -import java.util.HashSet; -import java.util.Set; - -/** - * A view which renders a series of custom graphics to be overlayed on top of an associated preview - * (i.e., the camera preview). The creator can add graphics objects, update the objects, and remove - * them, triggering the appropriate drawing and invalidation within the view. - *

- * Supports scaling and mirroring of the graphics relative to the camera's preview properties. The - * idea is that detection items are expressed in terms of a preview size, but need to be scaled up - * to the full view size, and also mirrored in the case of the front-facing camera. - *

- * Associated {@link Graphic} items should use the following methods to convert to view coordinates - * for the graphics that are drawn: - *

    - *
  1. {@link Graphic#scaleX(float)} and {@link Graphic#scaleY(float)} adjust the size of the - * supplied value from the preview scale to the view scale.
  2. - *
  3. {@link Graphic#translateX(float)} and {@link Graphic#translateY(float)} adjust the coordinate - * from the preview's coordinate system to the view coordinate system.
  4. - *
- */ -public class GraphicOverlay extends View { - private final Object mLock = new Object(); - - private int mPreviewWidth; - - private float mWidthScaleFactor = 1.0f; - - private int mPreviewHeight; - - private float mHeightScaleFactor = 1.0f; - - private int mFacing = LensEngine.BACK_LENS; - - private Set mGraphics = new HashSet<>(); - - /** - * Base class for a custom graphics object to be rendered within the graphic overlay. Subclass - * this and implement the {@link Graphic#draw(Canvas)} method to define the - * graphics element. Add instances to the overlay using {@link GraphicOverlay#add(Graphic)}. - */ - public static abstract class Graphic { - private GraphicOverlay mOverlay; - - public Graphic(GraphicOverlay overlay) { - mOverlay = overlay; - } - - /** - * Draw the graphic on the supplied canvas. Drawing should use the following methods to - * convert to view coordinates for the graphics that are drawn: - *
    - *
  1. {@link Graphic#scaleX(float)} and {@link Graphic#scaleY(float)} adjust the size of - * the supplied value from the preview scale to the view scale.
  2. - *
  3. {@link Graphic#translateX(float)} and {@link Graphic#translateY(float)} adjust the - * coordinate from the preview's coordinate system to the view coordinate system.
  4. - *
- * - * @param canvas drawing canvas - */ - public abstract void draw(Canvas canvas); - - /** - * Adjusts a horizontal value of the supplied value from the preview scale to the view - * scale. - */ - public float scaleX(float horizontal) { - return horizontal * mOverlay.mWidthScaleFactor; - } - - /** - * Adjusts a vertical value of the supplied value from the preview scale to the view scale. - */ - public float scaleY(float vertical) { - return vertical * mOverlay.mHeightScaleFactor; - } - - /** - * Adjusts the x coordinate from the preview's coordinate system to the view coordinate - * system. - */ - public float translateX(float x) { - if (mOverlay.mFacing == LensEngine.FRONT_LENS) { - return mOverlay.getWidth() - scaleX(x); - } else { - return scaleX(x); - } - } - - /** - * Adjusts the y coordinate from the preview's coordinate system to the view coordinate - * system. - */ - public float translateY(float y) { - return scaleY(y); - } - - public void postInvalidate() { - mOverlay.postInvalidate(); - } - } - - public GraphicOverlay(Context context, AttributeSet attrs) { - super(context, attrs); - } - - /** - * Removes all graphics from the overlay. - */ - public void clear() { - synchronized (mLock) { - mGraphics.clear(); - } - postInvalidate(); - } - - /** - * Adds a graphic to the overlay. - */ - public void add(Graphic graphic) { - synchronized (mLock) { - mGraphics.add(graphic); - } - postInvalidate(); - } - - /** - * Removes a graphic from the overlay. - */ - public void remove(Graphic graphic) { - synchronized (mLock) { - mGraphics.remove(graphic); - } - postInvalidate(); - } - - /** - * Sets the camera attributes for size and facing direction, which informs how to transform - * image coordinates later. - */ - public void setCameraInfo(int previewWidth, int previewHeight, int facing) { - synchronized (mLock) { - mPreviewWidth = previewWidth; - mPreviewHeight = previewHeight; - mFacing = facing; - } - postInvalidate(); - } - - /** - * Draws the overlay with its associated graphic objects. - */ - @Override - protected void onDraw(Canvas canvas) { - super.onDraw(canvas); - - synchronized (mLock) { - if ((mPreviewWidth != 0) && (mPreviewHeight != 0)) { - mWidthScaleFactor = (float) canvas.getWidth() / (float) mPreviewWidth; - mHeightScaleFactor = (float) canvas.getHeight() / (float) mPreviewHeight; - } - - for (Graphic graphic : mGraphics) { - graphic.draw(canvas); - } - } - } -} diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/camera/LensEnginePreview.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/camera/LensEnginePreview.java deleted file mode 100644 index 5ea0da8..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/camera/LensEnginePreview.java +++ /dev/null @@ -1,202 +0,0 @@ -/* - * Copyright (C) The Android Open Source Project - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - * - * 2019.12.15-Changed modify to use LensEngine - * Huawei Technologies Co., Ltd. - */ - -package com.huawei.mlkit.example.camera; - -import android.content.Context; -import android.content.res.Configuration; -import android.util.AttributeSet; -import android.util.Log; -import android.view.SurfaceHolder; -import android.view.SurfaceView; -import android.view.ViewGroup; - -import com.huawei.hms.common.size.Size; -import com.huawei.hms.mlsdk.common.LensEngine; - -import java.io.IOException; - -public class LensEnginePreview extends ViewGroup { - private static final String TAG = "LensEnginePreview"; - - private Context mContext; - - private SurfaceView mSurfaceView; - - private boolean mStartRequested; - - private boolean mSurfaceAvailable; - - private LensEngine mLensEngine; - - private GraphicOverlay mOverlay; - - public LensEnginePreview(Context context, AttributeSet attrs) { - super(context, attrs); - this.mContext = context; - this.mStartRequested = false; - this.mSurfaceAvailable = false; - - this.mSurfaceView = new SurfaceView(context); - this.mSurfaceView.getHolder().addCallback(new SurfaceCallback()); - this.addView(this.mSurfaceView); - } - - public void start(LensEngine lensEngine) throws IOException { - if (lensEngine == null) { - this.stop(); - } - - this.mLensEngine = lensEngine; - - if (this.mLensEngine != null) { - this.mStartRequested = true; - this.startIfReady(); - } - } - - public void start(LensEngine lensEngine, GraphicOverlay overlay) throws IOException { - this.mOverlay = overlay; - this.start(lensEngine); - } - - public void stop() { - if (this.mLensEngine != null) { - this.mLensEngine.close(); - } - } - - public void release() { - if (this.mLensEngine != null) { - this.mLensEngine.release(); - this.mLensEngine = null; - } - } - - private void startIfReady() throws IOException { - if (this.mStartRequested && this.mSurfaceAvailable) { - this.mLensEngine.run(this.mSurfaceView.getHolder()); - if (this.mOverlay != null) { - Size size = this.mLensEngine.getDisplayDimension(); - int min = Math.min(size.getWidth(), size.getHeight()); - int max = Math.max(size.getWidth(), size.getHeight()); - if (this.isPortraitMode()) { - // Swap width and height sizes when in portrait, since it will be rotated by - // 90 degrees - this.mOverlay.setCameraInfo(min, max, this.mLensEngine.getLensType()); - } else { - this.mOverlay.setCameraInfo(max, min, this.mLensEngine.getLensType()); - } - this.mOverlay.clear(); - } - this.mStartRequested = false; - } - } - - private class SurfaceCallback implements SurfaceHolder.Callback { - @Override - public void surfaceCreated(SurfaceHolder surface) { - LensEnginePreview.this.mSurfaceAvailable = true; - try { - LensEnginePreview.this.startIfReady(); - } catch (IOException e) { - Log.e(LensEnginePreview.TAG, "Could not start camera source.", e); - } - } - - @Override - public void surfaceDestroyed(SurfaceHolder surface) { - LensEnginePreview.this.mSurfaceAvailable = false; - } - - @Override - public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) { - } - } - - @Override - protected void onLayout(boolean changed, int left, int top, int right, int bottom) { - int previewWidth = 320; - int previewHeight = 240; - if (this.mLensEngine != null) { - Size size = this.mLensEngine.getDisplayDimension(); - if (size != null) { - previewWidth = size.getWidth(); - previewHeight = size.getHeight(); - } - } - - // Swap width and height sizes when in portrait, since it will be rotated 90 degrees - if (this.isPortraitMode()) { - int tmp = previewWidth; - previewWidth = previewHeight; - previewHeight = tmp; - } - - final int viewWidth = right - left; - final int viewHeight = bottom - top; - - int childWidth; - int childHeight; - int childXOffset = 0; - int childYOffset = 0; - float widthRatio = (float) viewWidth / (float) previewWidth; - float heightRatio = (float) viewHeight / (float) previewHeight; - - // To fill the view with the camera preview, while also preserving the correct aspect ratio, - // it is usually necessary to slightly oversize the child and to crop off portions along one - // of the dimensions. We scale up based on the dimension requiring the most correction, and - // compute a crop offset for the other dimension. - if (widthRatio > heightRatio) { - childWidth = viewWidth; - childHeight = (int) ((float) previewHeight * widthRatio); - childYOffset = (childHeight - viewHeight) / 2; - } else { - childWidth = (int) ((float) previewWidth * heightRatio); - childHeight = viewHeight; - childXOffset = (childWidth - viewWidth) / 2; - } - - for (int i = 0; i < this.getChildCount(); ++i) { - // One dimension will be cropped. We shift child over or up by this offset and adjust - // the size to maintain the proper aspect ratio. - this.getChildAt(i).layout(-1 * childXOffset, -1 * childYOffset, childWidth - childXOffset, - childHeight - childYOffset); - } - - try { - this.startIfReady(); - } catch (IOException e) { - Log.e(LensEnginePreview.TAG, "Could not start camera source.", e); - } - } - - private boolean isPortraitMode() { - int orientation = this.mContext.getResources().getConfiguration().orientation; - if (orientation == Configuration.ORIENTATION_LANDSCAPE) { - return false; - } - if (orientation == Configuration.ORIENTATION_PORTRAIT) { - return true; - } - - Log.d(LensEnginePreview.TAG, "isPortraitMode returning false by default"); - return false; - } -} diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/classification/ImageClassificationAnalyseActivity.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/classification/ImageClassificationAnalyseActivity.java deleted file mode 100644 index 7d4bd06..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/classification/ImageClassificationAnalyseActivity.java +++ /dev/null @@ -1,147 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.classification; - -import java.io.IOException; -import java.util.List; - -import android.graphics.Bitmap; -import android.graphics.BitmapFactory; -import android.os.Bundle; -import android.util.Log; -import android.view.View; -import android.widget.TextView; - -import androidx.appcompat.app.AppCompatActivity; - -import com.huawei.hmf.tasks.OnFailureListener; -import com.huawei.hmf.tasks.OnSuccessListener; -import com.huawei.hmf.tasks.Task; -import com.huawei.hms.mlsdk.MLAnalyzerFactory; -import com.huawei.hms.mlsdk.classification.MLImageClassification; -import com.huawei.hms.mlsdk.classification.MLImageClassificationAnalyzer; -import com.huawei.hms.mlsdk.classification.MLLocalClassificationAnalyzerSetting; -import com.huawei.hms.mlsdk.classification.MLRemoteClassificationAnalyzerSetting; -import com.huawei.hms.mlsdk.common.MLFrame; -import com.huawei.mlkit.example.R; - - -public class ImageClassificationAnalyseActivity extends AppCompatActivity implements View.OnClickListener { - private static final String TAG = "ImageClassification"; - - private TextView mTextView; - - private MLImageClassificationAnalyzer analyzer; - - @Override - protected void onCreate(Bundle savedInstanceState) { - super.onCreate(savedInstanceState); - this.setContentView(R.layout.activity_image_classification_analyse); - this.mTextView = this.findViewById(R.id.classification_result); - this.findViewById(R.id.classification_detect).setOnClickListener(this); - } - - private void localAnalyzer() { - // Use customized parameter settings for device-based recognition. - MLLocalClassificationAnalyzerSetting setting = - new MLLocalClassificationAnalyzerSetting.Factory().setMinAcceptablePossibility(0.8f).create(); - this.analyzer = MLAnalyzerFactory.getInstance().getLocalImageClassificationAnalyzer(setting); - // Create an MLFrame by using the bitmap. Recommended image size: large than 112*112. - Bitmap bitmap = BitmapFactory.decodeResource(this.getResources(), R.drawable.classification_image); - MLFrame frame = MLFrame.fromBitmap(bitmap); - Task> task = this.analyzer.asyncAnalyseFrame(frame); - task.addOnSuccessListener(new OnSuccessListener>() { - @Override - public void onSuccess(List classifications) { - // Recognition success. - ImageClassificationAnalyseActivity.this.displaySuccess(classifications); - } - }).addOnFailureListener(new OnFailureListener() { - @Override - public void onFailure(Exception e) { - // Recognition failure. - ImageClassificationAnalyseActivity.this.displayFailure(); - } - }); - } - - /** - * Image classification analyzer on the cloud. If you want to use cloud image classification analyzer, - * you need to apply for an agconnect-services.json file in the developer - * alliance(https://developer.huawei.com/consumer/en/doc/development/HMS-Guides/ml-preparations4), - * replacing the sample-agconnect-services.json in the project. - */ - private void remoteAnalyzer() { - // Use customized parameter settings for device-based recognition. - MLRemoteClassificationAnalyzerSetting setting = - new MLRemoteClassificationAnalyzerSetting.Factory().setMinAcceptablePossibility(0.8f).create(); - this.analyzer = MLAnalyzerFactory.getInstance().getRemoteImageClassificationAnalyzer(setting); - Bitmap bitmap = BitmapFactory.decodeResource(this.getResources(), R.drawable.classification_image); - // Create an MLFrame by using the bitmap. - MLFrame frame = MLFrame.fromBitmap(bitmap); - Task> task = this.analyzer.asyncAnalyseFrame(frame); - task.addOnSuccessListener(new OnSuccessListener>() { - @Override - public void onSuccess(List classifications) { - // Recognition success. - ImageClassificationAnalyseActivity.this.displaySuccess(classifications); - } - }).addOnFailureListener(new OnFailureListener() { - @Override - public void onFailure(Exception e) { - // Recognition failure. - ImageClassificationAnalyseActivity.this.displayFailure(); - } - }); - } - - private void displayFailure() { - this.mTextView.setText("Failure"); - } - - private void displaySuccess(List classifications) { - String result = ""; - int count = 0; - for (MLImageClassification classification : classifications) { - count++; - if (count % 3 == 0) { - result += classification.getName() + "\n"; - } else { - result += classification.getName() + "\t\t\t\t\t\t"; - } - } - this.mTextView.setText(result); - } - - @Override - public void onClick(View v) { - this.localAnalyzer(); - } - - @Override - protected void onDestroy() { - super.onDestroy(); - if (this.analyzer == null) { - return; - } - try { - this.analyzer.stop(); - } catch (IOException e) { - Log.e(ImageClassificationAnalyseActivity.TAG, "Stop failed: " + e.getMessage()); - } - } -} diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/document/ImageDocumentAnalyseActivity.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/document/ImageDocumentAnalyseActivity.java deleted file mode 100644 index e716604..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/document/ImageDocumentAnalyseActivity.java +++ /dev/null @@ -1,133 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.document; - -import android.graphics.Bitmap; -import android.graphics.BitmapFactory; -import android.os.Bundle; -import android.util.Log; -import android.view.View; -import android.widget.TextView; - -import androidx.appcompat.app.AppCompatActivity; - -import com.huawei.hmf.tasks.OnFailureListener; -import com.huawei.hmf.tasks.OnSuccessListener; -import com.huawei.hmf.tasks.Task; -import com.huawei.hms.mlsdk.MLAnalyzerFactory; -import com.huawei.hms.mlsdk.common.MLFrame; -import com.huawei.hms.mlsdk.document.MLDocument; -import com.huawei.hms.mlsdk.document.MLDocumentAnalyzer; -import com.huawei.hms.mlsdk.document.MLDocumentSetting; -import com.huawei.hms.mlsdk.text.MLRemoteTextSetting; -import com.huawei.mlkit.example.R; - -import java.io.IOException; -import java.util.ArrayList; -import java.util.List; - -/* If you want to use document analyzer, - you need to apply for an agconnect-services.json file in the developer alliance(https://developer.huawei.com/consumer/en/doc/development/HMS-Guides/ml-preparations4), - replacing the sample-agconnect-services.json in the project. - */ -public class ImageDocumentAnalyseActivity extends AppCompatActivity implements View.OnClickListener { - private static final String TAG = "ImageDocumentAnalyse"; - - private TextView mTextView; - - private MLDocumentAnalyzer analyzer; - - @Override - protected void onCreate(Bundle savedInstanceState) { - super.onCreate(savedInstanceState); - this.setContentView(R.layout.activity_image_document_analyse); - this.mTextView = this.findViewById(R.id.document_result); - this.findViewById(R.id.document_detect).setOnClickListener(this); - } - - private void analyzer() { - // Create a document analyzer. You can create an analyzer using the provided custom document recognition - // parameter MLDocumentSetting - MLDocumentSetting setting = new MLDocumentSetting.Factory() - .setBorderType(MLRemoteTextSetting.ARC) - .setLanguageList(new ArrayList(){{this.add("zh"); this.add("en");}}) - .create(); - // Create a document analyzer that uses the customized configuration. - this.analyzer = MLAnalyzerFactory.getInstance().getRemoteDocumentAnalyzer(setting); - - // Create a document analyzer that uses the default configuration. - // analyzer = MLAnalyzerFactory.getInstance().getRemoteDocumentAnalyzer(); - // Pass the MLFrame object to the asyncAnalyseFrame method for document recognition. - Bitmap bitmap = BitmapFactory.decodeResource(this.getResources(), R.drawable.document_image); - MLFrame frame = MLFrame.fromBitmap(bitmap); - Task task = this.analyzer.asyncAnalyseFrame(frame); - task.addOnSuccessListener(new OnSuccessListener() { - @Override - public void onSuccess(MLDocument document) { - // Recognition success. - ImageDocumentAnalyseActivity.this.displaySuccess(document); - } - }).addOnFailureListener(new OnFailureListener() { - @Override - public void onFailure(Exception e) { - // Recognition failure. - ImageDocumentAnalyseActivity.this.displayFailure(); - } - }); - } - - private void displayFailure() { - this.mTextView.setText("Failure"); - } - - private void displaySuccess(MLDocument document) { - String result = ""; - List blocks = document.getBlocks(); - for (MLDocument.Block block : blocks) { - List sections = block.getSections(); - for (MLDocument.Section section : sections) { - List lines = section.getLineList(); - for (MLDocument.Line line : lines) { - List words = line.getWordList(); - for (MLDocument.Word word : words) { - result += word.getStringValue() + " "; - } - } - } - result += "\n"; - } - this.mTextView.setText(result); - } - - @Override - public void onClick(View v) { - this.analyzer(); - } - - @Override - protected void onDestroy() { - super.onDestroy(); - if (this.analyzer == null) { - return; - } - try { - this.analyzer.close(); - } catch (IOException e) { - Log.e(ImageDocumentAnalyseActivity.TAG, "Stop failed: " + e.getMessage()); - } - } -} diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/face/FaceAnalyzerTransactor.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/face/FaceAnalyzerTransactor.java deleted file mode 100644 index fa01e62..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/face/FaceAnalyzerTransactor.java +++ /dev/null @@ -1,47 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.face; - -import android.util.SparseArray; - -import com.huawei.hms.mlsdk.common.MLAnalyzer; -import com.huawei.hms.mlsdk.face.MLFace; -import com.huawei.mlkit.example.camera.GraphicOverlay; - -public class FaceAnalyzerTransactor implements MLAnalyzer.MLTransactor { - private GraphicOverlay mGraphicOverlay; - - FaceAnalyzerTransactor(GraphicOverlay ocrGraphicOverlay) { - this.mGraphicOverlay = ocrGraphicOverlay; - } - - @Override - public void transactResult(MLAnalyzer.Result result) { - this.mGraphicOverlay.clear(); - SparseArray faceSparseArray = result.getAnalyseList(); - for (int i = 0; i < faceSparseArray.size(); i++) { - MLFaceGraphic graphic = new MLFaceGraphic(this.mGraphicOverlay, faceSparseArray.valueAt(i)); - this.mGraphicOverlay.add(graphic); - } - } - - @Override - public void destroy() { - this.mGraphicOverlay.clear(); - } - -} \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/face/LiveFaceAnalyseActivity.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/face/LiveFaceAnalyseActivity.java deleted file mode 100644 index db69f49..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/face/LiveFaceAnalyseActivity.java +++ /dev/null @@ -1,184 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.face; - -import android.Manifest; -import android.content.Context; -import android.content.pm.PackageManager; -import android.os.Bundle; -import android.util.Log; -import android.view.View; - -import androidx.annotation.NonNull; -import androidx.appcompat.app.AppCompatActivity; -import androidx.core.app.ActivityCompat; - -import com.huawei.hms.mlsdk.MLAnalyzerFactory; -import com.huawei.hms.mlsdk.common.LensEngine; -import com.huawei.hms.mlsdk.face.MLFaceAnalyzer; -import com.huawei.hms.mlsdk.face.MLFaceAnalyzerSetting; -import com.huawei.mlkit.example.R; -import com.huawei.mlkit.example.camera.GraphicOverlay; -import com.huawei.mlkit.example.camera.LensEnginePreview; - -import java.io.IOException; - -public class LiveFaceAnalyseActivity extends AppCompatActivity implements View.OnClickListener { - private static final String TAG = "LiveFaceAnalyse"; - - private static final int CAMERA_PERMISSION_CODE = 2; - - private MLFaceAnalyzer analyzer; - - private LensEngine mLensEngine; - - private LensEnginePreview mPreview; - - private GraphicOverlay mOverlay; - - private int lensType = LensEngine.BACK_LENS; - - private boolean isFront = false; - - @Override - public void onCreate(Bundle savedInstanceState) { - super.onCreate(savedInstanceState); - this.setContentView(R.layout.activity_live_face_analyse); - if (savedInstanceState != null) { - this.lensType = savedInstanceState.getInt("lensType"); - } - this.mPreview = this.findViewById(R.id.preview); - this.mOverlay = this.findViewById(R.id.overlay); - this.createFaceAnalyzer(); - this.findViewById(R.id.facingSwitch).setOnClickListener(this); - // Checking Camera Permissions - if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED) { - this.createLensEngine(); - } else { - this.requestCameraPermission(); - } - } - - private void createFaceAnalyzer() { - // Create a face analyzer. You can create an analyzer using the provided customized face detection parameter - // MLFaceAnalyzerSetting - MLFaceAnalyzerSetting setting = - new MLFaceAnalyzerSetting.Factory() - .setFeatureType(MLFaceAnalyzerSetting.TYPE_FEATURES) - .setKeyPointType(MLFaceAnalyzerSetting.TYPE_KEYPOINTS) - .setMinFaceProportion(0.2f) - .allowTracing() - .create(); - this.analyzer = MLAnalyzerFactory.getInstance().getFaceAnalyzer(setting); - this.analyzer.setTransactor(new FaceAnalyzerTransactor(this.mOverlay)); - } - - private void createLensEngine() { - Context context = this.getApplicationContext(); - // Create LensEngine. Recommended image size: large than 320*320, less than 1920*1920. - this.mLensEngine = new LensEngine.Creator(context, this.analyzer) - .setLensType(this.lensType) - .applyDisplayDimension(640, 480) - .applyFps(25.0f) - .enableAutomaticFocus(true) - .create(); - } - - @Override - protected void onResume() { - super.onResume(); - this.startLensEngine(); - } - - private void startLensEngine() { - if (this.mLensEngine != null) { - try { - this.mPreview.start(this.mLensEngine, this.mOverlay); - } catch (IOException e) { - Log.e(LiveFaceAnalyseActivity.TAG, "Failed to start lens engine.", e); - this.mLensEngine.release(); - this.mLensEngine = null; - } - } - } - - @Override - protected void onPause() { - super.onPause(); - this.mPreview.stop(); - } - - @Override - protected void onDestroy() { - super.onDestroy(); - if (this.mLensEngine != null) { - this.mLensEngine.release(); - } - if (this.analyzer != null) { - try { - this.analyzer.stop(); - } catch (IOException e) { - Log.e(LiveFaceAnalyseActivity.TAG, "Stop failed: " + e.getMessage()); - } - } - } - - private void requestCameraPermission() { - final String[] permissions = new String[] {Manifest.permission.CAMERA}; - - if (!ActivityCompat.shouldShowRequestPermissionRationale(this, Manifest.permission.CAMERA)) { - ActivityCompat.requestPermissions(this, permissions, LiveFaceAnalyseActivity.CAMERA_PERMISSION_CODE); - return; - } - } - - @Override - public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, - @NonNull int[] grantResults) { - if (requestCode != LiveFaceAnalyseActivity.CAMERA_PERMISSION_CODE) { - super.onRequestPermissionsResult(requestCode, permissions, grantResults); - return; - } - if (grantResults.length != 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) { - this.createLensEngine(); - return; - } - } - - @Override - public void onSaveInstanceState(Bundle savedInstanceState) { - savedInstanceState.putInt("lensType", this.lensType); - super.onSaveInstanceState(savedInstanceState); - } - - @Override - public void onClick(View v) { - this.isFront = !this.isFront; - if (this.isFront) { - this.lensType = LensEngine.FRONT_LENS; - } else { - this.lensType = LensEngine.BACK_LENS; - } - if (this.mLensEngine != null) { - this.mLensEngine.close(); - } - this.createLensEngine(); - this.startLensEngine(); - } - - -} diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/face/MLFaceGraphic.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/face/MLFaceGraphic.java deleted file mode 100644 index aed9689..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/face/MLFaceGraphic.java +++ /dev/null @@ -1,265 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.face; - -import android.graphics.Canvas; -import android.graphics.Color; -import android.graphics.Paint; -import android.graphics.Typeface; - -import com.huawei.hms.mlsdk.common.MLPosition; -import com.huawei.hms.mlsdk.face.MLFace; -import com.huawei.hms.mlsdk.face.MLFaceKeyPoint; -import com.huawei.hms.mlsdk.face.MLFaceShape; -import com.huawei.mlkit.example.camera.GraphicOverlay; - -import java.text.DecimalFormat; -import java.util.ArrayList; -import java.util.Collections; -import java.util.Comparator; -import java.util.HashMap; -import java.util.List; -import java.util.Map; -import java.util.Set; - -public class MLFaceGraphic extends GraphicOverlay.Graphic { - private static final float BOX_STROKE_WIDTH = 8.0f; - - private static final float LINE_WIDTH = 5.0f; - - private final GraphicOverlay overlay; - - private final Paint facePositionPaint; - - private final Paint landmarkPaint; - - private final Paint boxPaint; - - private final Paint facePaint; - - private final Paint eyePaint; - - private final Paint eyebrowPaint; - - private final Paint lipPaint; - - private final Paint nosePaint; - - private final Paint noseBasePaint; - - private final Paint textPaint; - - private final Paint probilityPaint; - - private volatile MLFace mFace; - - public MLFaceGraphic(GraphicOverlay overlay, MLFace face) { - super(overlay); - - this.mFace = face; - this.overlay = overlay; - final int selectedColor = Color.WHITE; - - this.facePositionPaint = new Paint(); - this.facePositionPaint.setColor(selectedColor); - - this.textPaint = new Paint(); - this.textPaint.setColor(Color.WHITE); - this.textPaint.setTextSize(24); - this.textPaint.setTypeface(Typeface.DEFAULT); - - this.probilityPaint = new Paint(); - this.probilityPaint.setColor(Color.WHITE); - this.probilityPaint.setTextSize(35); - this.probilityPaint.setTypeface(Typeface.DEFAULT); - - this.landmarkPaint = new Paint(); - this.landmarkPaint.setColor(Color.RED); - this.landmarkPaint.setStyle(Paint.Style.FILL); - this.landmarkPaint.setStrokeWidth(10f); - - this.boxPaint = new Paint(); - this.boxPaint.setColor(Color.WHITE); - this.boxPaint.setStyle(Paint.Style.STROKE); - this.boxPaint.setStrokeWidth(MLFaceGraphic.BOX_STROKE_WIDTH); - - this.facePaint = new Paint(); - this.facePaint.setColor(Color.parseColor("#ffcc66")); - this.facePaint.setStyle(Paint.Style.STROKE); - this.facePaint.setStrokeWidth(MLFaceGraphic.LINE_WIDTH); - - this.eyePaint = new Paint(); - this.eyePaint.setColor(Color.parseColor("#00ccff")); - this.eyePaint.setStyle(Paint.Style.STROKE); - this.eyePaint.setStrokeWidth(MLFaceGraphic.LINE_WIDTH); - - this.eyebrowPaint = new Paint(); - this.eyebrowPaint.setColor(Color.parseColor("#006666")); - this.eyebrowPaint.setStyle(Paint.Style.STROKE); - this.eyebrowPaint.setStrokeWidth(MLFaceGraphic.LINE_WIDTH); - - this.nosePaint = new Paint(); - this.nosePaint.setColor(Color.parseColor("#ffff00")); - this.nosePaint.setStyle(Paint.Style.STROKE); - this.nosePaint.setStrokeWidth(MLFaceGraphic.LINE_WIDTH); - - this.noseBasePaint = new Paint(); - this.noseBasePaint.setColor(Color.parseColor("#ff6699")); - this.noseBasePaint.setStyle(Paint.Style.STROKE); - this.noseBasePaint.setStrokeWidth(MLFaceGraphic.LINE_WIDTH); - - this.lipPaint = new Paint(); - this.lipPaint.setColor(Color.parseColor("#990000")); - this.lipPaint.setStyle(Paint.Style.STROKE); - this.lipPaint.setStrokeWidth(MLFaceGraphic.LINE_WIDTH); - } - - public List sortHashMap(HashMap map) { - - Set> entey = map.entrySet(); - List> list = new ArrayList>(entey); - Collections.sort(list, new Comparator>() { - @Override - public int compare(Map.Entry o1, Map.Entry o2) { - if (o2.getValue() - o1.getValue() >= 0) { - return 1; - } else { - return -1; - } - } - }); - List emotions = new ArrayList<>(); - for (int i = 0; i < 2; i++) { - emotions.add(list.get(i).getKey()); - } - return emotions; - } - - @Override - public void draw(Canvas canvas) { - if (this.mFace == null) { - return; - } - float start = 350f; - float x = start; - float width = 500f; - float y = this.overlay.getHeight() - 300.0f; - HashMap emotions = new HashMap<>(); - emotions.put("Smiling", this.mFace.getEmotions().getSmilingProbability()); - emotions.put("Neutral", this.mFace.getEmotions().getNeutralProbability()); - emotions.put("Angry", this.mFace.getEmotions().getAngryProbability()); - emotions.put("Fear", this.mFace.getEmotions().getFearProbability()); - emotions.put("Sad", this.mFace.getEmotions().getSadProbability()); - emotions.put("Disgust", this.mFace.getEmotions().getDisgustProbability()); - emotions.put("Surprise", this.mFace.getEmotions().getSurpriseProbability()); - List result = this.sortHashMap(emotions); - - DecimalFormat decimalFormat = new DecimalFormat("0.000"); - // Draw the facial feature value. - canvas.drawText("Left eye: " + decimalFormat.format(this.mFace.getFeatures().getLeftEyeOpenProbability()), x, y, - this.probilityPaint); - x = x + width; - canvas.drawText("Right eye: " + decimalFormat.format(this.mFace.getFeatures().getRightEyeOpenProbability()), x, y, - this.probilityPaint); - y = y - 40.0f; - x = start; - canvas.drawText("Moustache Probability: " + decimalFormat.format(this.mFace.getFeatures().getMoustacheProbability()), - x, y, this.probilityPaint); - x = x + width; - canvas.drawText("Glass Probability: " + decimalFormat.format(this.mFace.getFeatures().getSunGlassProbability()), x, - y, this.probilityPaint); - y = y - 40.0f; - x = start; - canvas.drawText("Hat Probability: " + decimalFormat.format(this.mFace.getFeatures().getHatProbability()), x, y, - this.probilityPaint); - x = x + width; - canvas.drawText("Age: " + this.mFace.getFeatures().getAge(), x, y, this.probilityPaint); - y = y - 40.0f; - x = start; - String sex = (this.mFace.getFeatures().getSexProbability() > 0.5f) ? "Female" : "Male"; - canvas.drawText("Gender: " + sex, x, y, this.probilityPaint); - x = x + width; - canvas.drawText("RotationAngleY: " + decimalFormat.format(this.mFace.getRotationAngleY()), x, y, this.probilityPaint); - y = y - 40.0f; - x = start; - canvas.drawText("RotationAngleZ: " + decimalFormat.format(this.mFace.getRotationAngleZ()), x, y, this.probilityPaint); - x = x + width; - canvas.drawText("RotationAngleX: " + decimalFormat.format(this.mFace.getRotationAngleX()), x, y, this.probilityPaint); - y = y - 40.0f; - x = start; - canvas.drawText(result.get(0), x, y, this.probilityPaint); - - // Draw a face contour. - if (this.mFace.getFaceShapeList() != null) { - for (MLFaceShape faceShape : this.mFace.getFaceShapeList()) { - if (faceShape == null) { - continue; - } - List points = faceShape.getPoints(); - for (int i = 0; i < points.size(); i++) { - MLPosition point = points.get(i); - canvas.drawPoint(this.translateX(point.getX().floatValue()), this.translateY(point.getY().floatValue()), - this.boxPaint); - if (i != (points.size() - 1)) { - MLPosition next = points.get(i + 1); - if (point != null && point.getX() != null && point.getY() != null) { - if (i % 3 == 0) { - canvas.drawText(i + 1 + "", this.translateX(point.getX().floatValue()), - this.translateY(point.getY().floatValue()), this.textPaint); - } - canvas.drawLines(new float[] {this.translateX(point.getX().floatValue()), - this.translateY(point.getY().floatValue()), this.translateX(next.getX().floatValue()), - this.translateY(next.getY().floatValue())}, this.getPaint(faceShape)); - } - } - } - } - } - // Face Key Points - for (MLFaceKeyPoint keyPoint : this.mFace.getFaceKeyPoints()) { - if (keyPoint != null) { - MLPosition point = keyPoint.getPoint(); - canvas.drawCircle(this.translateX(point.getX()), this.translateY(point.getY()), 10f, this.landmarkPaint); - } - } - } - - private Paint getPaint(MLFaceShape faceShape) { - switch (faceShape.getFaceShapeType()) { - case MLFaceShape.TYPE_LEFT_EYE: - case MLFaceShape.TYPE_RIGHT_EYE: - return this.eyePaint; - case MLFaceShape.TYPE_BOTTOM_OF_LEFT_EYEBROW: - - case MLFaceShape.TYPE_BOTTOM_OF_RIGHT_EYEBROW: - case MLFaceShape.TYPE_TOP_OF_LEFT_EYEBROW: - case MLFaceShape.TYPE_TOP_OF_RIGHT_EYEBROW: - return this.eyebrowPaint; - case MLFaceShape.TYPE_BOTTOM_OF_LOWER_LIP: - case MLFaceShape.TYPE_TOP_OF_LOWER_LIP: - case MLFaceShape.TYPE_BOTTOM_OF_UPPER_LIP: - case MLFaceShape.TYPE_TOP_OF_UPPER_LIP: - return this.lipPaint; - case MLFaceShape.TYPE_BOTTOM_OF_NOSE: - return this.noseBasePaint; - case MLFaceShape.TYPE_BRIDGE_OF_NOSE: - return this.nosePaint; - default: - return this.facePaint; - } - } -} \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/face/StillFaceAnalyseActivity.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/face/StillFaceAnalyseActivity.java deleted file mode 100644 index 7a555e9..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/face/StillFaceAnalyseActivity.java +++ /dev/null @@ -1,122 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.face; - -import android.graphics.Bitmap; -import android.graphics.BitmapFactory; -import android.os.Bundle; -import android.util.Log; -import android.view.View; -import android.widget.TextView; - -import androidx.appcompat.app.AppCompatActivity; - -import com.huawei.hmf.tasks.OnFailureListener; -import com.huawei.hmf.tasks.OnSuccessListener; -import com.huawei.hmf.tasks.Task; -import com.huawei.hms.mlsdk.MLAnalyzerFactory; -import com.huawei.hms.mlsdk.common.MLFrame; -import com.huawei.hms.mlsdk.face.MLFace; -import com.huawei.hms.mlsdk.face.MLFaceAnalyzer; -import com.huawei.mlkit.example.R; - -import java.io.IOException; -import java.text.DecimalFormat; -import java.util.List; - -/** - * Static image detection - */ -public class StillFaceAnalyseActivity extends AppCompatActivity implements View.OnClickListener { - private static final String TAG = "StillFaceAnalyse"; - - private TextView mTextView; - - private MLFaceAnalyzer analyzer; - - @Override - protected void onCreate(Bundle savedInstanceState) { - super.onCreate(savedInstanceState); - this.setContentView(R.layout.activity_still_face_analyse); - this.mTextView = this.findViewById(R.id.result); - this.findViewById(R.id.face_detect).setOnClickListener(this); - } - - private void analyzer() { - // Use default parameter settings. - this.analyzer = MLAnalyzerFactory.getInstance().getFaceAnalyzer(); - // Create an MLFrame by using the bitmap. Recommended image size: large than 320*320, less than 1920*1920. - Bitmap bitmap = BitmapFactory.decodeResource(this.getResources(), R.drawable.face_image); - MLFrame frame = MLFrame.fromBitmap(bitmap); - // Call the asyncAnalyseFrame method to perform face detection - Task> task = this.analyzer.asyncAnalyseFrame(frame); - task.addOnSuccessListener(new OnSuccessListener>() { - @Override - public void onSuccess(List faces) { - // Detection success. - if (faces.size() > 0) { - StillFaceAnalyseActivity.this.displaySuccess(faces.get(0)); - } - } - }).addOnFailureListener(new OnFailureListener() { - @Override - public void onFailure(Exception e) { - // Detection failure. - StillFaceAnalyseActivity.this.displayFailure(); - } - }); - } - - private void displayFailure() { - this.mTextView.setText("Failure"); - } - - private void displaySuccess(MLFace mFace) { - DecimalFormat decimalFormat = new DecimalFormat("0.000"); - String result = - "Left eye open Probability: " + decimalFormat.format(mFace.getFeatures().getLeftEyeOpenProbability()); - result += - "\nRight eye open Probability: " + decimalFormat.format(mFace.getFeatures().getRightEyeOpenProbability()); - result += "\nMoustache Probability: " + decimalFormat.format(mFace.getFeatures().getMoustacheProbability()); - result += "\nGlass Probability: " + decimalFormat.format(mFace.getFeatures().getSunGlassProbability()); - result += "\nHat Probability: " + decimalFormat.format(mFace.getFeatures().getHatProbability()); - result += "\nAge: " + mFace.getFeatures().getAge(); - result += "\nGender: " + ((mFace.getFeatures().getSexProbability() > 0.5f) ? "Female" : "Male"); - result += "\nRotationAngleY: " + decimalFormat.format(mFace.getRotationAngleY()); - result += "\nRotationAngleZ: " + decimalFormat.format(mFace.getRotationAngleZ()); - result += "\nRotationAngleX: " + decimalFormat.format(mFace.getRotationAngleX()); - this.mTextView.setText(result); - } - - @Override - public void onClick(View v) { - this.analyzer(); - } - - @Override - protected void onDestroy() { - super.onDestroy(); - if (this.analyzer == null) { - return; - } - try { - this.analyzer.stop(); - } catch (IOException e) { - Log.e(StillFaceAnalyseActivity.TAG, "Stop failed: " + e.getMessage()); - } - } -} diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/BlockItem.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/BlockItem.java deleted file mode 100644 index 755761e..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/BlockItem.java +++ /dev/null @@ -1,33 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.generalCard; - -import android.graphics.Rect; - -/** - * Re-encapsulate the return result of OCR in Block. - * - */ -public class BlockItem { - public final String text; - public final Rect rect; - - public BlockItem(String text, Rect rect) { - this.text = text; - this.rect = rect; - } -} \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/GcrAnalyseActivity.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/GcrAnalyseActivity.java deleted file mode 100644 index 58dbaf5..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/GcrAnalyseActivity.java +++ /dev/null @@ -1,273 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.generalCard; - -import android.Manifest; -import android.content.pm.PackageManager; -import android.graphics.Bitmap; -import android.graphics.Color; -import android.os.Bundle; -import android.util.Log; -import android.view.View; -import android.widget.ImageView; -import android.widget.TextView; - -import androidx.appcompat.app.AppCompatActivity; -import androidx.core.app.ActivityCompat; - -import com.huawei.hms.mlplugin.card.gcr.MLGcrCapture; -import com.huawei.hms.mlplugin.card.gcr.MLGcrCaptureConfig; -import com.huawei.hms.mlplugin.card.gcr.MLGcrCaptureFactory; -import com.huawei.hms.mlplugin.card.gcr.MLGcrCaptureResult; -import com.huawei.hms.mlplugin.card.gcr.MLGcrCaptureUIConfig; -import com.huawei.mlkit.example.R; - -/** - * It provides the identification function of general cards, - * and recognizes formatted text information from the images with card information. - * General card recognition is a plugin that encapsulates text recognition. - * Developers can integrate this plug-in to obtain the ability of card identification - * pre-processing (quality inspection, etc.). At the same time, developers need to implement the - * detection according to their own application scenarios Post-processing of results. - * In this example, post-processing of the Exit-Entry Permit for Travelling to and from - * Hong Kong and Macao is implemented for developers' reference - */ -public class GcrAnalyseActivity extends AppCompatActivity implements View.OnClickListener { - private static final String TAG = "GcrAnalyse"; - private int CAMERA_PERMISSION_CODE = 100; - private TextView mTextView; - - private ImageView previewImage; - - private Bitmap cardImage; - - private int processMode; - - private final int HKIDPROCESS = 1; - - private final int HOMECARDPROCESS = 2; - - private final int PASSCARDPROCESS = 3; - - @Override - protected void onCreate(Bundle savedInstanceState) { - super.onCreate(savedInstanceState); - this.setContentView(R.layout.activity_image_gcr_analyse); - this.mTextView = this.findViewById(R.id.text_result); - this.previewImage = this.findViewById(R.id.Card_image); - this.previewImage.setOnClickListener(this); - this.previewImage.setScaleType(ImageView.ScaleType.FIT_XY); - this.findViewById(R.id.detect_picture_HKID).setOnClickListener(this); - this.findViewById(R.id.detect_picture_homeCard).setOnClickListener(this); - this.findViewById(R.id.detect_picture_passCard).setOnClickListener(this); - this.findViewById(R.id.detect_video_HKID).setOnClickListener(this); - this.findViewById(R.id.detect_video_homeCard).setOnClickListener(this); - this.findViewById(R.id.detect_video_passCard).setOnClickListener(this); - if (!(ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED)) { - this.requestCameraPermission(); - } - } - - private void requestCameraPermission() { - final String[] permissions = new String[]{Manifest.permission.CAMERA}; - - if (!ActivityCompat.shouldShowRequestPermissionRationale(this, Manifest.permission.CAMERA)) { - ActivityCompat.requestPermissions(this, permissions, this.CAMERA_PERMISSION_CODE); - return; - } - } - - @Override - public void onClick(View v) { - switch (v.getId()) { - case R.id.detect_picture_passCard: - this.mTextView.setText(""); - this.processMode = this.PASSCARDPROCESS; - this.startLocalImageActivity(this.cardImage, null, this.callback); - break; - case R.id.detect_video_passCard: - this.mTextView.setText(""); - this.processMode = this.PASSCARDPROCESS; - this.startCaptureActivity(null, this.callback); - break; - case R.id.detect_picture_HKID: - this.mTextView.setText(""); - this.processMode = this.HKIDPROCESS; - this.startLocalImageActivity(this.cardImage, null, this.callback); - break; - case R.id.detect_video_HKID: - this.mTextView.setText(""); - this.processMode = this.HKIDPROCESS; - this.startCaptureActivity(null, this.callback); - break; - case R.id.detect_picture_homeCard: - this.mTextView.setText(""); - this.processMode = this.HOMECARDPROCESS; - this.startLocalImageActivity(this.cardImage, null, this.callback); - break; - case R.id.detect_video_homeCard: - this.mTextView.setText(""); - this.processMode = this.HOMECARDPROCESS; - this.startCaptureActivity(null, this.callback); - break; - case R.id.Card_image: - this.mTextView.setText(""); - this.processMode = this.PASSCARDPROCESS; - this.startTakePhotoActivity(null, this.callback); - break; - default: - break; - } - } - - private void displaySuccess(UniversalCardResult mlIdCard) { - StringBuilder resultBuilder = new StringBuilder(); - resultBuilder.append("IDNum: " + mlIdCard.number + "\r\n"); - resultBuilder.append("ValidDate: " + mlIdCard.valid + "\r\n"); - this.mTextView.setText(resultBuilder.toString()); - } - - private void displayFailure() { - this.mTextView.setText("Failure"); - } - - @Override - protected void onDestroy() { - super.onDestroy(); - } - - /** - * Use the card recognition plugin to identify cards. - * Create a recognition result callback function to process the identification result of the card. - */ - private MLGcrCapture.Callback callback = new MLGcrCapture.Callback() { - // Identify successful processing. - @Override - public int onResult(MLGcrCaptureResult result, Object object) { - Log.i(GcrAnalyseActivity.TAG, "callback onRecSuccess"); - if (result == null) { - Log.e(GcrAnalyseActivity.TAG, "callback onRecSuccess result is null"); - // If result is empty, return MLGcrCaptureResult.CAPTURE_CONTINUE, and the detector will continue to detect. - return MLGcrCaptureResult.CAPTURE_CONTINUE; - } - UniversalCardResult cardResult = null; - - switch (processMode) { - case PASSCARDPROCESS: - PassCardProcess passCard = new PassCardProcess(result.text); - if (passCard != null) { - cardResult = passCard.getResult(); - } - break; - case HKIDPROCESS: - HKIdCardProcess HKIDCard = new HKIdCardProcess(result.text); - if (HKIDCard != null) { - cardResult = HKIDCard.getResult(); - } - break; - case HOMECARDPROCESS: - HomeCardProcess homeCard = new HomeCardProcess(result.text); - if (homeCard != null) { - cardResult = homeCard.getResult(); - } - break; - default: - break; - } - - if (cardResult == null || cardResult.valid.isEmpty() || cardResult.number.isEmpty()) { - // If detection is not successful, return MLGcrCaptureResult.CAPTURE_CONTINUE, and the detector will continue to detect. - return MLGcrCaptureResult.CAPTURE_CONTINUE; - } - GcrAnalyseActivity.this.cardImage = result.cardBitmap; - GcrAnalyseActivity.this.previewImage.setImageBitmap(GcrAnalyseActivity.this.cardImage); - GcrAnalyseActivity.this.displaySuccess(cardResult); - // If detection is successful, return MLGcrCaptureResult.CAPTURE_STOP, and the detector will stop to detect. - return MLGcrCaptureResult.CAPTURE_STOP; - } - - @Override - public void onCanceled() { - Log.i(GcrAnalyseActivity.TAG, "callback onRecCanceled"); - } - - @Override - public void onFailure(int restCode, Bitmap var2) { - GcrAnalyseActivity.this.displayFailure(); - Log.i(GcrAnalyseActivity.TAG, "callback onFailure"); - } - - @Override - public void onDenied() { - GcrAnalyseActivity.this.displayFailure(); - Log.i(GcrAnalyseActivity.TAG, "callback onCameraDenied"); - } - }; - - /** - * Use the plug-in to take a picture of the card and recognize. - * - * @param object - * @param callback - */ - private void startTakePhotoActivity(Object object, MLGcrCapture.Callback callback) { - MLGcrCaptureConfig cardConfig = new MLGcrCaptureConfig.Factory().create(); - MLGcrCaptureUIConfig uiConfig = new MLGcrCaptureUIConfig.Factory() - .setScanBoxCornerColor(Color.BLUE) - .setTipText("Taking EEP to HK/Macau picture") - .setOrientation(MLGcrCaptureUIConfig.ORIENTATION_AUTO).create(); - // Create a general card identification processor using the custom interface. - MLGcrCapture ocrManager = MLGcrCaptureFactory.getInstance().getGcrCapture(cardConfig, uiConfig); - - // Create a general card identification processor using the default interface. - //MLGcrCapture ocrManager = MLGcrCaptureFactory.getInstance().getGcrCapture(cardConfig); - - ocrManager.capturePhoto(this, object, callback); - } - - /** - * Detect input card bitmap. - * - * @param bitmap - * @param object - * @param callback - */ - private void startLocalImageActivity(Bitmap bitmap, Object object, MLGcrCapture.Callback callback) { - if (bitmap == null) { - this.mTextView.setText("No card image to recognition."); - return; - } - MLGcrCaptureConfig config = new MLGcrCaptureConfig.Factory().create(); - MLGcrCapture ocrManager = MLGcrCaptureFactory.getInstance().getGcrCapture(config); - ocrManager.captureImage(bitmap, object, callback); - } - - /** - * Set the recognition parameters, call the recognizer capture interface for recognition, and the recognition result will be returned through the callback function. - * - * @param callBack The callback of cards analyse. - */ - private void startCaptureActivity(Object object, MLGcrCapture.Callback callBack) { - MLGcrCaptureConfig cardConfig = new MLGcrCaptureConfig.Factory().create(); - MLGcrCaptureUIConfig uiConfig = new MLGcrCaptureUIConfig.Factory() - .setScanBoxCornerColor(Color.GREEN) - .setTipText("Recognizing, align edges") - .setOrientation(MLGcrCaptureUIConfig.ORIENTATION_AUTO).create(); - MLGcrCapture ocrManager = MLGcrCaptureFactory.getInstance().getGcrCapture(cardConfig, uiConfig); - ocrManager.capturePreview(this, object, callBack); - } -} \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/HKIdCardProcess.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/HKIdCardProcess.java deleted file mode 100644 index fbff70a..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/HKIdCardProcess.java +++ /dev/null @@ -1,108 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.generalCard; - -import android.graphics.Point; -import android.graphics.Rect; -import android.util.Log; - -import com.huawei.hms.mlsdk.text.MLText; - -import java.util.ArrayList; -import java.util.List; - -/** - * Hong Kong Permanent Identity Card post-processing plug-in. - * - */ -public class HKIdCardProcess { - private static final String TAG = "MLGcrPluginDemo"; - - private final MLText text; - - public HKIdCardProcess(MLText text) { - this.text = text; - } - - public UniversalCardResult getResult() { - List blocks = this.text.getBlocks(); - if (blocks.isEmpty()) { - Log.i(HKIdCardProcess.TAG, "HKIdCardProcess::getResult blocks is empty"); - return null; - } - - ArrayList originItems = this.getOriginItems(blocks); - - String valid = ""; - String number = ""; - boolean numberFlag = false; - boolean validFlag = false; - - int location = 1; - for (BlockItem item : originItems) { - String tempStr = item.text; - - if (!validFlag && (originItems.size() - location) < 3) { - String result = this.tryGetValidDate(tempStr); - if (!result.isEmpty()) { - valid = result; - validFlag = true; - } - } - - if (!numberFlag) { - String result = this.tryGetCardNumber(tempStr); - if (!result.isEmpty()) { - number = result; - numberFlag = true; - } - } - location++; - } - - Log.i(HKIdCardProcess.TAG, "valid: " + valid); - Log.i(HKIdCardProcess.TAG, "number: " + number); - - return new UniversalCardResult(valid, number); - } - - private String tryGetValidDate(String originStr) { - int[] formatter = {2, 2, 2}; - return Utils.getCorrectDate(originStr, "\\-", formatter); - } - - private String tryGetCardNumber(String originStr) { - return Utils.getHKIdCardNum(originStr); - } - - private ArrayList getOriginItems(List blocks) { - ArrayList originItems = new ArrayList<>(); - for (MLText.Block block : blocks) { - List lines = block.getContents(); - for (MLText.TextLine line : lines) { - String text = line.getStringValue(); - text = Utils.filterString(text, "[^a-zA-Z0-9\\.\\-,<\\(\\)\\s]"); - Log.d(HKIdCardProcess.TAG, "HKIdCardProcess text: " + text); - Point[] points = line.getVertexes(); - Rect rect = new Rect(points[0].x, points[0].y, points[2].x, points[2].y); - BlockItem item = new BlockItem(text, rect); - originItems.add(item); - } - } - return originItems; - } -} diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/HomeCardProcess.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/HomeCardProcess.java deleted file mode 100644 index 8ee8ecc..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/HomeCardProcess.java +++ /dev/null @@ -1,106 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.generalCard; - -import android.graphics.Point; -import android.graphics.Rect; -import android.util.Log; - -import com.huawei.hms.mlsdk.text.MLText; - -import java.util.ArrayList; -import java.util.List; - -/** - * Home Return Certificate (for Compatriots from Hong Kong and Macao) post-processing plug-in. - * - */ -public class HomeCardProcess { - private static final String TAG = "MLGcrPluginDemo"; - - private final MLText text; - - public HomeCardProcess(MLText text) { - this.text = text; - } - - public UniversalCardResult getResult() { - List blocks = this.text.getBlocks(); - if (blocks.isEmpty()) { - Log.i(HomeCardProcess.TAG, "HomeCardProcess::getResult blocks is empty"); - return null; - } - - ArrayList originItems = this.getOriginItems(blocks); - - String valid = ""; - String number = ""; - boolean numberFlag = false; - boolean validFlag = false; - - for (BlockItem item : originItems) { - String tempStr = item.text; - - if (!validFlag) { - String result = this.tryGetValidDate(tempStr); - if (!result.isEmpty()) { - valid = result; - validFlag = true; - } - } - - if (!numberFlag) { - String result = this.tryGetCardNumber(tempStr); - if (!result.isEmpty()) { - number = result; - numberFlag = true; - } - } - } - - Log.i(HomeCardProcess.TAG, "valid: " + valid); - Log.i(HomeCardProcess.TAG, "number: " + number); - - return new UniversalCardResult(valid, number); - } - - private ArrayList getOriginItems(List blocks) { - ArrayList originItems = new ArrayList<>(); - - for (MLText.Block block : blocks) { - List lines = block.getContents(); - for (MLText.TextLine line : lines) { - String text = line.getStringValue(); - text = Utils.filterString(text, "[^a-zA-Z0-9\\.\\-,<\\(\\)\\s]"); - Log.d(HomeCardProcess.TAG, "HomeCardProcess text: " + text); - Point[] points = line.getVertexes(); - Rect rect = new Rect(points[0].x, points[0].y, points[2].x, points[2].y); - BlockItem item = new BlockItem(text, rect); - originItems.add(item); - } - } - return originItems; - } - - private String tryGetValidDate(String originStr) { - return Utils.getCorrectValidDate(originStr); - } - - private String tryGetCardNumber(String originStr) { - return Utils.getHomeCardNumber(originStr); - } -} \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/PassCardProcess.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/PassCardProcess.java deleted file mode 100644 index 362c6af..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/PassCardProcess.java +++ /dev/null @@ -1,120 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.generalCard; - -import java.util.ArrayList; -import java.util.List; -import java.util.Locale; - -import android.graphics.Point; -import android.graphics.Rect; -import android.util.Log; - -import com.huawei.hms.mlsdk.text.MLText; - -/** - * Post-processing plug-in for Exit-Entry Permit for Travelling to and from Hong Kong and Macao. - * - */ -public class PassCardProcess { - private static final String TAG = "MLGcrPlugin"; - - private final MLText text; - - public PassCardProcess(MLText text) { - this.text = text; - } - - // Re-encapsulate the return result of OCR in Block. - class BlockItem { - public final String text; - public final Rect rect; - - public BlockItem(String text, Rect rect) { - this.text = text; - this.rect = rect; - } - } - - public UniversalCardResult getResult() { - List blocks = this.text.getBlocks(); - if (blocks.isEmpty()) { - Log.i(PassCardProcess.TAG, "PassCardProcess::getResult blocks is empty"); - return null; - } - ArrayList originItems = this.getOriginItems(blocks); - String valid = ""; - String number = ""; - boolean validFlag = false; - boolean numberFlag = false; - - for (BlockItem item : originItems) { - String tempStr = item.text; - if (!validFlag) { - String result = this.tryGetValidDate(tempStr); - if (!result.isEmpty()) { - valid = result; - validFlag = true; - } - } - if (!numberFlag) { - String result = this.tryGetCardNumber(tempStr); - if (!result.isEmpty()) { - number = result; - numberFlag = true; - } - } - } - return new UniversalCardResult(valid, number); - } - - private String tryGetValidDate(String originStr) { - return Utils.getCorrectValidDate(originStr); - } - - private String tryGetCardNumber(String originStr) { - String result = Utils.getPassCardNumber(originStr); - if (!result.isEmpty()) { - result = result.toUpperCase(Locale.ENGLISH); - result = Utils.filterString(result, "[^0-9A-Z<]"); - } - return result; - } - - /** - * Transform the detected text into BlockItem structure. - * @param blocks - * @return - */ - private ArrayList getOriginItems(List blocks) { - ArrayList originItems = new ArrayList<>(); - - for (MLText.Block block : blocks) { - List lines = block.getContents(); - for (MLText.TextLine line : lines) { - String text = line.getStringValue(); - text = Utils.filterString(text, "[^a-zA-Z0-9\\.\\-,<\\(\\)\\s]"); - Log.d(PassCardProcess.TAG, "PassCardProcess text: " + text); - Point[] points = line.getVertexes(); - Rect rect = new Rect(points[0].x, points[0].y, points[2].x, points[2].y); - BlockItem item = new BlockItem(text, rect); - originItems.add(item); - } - } - return originItems; - } -} diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/UniversalCardResult.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/UniversalCardResult.java deleted file mode 100644 index 103c4e2..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/UniversalCardResult.java +++ /dev/null @@ -1,31 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.generalCard; - -/** - * Return result class of post-processing plugin. - * - */ -public class UniversalCardResult { - public final String valid; - public final String number; - - public UniversalCardResult(String valid, String number) { - this.valid = valid; - this.number = number; - } -} diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/Utils.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/Utils.java deleted file mode 100644 index c9435b6..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/generalCard/Utils.java +++ /dev/null @@ -1,352 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.generalCard; - -import java.util.HashMap; -import java.util.Locale; -import java.util.Map; -import java.util.regex.Matcher; -import java.util.regex.Pattern; - -/** - * Defining post-processing rules and methods for Exit-Entry Permit for Travelling to and from Hong Kong and Macao, Hong Kong Permanent Identity Card and Home Return Certificate (for Compatriots from Hong Kong and Macao). - */ -public class Utils { - private static Map letterNumberMap = new HashMap<>(); - private static Map numberLetterMap = new HashMap<>(); - - // Misrecognized letters and numbers. - static { - Utils.letterNumberMap.put("i", "1"); - Utils.letterNumberMap.put("I", "1"); - Utils.letterNumberMap.put("o", "0"); - Utils.letterNumberMap.put("O", "0"); - Utils.letterNumberMap.put("z", "2"); - Utils.letterNumberMap.put("Z", "2"); - - Utils.numberLetterMap.put("1", "I"); - Utils.numberLetterMap.put("0", "O"); - Utils.numberLetterMap.put("2", "Z"); - Utils.numberLetterMap.put("8", "B"); - } - - // Filter strings based on regular expressions. - public static String filterString(String origin, String filterStr) { - if (origin == null || origin.isEmpty()) { - return ""; - } - if (filterStr == null || filterStr.isEmpty()) { - return origin; - } - - Pattern pattern = Pattern.compile(filterStr); - Matcher matcher = pattern.matcher(origin); - return matcher.replaceAll("").trim(); - } - - // Get date in specified date format. - public static String getCorrectDate(String origin, String splitter, int[] formatter) { - if (origin == null || origin.isEmpty()) { - return ""; - } - if (splitter == null) { - return ""; - } - - int targetLength = 0; - for (int i : formatter) { - targetLength += i; - } - String newStr = Utils.correctLetterToNumber(origin); - newStr = Utils.filterString(newStr, "[^0-9,.-]"); - // If the length is less than the minimum length of the date(8), return the empty string directly. - if (newStr.length() < targetLength) { - return ""; - } - - int length = formatter.length; - String[] strings = newStr.split(splitter); - - if (strings.length < 2 || strings.length > 3) { - if (splitter.equals("\\.")) { - strings = newStr.split(","); - if (strings.length < 2 || strings.length > 3) { - return ""; - } - } else { - return ""; - } - } - - // If both the length and the number of delimiters are satisfied, the result is returned directly. - if (strings.length == length && newStr.length() == (targetLength + 2)) { - return newStr; - } - - char target = splitter.toCharArray()[1]; - return Utils.fixMissingDelimiter(newStr, target, formatter); - } - - // Completion of missing delimiters in the specified format. - public static String fixMissingDelimiter(String origin, char target, int[] format) { - if (origin == null || origin.isEmpty()) { - return ""; - } - if (format == null ) { - return ""; - } - - int newCharsLen = 0; - for (int temp : format) { - newCharsLen += temp; - } - - // After removing the symbol, it is illegal if smaller than the size of the format. - String temp = Utils.filterString(origin, "[^0-9a-zA-Z]"); - if (temp.length() < newCharsLen) { - return ""; - } - - char[] oldChars = origin.toCharArray(); - char[] newChars = new char[newCharsLen + (format.length - 1)]; - - if (oldChars.length < newCharsLen) { - return ""; - } - - int oldIndex = 0; - int newIndex = 0; - - for (int i = 0; i < format.length; i++) { - int tmp = format[i]; - - while (tmp-- > 0) { - newChars[newIndex++] = oldChars[oldIndex++]; - } - if (i != format.length - 1) { - if (Utils.blurMatchDelimiter(oldChars[oldIndex], target)) { - oldIndex++; - if (oldIndex >= oldChars.length) { - return ""; - } - } - newChars[newIndex++] = target; - } - } - return String.valueOf(newChars); - } - - // Get a string of validity in the format xxxx.xx.xx-xxxx.xx.xx. - public static String getCorrectValidDate(String origin) { - if (origin == null || origin.isEmpty()) { - return ""; - } - - String newStr = Utils.correctLetterToNumber(origin); - newStr = newStr.replaceAll("\\s{1,}", " "); - String[] strings = null; - - if (newStr.split("-").length == 2) { - // Standard case with '-' split between validity periods. - newStr = Utils.filterString(newStr, "[^0-9,.-]"); - if (newStr.length() < 18) { - return ""; - } - strings = newStr.split("-"); - } - - if (newStr.split(" ").length == 2) { - // Abnormal conditions, missing delimiters between validity periods. - strings = newStr.split(" "); - } - - if (strings == null || strings.length != 2) { - return ""; - } - - int[] formatter = {4, 2, 2}; - String startDate = Utils.getCorrectDate(strings[0], "\\.", formatter); - String endDate = Utils.getCorrectDate(strings[1], "\\.", formatter); - if (startDate.isEmpty() || endDate.isEmpty()) { - return ""; - } - return startDate + " - " + endDate; - } - - // Get the card number. - public static String getPassCardNumber(String origin) { - if (origin == null || origin.isEmpty()) { - return ""; - } - - String newStr = origin.trim(); - newStr = newStr.toUpperCase(Locale.ENGLISH); - newStr = Utils.filterString(newStr, "[^0-9A-Z<]"); - - if (newStr.length() < 27 || newStr.length() > 30) { - return ""; - } - - String[] splits = newStr.split("[<]"); - if (splits.length == 4) { - return newStr; - } - - int[] formatter = {12, 7, 7, 1}; - return Utils.fixMissingDelimiter(newStr, '<', formatter); - } - - // Obtaining a permanent identity card number for Hong Kong residents. - public static String getHKIdCardNum(String origin) { - if (origin == null || origin.isEmpty()) { - return ""; - } - - origin = Utils.filterString(origin, "[^0-9a-zA-Z()]"); - - if (origin.length() < 10) { - return ""; - } - - // The first character must be a letter. - String firstChar = origin.substring(0, 1); - if (!Character.isLowerCase(firstChar.charAt(0)) && !Character.isUpperCase(firstChar.charAt(0))) { - if (firstChar.equalsIgnoreCase("2")) { - firstChar = "Z"; - } else { - return ""; - } - } - - String number = origin.substring(1, 7); - number = Utils.correctLetterToNumber(number); - - // Judgment contains "()" . - String firstField = origin.substring(7, 8); - String middleField = origin.substring(8, 9); - String lastField = origin.substring(9, 10); - - if (!firstField.equals("(") && !lastField.equals(")")) { - return ""; - } - if (!firstField.equals("(")) { - if (firstField.equalsIgnoreCase("C") || firstField.equalsIgnoreCase("G")) { - firstField = "("; - } else { - return ""; - } - } - - String ret = firstChar + number + firstField + middleField + lastField; - return ret.toUpperCase(Locale.ENGLISH); - } - - // Get Home Return Certificate (for Compatriots from Hong Kong and Macao) number. - public static String getHomeCardNumber(String origin) { - if (origin == null || origin.isEmpty()) { - return ""; - } - - origin = Utils.filterString(origin, "[\\s]"); - if (origin.length() != 8 && origin.length() != 9) { - return ""; - } - - // Determine if the first character is a letter. - String firstLetter = origin.substring(0, 1); - if (Character.isLetter(firstLetter.charAt(0))) { - if (firstLetter.equalsIgnoreCase("H") || firstLetter.equalsIgnoreCase("M")) { - firstLetter = firstLetter.toUpperCase(Locale.ENGLISH); - String number = origin.substring(1, origin.length()); - number = Utils.filterString(number, "[^0-9]"); - number = Utils.correctLetterToNumber(number); - if (number.length() != 8) { - return ""; - } - return firstLetter + number; - } else { - return ""; - } - } - - // Home Return Certificate (for Compatriots from Hong Kong and Macao) only contains number. - if (origin.length() != 8) { - return ""; - } - - String number = origin.substring(0, 8); - number = Utils.filterString(number, "[^0-9]"); - number = Utils.correctLetterToNumber(number); - if (number.length() != 8) { - return ""; - } - return number; - } - - // Letters corrected to numbers. - public static String correctLetterToNumber(String str) { - if (str == null || str.isEmpty()) { - return ""; - } - char[] chars = str.toCharArray(); - - int length = chars.length; - for (int index = 0; index < length; index++) { - String tmp = Utils.letterNumberMap.get(Character.toString(chars[index])); - if (tmp != null) { - char[] tempChars = tmp.toCharArray(); - chars[index] = tempChars[0]; - } - } - - return String.valueOf(chars); - } - - // Numbers corrected to letters. - public static String correctNumberToLetter(String str) { - if (str == null || str.isEmpty()) { - return ""; - } - char[] chars = str.toCharArray(); - - int length = chars.length; - for (int index = 0; index < length; index++) { - String tmp = Utils.numberLetterMap.get(Character.toString(chars[index])); - if (tmp != null) { - char[] tempChars = tmp.toCharArray(); - chars[index] = tempChars[0]; - } - } - - return String.valueOf(chars); - } - - // Fuzzy match delimiter. - private static boolean blurMatchDelimiter(char origin, char target) { - if (target == '.') { - if (origin == '.' || origin == ',') { - return true; - } - } - if (target == '<') { - if (origin == 'K' || origin == 'X') { - return true; - } - } - return origin == target; - } -} \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/imgseg/ImageSegmentationAnalyseActivity.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/imgseg/ImageSegmentationAnalyseActivity.java deleted file mode 100644 index 38dcf01..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/imgseg/ImageSegmentationAnalyseActivity.java +++ /dev/null @@ -1,176 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.imgseg; - -import java.io.IOException; - -import android.graphics.Bitmap; -import android.graphics.BitmapFactory; -import android.graphics.Color; -import android.os.Bundle; -import android.util.Log; -import android.view.View; -import android.widget.ImageView; -import android.widget.Toast; - -import androidx.appcompat.app.AppCompatActivity; - -import com.huawei.hmf.tasks.OnFailureListener; -import com.huawei.hmf.tasks.OnSuccessListener; -import com.huawei.hmf.tasks.Task; -import com.huawei.hms.mlsdk.MLAnalyzerFactory; -import com.huawei.hms.mlsdk.common.MLFrame; -import com.huawei.hms.mlsdk.imgseg.MLImageSegmentation; -import com.huawei.hms.mlsdk.imgseg.MLImageSegmentationAnalyzer; -import com.huawei.hms.mlsdk.imgseg.MLImageSegmentationClassification; -import com.huawei.hms.mlsdk.imgseg.MLImageSegmentationScene; -import com.huawei.hms.mlsdk.imgseg.MLImageSegmentationSetting; -import com.huawei.mlkit.example.R; - -public class ImageSegmentationAnalyseActivity extends AppCompatActivity implements View.OnClickListener { - private static final String TAG = ImageSegmentationAnalyseActivity.class.getSimpleName(); - - private MLImageSegmentationAnalyzer analyzer; - - private ImageView mImageView; - - private Bitmap bitmap; - - @Override - protected void onCreate(Bundle savedInstanceState) { - super.onCreate(savedInstanceState); - this.setContentView(R.layout.activity_image_segmentation_analyse); - this.findViewById(R.id.segment_detect).setOnClickListener(this); - this.mImageView = this.findViewById(R.id.image_result); - } - - @Override - public void onClick(View v) { - this.analyzer(); - } - - private void analyzer() { - /** - * Configure image segmentation analyzer with custom parameter MLImageSegmentationSetting. - * - * setExact(): Set the segmentation fine mode, true is the fine segmentation mode, - * and false is the speed priority segmentation mode. - * setAnalyzerType(): Set the segmentation mode. When segmenting a static image, support setting - * MLImageSegmentationSetting.BODY_SEG (only segment human body and background) - * and MLImageSegmentationSetting.IMAGE_SEG (segment 10 categories of scenes, including human bodies) - * setScene(): Set the type of the returned results. This configuration takes effect only in - * MLImageSegmentationSetting.BODY_SEG mode. In MLImageSegmentationSetting.IMAGE_SEG mode, - * only pixel-level tagging information is returned. - * Supports setting MLImageSegmentationScene.ALL (returns all segmentation results, - * including: pixel-level tag information, portrait images with transparent backgrounds - * and portraits are white, gray background with black background), - * MLImageSegmentationScene.MASK_ONLY (returns only pixel-level tag information), - * MLImageSegmentationScene .FOREGROUND_ONLY (returns only portrait images with transparent background), - * MLImageSegmentationScene.GRAYSCALE_ONLY (returns only grayscale images with white portrait and black background). - */ - MLImageSegmentationSetting setting = new MLImageSegmentationSetting.Factory() - .setExact(false) - .setAnalyzerType(MLImageSegmentationSetting.BODY_SEG) - .setScene(MLImageSegmentationScene.ALL) - .create(); - this.analyzer = MLAnalyzerFactory.getInstance().getImageSegmentationAnalyzer(setting); - // Create an MLFrame by using android.graphics.Bitmap. Recommended image size: large than 224*224. - this.bitmap = BitmapFactory.decodeResource(this.getResources(), R.drawable.imgseg_foreground); - MLFrame mlFrame = new MLFrame.Creator().setBitmap(this.bitmap).create(); - Task task = this.analyzer.asyncAnalyseFrame(mlFrame); - task.addOnSuccessListener(new OnSuccessListener() { - @Override - public void onSuccess(MLImageSegmentation imageSegmentationResult) { - // Processing logic for recognition success. - if (imageSegmentationResult != null) { - ImageSegmentationAnalyseActivity.this.displaySuccess(imageSegmentationResult); - } else { - ImageSegmentationAnalyseActivity.this.displayFailure(); - } - } - }).addOnFailureListener(new OnFailureListener() { - @Override - public void onFailure(Exception e) { - // Processing logic for recognition failure. - ImageSegmentationAnalyseActivity.this.displayFailure(); - } - }); - } - - private void displaySuccess(MLImageSegmentation imageSegmentationResult) { - if (this.bitmap == null) { - this.displayFailure(); - return; - } - // Draw the portrait with a transparent background. - Bitmap bitmapFore = imageSegmentationResult.getForeground(); - if (bitmapFore != null) { - this.mImageView.setImageBitmap(bitmapFore); - } else { - this.displayFailure(); - } - - /** - // Draw a segmentation result map based on the returned pixel-level marker information. - byte[] result = imageSegmentationResult.getMasks(); - if (result != null) { - int[] pixels = this.cutOutHuman(result); - Bitmap processBitmap = Bitmap.createBitmap(pixels, 0, this.bitmap.getWidth(), this.bitmap.getWidth(), - this.bitmap.getHeight(), Bitmap.Config.ARGB_8888); - this.mImageView.setImageBitmap(processBitmap); - } else { - this.displayFailure(); - }*/ - - /** - // Draw the gray image of the returned portrait as white and background as black. - Bitmap bitmapGray = imageSegmentationResult.getGrayscale(); - if (bitmapGray != null) { - this.mImageView.setImageBitmap(bitmapGray); - } else { - this.displayFailure(); - }*/ - } - - private void displayFailure() { - Toast.makeText(this.getApplicationContext(), "Fail", Toast.LENGTH_SHORT).show(); - } - - private int[] cutOutHuman(byte[] masks) { - int[] results = new int[this.bitmap.getWidth() * this.bitmap.getHeight()]; - this.bitmap.getPixels(results, 0, this.bitmap.getWidth(), 0, 0, this.bitmap.getWidth(), - this.bitmap.getHeight()); - for (int i = 0; i < masks.length; i++) { - if (masks[i] != MLImageSegmentationClassification.TYPE_HUMAN) { - results[i] = Color.WHITE; - } - } - return results; - } - - @Override - protected void onDestroy() { - super.onDestroy(); - if (this.analyzer != null) { - try { - this.analyzer.stop(); - } catch (IOException e) { - Log.e(ImageSegmentationAnalyseActivity.TAG, "Stop failed: " + e.getMessage()); - } - } - } -} diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/landmark/ImageLandmarkAnalyseActivity.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/landmark/ImageLandmarkAnalyseActivity.java deleted file mode 100644 index 2c198a8..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/landmark/ImageLandmarkAnalyseActivity.java +++ /dev/null @@ -1,133 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.landmark; - -import android.graphics.Bitmap; -import android.graphics.BitmapFactory; -import android.os.Bundle; -import android.util.Log; -import android.view.View; -import android.widget.TextView; - -import androidx.appcompat.app.AppCompatActivity; - -import com.huawei.hmf.tasks.OnFailureListener; -import com.huawei.hmf.tasks.OnSuccessListener; -import com.huawei.hmf.tasks.Task; -import com.huawei.hms.mlsdk.MLAnalyzerFactory; -import com.huawei.hms.mlsdk.common.MLCoordinate; -import com.huawei.hms.mlsdk.common.MLFrame; -import com.huawei.hms.mlsdk.landmark.MLRemoteLandmark; -import com.huawei.hms.mlsdk.landmark.MLRemoteLandmarkAnalyzer; -import com.huawei.hms.mlsdk.landmark.MLRemoteLandmarkAnalyzerSetting; -import com.huawei.mlkit.example.R; - -import java.io.IOException; -import java.util.List; - -/* If you want to use landmark analyzer, - you need to apply for an agconnect-services.json file in the developer alliance(https://developer.huawei.com/consumer/en/doc/development/HMS-Guides/ml-preparations4), - replacing the sample-agconnect-services.json in the project. - */ -public class ImageLandmarkAnalyseActivity extends AppCompatActivity implements View.OnClickListener { - private static final String TAG = "StillFaceAnalyse"; - - private TextView mTextView; - - private MLRemoteLandmarkAnalyzer analyzer; - - @Override - protected void onCreate(Bundle savedInstanceState) { - super.onCreate(savedInstanceState); - this.setContentView(R.layout.activity_image_landmark_analyse); - this.mTextView = this.findViewById(R.id.landmark_result); - this.findViewById(R.id.landmark_detect).setOnClickListener(this); - } - - private void analyzer() { - // Create a landmark analyzer. - // Use default parameter settings. - // analyzer = MLAnalyzerFactory.getInstance().getRemoteLandmarkAnalyzer(); - // Use customized parameter settings. - /** - * setLargestNumOfReturns: maximum number of recognition results. - * setPatternType: analyzer mode. - * MLRemoteLandmarkAnalyzerSetting.STEADY_PATTERN: The value 1 indicates the stable mode. - * MLRemoteLandmarkAnalyzerSetting.NEWEST_PATTERN: The value 2 indicates the latest mode. - */ - MLRemoteLandmarkAnalyzerSetting settings = new MLRemoteLandmarkAnalyzerSetting.Factory() - .setLargestNumOfReturns(1) - .setPatternType(MLRemoteLandmarkAnalyzerSetting.STEADY_PATTERN) - .create(); - this.analyzer = MLAnalyzerFactory.getInstance() - .getRemoteLandmarkAnalyzer(settings); - // Create an MLFrame by using android.graphics.Bitmap. Recommended image size: large than 640*640. - Bitmap bitmap = BitmapFactory.decodeResource(this.getResources(), R.drawable.landmark_image); - MLFrame mlFrame = new MLFrame.Creator().setBitmap(bitmap).create(); - Task> task = this.analyzer.asyncAnalyseFrame(mlFrame); - task.addOnSuccessListener(new OnSuccessListener>() { - @Override - public void onSuccess(List landmarkResults) { - // Processing logic for recognition success. - ImageLandmarkAnalyseActivity.this.displaySuccess(landmarkResults.get(0)); - } - }).addOnFailureListener(new OnFailureListener() { - @Override - public void onFailure(Exception e) { - // Processing logic for recognition failur - ImageLandmarkAnalyseActivity.this.displayFailure(); - } - }); - } - - private void displayFailure() { - this.mTextView.setText("Failure"); - } - - private void displaySuccess(MLRemoteLandmark landmark) { - String result = ""; - if (landmark.getLandmark() != null) { - result = "Landmark: " + landmark.getLandmark(); - } - result += "\nPositions: "; - if (landmark.getPositionInfos() != null) { - for (MLCoordinate coordinate : landmark.getPositionInfos()) { - result += "\nLatitude:" + coordinate.getLat(); - result += "\nLongitude:" + coordinate.getLng(); - } - } - this.mTextView.setText(result); - } - - @Override - public void onClick(View v) { - this.analyzer(); - } - - @Override - protected void onDestroy() { - super.onDestroy(); - if (this.analyzer == null) { - return; - } - try { - this.analyzer.close(); - } catch (IOException e) { - Log.e(ImageLandmarkAnalyseActivity.TAG, "Stop failed: " + e.getMessage()); - } - } -} diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/object/LiveObjectAnalyseActivity.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/object/LiveObjectAnalyseActivity.java deleted file mode 100644 index 2426f71..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/object/LiveObjectAnalyseActivity.java +++ /dev/null @@ -1,265 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.object; - -import java.io.IOException; -import java.util.List; - -import android.Manifest; -import android.content.Context; -import android.content.pm.PackageManager; -import android.graphics.Bitmap; -import android.graphics.BitmapFactory; -import android.os.Bundle; -import android.os.Handler; -import android.os.Message; -import android.util.Log; -import android.util.SparseArray; -import android.view.View; -import android.widget.Button; - -import androidx.annotation.NonNull; -import androidx.appcompat.app.AppCompatActivity; -import androidx.core.app.ActivityCompat; - -import com.huawei.hmf.tasks.OnFailureListener; -import com.huawei.hmf.tasks.OnSuccessListener; -import com.huawei.hmf.tasks.Task; -import com.huawei.hms.mlsdk.MLAnalyzerFactory; -import com.huawei.hms.mlsdk.common.LensEngine; -import com.huawei.hms.mlsdk.common.MLAnalyzer; -import com.huawei.hms.mlsdk.common.MLFrame; -import com.huawei.hms.mlsdk.objects.MLObject; -import com.huawei.hms.mlsdk.objects.MLObjectAnalyzer; -import com.huawei.hms.mlsdk.objects.MLObjectAnalyzerSetting; -import com.huawei.mlkit.example.R; -import com.huawei.mlkit.example.camera.GraphicOverlay; -import com.huawei.mlkit.example.camera.LensEnginePreview; - -public class LiveObjectAnalyseActivity extends AppCompatActivity implements View.OnClickListener { - private static final String TAG = "LiveFaceAnalyse"; - - private static final int CAMERA_PERMISSION_CODE = 1; - - private MLObjectAnalyzer analyzer; - - private LensEngine mLensEngine; - - private boolean isStarted = true; - - private LensEnginePreview mPreview; - - private GraphicOverlay mOverlay; - - private int lensType = LensEngine.BACK_LENS; - - public boolean mlsNeedToDetect = true; - - private final static int STOP_PREVIEW = 1; - - private final static int START_PREVIEW = 2; - - @Override - public void onCreate(Bundle savedInstanceState) { - super.onCreate(savedInstanceState); - this.setContentView(R.layout.activity_live_object_analyse); - if (savedInstanceState != null) { - this.lensType = savedInstanceState.getInt("lensType"); - } - this.mPreview = this.findViewById(R.id.object_preview); - this.mOverlay = this.findViewById(R.id.object_overlay); - this.createObjectAnalyzer(); - Button start = this.findViewById(R.id.detect_start); - start.setOnClickListener(this); - // Checking Camera Permissions - if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED) { - this.createLensEngine(); - } else { - this.requestCameraPermission(); - } - } - - private void requestCameraPermission() { - final String[] permissions = new String[]{Manifest.permission.CAMERA}; - - if (!ActivityCompat.shouldShowRequestPermissionRationale(this, Manifest.permission.CAMERA)) { - ActivityCompat.requestPermissions(this, permissions, LiveObjectAnalyseActivity.CAMERA_PERMISSION_CODE); - return; - } - } - - @Override - protected void onResume() { - super.onResume(); - this.startLensEngine(); - } - - @Override - protected void onPause() { - super.onPause(); - this.mPreview.stop(); - } - - @Override - protected void onDestroy() { - super.onDestroy(); - if (this.mLensEngine != null) { - this.mLensEngine.release(); - } - if (this.analyzer != null) { - try { - this.analyzer.stop(); - } catch (IOException e) { - Log.e(LiveObjectAnalyseActivity.TAG, "Stop failed: " + e.getMessage()); - } - } - } - - @Override - public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, - @NonNull int[] grantResults) { - if (requestCode != LiveObjectAnalyseActivity.CAMERA_PERMISSION_CODE) { - super.onRequestPermissionsResult(requestCode, permissions, grantResults); - return; - } - if (grantResults.length != 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) { - this.createLensEngine(); - return; - } - } - - @Override - public void onSaveInstanceState(Bundle savedInstanceState) { - savedInstanceState.putInt("lensType", this.lensType); - super.onSaveInstanceState(savedInstanceState); - } - - // When you need to implement a scene that stops after recognizing specific content - // and continues to recognize after finishing processing, refer to this code - private Handler mHandler = new Handler() { - @Override - public void handleMessage(Message msg) { - super.handleMessage(msg); - switch (msg.what) { - case LiveObjectAnalyseActivity.START_PREVIEW: - LiveObjectAnalyseActivity.this.mlsNeedToDetect = true; - Log.d("object", "start to preview"); - LiveObjectAnalyseActivity.this.startPreview(); - break; - case LiveObjectAnalyseActivity.STOP_PREVIEW: - LiveObjectAnalyseActivity.this.mlsNeedToDetect = false; - Log.d("object", "stop to preview"); - LiveObjectAnalyseActivity.this.stopPreview(); - break; - default: - break; - } - } - }; - - private void stopPreview() { - this.mlsNeedToDetect = false; - if (this.mLensEngine != null) { - this.mLensEngine.release(); - } - if (this.analyzer != null) { - try { - this.analyzer.stop(); - } catch (IOException e) { - Log.d("object", "Stop failed: " + e.getMessage()); - } - } - this.isStarted = false; - } - - private void startPreview() { - if (this.isStarted) { - return; - } - this.createObjectAnalyzer(); - this.mPreview.release(); - this.createLensEngine(); - this.startLensEngine(); - this.isStarted = true; - } - - @Override - public void onClick(View v) { - this.mHandler.sendEmptyMessage(LiveObjectAnalyseActivity.START_PREVIEW); - } - - private void createObjectAnalyzer() { - // Create an object analyzer - // Use MLObjectAnalyzerSetting.TYPE_VIDEO for video stream detection. - // Use MLObjectAnalyzerSetting.TYPE_PICTURE for static image detection. - MLObjectAnalyzerSetting setting = - new MLObjectAnalyzerSetting.Factory().setAnalyzerType(MLObjectAnalyzerSetting.TYPE_VIDEO) - .allowMultiResults() - .allowClassification() - .create(); - this.analyzer = MLAnalyzerFactory.getInstance().getLocalObjectAnalyzer(setting); - this.analyzer.setTransactor(new MLAnalyzer.MLTransactor() { - @Override - public void destroy() { - - } - - @Override - public void transactResult(MLAnalyzer.Result result) { - if (!LiveObjectAnalyseActivity.this.mlsNeedToDetect) { - return; - } - LiveObjectAnalyseActivity.this.mOverlay.clear(); - SparseArray objectSparseArray = result.getAnalyseList(); - for (int i = 0; i < objectSparseArray.size(); i++) { - MLObjectGraphic graphic = new MLObjectGraphic(LiveObjectAnalyseActivity.this.mOverlay, objectSparseArray.valueAt(i)); - LiveObjectAnalyseActivity.this.mOverlay.add(graphic); - } - // When you need to implement a scene that stops after recognizing specific content - // and continues to recognize after finishing processing, refer to this code - for (int i = 0; i < objectSparseArray.size(); i++) { - if (objectSparseArray.valueAt(i).getTypeIdentity() == MLObject.TYPE_FOOD) { - LiveObjectAnalyseActivity.this.mlsNeedToDetect = true; - LiveObjectAnalyseActivity.this.mHandler.sendEmptyMessage(LiveObjectAnalyseActivity.STOP_PREVIEW); - } - } - } - }); - } - - private void createLensEngine() { - Context context = this.getApplicationContext(); - // Create LensEngine - this.mLensEngine = new LensEngine.Creator(context, this.analyzer).setLensType(this.lensType) - .applyDisplayDimension(640, 480) - .applyFps(25.0f) - .enableAutomaticFocus(true) - .create(); - } - - private void startLensEngine() { - if (this.mLensEngine != null) { - try { - this.mPreview.start(this.mLensEngine, this.mOverlay); - } catch (IOException e) { - Log.e(LiveObjectAnalyseActivity.TAG, "Failed to start lens engine.", e); - this.mLensEngine.release(); - this.mLensEngine = null; - } - } - } -} diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/object/MLObjectGraphic.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/object/MLObjectGraphic.java deleted file mode 100644 index aea3306..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/object/MLObjectGraphic.java +++ /dev/null @@ -1,96 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.object; - -import android.graphics.Canvas; -import android.graphics.Color; -import android.graphics.Paint; -import android.graphics.Paint.Style; -import android.graphics.RectF; - -import com.huawei.hms.mlsdk.objects.MLObject; -import com.huawei.mlkit.example.camera.GraphicOverlay; - -/** - * Draw the detected object information and overlay it on the preview frame. - */ -public class MLObjectGraphic extends GraphicOverlay.Graphic { - - private static final float TEXT_SIZE = 54.0f; - - private static final float STROKE_WIDTH = 4.0f; - - private final MLObject object; - - private final Paint boxPaint; - - private final Paint textPaint; - - MLObjectGraphic(GraphicOverlay overlay, MLObject object) { - super(overlay); - - this.object = object; - - this.boxPaint = new Paint(); - this.boxPaint.setColor(Color.WHITE); - this.boxPaint.setStyle(Style.STROKE); - this.boxPaint.setStrokeWidth(MLObjectGraphic.STROKE_WIDTH); - - this.textPaint = new Paint(); - this.textPaint.setColor(Color.WHITE); - this.textPaint.setTextSize(MLObjectGraphic.TEXT_SIZE); - } - - @Override - public void draw(Canvas canvas) { - // draw the object border. - RectF rect = new RectF(this.object.getBorder()); - rect.left = this.translateX(rect.left); - rect.top = this.translateY(rect.top); - rect.right = this.translateX(rect.right); - rect.bottom = this.translateY(rect.bottom); - canvas.drawRect(rect, this.boxPaint); - - // draw other object info. - canvas.drawText(MLObjectGraphic.getCategoryName(this.object.getTypeIdentity()), rect.left, rect.bottom, this.textPaint); - canvas.drawText("trackingId: " + this.object.getTracingIdentity(), rect.left, rect.top, this.textPaint); - if (this.object.getTypePossibility() != null) { - canvas.drawText("confidence: " + this.object.getTypePossibility(), rect.right, rect.bottom, this.textPaint); - } - } - - private static String getCategoryName(int category) { - switch (category) { - case MLObject.TYPE_OTHER: - return "Unknown"; - case MLObject.TYPE_FURNITURE: - return "Home good"; - case MLObject.TYPE_GOODS: - return "Fashion good"; - case MLObject.TYPE_PLACE: - return "Place"; - case MLObject.TYPE_PLANT: - return "Plant"; - case MLObject.TYPE_FOOD: - return "Food"; - case MLObject.TYPE_FACE: - return "Face"; - default: - } - return ""; - } -} diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/productvisionsearch/ProductVisionSearchAnalyseActivity.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/productvisionsearch/ProductVisionSearchAnalyseActivity.java deleted file mode 100644 index bc2d3d4..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/productvisionsearch/ProductVisionSearchAnalyseActivity.java +++ /dev/null @@ -1,153 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.productvisionsearch; - -import android.graphics.Bitmap; -import android.graphics.BitmapFactory; -import android.graphics.Canvas; -import android.graphics.Color; -import android.graphics.Paint; -import android.graphics.Rect; -import android.os.Bundle; -import android.view.View; -import android.widget.ImageView; -import android.widget.TextView; - -import androidx.appcompat.app.AppCompatActivity; - -import com.huawei.hmf.tasks.OnFailureListener; -import com.huawei.hmf.tasks.OnSuccessListener; -import com.huawei.hmf.tasks.Task; -import com.huawei.hms.mlsdk.MLAnalyzerFactory; -import com.huawei.hms.mlsdk.common.MLFrame; -import com.huawei.hms.mlsdk.productvisionsearch.MLProductVisionSearch; -import com.huawei.hms.mlsdk.productvisionsearch.MLVisionSearchProduct; -import com.huawei.hms.mlsdk.productvisionsearch.MLVisionSearchProductImage; -import com.huawei.hms.mlsdk.productvisionsearch.cloud.MLRemoteProductVisionSearchAnalyzer; -import com.huawei.hms.mlsdk.productvisionsearch.cloud.MLRemoteProductVisionSearchAnalyzerSetting; -import com.huawei.mlkit.example.R; - -import java.util.ArrayList; -import java.util.List; - -public class ProductVisionSearchAnalyseActivity extends AppCompatActivity implements View.OnClickListener { - private static final String TAG = "ProductVisionSearch"; - - private TextView mTextView; - - private ImageView productResult; - - private Bitmap bitmap; - - private MLRemoteProductVisionSearchAnalyzer analyzer; - - @Override - protected void onCreate(Bundle savedInstanceState) { - super.onCreate(savedInstanceState); - this.setContentView(R.layout.activity_product_vision_search_analyse); - this.mTextView = this.findViewById(R.id.result); - this.productResult = this.findViewById(R.id.image_product); - this.bitmap = BitmapFactory.decodeResource(this.getResources(), R.drawable.product_image); - this.productResult.setImageBitmap(this.bitmap); - this.findViewById(R.id.product_detect).setOnClickListener(this); - } - - /** - * Product search analyzer on the cloud. If you want to use product search analyzer, - * you need to apply for an agconnect-services.json file in the developer - * alliance(https://developer.huawei.com/consumer/en/doc/development/HMS-Guides/ml-preparations4), - * replacing the sample-agconnect-services.json in the project. - */ - private void remoteAnalyzer() { - // Use customized parameter settings for cloud-based recognition. - MLRemoteProductVisionSearchAnalyzerSetting setting = - new MLRemoteProductVisionSearchAnalyzerSetting.Factory().setLargestNumOfReturns(1).create(); - this.analyzer = MLAnalyzerFactory.getInstance().getRemoteProductVisionSearchAnalyzer(setting); - // Create an MLFrame by using the bitmap. - Bitmap bitmap = BitmapFactory.decodeResource(this.getResources(), R.drawable.product_image); - MLFrame frame = MLFrame.fromBitmap(bitmap); - Task> task = this.analyzer.asyncAnalyseFrame(frame); - task.addOnSuccessListener(new OnSuccessListener>() { - @Override - public void onSuccess(List productVisionSearchList) { - // Recognition success. - ProductVisionSearchAnalyseActivity.this.displaySuccess(productVisionSearchList); - } - }).addOnFailureListener(new OnFailureListener() { - @Override - public void onFailure(Exception e) { - // Recognition failure. - ProductVisionSearchAnalyseActivity.this.displayFailure(); - } - }); - } - - private void displayFailure() { - this.mTextView.setText("Failure"); - } - - private void drawBitmap(ImageView imageView, Rect rect, String product) { - Paint boxPaint = new Paint(); - boxPaint.setColor(Color.WHITE); - boxPaint.setStyle(Paint.Style.STROKE); - boxPaint.setStrokeWidth(4.0f); - Paint textPaint = new Paint(); - textPaint = new Paint(); - textPaint.setColor(Color.WHITE); - textPaint.setTextSize(100.0f); - - imageView.setDrawingCacheEnabled(true); - Bitmap bitmapDraw = Bitmap.createBitmap(this.bitmap.copy(Bitmap.Config.ARGB_8888, true)); - Canvas canvas = new Canvas(bitmapDraw); - canvas.drawRect(rect, boxPaint); - canvas.drawText("product type: " + product, rect.left, rect.top, textPaint); - this.productResult.setImageBitmap(bitmapDraw); - } - - private void displaySuccess(List productVisionSearchList) { - List productImageList = new ArrayList(); - for (MLProductVisionSearch productVisionSearch : productVisionSearchList) { - this.drawBitmap(this.productResult, productVisionSearch.getBorder(), productVisionSearch.getType()); - for (MLVisionSearchProduct product : productVisionSearch.getProductList()) { - productImageList.addAll(product.getImageList()); - } - } - StringBuffer buffer = new StringBuffer(); - for (MLVisionSearchProductImage productImage : productImageList) { - String str = String.format("%s#%s(%.5f)", productImage.getProductId(), productImage.getImageId(), - productImage.getPossibility()); - buffer.append(str); - buffer.append("\n"); - } - - this.mTextView.setText(buffer.toString()); - } - - @Override - public void onClick(View v) { - this.remoteAnalyzer(); - } - - @Override - protected void onDestroy() { - super.onDestroy(); - if (this.analyzer == null) { - return; - } - this.analyzer.stop(); - } -} \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/text/ImageTextAnalyseActivity.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/text/ImageTextAnalyseActivity.java deleted file mode 100644 index 4c52e21..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/text/ImageTextAnalyseActivity.java +++ /dev/null @@ -1,179 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.text; - -import android.graphics.Bitmap; -import android.graphics.BitmapFactory; -import android.os.Bundle; -import android.util.Log; -import android.view.View; -import android.widget.TextView; - -import androidx.appcompat.app.AppCompatActivity; - -import com.huawei.hmf.tasks.OnFailureListener; -import com.huawei.hmf.tasks.OnSuccessListener; -import com.huawei.hmf.tasks.Task; -import com.huawei.hms.mlsdk.MLAnalyzerFactory; -import com.huawei.hms.mlsdk.common.MLFrame; -import com.huawei.hms.mlsdk.text.MLLocalTextSetting; -import com.huawei.hms.mlsdk.text.MLRemoteTextSetting; -import com.huawei.hms.mlsdk.text.MLText; -import com.huawei.hms.mlsdk.text.MLTextAnalyzer; -import com.huawei.mlkit.example.R; - -import java.io.IOException; -import java.util.ArrayList; -import java.util.List; - -public class ImageTextAnalyseActivity extends AppCompatActivity implements View.OnClickListener { - private static final String TAG = "ImageTextAnalyse"; - - private TextView mTextView; - - private MLTextAnalyzer analyzer; - - @Override - protected void onCreate(Bundle savedInstanceState) { - super.onCreate(savedInstanceState); - this.setContentView(R.layout.activity_image_text_analyse); - this.mTextView = this.findViewById(R.id.text_result); - this.findViewById(R.id.text_detect).setOnClickListener(this); - } - - @Override - public void onClick(View v) { - this.localAnalyzer(); - } - - /** - * Text recognition on the device - */ - private void localAnalyzer() { - // Create the text analyzer MLTextAnalyzer to recognize characters in images. You can set MLLocalTextSetting to - // specify languages that can be recognized. - // If you do not set the languages, only Romance languages can be recognized by default. - // Use default parameter settings to configure the on-device text analyzer. Only Romance languages can be - // recognized. - // analyzer = MLAnalyzerFactory.getInstance().getLocalTextAnalyzer(); - // Use the customized parameter MLLocalTextSetting to configure the text analyzer on the device. - MLLocalTextSetting setting = new MLLocalTextSetting.Factory() - .setOCRMode(MLLocalTextSetting.OCR_DETECT_MODE) - .setLanguage("en") - .create(); - this.analyzer = MLAnalyzerFactory.getInstance() - .getLocalTextAnalyzer(setting); - // Create an MLFrame by using android.graphics.Bitmap. - Bitmap bitmap = BitmapFactory.decodeResource(this.getResources(), R.drawable.text_image); - MLFrame frame = MLFrame.fromBitmap(bitmap); - Task task = this.analyzer.asyncAnalyseFrame(frame); - task.addOnSuccessListener(new OnSuccessListener() { - @Override - public void onSuccess(MLText text) { - // Recognition success. - ImageTextAnalyseActivity.this.displaySuccess(text); - } - }).addOnFailureListener(new OnFailureListener() { - @Override - public void onFailure(Exception e) { - // Recognition failure. - ImageTextAnalyseActivity.this.displayFailure(); - } - }); - } - - /** - * Text recognition on the cloud. If you want to use cloud text analyzer, - * you need to apply for an agconnect-services.json file in the developer - * alliance(https://developer.huawei.com/consumer/en/doc/development/HMS-Guides/ml-preparations4), - * replacing the sample-agconnect-services.json in the project. - */ - private void remoteAnalyzer() { - // Create an analyzer. You can customize the analyzer by creating MLRemoteTextSetting - MLRemoteTextSetting setting = - new MLRemoteTextSetting.Factory() - .setTextDensityScene(MLRemoteTextSetting.OCR_COMPACT_SCENE) - .setLanguageList(new ArrayList(){{ - this.add("zh"); this.add("en");}}) - .setBorderType(MLRemoteTextSetting.ARC) - .create(); - this.analyzer = MLAnalyzerFactory.getInstance().getRemoteTextAnalyzer(setting); - // Use default parameter settings. - // analyzer = MLAnalyzerFactory.getInstance().getRemoteTextAnalyzer(); - // Create an MLFrame by using android.graphics.Bitmap. - Bitmap bitmap = BitmapFactory.decodeResource(this.getResources(), R.drawable.text_image); - MLFrame frame = MLFrame.fromBitmap(bitmap); - Task task = this.analyzer.asyncAnalyseFrame(frame); - task.addOnSuccessListener(new OnSuccessListener() { - @Override - public void onSuccess(MLText text) { - // Recognition success. - ImageTextAnalyseActivity.this.remoteDisplaySuccess(text); - } - }).addOnFailureListener(new OnFailureListener() { - @Override - public void onFailure(Exception e) { - // Recognition failure. - ImageTextAnalyseActivity.this.displayFailure(); - } - }); - } - - private void displayFailure() { - this.mTextView.setText("Failure"); - } - - private void remoteDisplaySuccess(MLText mlTexts) { - String result = ""; - List blocks = mlTexts.getBlocks(); - for (MLText.Block block : blocks) { - List lines = block.getContents(); - for (MLText.TextLine line : lines) { - List words = line.getContents(); - for (MLText.Word word : words) { - result += word.getStringValue() + " "; - } - } - result += "\n"; - } - this.mTextView.setText(result); - } - - private void displaySuccess(MLText mlText) { - String result = ""; - List blocks = mlText.getBlocks(); - for (MLText.Block block : blocks) { - for (MLText.TextLine line : block.getContents()) { - result += line.getStringValue() + "\n"; - } - } - this.mTextView.setText(result); - } - - @Override - protected void onDestroy() { - super.onDestroy(); - if (this.analyzer == null) { - return; - } - try { - this.analyzer.close(); - } catch (IOException e) { - Log.e(ImageTextAnalyseActivity.TAG, "Stop failed: " + e.getMessage()); - } - } -} diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/translate/TranslatorActivity.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/translate/TranslatorActivity.java deleted file mode 100644 index 337d15f..0000000 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/example/translate/TranslatorActivity.java +++ /dev/null @@ -1,224 +0,0 @@ -/* - * Copyright 2020. Huawei Technologies Co., Ltd. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package com.huawei.mlkit.example.translate; - -import android.os.Bundle; -import android.util.Log; -import android.view.View; -import android.widget.EditText; -import android.widget.TextView; - -import androidx.appcompat.app.AppCompatActivity; - -import com.huawei.hmf.tasks.OnFailureListener; -import com.huawei.hmf.tasks.OnSuccessListener; -import com.huawei.hmf.tasks.Task; -import com.huawei.hms.mlsdk.langdetect.MLDetectedLang; -import com.huawei.hms.mlsdk.langdetect.MLLangDetectorFactory; -import com.huawei.hms.mlsdk.langdetect.cloud.MLRemoteLangDetector; -import com.huawei.hms.mlsdk.langdetect.cloud.MLRemoteLangDetectorSetting; -import com.huawei.hms.mlsdk.translate.MLTranslatorFactory; -import com.huawei.hms.mlsdk.translate.cloud.MLRemoteTranslateSetting; -import com.huawei.hms.mlsdk.translate.cloud.MLRemoteTranslator; -import com.huawei.mlkit.example.R; - -import java.io.BufferedReader; -import java.io.IOException; -import java.io.InputStream; -import java.io.InputStreamReader; -import java.util.HashMap; -import java.util.List; -import java.util.Map; - -public class TranslatorActivity extends AppCompatActivity implements View.OnClickListener { - private static final String TAG = "Translator"; - - private TextView mTextView; - - private EditText mEditText; - - private MLRemoteTranslator translator; - - private MLRemoteLangDetector langDetector; - - private static Map nationAndCode = new HashMap(); - - @Override - protected void onCreate(Bundle savedInstanceState) { - super.onCreate(savedInstanceState); - this.setContentView(R.layout.activity_translator); - this.mTextView = this.findViewById(R.id.tv_output); - this.mEditText = this.findViewById(R.id.et_input); - this.findViewById(R.id.btn_translator).setOnClickListener(this); - this.findViewById(R.id.btn_identification).setOnClickListener(this); - TranslatorActivity.nationAndCode = this.readNationAndCode(); - } - - /** - * Translation on the cloud. If you want to use cloud translator, - * you need to apply for an agconnect-services.json file in the developer - * alliance(https://developer.huawei.com/consumer/en/doc/development/HMS-Guides/ml-preparations4), - * replacing the sample-agconnect-services.json in the project. - */ - private void remoteTranslator() { - // Create an analyzer. You can customize the analyzer by creating MLRemoteTranslateSetting - MLRemoteTranslateSetting setting = - new MLRemoteTranslateSetting.Factory().setTargetLangCode("zh").create(); - this.translator = MLTranslatorFactory.getInstance().getRemoteTranslator(setting); - // Use default parameter settings. - // analyzer = MLTranslatorFactory.getInstance().getRemoteTranslator(); - // Read text in edit box. - String sourceText = this.mEditText.getText().toString(); - Task task = this.translator.asyncTranslate(sourceText); - task.addOnSuccessListener(new OnSuccessListener() { - @Override - public void onSuccess(String text) { - // Recognition success. - TranslatorActivity.this.remoteDisplaySuccess(text, true); - } - - }).addOnFailureListener(new OnFailureListener() { - @Override - public void onFailure(Exception e) { - // Recognition failure. - TranslatorActivity.this.displayFailure(); - } - }); - } - - /** - * Language detection on the cloud. If you want to use cloud language detector, - * you need to apply for an agconnect-services.json file in the developer - * alliance(https://developer.huawei.com/consumer/en/doc/development/HMS-Guides/ml-preparations4), - * replacing the sample-agconnect-services.json in the project. - */ - private void remoteLangDetection() { - // Create an analyzer. You can customize the analyzer by creating MLRemoteTextSetting - MLRemoteLangDetectorSetting setting = new MLRemoteLangDetectorSetting.Factory().create(); - this.langDetector = MLLangDetectorFactory.getInstance().getRemoteLangDetector(setting); - // Use default parameter settings. - // analyzer = MLLangDetectorFactory.getInstance().getRemoteLangDetector(); - // Read text in edit box. - String sourceText = this.mEditText.getText().toString(); - Task> task = this.langDetector.probabilityDetect(sourceText); - task.addOnSuccessListener(new OnSuccessListener>() { - @Override - public void onSuccess(List text) { - // Recognition success. - TranslatorActivity.this.remoteDisplaySuccess(text); - } - }).addOnFailureListener(new OnFailureListener() { - @Override - public void onFailure(Exception e) { - // Recognition failure. - TranslatorActivity.this.displayFailure(); - } - }); - // Returns the language code with the highest confidence, sourceText represents the language to be detected. - /** - Task taskFirstBest = this.langDetector.firstBestDetect(sourceText); - taskFirstBest.addOnSuccessListener(new OnSuccessListener() { - @Override - public void onSuccess(String text) { - // Recognition success. - TranslatorActivity.this.remoteDisplaySuccess(text, false); - } - }).addOnFailureListener(new OnFailureListener() { - @Override - public void onFailure(Exception e) { - // Recognition failure. - TranslatorActivity.this.displayFailure(); - } - }); - */ - } - - private void displayFailure() { - this.mTextView.setText("Failure"); - } - - private void remoteDisplaySuccess(String text, boolean isTranslator) { - if (isTranslator) { - this.mTextView.setText(text); - } else { - this.mTextView.setText("Language=" + TranslatorActivity.nationAndCode.get(text) + "(" + text + ")."); - } - } - - private void remoteDisplaySuccess(List result) { - StringBuilder stringBuilder = new StringBuilder(); - for (MLDetectedLang recognizedLang : result) { - String langCode = recognizedLang.getLangCode(); - float probability = recognizedLang.getProbability(); - stringBuilder.append("Language=" + TranslatorActivity.nationAndCode.get(langCode) + "(" + langCode + "), score=" + probability + ".\n"); - } - this.mTextView.setText(stringBuilder.toString()); - } - - @Override - protected void onDestroy() { - super.onDestroy(); - if (this.langDetector != null) { - this.langDetector.stop(); - } - if (this.translator != null) { - this.translator.stop(); - } - } - - @Override - public void onClick(View v) { - switch (v.getId()) { - case R.id.btn_translator: - this.remoteTranslator(); - break; - case R.id.btn_identification: - this.remoteLangDetection(); - break; - default: - break; - } - } - - /** - * Read the list of languages supported by language detection. - * @return Returns a map that stores the country name and language code of the ISO 639-1. - */ - private Map readNationAndCode() { - Map nationMap = new HashMap(); - InputStreamReader inputStreamReader = null; - try { - InputStream inputStream = this.getAssets().open("Country_pair_new.txt"); - inputStreamReader = new InputStreamReader(inputStream, "utf-8"); - } catch (IOException e) { - Log.d(TranslatorActivity.TAG, "Read Country_pair_new.txt failed."); - } - BufferedReader reader = new BufferedReader(inputStreamReader); - String line; - try { - while ((line = reader.readLine()) != null) { - String[] nationAndCodeList = line.split(" "); - if (nationAndCodeList.length == 2) { - nationMap.put(nationAndCodeList[1], nationAndCodeList[0]); - } - } - } catch (IOException e) { - Log.d(TranslatorActivity.TAG, "Read Country_pair_new.txt line by line failed."); - } - return nationMap; - } -} diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/face/demo/FaceAnalyzerTransactor.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/face/demo/FaceAnalyzerTransactor.java index e43b7ec..d642736 100644 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/face/demo/FaceAnalyzerTransactor.java +++ b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/face/demo/FaceAnalyzerTransactor.java @@ -35,7 +35,8 @@ public void transactResult(MLAnalyzer.Result result) { SparseArray faceSparseArray = result.getAnalyseList(); for (int i = 0; i < faceSparseArray.size(); i++) { // todo step 4: add on-device face graphic - + MLFaceGraphic graphic = new MLFaceGraphic(this.mGraphicOverlay, faceSparseArray.valueAt(i)); + this.mGraphicOverlay.add(graphic); // finish } } diff --git a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/face/demo/LiveImageDetectionActivity.java b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/face/demo/LiveImageDetectionActivity.java index 4781c84..a878a66 100644 --- a/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/face/demo/LiveImageDetectionActivity.java +++ b/Codelabs/MLKit/app/src/main/java/com/huawei/mlkit/face/demo/LiveImageDetectionActivity.java @@ -132,7 +132,12 @@ public void onCheckedChanged(CompoundButton buttonView, boolean isChecked) { private MLFaceAnalyzer createFaceAnalyzer() { // todo step 2: add on-device face analyzer - + MLFaceAnalyzerSetting setting = new MLFaceAnalyzerSetting.Factory() + .setFeatureType(MLFaceAnalyzerSetting.TYPE_FEATURES) + .setPerformanceType(MLFaceAnalyzerSetting.TYPE_SPEED) + .allowTracing() + .create(); + this.analyzer = MLAnalyzerFactory.getInstance().getFaceAnalyzer(setting); // finish this.analyzer.setTransactor(new FaceAnalyzerTransactor(this.mOverlay)); return this.analyzer; @@ -141,7 +146,12 @@ private MLFaceAnalyzer createFaceAnalyzer() { private void createLensEngine() { Context context = this.getApplicationContext(); // todo step 3: add on-device lens engine - + this.mLensEngine = new LensEngine.Creator(context, this.analyzer) + .setLensType(this.lensType) + .applyDisplayDimension(1600, 1024) + .applyFps(25.0f) + .enableAutomaticFocus(true) + .create(); // finish } diff --git a/Codelabs/MLKit/app/src/main/res/drawable/bg_edit_text.xml b/Codelabs/MLKit/app/src/main/res/drawable/bg_edit_text.xml deleted file mode 100644 index bd3ff8a..0000000 --- a/Codelabs/MLKit/app/src/main/res/drawable/bg_edit_text.xml +++ /dev/null @@ -1,11 +0,0 @@ - - - - - - - - \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/res/drawable/card_image.jpg b/Codelabs/MLKit/app/src/main/res/drawable/card_image.jpg deleted file mode 100644 index e1b8434..0000000 Binary files a/Codelabs/MLKit/app/src/main/res/drawable/card_image.jpg and /dev/null differ diff --git a/Codelabs/MLKit/app/src/main/res/drawable/card_image_back.jpg b/Codelabs/MLKit/app/src/main/res/drawable/card_image_back.jpg deleted file mode 100644 index e1f0267..0000000 Binary files a/Codelabs/MLKit/app/src/main/res/drawable/card_image_back.jpg and /dev/null differ diff --git a/Codelabs/MLKit/app/src/main/res/drawable/card_image_front.jpg b/Codelabs/MLKit/app/src/main/res/drawable/card_image_front.jpg deleted file mode 100644 index 164508d..0000000 Binary files a/Codelabs/MLKit/app/src/main/res/drawable/card_image_front.jpg and /dev/null differ diff --git a/Codelabs/MLKit/app/src/main/res/drawable/card_passcard.jpg b/Codelabs/MLKit/app/src/main/res/drawable/card_passcard.jpg deleted file mode 100644 index e66b1b1..0000000 Binary files a/Codelabs/MLKit/app/src/main/res/drawable/card_passcard.jpg and /dev/null differ diff --git a/Codelabs/MLKit/app/src/main/res/drawable/classification_image.jpg b/Codelabs/MLKit/app/src/main/res/drawable/classification_image.jpg deleted file mode 100644 index 5c4a5b7..0000000 Binary files a/Codelabs/MLKit/app/src/main/res/drawable/classification_image.jpg and /dev/null differ diff --git a/Codelabs/MLKit/app/src/main/res/drawable/document_image.jpg b/Codelabs/MLKit/app/src/main/res/drawable/document_image.jpg deleted file mode 100644 index 5a93810..0000000 Binary files a/Codelabs/MLKit/app/src/main/res/drawable/document_image.jpg and /dev/null differ diff --git a/Codelabs/MLKit/app/src/main/res/drawable/face_image.jpg b/Codelabs/MLKit/app/src/main/res/drawable/face_image.jpg deleted file mode 100644 index 324d092..0000000 Binary files a/Codelabs/MLKit/app/src/main/res/drawable/face_image.jpg and /dev/null differ diff --git a/Codelabs/MLKit/app/src/main/res/drawable/front_back_switch.png b/Codelabs/MLKit/app/src/main/res/drawable/front_back_switch.png deleted file mode 100644 index d81b7ef..0000000 Binary files a/Codelabs/MLKit/app/src/main/res/drawable/front_back_switch.png and /dev/null differ diff --git a/Codelabs/MLKit/app/src/main/res/drawable/imgseg_foreground.jpg b/Codelabs/MLKit/app/src/main/res/drawable/imgseg_foreground.jpg deleted file mode 100644 index b768765..0000000 Binary files a/Codelabs/MLKit/app/src/main/res/drawable/imgseg_foreground.jpg and /dev/null differ diff --git a/Codelabs/MLKit/app/src/main/res/drawable/landmark_image.jpg b/Codelabs/MLKit/app/src/main/res/drawable/landmark_image.jpg deleted file mode 100644 index c555c66..0000000 Binary files a/Codelabs/MLKit/app/src/main/res/drawable/landmark_image.jpg and /dev/null differ diff --git a/Codelabs/MLKit/app/src/main/res/drawable/product_image.jpg b/Codelabs/MLKit/app/src/main/res/drawable/product_image.jpg deleted file mode 100644 index f295c47..0000000 Binary files a/Codelabs/MLKit/app/src/main/res/drawable/product_image.jpg and /dev/null differ diff --git a/Codelabs/MLKit/app/src/main/res/drawable/text_image.jpg b/Codelabs/MLKit/app/src/main/res/drawable/text_image.jpg deleted file mode 100644 index 4d73097..0000000 Binary files a/Codelabs/MLKit/app/src/main/res/drawable/text_image.jpg and /dev/null differ diff --git a/Codelabs/MLKit/app/src/main/res/layout/activity_image_bcr_analyse.xml b/Codelabs/MLKit/app/src/main/res/layout/activity_image_bcr_analyse.xml deleted file mode 100644 index 6ae7627..0000000 --- a/Codelabs/MLKit/app/src/main/res/layout/activity_image_bcr_analyse.xml +++ /dev/null @@ -1,36 +0,0 @@ - - - - - - - - - - \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/res/layout/activity_image_classification_analyse.xml b/Codelabs/MLKit/app/src/main/res/layout/activity_image_classification_analyse.xml deleted file mode 100644 index 111e2b2..0000000 --- a/Codelabs/MLKit/app/src/main/res/layout/activity_image_classification_analyse.xml +++ /dev/null @@ -1,36 +0,0 @@ - - - - - - - - - \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/res/layout/activity_image_document_analyse.xml b/Codelabs/MLKit/app/src/main/res/layout/activity_image_document_analyse.xml deleted file mode 100644 index 9337c6b..0000000 --- a/Codelabs/MLKit/app/src/main/res/layout/activity_image_document_analyse.xml +++ /dev/null @@ -1,35 +0,0 @@ - - - - - - - - - \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/res/layout/activity_image_gcr_analyse.xml b/Codelabs/MLKit/app/src/main/res/layout/activity_image_gcr_analyse.xml deleted file mode 100644 index 1bcda7e..0000000 --- a/Codelabs/MLKit/app/src/main/res/layout/activity_image_gcr_analyse.xml +++ /dev/null @@ -1,135 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/res/layout/activity_image_icr_analyse.xml b/Codelabs/MLKit/app/src/main/res/layout/activity_image_icr_analyse.xml deleted file mode 100644 index af088d7..0000000 --- a/Codelabs/MLKit/app/src/main/res/layout/activity_image_icr_analyse.xml +++ /dev/null @@ -1,46 +0,0 @@ - - - - - - - - - - - - \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/res/layout/activity_image_landmark_analyse.xml b/Codelabs/MLKit/app/src/main/res/layout/activity_image_landmark_analyse.xml deleted file mode 100644 index 55d2ab8..0000000 --- a/Codelabs/MLKit/app/src/main/res/layout/activity_image_landmark_analyse.xml +++ /dev/null @@ -1,36 +0,0 @@ - - - - - - - - - \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/res/layout/activity_image_segmentation_analyse.xml b/Codelabs/MLKit/app/src/main/res/layout/activity_image_segmentation_analyse.xml deleted file mode 100644 index 4d4392d..0000000 --- a/Codelabs/MLKit/app/src/main/res/layout/activity_image_segmentation_analyse.xml +++ /dev/null @@ -1,32 +0,0 @@ - - - - - - - - - diff --git a/Codelabs/MLKit/app/src/main/res/layout/activity_image_text_analyse.xml b/Codelabs/MLKit/app/src/main/res/layout/activity_image_text_analyse.xml deleted file mode 100644 index 5ff9692..0000000 --- a/Codelabs/MLKit/app/src/main/res/layout/activity_image_text_analyse.xml +++ /dev/null @@ -1,35 +0,0 @@ - - - - - - - - - \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/res/layout/activity_live_face_analyse.xml b/Codelabs/MLKit/app/src/main/res/layout/activity_live_face_analyse.xml deleted file mode 100644 index 481597a..0000000 --- a/Codelabs/MLKit/app/src/main/res/layout/activity_live_face_analyse.xml +++ /dev/null @@ -1,30 +0,0 @@ - - - - - - - - - - - diff --git a/Codelabs/MLKit/app/src/main/res/layout/activity_main.xml b/Codelabs/MLKit/app/src/main/res/layout/activity_main.xml deleted file mode 100644 index 2c2aa45..0000000 --- a/Codelabs/MLKit/app/src/main/res/layout/activity_main.xml +++ /dev/null @@ -1,171 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/res/layout/activity_still_face_analyse.xml b/Codelabs/MLKit/app/src/main/res/layout/activity_still_face_analyse.xml deleted file mode 100644 index 80275cf..0000000 --- a/Codelabs/MLKit/app/src/main/res/layout/activity_still_face_analyse.xml +++ /dev/null @@ -1,36 +0,0 @@ - - - - - - - - - \ No newline at end of file diff --git a/Codelabs/MLKit/app/src/main/res/layout/activity_translator.xml b/Codelabs/MLKit/app/src/main/res/layout/activity_translator.xml deleted file mode 100644 index 866b277..0000000 --- a/Codelabs/MLKit/app/src/main/res/layout/activity_translator.xml +++ /dev/null @@ -1,62 +0,0 @@ - - - - - - - -