blob: 934b58c7242fc06064ee3c06bc8f4c2740bd24ef (
plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
|
TensorFlow-Android-Inference
============================
This directory contains CMake support for building the Android Java Inference
interface to the TensorFlow native APIs.
See [tensorflow/contrib/android](..) for more details about the library, and
instructions for building with Bazel.
Usage
-----
Add TensorFlow-Android-Inference as a dependency of your Android application
* settings.gradle
```
include ':TensorFlow-Android-Inference'
findProject(":TensorFlow-Android-Inference").projectDir =
new File("${/path/to/tensorflow_repo}/contrib/android/cmake")
```
* application's build.gradle (adding dependency):
```
debugCompile project(path: ':tensorflow_inference', configuration: 'debug')
releaseCompile project(path: ':tensorflow_inference', configuration: 'release')
```
Note: this makes native code in the lib traceable from your app.
Dependencies
------------
TensorFlow-Android-Inference depends on the TensorFlow static libs already built
in your local TensorFlow repo directory. For Linux/Mac OS, build_all_android.sh
is used in build.gradle to build it. It DOES take time to build the core libs;
so, by default, it is commented out to avoid confusion (otherwise
Android Studio would appear to hang during opening the project).
To enable it, refer to the comment in
* build.gradle
Output
------
- TensorFlow-Inference-debug.aar
- TensorFlow-Inference-release.aar
File libtensorflow_inference.so should be packed under jni/${ANDROID_ABI}/
in the above aar, and it is transparent to the app as it will access them via
equivalent java APIs.
|