I'm trying to find a way to use `tensorflow`model ...
# kotlindl
s
I'm trying to find a way to use `tensorflow`model in
Spring Boot
, I loaded the model successfully, and created the needed tensor, but I can't make a call to get the result from the model because of this error:
Caused by: org.tensorflow.exceptions.TFInvalidArgumentException: Expects arg[0] to be float but uint8 is provided
I checked the model signature and it was like this:
Copy code
Signature for "serving_default":
    Method: "tensorflow/serving/predict"
    Inputs:
        "input_1": dtype=DT_FLOAT, shape=(-1, 299, 299, 3)
    Outputs:
        "dense_3": dtype=DT_FLOAT, shape=(-1, 41)

Signature for "__saved_model_init_op":
    Outputs:
        "__saved_model_init_op": dtype=DT_INVALID, shape=()
my tensor details are
DT_UINT8 tensor with shape [299, 299, 3]
. When I changed my tensor data type into float like this:
Copy code
val imageShape = TFloat32.tensorOf(runner.fetch(decodeImage).run()[0].shape())
        val reshape = tf.reshape(
            decodeImage,
            tf.array(
                -1.0f,
                imageShape[0].getFloat(),
                imageShape[1].getFloat(),
                imageShape[2].getFloat())
            )
I got this error:
org.tensorflow.exceptions.TFInvalidArgumentException: Value for attr 'Tshape' of float is not in the list of allowed values: int32, int64
if someone is curious how I loaded the model, created the tensor and called it, here is the code below Loading the model in `TFServices`:
Copy code
fun model(): SavedModelBundle  {
        return SavedModelBundle
            .loader("/home/***/src/main/resources/pd/")
            .withRunOptions(RunOptions.getDefaultInstance())
            .load()
    }
Building the Tensor and calling the model
Copy code
val graph = Graph()
        val session = Session(graph)
        val tf = Ops.create(graph)
        val fileName = tf.constant("/home/***/src/main/resources/keyframe_1294.jpg")
        val readFile = tf.io.readFile(fileName)
        val runner = session.runner()
        val decodingOptions = DecodeJpeg.channels(3)
        val decodeImage = tf.image.decodeJpeg(readFile.contents(), decodingOptions)
        val imageShape = runner.fetch(decodeImage).run()[0].shape()
        val reshape = tf.reshape(
            decodeImage,
            tf.array(
                -1,
                imageShape.asArray()[0],
                imageShape.asArray()[1],
                imageShape.asArray()[2])
            )
        val tensor = runner.fetch(reshape).run()[0]
        val inputMap = mutableMapOf("input_tensor" to tensor)
        println(tensor.shape())
        println(tensor.dataType())
        println(tensor.asRawTensor())
        val result = tfService.model().function("serving_default").call(inputMap)
and i used this dependency:
Copy code
implementation("org.tensorflow:tensorflow-core-platform:0.5.0")
Then i changed the whole code, and used the Kotlin Tensorflow dependencies
Copy code
implementation("org.jetbrains.kotlinx:kotlin-deeplearning-api:0.5.2")
implementation("org.jetbrains.kotlinx:kotlin-deeplearning-tensorflow:0.5.2")
I loaded the model:
Copy code
fun myModel(): SavedModel {
        return SavedModel.load("/home/***/src/main/resources/pd/")
    }
and called for the prediction:
Copy code
val file = File("/home/***/src/main/resources/keyframe_1294.jpg")
val byteArray = ImageIO.read(file)
val floatArray = ImageConverter.toRawFloatArray(byteArray)
val myResult = tfService.myModel().predictSoftly(floatArray, "dense_3")
println(myResult)
but i got this error:
Caused by: org.tensorflow.TensorFlowException: Op type not registered 'DisableCopyOnRead' in binary running on My Computer. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.)
tf.contrib.resampler
should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
is there a fix for this or a ready example I can learn from? all i want to do is to use my model that i generated using Tensorflow 2 in spring boot application. thank you
z
Unfortunately, kotlindl wrapped only Tensorflow 1.15 Runtime. TF 2.x is not wrapped with KotlinDL and highly likely we will not support it because of some troubles with TF Java API for 2.x
From other side, what is your final goal? Do you need to train model or just use it for predicitions?
s
o dear...
just predicitions
i have a model that i want to load it on spring boot and expose api for it to be used
since it is already trained and ready to be used for inference
is there a way to do that with Kotlin?
z
You have two ways here: convert it to onnx format with python and use ONNX Java API or go to the https://github.com/tensorflow/java and create an issue here with the information above (issue with tensor type) I don’t know any special kotlin libs that wraps Tf 2.x
s
will do both actually... thank you so much... hopefully it works...
z
I strongly recommend to learn how to convert the model to onnx (from pytorch or tf), because for now its a best thing to run DL models in JVM
s
i see... i will learn it for sure... since it is highly possible for customers that ask to train a specific model to request a way to use it, either on a server or mobile app...
z
Also you have an option to run a model behind a web server on python
And have access via http
s
I hope that there will be a way (now or in the future) for a kotlin lib to wrap TF2... yes i know i can do that but when it comes to servers I go with spring
K 1
maybe Ktor in the future, but going with Python for servers is unlikely for me!
Is there a Tutorial? I started working on it... all i did was loading the model
Copy code
suspend fun myModel(): Model<*> {
        return KIEngine.loadModel("/home/***/model2b.onnx")
    }
and i got this error
java.lang.IllegalStateException: Unsupported operator: AveragePool
is that in my pc or an issue with converting the model?
okey... i found it... the guys from GitHub helped and I continues... thank you for your help.
Copy code
val graph = Graph()
val session = Session(graph)
val tf = Ops.create(graph)
val fileName = tf.constant("/***/20220821_203556.jpg")
val readFile = tf.io.readFile(fileName)
val runner = session.Runner()
val decodingOptions = DecodeJpeg.channels(3)
val decodeImage = tf.image.decodeJpeg(readFile.contents(), decodingOptions)
val castedImage = tf.dtypes.cast(decodeImage, TFloat32::class.java)
// Add an extra dimension to make it 4-dimensional
val expandedImage = tf.expandDims(castedImage, tf.constant(0))
val reshapedImage =tf.image.resizeBilinear(expandedImage, tf.constant(intArrayOf(299, 299))) // Resize the image
val tensor = runner.fetch(reshapedImage).run()[0]
val inputMap = mutableMapOf("input_1" to tensor)
val result = tfService.model().function("serving_default").call(inputMap)
z
Yeah, not so easy transformation with low-level tf api, great that you found a solution
s
truly... thank you Sir!
141 Views