cavis/nd4j/nd4j-remote/nd4j-json-server
Alex Black 130c9aa536
Fixes (#202)
* Remove unnecessary dup in MKLDNNSubsamplingHelper

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* gson dep fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-08-30 16:36:14 +10:00
..
src [WIP] Handling binary data in DL4J servlet (#135) 2019-08-23 17:00:55 +03:00
README.md [WIP] Remote inference (#96) 2019-08-14 12:11:09 +03:00
pom.xml Fixes (#202) 2019-08-30 16:36:14 +10:00

README.md

SameDiff model serving

This modules provides JSON-based serving of SameDiff models

Example

First of all we'll create server instance. Most probably you'll do it in application that will be running in container

val server = SameDiffJsonModelServer.<String, Sentiment>builder()
                .adapter(new StringToSentimentAdapter())
                .model(mySameDiffModel)
                .port(8080)
                .serializer(new SentimentSerializer())
                .deserializer(new StringDeserializer())
                .build();

server.start();
server.join();

Now, presumably in some other container, we'll set up remote inference client:

val client = JsonRemoteInference.<String, Sentiment>builder()
                .endpointAddress("http://youraddress:8080/v1/serving")
                .serializer(new StringSerializer())
                .deserializer(new SentimentDeserializer())
                .build();

Sentiment result = client.predict(myText);

On top of that, there's async call available, for cases when you need to chain multiple requests to one or multiple remote model servers.

Future<Sentiment> result = client.predictAsync(myText);