Skip to content

Commit

Permalink
update readme and changelong
Browse files Browse the repository at this point in the history
  • Loading branch information
jbkyang-nvi committed Oct 6, 2023
1 parent b1a831d commit 41b319d
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 4 deletions.
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
* Include `mkl_lapack.h` header file in presets for MKL ([issue #1388](https://github.com/bytedeco/javacpp-presets/issues/1388))
* Map new higher-level C++ API of Triton Inference Server ([pull #1361](https://github.com/bytedeco/javacpp-presets/pull/1361))
* Upgrade presets for OpenCV 4.8.0, DNNL 3.2.1, OpenBLAS 0.3.24, CPython 3.11.5, NumPy 1.25.2, SciPy 1.11.2, LLVM 17.0.1, TensorFlow Lite 2.14.0, Triton Inference Server 2.34.0, ONNX 1.14.1, ONNX Runtime 1.16.0, TVM 0.13.0, and their dependencies
* Upgrade presets for Triton Inference Server 2.38.0

### June 6, 2023 version 1.5.9
* Virtualize `nvinfer1::IGpuAllocator` from TensorRT to allow customization ([pull #1367](https://github.com/bytedeco/javacpp-presets/pull/1367))
Expand Down
8 changes: 4 additions & 4 deletions tritonserver/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ Introduction
------------
This directory contains the JavaCPP Presets module for:

* Triton Inference Server 2.34.0 https://github.com/triton-inference-server/server
* Triton Inference Server 2.38.0 https://github.com/triton-inference-server/server

Please refer to the parent README.md file for more detailed information about the JavaCPP Presets.

Expand Down Expand Up @@ -51,9 +51,9 @@ This sample intends to show how to call the Java-mapped C API of Triton to execu

1. Get the source code of Triton Inference Server to prepare the model repository:
```bash
$ wget https://github.com/triton-inference-server/server/archive/refs/tags/v2.34.0.tar.gz
$ tar zxvf v2.34.0.tar.gz
$ cd server-2.34.0/docs/examples/model_repository
$ wget https://github.com/triton-inference-server/server/archive/refs/tags/v2.38.0.tar.gz
$ tar zxvf v2.38.0.tar.gz
$ cd server-2.38.0/docs/examples/model_repository
$ mkdir models
$ cd models; cp -a ../simple .
```
Expand Down

0 comments on commit 41b319d

Please sign in to comment.