zviha.ai (0.3.1)

This is a documentation for zhiva.ai products. Feel free to read the whole thing but here is a:


If you want to setup everything for your local research please execute following steps:

1. Setup Local PACS server

You have to have something that servers the DICOM data. Usually this is a PACS server with DICOMWeb. Setting local server is easy and everything you should do is described in Setting up Local PACS. If your server deals with multiple users you should consider Setting up PACS with JWT. This version of PACS gives you ability to easily create multiple users with limited access to the server.

2. Add Local PACS server to application settings

After creating Local PACS you have to Add that server to the application so you can use its data.

3. Setup your AI Model server

Now you have to have something that makes inferences. If you have your own model (or want to use ours) please follow the instruction: Setting up Local Model API

4. Setup proxy server for your AI Model

When you have a model you cannot just connect it directly to the application. You have to add data providers (PACS servers) to it. Reason for that is that sending DICOM data directly from an application is extremely inefficient. How to setup a proxy server is described in Setting up Model Proxy

5. Add your models to application settings

At the end you have to add your models that are now accessible through Model Proxy to application. Please follow Managing AI Models inside the DICOM viewer guide.

6. Setup your user accounts (optional but advised)

For Basic Local PACS

Usually you're dealing with more than one account. Some of them should have write permission and some of them are only for viewing data. Example setup explains how to set accounts for different users in your local environment.

For Local PACS with JWT

If you're using PACS with Tokens you're usually dealing with mode users. Every user should have its own account created by Administrator as described in User account section.

7. You're good to go

Your architecture should look like this:

local architecture