Skip to main content
Version: Version 3.0 ๐Ÿšง

Data Sources

Set up a local DICOM server#

ATTENTION! Already have a remote or local server? Skip to the configuration section below.

While the OHIF Viewer can work with any data source, the easiest to configure are the ones that follow the [DICOMWeb][dicom-web] spec.

  1. Choose and install an Image Archive
  2. Upload data to your archive (e.g. with DCMTK's [storescu][storescu] or your archive's web interface)
  3. Keep the server running

For our purposes, we will be using Orthanc, but you can see a list of other Open Source options below.


Not sure if you have docker installed already? Try running docker --version in command prompt or terminal

If you are using Docker Toolbox you need to change the PROXY_DOMAIN parameter in platform/viewer/package.json to or the ip docker-machine ip throws. This is the value [WebPack][webpack-proxy] uses to proxy requests

Open Source DICOM Image Archives#

There are a lot of options available to you to use as a local DICOM server. Here are some of the more popular ones:

DCM4CHEE Archive 5.xW/ Docker
OrthancW/ Docker
DICOMcloud (DICOM Web only)Installation
OsiriX (Mac OSX only)Desktop Client
Horos (Mac OSX only)Desktop Client

Feel free to make a Pull Request if you want to add to this list.

Below, we will focus on DCM4CHEE and Orthanc usage:

Running Orthanc#

Start Orthanc:

# Runs orthanc so long as window remains openyarn run orthanc:up

Upload your first Study:

  1. Navigate to Orthanc's web interface at http://localhost:8042/app/explorer.html in a web browser.
  2. In the top right corner, click "Upload"
  3. Click "Select files to upload..." and select one or more DICOM files
  4. Click "Start the upload"

Orthanc: Learn More#

You can see the docker-compose.yml file this command runs at [<project-root>/.docker/Nginx-Orthanc/][orthanc-docker-compose], and more on Orthanc for Docker in Orthanc's documentation.

Connecting to Orthanc#

Now that we have a local Orthanc instance up and running, we need to configure our web application to connect to it. Open a new terminal window, navigate to this repository's root directory, and run:

# If you haven't already, enable yarn workspacesyarn config set workspaces-experimental true
# Restore dependenciesyarn install
# Run our dev command, but with the local orthanc configyarn run dev:orthanc

Configuration: Learn More#

For more configuration fun, check out the Essentials Configuration guide.

Let's take a look at what's going on under the hood here. yarn run dev:orthanc is running the dev:orthanc script in our project's package.json (inside platform/viewer). That script is:

cross-env NODE_ENV=development PROXY_TARGET=/dicom-web PROXY_DOMAIN=http://localhost:8042 APP_CONFIG=config/docker_nginx-orthanc.js webpack-dev-server --config .webpack/webpack.pwa.js -w
  • cross-env sets three environment variables
    • PROXY_TARGET: /dicom-web
    • PROXY_DOMAIN: http://localhost:8042
    • APP_CONFIG: config/docker_nginx-orthanc.js
  • webpack-dev-server runs using the .webpack/webpack.pwa.js configuration file. It will watch for changes and update as we develop.

PROXY_TARGET and PROXY_DOMAIN tell our development server to proxy requests to Orthanc. This allows us to bypass CORS issues that normally occur when requesting resources that live at a different domain.

The APP_CONFIG value tells our app which file to load on to window.config. By default, our app uses the file at <project-root>/platform/viewer/public/config/default.js. Here is what that configuration looks like:

window.config = {  routerBasename: '/',  extensions: [],  modes: [],  showStudyList: true,  dataSources: [    {      friendlyName: 'dcmjs DICOMWeb Server',      namespace: 'org.ohif.default.dataSourcesModule.dicomweb',      sourceName: 'dicomweb',      configuration: {        name: 'DCM4CHEE',        wadoUriRoot: '',        qidoRoot: '',        wadoRoot: '',        qidoSupportsIncludeField: true,        supportsReject: true,        imageRendering: 'wadors',        thumbnailRendering: 'wadors',        enableStudyLazyLoad: true,        supportsFuzzyMatching: true,        supportsWildcard: true,      },    },  ],  defaultDataSourceName: 'dicomweb',};

To learn more about how you can configure the OHIF Viewer, check out our Configuration Guide.

Running DCM4CHEE#

dcm4che is a collection of open source applications for healthcare enterprise written in Java programming language which implements DICOM standard. dcm4chee (extra 'e' at the end) is dcm4che project for an Image Manager/Image Archive which provides storage, retrieval and other functionalities. You can read more about dcm4chee in their website here

DCM4chee installation is out of scope for these tutorials and can be found here

An overview of steps for running OHIF Viewer using a local DCM4CHEE is shown below:

Static Files#

There is a binay DICOM to static file generator, which provides easily served binary files. The files are all compressed in order to reduce space signifcantly, and are pre-computed for the files required for OHIF, so that the performance of serving the files is just the read from disk/write to http stream time, without any extra processing time.

The project for the static wado files is located here:

It can be compiled with Java and Gradle, and then run against a set of dicom, in the example located in /dicom/study1 outputting to /dicomweb, and then a server run against that data, like this:

git clone static-wado./gradlew installDistStaticWado/build/install/StaticWado/bin/StaticWado -d /dicomweb /dicom/study1cd /dicomwebnpx http-server -p 5000 --cors -g

There is then a dev environment in the platform/viewer directory which can be run against those files, like this:

cd platform/vieweryarn dev:static

Additional studies can be added to the dicomweb by re-running the StaticWado command. It will create a single studies.gz index file (JSON DICOM file, compressed) containing an index of all studies created. There is then a small extension to OHIF which performs client side indexing.

The StaticWado command also knows how to deploy a client and dicomweb directory to Amazon s3, which can then server files up directly. There is another build setup build:aws in the viewer package.json to create such a deployment.