Getting Started#

Installation#

It is required that Python is installed to use PSSR2. If you have not already, download it here.

Before installing PSSR2, you may want to create an environment for it with

$ conda env create pssr

or the equivalent for any other environment manager if you chose to do so.


Note

The package name of PSSR2 is pssr! All modules are referenced as such.

You can easily install PSSR2 via the pip package manager:

$ pip install pssr

All package versions are also available on the GitHub repository:

$ pip install git+https://github.com/ucsdmanorlab/PSSR/dist/pssr-x.x.x-py3-none-any.whl

Running the CLI#

The PSSR2 CLI is included with package installation and can be run with the pssr command in the command line. It provides a simple interface for using PSSR2 without having to write any code, and covers most basic use cases.

The CLI can run in either train or predict mode. It takes in a number of arugments, described below.

CLI Arguments

PSSR2 CLI for basic usage

usage: pssr [-h] [-t] [-dp DATA_PATH] [-dt DATA_TYPE] [-mt MODEL_TYPE]
            [-mp MODEL_PATH] [-e EPOCHS] [-b BATCH_SIZE] [-lr LR]
            [-p PATIENCE] [-mse] [-cp] [-sl]

Named Arguments#

-t, --train

enable train mode

Default: False

-dp, --data-path

specify dataset path

-dt, --data-type

specify dataset type

Default: “ImageDataset”

-mt, --model-type

specify model type

Default: “ResUNet”

-mp, --model-path

specify model path

-e, --epochs

specify number of training epochs

Default: 10

-b, --batch-size

specify batch size

Default: 16

-lr, --lr

specify learning rate

Default: 0.001

-p, --patience

specify learning rate decay patience

Default: 3

-mse, --mse

use MSE loss instead of MS-SSIM loss

Default: False

-cp, --checkpoint

save model checkpoints during training

Default: False

-sl, --save-losses

save training losses

Default: False

Keep in mind that arguments representing an object such as a dataset or model can be defined as a class declaration with additonal arguments in Python syntax as a string. For example --model-type could be given as -mt "ResUNet(hidden=[128, 256], scale=4)".


If you do not have access to a microscopy dataset, a sample EM training dataset can be found here containing high-resolution 512x512 images (hr_res=512) with a pixel size of 2nm. Real-world EM high-low-resolution image pairs for testing can be found here of the same resolution. Larger datasets and all data used in the PSSR2 paper can also be found on 3Dem.org. If your dataset have different resolution data, hr_res and scale can be changed correspondingly.


Training#

A model can be trained by running

$ pssr -t -dp your/path

where your/path is replaced with the path of your training dataset (folder containing high-resolution images/image sheets).

The low-resolution images will generated via Crappifier, which is explained in Principles of PSSR.


The trained model will be saved in your current directory.

By default the dataset used is ImageDataset. If your dataset contains image sheets (e.g. .czi files) rather than many images, you can use SlidingDataset by adding the argument -dt SlidingDataset. The batch size can also be changed with the -b argument.


Predicting#

A pretrained PSSR2 model for EM data can be found here, a ResUNet with default arguments.

To run the demo in predict mode, omit the -t argument. The dataset path should be changed to the path containing the low-resolution images to be upscaled. The -mp argument must be set to the path of your trained model. The predicted upscaled images will be saved to the preds folder.

Note

SlidingDataset does not automatically detect low-resolution inputs. hr_res must be lowered to the size of the low-resolution image and lr_scale must be set to -1.


If a PairedImageDataset instance with high-low-resolution image pairs is given as the dataset, additional performance metrics will be calculated. To define both high-resolution and low-resolution data paths, provide both paths in order separated by a comma for the -dp argument

$ pssr -dp your/hr,your/lr -dt PairedImageDataset

where your/hr and your/lr are repleaced by your high-resolution and low-resolution data paths respectively.


If high-resolution images are given using an ImageDataset, then low-resolution images will be generated via Crappifier and performance metrics will still be calculated.

Next Steps#

If you are not familar with PSSR2 or super-resolution, understand the Principles of PSSR.

For usage of PSSR2 beyond the extents of the demo, learn how to implement your own workflow.

Full reference and explanations of all PSSR2 tools is available in API Reference.