Project Neura
MIP Candy
A candy for medical image processing

Next-generation infrastructure for fast prototyping in machine learning.

MIP Candy brings ready-to-use training, inference, and evaluation pipelines together with aesthetics, so you can focus on your experiments, not boilerplate.

PyPI GitHub release PyPI downloads GitHub stars
MIP Candy overview poster

Trusted by

MIP Candy powers research across top institutions.

Key features

Designed for modern medical image research pipelines.

Training
Easy adaptation to your workflow

Override a single method to plug in your own network architecture. Grab a tool from the box and customize your experiments.

  • Sliding window
  • ROI inspection & cropping
  • Automatic padding & shape alignment
Interface
Thoughtful command-line UX

A clean CLI layout makes it easy to configure experiments, track progress, and resume work without digging through scripts.

CLI UI
Visualization
Built-in 2D & 3D views

Inspect slices or volumes directly from the training pipeline for intuitive understanding of your data and predictions.

2D and 3D visualization
Reliability
Interruption-tolerant runs

Experiments can be safely paused and resumed with built-in recovery mechanisms, so cluster hiccups don’t cost you progress.

Recovery screenshots
Dashboards
Remote monitoring ready

Connect to Notion, Weights & Biases, and TensorBoard for rich experiment tracking and sharing with your team.

Notion dashboard
Python 3.12+
Modern Python, modern stack

Built for Python 3.12 and above, MIP Candy takes advantage of modern typing and ecosystem improvements out of the box.

Custom trainers

Adapt a new network in one method.

Start from SegmentationTrainer and only implement the network construction. MIP Candy handles data flow, loss computation, augmentation, checkpointing, and evaluation out of the box.

Example — Custom model integration
from typing import override
from torch import nn
from mipcandy import SegmentationTrainer


class MyTrainer(SegmentationTrainer):
    @override
    def build_network(self, example_shape: tuple[int, ...]) -> nn.Module:
        ...
Provide your architecture once. MIP Candy takes care of the entire training pipeline.

Live Notion dashboard

Explore an interactive MIP Candy frontend demo directly in Notion.

Quick start

Train like a Pro in a few lines

Download a dataset, create a dataset wrapper, and hand it to a bundled trainer. Below is an example using the PH2 dataset with batch size 1 for varying shapes, although you can use a ROIDataset to align them.

Example — nnU-Net style training
from typing import override

import torch
from mipcandy_bundles.unet import UNetTrainer
from torch.utils.data import DataLoader

from mipcandy import download_dataset, NNUNetDataset


class PH2(NNUNetDataset):
    @override
    def load(self, idx: int) -> tuple[torch.Tensor, torch.Tensor]:
        image, label = super().load(idx)
        return image.squeeze(0).permute(2, 0, 1), label


download_dataset("nnunet_datasets/PH2", "tutorial/datasets/PH2")
dataset, val_dataset = PH2("tutorial/datasets/PH2", device="cuda").fold()
dataloader = DataLoader(dataset, 1, shuffle=True)
val_dataloader = DataLoader(val_dataset, 1, shuffle=False)
trainer = UNetTrainer("tutorial", dataloader, val_dataloader, device="cuda")
trainer.train(1000, note="a nnU-Net style example")
Installation

Install MIP Candy

MIP Candy requires Python ≥ 3.12. Install the standard bundle from PyPI:

$ pip install "mipcandy[standard]"
Stand on the Giants

Install MIP Candy Bundles

MIP Candy Bundles provide verified model architectures with corresponding trainers and predictors. You can install it along with MIP Candy.

$ pip install "mipcandy[all]"