Override a single method to plug in your own network architecture. Grab a tool from the box and customize your experiments.
- Sliding window
- ROI inspection & cropping
- Automatic padding & shape alignment
MIP Candy brings ready-to-use training, inference, and evaluation pipelines together with aesthetics, so you can focus on your experiments, not boilerplate.
MIP Candy powers research across top institutions.
Designed for modern medical image research pipelines.
Override a single method to plug in your own network architecture. Grab a tool from the box and customize your experiments.
A clean CLI layout makes it easy to configure experiments, track progress, and resume work without digging through scripts.
Inspect slices or volumes directly from the training pipeline for intuitive understanding of your data and predictions.
Experiments can be safely paused and resumed with built-in recovery mechanisms, so cluster hiccups don’t cost you progress.
Connect to Notion, Weights & Biases, and TensorBoard for rich experiment tracking and sharing with your team.
Built for Python 3.12 and above, MIP Candy takes advantage of modern typing and ecosystem improvements out of the box.
Start from SegmentationTrainer and only implement the network construction. MIP Candy handles data flow, loss computation, augmentation, checkpointing, and evaluation out of the box.
from typing import override
from torch import nn
from mipcandy import SegmentationTrainer
class MyTrainer(SegmentationTrainer):
@override
def build_network(self, example_shape: tuple[int, ...]) -> nn.Module:
...
Explore an interactive MIP Candy frontend demo directly in Notion.
Download a dataset, create a dataset wrapper, and hand it to a bundled trainer. Below is an example using the PH2 dataset with batch size 1 for varying shapes, although you can use a ROIDataset to align them.
from typing import override
import torch
from mipcandy_bundles.unet import UNetTrainer
from torch.utils.data import DataLoader
from mipcandy import download_dataset, NNUNetDataset
class PH2(NNUNetDataset):
@override
def load(self, idx: int) -> tuple[torch.Tensor, torch.Tensor]:
image, label = super().load(idx)
return image.squeeze(0).permute(2, 0, 1), label
download_dataset("nnunet_datasets/PH2", "tutorial/datasets/PH2")
dataset, val_dataset = PH2("tutorial/datasets/PH2", device="cuda").fold()
dataloader = DataLoader(dataset, 1, shuffle=True)
val_dataloader = DataLoader(val_dataset, 1, shuffle=False)
trainer = UNetTrainer("tutorial", dataloader, val_dataloader, device="cuda")
trainer.train(1000, note="a nnU-Net style example")
MIP Candy requires Python ≥ 3.12. Install the standard bundle from PyPI:
MIP Candy Bundles provide verified model architectures with corresponding trainers and predictors. You can install it along with MIP Candy.
Jump into the ecosystem around MIP Candy.