Contour
v1.0 · macOS 14+ · Apple Silicon

SAM 3 · SAM 3.1 · MLX · Metal · Accelerate

Segment anything. On your Mac. In bulk.

Contour turns Meta's SAM 3 and SAM 3.1 into a real macOS workspace — drop in a folder, write a prompt, get clean masks, boxes, and dataset-ready exports.

  • Free for personal use
  • Pro · $49 one-time
  • 1.7 GB model
412ms median infer ~200 instances/image 0 bytes uploaded

IMG_0412.jpg · SAM 3.1

5 instances · 412ms

> person
Before ← drag → After
Features

An instrument, not a magic wand.

Every feature is there because someone who segments images for a living needed it. No confetti, no auto-enhance, no one-click magic.

01

Prompt the way you think

Text, include-boxes, exclude-boxes — and combinations. SAM understands "person," "label," "dog." Shift-click to draw an and-not region.

prompt: "dog" + include − exclude
02

Instant threshold tuning

Raw output is cached per image. Drag confidence, NMS, alpha, and minimum area — results re-render in hundreds of milliseconds, no re-inference.

CONF 0.78 NMS 0.53 ALPHA 0.72 AREA 0.44
03

Batch a whole collection

Point at a folder, set a prompt and model once, run. Status is tracked in SwiftData. Completed images skip on re-runs; crashes recover cleanly.

IMG_0412.jpg DONE 312ms IMG_0413.jpg DONE 298ms IMG_0414.jpg RUNNING IMG_0415.jpg PENDING IMG_0416.jpg PENDING
04

Three overlay modes

Pixel mask, axis-aligned bbox, or oriented bbox — switch per export.

MASK BBOX OBB
05

On device, via MLX

MLX for compute, Metal for GPU, Accelerate for SIMD mask composition. No cloud, no API key, no telemetry.

MLX · Metal on device
06

Real export formats

COCO JSON (with optional RLE), YOLO TXT bbox or polygon, mask PNG, transparent cutout PNG. Optional source-image copy.

COCO.json ready YOLO.txt ready mask.png ready cutout.png ready
Models

Pick the precision that fits the box.

Six SAM variants ship ready to cache. Downloaded once from Hugging Face, color-coded in the toolbar so you always know what's loaded.

SAM 3 full
Size
~1.7 GB
Mem
8 GB+
Infer
412 ms

Reliable baseline. Fast on most Apple Silicon.

SAM 3.1 full
Size
~3.5 GB
Mem
16 GB+
Infer
680 ms

Latest. Adds box-geometry prompts.

SAM 3 · int8 quant
Size
~900 MB
Mem
8 GB+
Infer
298 ms

Quantized · near-lossless for most prompts.

SAM 3 · int4 quant
Size
~450 MB
Mem
8 GB+
Infer
215 ms

Smallest. Trade accuracy for throughput.

SAM 3.1 · mxfp8 μfp
Size
~900 MB
Mem
8 GB+
Infer
340 ms

Microscaling FP — modern low-precision.

SAM 3.1 · mxfp4 μfp
Size
~500 MB
Mem
8 GB+
Infer
240 ms

Smallest 3.1 variant. Box prompts supported.

Workflow

Five steps, from folder to dataset.

No modal popups. No "AI magic." Each step is a tool in the workstation — toolbar-driven, menu-driven, keyboard-driven.

  1. 01

    Collect

    Drag a folder of images into a new collection. SwiftData tracks per-image status.

  2. 02

    Pick a model

    SAM 3 or 3.1, full-precision or quantized. Color-coded in the toolbar.

    SAM 3.1 · mxfp8 900 MB · 8 GB req
  3. 03

    Prompt

    Type a label, draw include / exclude boxes, or combine.

    "person" +
  4. 04

    Tune

    Sliders re-render from cache — no re-inference while you dial it in.

    conf nms alpha
  5. 05

    Export

    Mask PNG, cutout PNG, COCO JSON, or YOLO TXT. Pick a folder, run it.

    COCO.json YOLO.txt mask.png cutout.png
Exports

Real formats. Nothing to reshape.

What comes out of Contour drops straight into your training pipeline — no round-tripping, no script-to-reshape, no CSV middleware.

COCO · annotations.json

COCO-format instance annotations with optional RLE-encoded masks.

{
  "images": [...],
  "annotations": [
    {
      "id": 1,
      "image_id": 412,
      "category_id": 1,
      "bbox": [142,288,72,312],
      "area": 18472,
      "segmentation": {
        "counts": "PYm03L4M...",
        "size": [1920,1080]
      },
      "score": 0.97
    }
  ]
}
YOLO · IMG_0412.txt

YOLO txt — bbox or polygon format. Douglas-Peucker simplified.

0 0.186 0.554 0.038 0.520
0 0.452 0.554 0.038 0.520
0 0.742 0.576 0.037 0.510
0 0.867 0.598 0.036 0.498
0 0.092 0.620 0.024 0.391

# class_id cx cy w h
# normalized 0..1
# 5 detections
MASK · IMG_0412_mask.png

Per-instance binary alpha masks. One PNG per detection.

IMG_0412/
├─ mask_001.png  · 62 KB
├─ mask_002.png  · 58 KB
├─ mask_003.png  · 61 KB
├─ mask_004.png  · 47 KB
└─ mask_005.png  · 34 KB

5 masks · 262 KB total
1920×1080 · 8-bit alpha
CUTOUT · IMG_0412_cut.png

Transparent-background cutouts — ready for compositing and matting.

IMG_0412/
├─ cutout_001.png · 240 KB
├─ cutout_002.png · 221 KB
├─ cutout_003.png · 204 KB
├─ cutout_004.png · 178 KB
└─ cutout_005.png · 132 KB

5 cutouts · RGBA
alpha channel preserved
On device

Your client's unreleased imagery never leaves the machine.

Every model runs locally via MLX — Apple's machine learning framework for Apple Silicon — with Metal for GPU compute and Accelerate for SIMD-accelerated mask composition.

  • No cloud inference. MLX runs on your GPU, not someone else's.
  • No API key. No quota. No sign-in to segment.
  • No telemetry. Contour makes zero outbound requests once models are cached.
  • Model weights downloaded once from Hugging Face. Silent thereafter.
  • Collections, masks, and caches stored in the app's SwiftData container — yours.
  • Sandboxed. App-Store-ready. Notarized and hardened runtime.
Pricing

Free to try. One-time for Pro.

No subscription. Buy Pro once, own it. Free tier covers personal projects and exploration — Pro unlocks batch, exports, and the 3.1 quantizations.

Free

$0 forever

Personal use. One image at a time.

  • SAM 3 · quantized (int8)
  • Text + box prompts
  • Slider tuning with live re-render
  • Mask PNG + cutout PNG export
  • Up to 50 detections per image
  • One collection, 100 images

Pro

ONE-TIME
$49 pay once · yours

Everything. For people who segment for a living.

  • All models · SAM 3, 3.1, int8, mxfp8, mxfp4
  • Batch processing across collections
  • COCO JSON (with RLE) + YOLO TXT export
  • OBB · oriented bounding box output
  • Up to 200 detections per image
  • Unlimited collections and images
  • All future 1.x updates
FAQ

Questions we actually get.

Missing one? The app's Help menu opens a short, searchable manual — or email support@mgcrea.io .

Q01 Does Contour need an internet connection?

Only once — to download model weights from Hugging Face on first use. After models are cached locally, Contour runs entirely offline. The app makes zero outbound requests during segmentation.

Q02 Which Macs can run it?

macOS 14 (Sonoma) or later on Apple Silicon. 8 GB of unified memory for quantized SAM 3; 16 GB+ recommended for full-precision SAM 3.1. Intel Macs are not supported — MLX and Metal require Apple Silicon.

Q03 How is this different from a one-click background remover?

A background remover is about the outline — one subject, one clean cutout. Contour is about the partition — many instances, each one identified, each one bounded, exportable as a dataset. Different problem, different tool.

Q04 Can I script it or automate batches?

Pro includes CLI export and AppleScript / Shortcuts bindings for running collections headlessly. SwiftData is queryable via the app's URL scheme.

Q05 Do I need to accept Meta's SAM license?

Yes. Contour gates SAM downloads behind a one-time license-acceptance prompt. After you accept, subsequent model pulls are silent.

Q06 What about my training data — is anything collected?

Nothing. Contour ships with no analytics, no crash reporter, no "help improve the product" telemetry. Images, masks, and caches live in the app's sandboxed SwiftData container on your machine.

Q07 Will there be a Linux or Windows version?

No. MLX is Apple-Silicon-only, and the app is built around macOS idioms — SwiftUI, SwiftData, Quick Look, Finder integration. A web or Linux port would be a different product.