Skip to content

Latest commit

 

History

History
126 lines (91 loc) · 6.19 KB

README.md

File metadata and controls

126 lines (91 loc) · 6.19 KB

DeepProfiler

Python 3.6+ Tensorflow 2.5+ Cell Painting CNN-1 DOI Example data DOI CometML

Image-based profiling using deep learning

DeepProfiler is a set of tools to use deep learning for analyzing imaging data in high-throughput biological experiments. Please, see our DeepProfiler Handbook for more details about how to use it and DeepProfilerExperiments repository for the examples of configuration files and downstream analysis.

Checkout our preprint on bioRxiv.

Cell Painting CNN-1

Cell Painting CNN weights are available on Zenodo.

We used DeepProfiler to train a feature extraction model for single cells in Cell Painting experiments. The model brings state-of-the-art profiling performance for downstream analysis tasks. This model is an EfficientNet trained to process the 5 channels of the Cell Painting assay and produce single-cell morphology embeddings, which can be aggregated to profile treatments in large-scale experiments. Features obtained with the Cell Painting CNN-1 are more robust and improve performance.

Quick Guide

System requirements

DeepProfiler works best with Linux operating systems (Ubuntu 18+).

Clone and install DeepProfiler

First, clone or fork this repository with example data (example data is stored with git-lfs):

git clone https://github.com/broadinstitute/DeepProfiler.git

Alternativly, you can download example data from Zenodo.

If you don't need example data, you can clone without it:

GIT_LFS_SKIP_SMUDGE=1 git clone https://github.com/broadinstitute/DeepProfiler.git

Then install it using:

pip install -e .

Download example data

This repository contains example data, which is already structured as a DeepProfiler project. To do this, unpack example_data.tar.gz with the command:

tar -xzf example_data.tar.gz

Profiling of the example data with GPU-acceleration is expected to take ~1 minute. Single-cell export and training are expected to take 5-10 minutes.

Profiling with the Cell Painting CNN-1

To profile experimental data, just an experiment folder, for example, cell_painting in example_data/outputs/ and then checkpoint folder inside the created experiment folder. Copy model file Cell_Painting_CNN_v1.hdf5 into checkpoint folder.

Download an example configuration file and put it in example_data/inputs/config/.

Now you can start profiling the example data:

python3 deepprofiler --root=/your_path/example_data/ --config=cell_painting_cnn_profiling_example.json –-exp=cell_painting –-gpu=0 profile

The extracted features should be available in example_data/outputs/cell_painting/features/.

Profiling with the Cell Painting CNN-1 with your data

When running DeepProfiler you usually need to specify a root directory where your data is stored and a command that you want to run. For instance, to initialize your project, you can use:

python deepprofiler --root=/home/ubuntu/project --config=config.json setup

In the created directories, you can organize your input files, including images, metadata and single-cell locations. You can also refer to example data regarding the project organization and format of files.

Download an example configuration file and put it in your project/inputs/config/. Adjust the configuration for your project: more details about configuration files are available in the corresponding handbook chapter and profiling section. Also, you can find other examples in the DeepProfilerExperiment repository.

After you organize your project, create an experiment folder (for example cell_painting) in project/outputs/ and then a checkpoint folder inside the created experiment folder. Copy the model (Cell_Painting_CNN_v1.hdf5) into the checkpoint folder.

If images are in project/inputs/images/, set implement:false in the compression config section.

After the project is organized, feature extraction can be started:

python3 deepprofiler --root=/project/ --config=cell_painting_cnn.json –-exp=cell_painting –-gpu=0 profile

Training of your models

If you are interested in training a model on your images, please follow the instructions in our documentation handbook.

Happy profiling!