Skip to content

Latest commit

 

History

History
115 lines (61 loc) · 2.9 KB

call_for_contributions.md

File metadata and controls

115 lines (61 loc) · 2.9 KB

Call for code example contributions

This is a constantly-updated list of code examples that we're currently interested in.

If you're not sure whether your idea would make a good code example, please ask us first!


Structured data examples featuring Keras Preprocessing Layers (KPL)

E.g. feature hashing, feature indexing with handling of missing values, mixing numerical, categorical, and text features, doing feature engineering with KPL, etc.


Transformer model for MIDI music generation

Reference TF/Keras implementation


StyleGAN2


Text-to-speech

Example TF2/Keras implementation


Learning to rank

Reference Kaggle competition


DETR: End-to-End Object Detection with Transformers


3D image segmentation


Question answering from structured knowledge base and freeform documents


Instance segmentation


EEG & MEG signal classification


Text summarization


Audio track separation


Audio style transfer


Timeseries imputation


Customer lifetime value prediction


Keras reproducibility recipes


Standalone Mixture-of-Experts (MoE) layer

MoE layers provide a flexible way to scale deep models to train on larger datasets. The aim of this example should be to show how replace the regular layers (such as Dense, Conv2D) with compatible MoE layers.

References:


Guide to report the efficiency of a Keras model

It's often important to report the efficiency of a model. But what factors should be included when reporting the efficiency of a deep learning model? The Efficiency Misnomer paper discusses this thoroughly and provides guidelines for practitioners on how to properly report model efficiency.

The objectives of this guide will include the following:

  • What factors to consider when reporting model efficiency?
  • How to calculate certain metrics like FLOPS, number of examples a model can process per second (both in training and inference mode), etc?