models/official/nlp/modeling/layers
..
README.md
__init__.py
attention.py
attention_test.py
bigbird_attention.py
bigbird_attention_test.py
block_diag_feedforward.py
block_diag_feedforward_test.py
block_sparse_attention.py
block_sparse_attention_test.py
cls_head.py
cls_head_test.py
factorized_embedding.py
factorized_embedding_test.py
gated_feedforward.py
gated_feedforward_test.py
gaussian_process.py
gaussian_process_test.py
kernel_attention.py
kernel_attention_test.py
masked_lm.py
masked_lm_test.py
masked_softmax.py
masked_softmax_test.py
mat_mul_with_margin.py
mat_mul_with_margin_test.py
mixing.py
mixing_test.py
mobile_bert_layers.py
mobile_bert_layers_test.py
moe.py
moe_test.py
multi_channel_attention.py
multi_channel_attention_test.py
multi_query_attention.py
multi_query_attention_test.py
on_device_embedding.py
on_device_embedding_test.py
pack_optimization.py
pack_optimization_test.py
per_dim_scale_attention.py
per_dim_scale_attention_test.py
position_embedding.py
position_embedding_test.py
relative_attention.py
relative_attention_test.py
reuse_attention.py
reuse_attention_test.py
reuse_transformer.py
reuse_transformer_test.py
rezero_transformer.py
rezero_transformer_test.py
routing.py
routing_test.py
self_attention_mask.py
spectral_normalization.py
spectral_normalization_test.py
talking_heads_attention.py
talking_heads_attention_test.py
text_layers.py
text_layers_test.py
tn_expand_condense.py
tn_expand_condense_test.py
tn_transformer_expand_condense.py
tn_transformer_test.py
transformer.py
transformer_encoder_block.py
transformer_encoder_block_test.py
transformer_scaffold.py
transformer_scaffold_test.py
transformer_test.py
transformer_xl.py
transformer_xl_test.py
util.py

README.md

Layers

Layers are the fundamental building blocks for NLP models. They can be used to assemble new tf.keras layers or models.