trellisnet
Sequence modeler
This repository presents a novel neural network architecture and its applications in sequence modeling tasks such as language modeling and classification.
[ICLR'19] Trellis Networks for Sequence Modeling
473 stars
22 watching
63 forks
Language: Python
last commit: over 5 years ago Related projects:
Repository | Description | Stars |
---|---|---|
locuslab/optnet | A PyTorch module that adds differentiable optimization as a layer to neural networks | 513 |
locuslab/e2e-model-learning | Develops an approach to learning probabilistic models in stochastic optimization problems | 200 |
locuslab/convmixer | An implementation of the ConvMixer neural network architecture in Python | 1,062 |
locuslab/convmixer-cifar10 | A simple ConvMixer-based classification system for the CIFAR-10 dataset | 41 |
molcik/python-neuron | A Python library for implementing and training various neural network architectures | 40 |
tbepler/protein-sequence-embedding-iclr2019 | A framework for learning protein sequence and structure embeddings using deep learning models. | 258 |
larsmans/seqlearn | A toolkit for building sequence classification models in Python | 688 |
chenxi116/pnasnet.pytorch | PyTorch implementation of PNASNet-5 architecture | 317 |
lextal/pspnet-pytorch | A PyTorch implementation of a segmentation network architecture | 585 |
juntang-zhuang/laddernet | A deep learning implementation of a multi-path network architecture for medical image segmentation | 139 |
ymcui/chinese-xlnet | Provides pre-trained models for Chinese natural language processing tasks using the XLNet architecture | 1,653 |
claws-lab/jodie | A PyTorch implementation of a representation learning framework for dynamic temporal networks | 355 |
rcmalli/keras-squeezenet | An implementation of the SqueezeNet neural network model in the Keras framework | 404 |
mhlee0903/multi_channels_pinn | Investigating neural networks for drug discovery using multiple chemical descriptors. | 3 |
nttcslab-nlp/doc_lm | This repository contains source files and training scripts for language models. | 12 |