nvim-juliana
Port of Sublime's Mariana Theme to Neovim for short attention span devs.
Lua90mit
4 months ago
colorschemeeditor-pluginlua
bi-att-flow
Bi-directional Attention Flow (BiDAF) network is a multi-stage hierarchical proc
Python1524apache-2.0
12 months ago
bidafnlpquestion-answering
funNLP
中英文敏感词、语言检测、中外手机/电话归属地/运营商查询、名字推断性别、手机号抽取、身份证抽取、邮箱抽取、中日文人名库、中文缩写库、拆字词典、词汇情感值、停用词
Python61963
9 months ago
hierarchical-attention-networks
TensorFlow implementation of the paper "Hierarchical Attention Networks for Doc
Python84mit
5 years ago
attention-mechanismdocument-classificationhierarchical-attention-networks
CrowdNav
[ICRA19] Crowd-aware Robot Navigation with Attention-based Deep Reinforcement Le
Python547mit
2 years ago
collision-avoidancecrowd-navigationreinforcement-learning
pg_shard
ATTENTION: pg_shard is superseded by Citus, its more powerful replacement
C1058lgpl-3.0
8 years ago
transformerCPI
TransformerCPI: Improving compound–protein interaction prediction by sequence-ba
Python126apache-2.0
2 years ago
compound-protein-interactiondrug-discoverydrug-target-identification
gMLP
Flax implementation of gMLP from "Pay Attention to MLPs" (https://arxiv.org/abs/
Python2mit
3 years ago
deep-learningflaxjax
preact-compat
ATTENTION: The React compatibility layer for Preact has moved to the main preact
JavaScript952mit
2 years ago
compatibilitypreactreact
hart
Hierarchical Attentive Recurrent Tracking
Python146gpl-3.0
6 years ago
attention-mechanismkitti-datasetneural-nets
UGATIT
Official Tensorflow implementation of U-GAT-IT: Unsupervised Generative Attentio
Python6171mit
3 years ago