|Title||A Homogeneous Transformer Architecture|
|Publication Type||CBMM Memos|
|Year of Publication||2023|
|Authors||Gan, Y, Poggio, TA|
While the Transformer architecture has made a substantial impact in the field of machine learning, it is unclear what purpose each component serves in the overall architecture. Heterogeneous nonlinear circuits such as multi-layer RELU networks are interleaved with layers of soft-max units. We introduce here a homogeneous architecture based on Hyper Radial Basis Function (HyperBF) units. Evalua- tions on CIFAR10, CIFAR100, and Tiny ImageNet demonstrate a performance comparable to standard vision transformers.
- CBMM Funded