Repository logo
  • English
  • Deutsch
Log In
or
  1. Home
  2. HSG CRIS
  3. HSG Publications
  4. Number of Attention Heads vs. Number of Transformer-encoders in Computer Vision
 
  • Details

Number of Attention Heads vs. Number of Transformer-encoders in Computer Vision

Journal
Proceedings of the 14th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management - KDIR
ISSN
2184-3228
Type
conference paper
Date Issued
2022-10
Author(s)
Hrycej, Tomas
Bermeitinger, Bernhard  orcid-logo
Handschuh, Siegfried  
DOI
10.5220/0011578000003335
Research Team
Data Science and Natural Language Processing
Abstract
Determining an appropriate number of attention heads on one hand and the number of transformer-encoders, on the other hand, is an important choice for Computer Vision (CV) tasks using the Transformer architecture. Computing experiments confirmed the expectation that the total number of parameters has to satisfy the condition of overdetermination (i.e., number of constraints significantly exceeding the number of parameters). Then, good generalization performance can be expected. This sets the boundaries within which the number of heads and the number of transformers can be chosen. If the role of context in images to be classified can be assumed to be small, it is favorable to use multiple transformers with a low number of heads (such as one or two). In classifying objects whose class may heavily depend on the context within the image (i.e., the meaning of a patch being dependent on other patches), the number of heads is equally important as that of transformers.
Language
English
HSG Classification
contribution to scientific community
HSG Profile Area
None
Refereed
Yes
Publisher
SciTePress
Start page
315
End page
321
Official URL
https://www.scitepress.org/PublicationsDetail.aspx?ID=8lZGKIPui9E=&t=1
URL
https://www.alexandria.unisg.ch/handle/20.500.14171/108192
Subject(s)

computer science

Division(s)

ICS - Institute of Co...

Contact Email Address
bernhard.bermeitinger@unisg.ch
Eprints ID
267726
File(s)
Loading...
Thumbnail Image

restricted

Name

115780.pdf

Size

381.52 KB

Format

Adobe PDF

Checksum (MD5)

f2d506c60b9d412da6809635a26066d9

Loading...
Thumbnail Image

open.access

Name

2022.09.18-NumberofAttentionHeadsvs.NumberofTransformer-EncodersinComputerVisionKDIR2022Valletta.pdf

Size

537.37 KB

Format

Adobe PDF

Checksum (MD5)

66a1b88011ed103c10eb23a68b23696f

here you can find instructions and news.

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science

  • Privacy policy
  • End User Agreement
  • Send Feedback