TY - CONF A1 - Chizhov, Pavel A1 - Arnett, Catherine A1 - Korotkova, Elizaveta A1 - Yamshchikov, Ivan P. T1 - BPE Gets Picky: Efficient Vocabulary Refinement During Tokenizer Training T2 - Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing N2 - Language models can greatly benefit from efficient tokenization. However, they still mostly utilize the classical Byte-Pair Encoding (BPE) algorithm, a simple and reliable method. BPE has been shown to cause such issues as under-trained tokens and sub-optimal compression that may affect the downstream performance. We introduce PickyBPE, a modified BPE algorithm that carries out vocabulary refinement during tokenizer training by removing merges that leave intermediate “junk” tokens. Our method improves vocabulary efficiency, eliminates under-trained tokens, and does not compromise text compression. Our experiments show that this method either improves downstream performance or does not harm it. KW - LLM KW - tokenization Y1 - 2024 UR - https://opus4.kobv.de/opus4-fhws/frontdoor/index/index/docId/6283 UR - https://nbn-resolving.org/urn:nbn:de:bvb:863-opus-62831 SP - 16587 EP - 16604 PB - Association for Computational Linguistics CY - Stroudsburg, PA, USA ER -