33rd IEEE Conference on Signal Processing and Communications Applications, SIU 2025, İstanbul, Türkiye, 25 - 28 Haziran 2025, (Tam Metin Bildiri)
Large Language Models (LLMs) struggle to maintain language consistency in low-resource languages like Turkish when sampling at high temperature parameters. This study investigates the effects of recently introduced min-p and top-p parameter values, which filter low-probability tokens, on Turkish text generation in open-source LLMs trained predominantly on English. The effectiveness of min-p in maintaining Turkish consistency across different temperature and top-p settings is evaluated using Supreme Court decision summaries. Detailed experiments demonstrate that min-p sampling significantly increases linguistic consistency at high temperatures and allows for greater creativity without compromising consistency.