# External Resources - Adyen wrote a detailed article about the interplay between TGI's main components: router and server. [LLM inference at scale with TGI (Martin Iglesias Goyanes - Adyen, 2024)](https://www.adyen.com/knowledge-hub/llm-inference-at-scale-with-tgi)