Glossary
Latency
Latency refers to the time delay between initiating a request and receiving a response. In APIs, latency is measured in milliseconds and includes network transit time, server processing time, and any middleware overhead. Low latency is critical for real-time applications and user-facing services. Factors affecting latency include geographic distance, server load, payload size, and the complexity of the operation being performed.