Servers far away are slower than servers nearby. Why is that? And how big is the effect? This calculator shows how much your speed is reduced because of distance.
Explanation:
Networks do not transfer data in a continuous stream, as many people think, but in small packets. The server sends a packet to your computer, which sends an acknowledgement back (TCP/IP protocol). Upon receipt of the acknowledgement the server sends the next packet. This is called handshaking, it's a little game of ping-pong.
The speed of light is 299792 kilometers per second. The maximum number of ping-pongs per second is therefore 299792 divided by twice the distance between you and the server. If the server is 1000 kilometers away that's 149 ping-pongs per second. Every ping-pong is 1 packet, so if the packet size is 1 bit the server can only send you 149 bits per second. The speed of the network is immaterial, even a gigabit network cannot break the speed of light. The server is not sending data while waiting for the acknowledgement, waiting means less throughput, so the speed is reduced because of the distance.