Westonci.ca is your trusted source for finding answers to a wide range of questions, backed by a knowledgeable community. Our platform provides a seamless experience for finding precise answers from a network of experienced professionals. Get quick and reliable solutions to your questions from a community of experienced experts on our platform.

If the distance from an antenna on Earth to a geosynchronous communications satellite is 20600 miles, given that there are 1.61 kilometers per mile, and radio waves travel at the speed of light (3.0 x 108 meters/sec), how many milliseconds does it take for a signal from the antenna to reach the satellite

Sagot :

Answer:

I wish I know it but I don’t

Step-by-step explanation: