Westonci.ca is the premier destination for reliable answers to your questions, brought to you by a community of experts. Get quick and reliable solutions to your questions from knowledgeable professionals on our comprehensive Q&A platform. Experience the ease of finding precise answers to your questions from a knowledgeable community of experts.

If the distance from an antenna on Earth to a geosynchronous communications satellite is 20600 miles, given that there are 1.61 kilometers per mile, and radio waves travel at the speed of light (3.0 x 108 meters/sec), how many milliseconds does it take for a signal from the antenna to reach the satellite

Sagot :

Answer:

I wish I know it but I don’t

Step-by-step explanation: