Get reliable answers to your questions at Westonci.ca, where our knowledgeable community is always ready to help. Get precise and detailed answers to your questions from a knowledgeable community of experts on our Q&A platform. Get precise and detailed answers to your questions from a knowledgeable community of experts on our Q&A platform.

The height of an arrow is shot upward at an initial velocity of 40 meters per second can be modeled by h=40t-5t^2 where h is the height in meters and t is the time in seconds. Find the time is take for the arrow to reach the ground. Can someone please explain this to me thanks

Sagot :

We are given the equation for the height of the arrow. If you graph it, you see that it's a parabola and that the arrow kinda peaks and then falls back down. Another way of thinking about this problem is that you're looking for the time when the height is 0. You can see on the graph that there are two times that h=0. The first is obviously at t=0, when the arrow hasn't left the ground yet. The second is what we're looking for, when the arrow reaches the ground.

To solve this, let's set h=0. So 0=40t-5t^2. If you factor this, you get 5t(8-t) = 0. Continuing that leads to 5t=0 where t=0 which we already knew, and 8-t=0 where t=8. So that second time is when the arrow is back on the ground. Therefore your answer is 8 sec.

View image anticlimaticus