Westonci.ca offers fast, accurate answers to your questions. Join our community and get the insights you need now. Discover in-depth solutions to your questions from a wide range of experts on our user-friendly Q&A platform. Connect with a community of professionals ready to help you find accurate solutions to your questions quickly and efficiently.

The height of an arrow is shot upward at an initial velocity of 40 meters per second can be modeled by h=40t-5t^2 where h is the height in meters and t is the time in seconds. Find the time is take for the arrow to reach the ground. Can someone please explain this to me thanks

Sagot :

We are given the equation for the height of the arrow. If you graph it, you see that it's a parabola and that the arrow kinda peaks and then falls back down. Another way of thinking about this problem is that you're looking for the time when the height is 0. You can see on the graph that there are two times that h=0. The first is obviously at t=0, when the arrow hasn't left the ground yet. The second is what we're looking for, when the arrow reaches the ground.

To solve this, let's set h=0. So 0=40t-5t^2. If you factor this, you get 5t(8-t) = 0. Continuing that leads to 5t=0 where t=0 which we already knew, and 8-t=0 where t=8. So that second time is when the arrow is back on the ground. Therefore your answer is 8 sec.

View image anticlimaticus
We hope our answers were helpful. Return anytime for more information and answers to any other questions you may have. We hope our answers were useful. Return anytime for more information and answers to any other questions you have. We're dedicated to helping you find the answers you need at Westonci.ca. Don't hesitate to return for more.