To drive 35 miles at 30 mph takes (35/30) hour.
To drive 35 miles at 25 mph takes (35/25) hour.
The difference is (35/25) minus (35/30).
In order to add or subtract fractions, they need a common denominator.
The smallest one available for this situation is 150 .
(35/25) = 210/150
(35/30) = 175/150
Now we can subtract: (210/150) - (175/150) = 35/150 of an hour.
(35/150 hour) x (60 min/hr) = (35 x 60) / (150) min = 14 minutes.
==> It's beginning to look like the answer is NOT 40minutes.
Let's do it again, with a less-elegant but more-intuitive approach:
Time to drive 35 miles at 30 mph = 35/30 = 1-1/6 hours = 70 minutes.
Time to drive 35 miles at 25 mph = 35/25 = 1.4 hours = 84 minutes.
Time saved by driving at 30 mph instead of 25 mph = (84 - 70) = 14 minutes.