Respuesta :
The driver of a car travels 150 miles to reach his destination. If he travels 60.0 mi/h for 100.0 miles and 55.0 mi/h for the remaining 50.0 miles, Then, it took him 2.576 hours to reach his destination.
The total distance covered by the driver is 150 miles.
However, if he covers 60.0 mi/h for 100 miles.
Then, it implies that for a total of 100 miles, he covers:
[tex]\mathbf{= \dfrac{100 miles}{60 \ mi/h}}[/tex]
= 1.667 hours for 100 miles.
Similarly, he covers the 55 mi/h for the remaining 50.0 miles
i.e.
[tex]\mathbf{ =\dfrac{50.0 \ miles}{55 \ mi/h}}[/tex]
= 0.9091 hour for 50.0 miles
∴ for the total of 150 miles i.e, (100 + 50)miles;
The driver spent: (1.667 + 0.909) hours to reach his destination.
= 2.576 hours
Therefore, we can conclude that the driver spent 2.576 hours to reach his destination.
Learn more about time conversion here:
https://brainly.com/question/4081802?referrer=searchResults