!PLEASE ANSWER FAST ILL GIVE BRAILIEST AWARD! A glider begins its flight 1 1/2 mile above the ground. After 60 minutes, it is 9/10 mile above the ground. Find the change in height of the glider. If it continues to descend at this rate, how long does the entire descent last?

Respuesta :

Answer:

The change in height is 6/10 miles per hour and the entire descent will last 150 minutes (2 hours and 30 minutes)

Step-by-step explanation:

The gilder started 1 1/2 miles above the ground, in other words, it started [tex]\frac{15}{10}[/tex] miles above the ground. After 60 minutes it is now [tex]\frac{9}{10}[/tex] miles above the ground.

Therefore, the change in height in one hour is: [tex]\frac{15}{10}-\frac{9}{10}= \frac{6}{10}[/tex] miles.

Now, we know that the glide descends at a rate of 6/10 miles per hour, we can solve the second part of the problem using the rule of three (or proportions)

If it takes it 60 minutes to descend 6/10 miles, how long it will take it to descend 15/10 miles?

60 minutes --- 6/10= .6

x minutes ---- 15/10= 1.5

Therefore [tex]x=\frac{60(1.5)}{.6} =\frac{90}{0.6} =150[/tex] minutes

Thus, the entire descent will last 150 minutes.