A major-league pitcher can throw a baseball in excess of 44.7 m/s. If a ball is thrown horizontally at this speed, how much will it drop by the time it reaches the catcher who is 16.1 m away from the point of release?

Respuesta :

The baseball drops by 0.64 meter.

Explanation:

Consider the horizontal motion of ball

We have equation of motion s = ut + 0.5 at²

        Initial velocity, u = 44.7 m/s

        Acceleration, a = 0 m/s²  

        Displacement, s = 16.1 m      

     Substituting

                      s = ut + 0.5 at²

                      16.1 = 44.7 x t + 0.5 x 0 x t²

                      t = 0.36 s

      Time taken to travel 16.1 m is 0.36  seconds

Now we need to find how much ball travel vertically during this 0.36 seconds.

We have equation of motion s = ut + 0.5 at²

        Initial velocity, u = 0 m/s

        Acceleration, a = 9.81 m/s²  

        Time, t = 0.36 s      

     Substituting

                      s = ut + 0.5 at²

                      s = 0 x 0.36 + 0.5 x 9.81 x 0.36²

                      s = 0.64 m

     The baseball drops by 0.64 meter.