Trivia Cafe
17

A driver embarks on a 60 mile drive. He drives half the time at 60 miles per hour and half the time at 30 miles per hour. How much time will this trip take?

Learn More

mathematics

This puzzle is a clever twist on a classic brain teaser about average speeds. To find the total time, we can let the unknown total duration of the trip be 'T'. The car travels for half of this time (T/2) at 60 mph and the other half (T/2) at 30 mph. Since distance equals speed multiplied by time, we can set up the equation: (60 mph * T/2) + (30 mph * T/2) = 60 miles. This simplifies to 30T + 15T = 60, which means 45T = 60. Solving for T gives us 4/3 hours, or 80 minutes.

What makes this problem interesting is that a common shortcut works here, but often fails in similar-sounding riddles. Many people intuitively average the two speeds (60 + 30) / 2 to get 45 mph, and then calculate the time: 60 miles / 45 mph = 80 minutes. This shortcut is correct in this case precisely because the *time* was split equally. When you spend an equal amount of time at different speeds, your average speed is the simple arithmetic mean.

The problem becomes much more difficult if the *distance* is split instead. For example, if the driver traveled 30 miles at 30 mph (taking 60 minutes) and then 30 miles at 60 mph (taking 30 minutes), the total 60-mile journey would take 90 minutes. In that scenario, more time is spent at the slower speed, pulling the true average speed down to 40 mph. The key is always to check whether the time or the distance is being divided.