# Average Time Calculator

Calculate the average time or pace given two or more time durations using the calculator below.

## Average Time:

## On this page:

## How to Calculate Average Time

Knowing how to calculate the average time is essential if you’re trying to determine your pace for a marathon, the amount of time you spend online, or the average time duration for any other activity.

The average time is a single time that represents the normal, usual, or typical time in a set of several durations. The term “average” is a measure of the center of a set of numbers or data.

Given a set of times, you can find the average in a few easy steps.

### Step One: Count the Times

The first step to calculating the average time is to find the total number of times in the data. You can do this by simply counting the times you’re working with.

### Step Two: Sum the Times

The next step is to calculate the sum of all of the times in the data. To add multiple times, you simply sum up the hours, minutes, and seconds separately.

Start by adding up the seconds in each time duration.

Next, add up the minutes in each time duration, and then multiply the total minutes by 60 to convert the minutes to seconds.

Add Hours:

Then, add up the hours in each time duration, and then multiply the total hours by 3,600 to convert the hours to seconds.

Finally, add all of these seconds values together to find the total number of seconds.

### Step Three: Find the Average Time

To find the average time, divide the total time in seconds found in step two by the number of times in the data found in step one. The result is the average time in seconds.

You can use our seconds to time converter to express the average time in seconds in any format you need.

**For example,** let’s calculate the average mile time for a runner with the following mile times: 07:15, 07:45, and 08:10.

Let’s start by counting the number of times in the data. There are three times, so the count is *three*.

Next, add the seconds of each time together:

15 + 45 + 10 = 70 seconds

Then, add the minutes of each time together, and then multiply the result by 60 to find the sum in seconds.

(7 + 7 + 8) × 60 = 1,320 seconds

Now, add the seconds from the previous steps together.

70 + 1,320 = 1,390 seconds

Finally, divide the total time in seconds by the count of *three* mile times in the data.

1,390 seconds ÷ 3 miles = 463.3 seconds/mile

So, the average time per mile in this example is 463.3 seconds, or 7 minutes and 43.3 seconds per mile.