Youre driving a car down a two-mile track. for the first mile, you drive 30 miles per hour. how fast do you have to go for the second mile in order to average 60 miles per hour for the whole track?
This is the concept of algebra, speed is given by: speed=(distance)/time Average speed=[(speed A)+(speed B)]/2 suppose the speed at the second mile was x mph; thus the average speed was: (30+x)/2=60 solving for x we get: 30+x=60*2 30+x=120 subtracting 30 from both sides we get: x+30-30=120-30 x=90 mph thus the speed at the second mile should be 90 miles per hour