Florian Mueller and colleagues from Australia presented an interesting paper at the CHI conference titled, “Balancing Exertion Experiences.”
They had previously developed and reported on a system that lets people jog “together” though physically separated (even England to Australia!) They can talk with each other, but the sound is spatially located, so it sounds like your running partner is to your left (or right) and ahead of you or behind you. If they’re getting ahead of you, it can spur you to speed up, or slow down if they’re behind you.
Now they’ve gone for a “better than being there” experience. If your running pace is different than your partner’s, they can still have you run together. Instead of balancing your speed in order to stay next to your partner, you have to balance some other metric of exertion. In their version, each person picks their own maximum target heart rate, and you have to match the percentage of personal target in order to stay aurally next to your partner while jogging.
Pretty goal. Current prototypes use a little too much hardware for comfortable jogging, but I expect something like this will be available for iPhones some time.