Microsoft Flight Simulator 2020 is a huge technical achievement. It lets you fly across the globe, features over 37,000 airports and uses Bing Maps as its primary imaging resource.
Its visuals mark a new level of visual fidelity for simulator software, satellite image-derived landscapes brought to life with incredible weather rendering. The project may never have achieved this scope without HoloLens, though.
Microsoft Flight Simulator 2020 initial inspiration was HoloLens HoloTour, which was a tech demo of sorts for Microsoft’s virtual reality headset. HoloTour saw iconic places like Rome and Machu Picchu recreated in 3D, with enough detail to trick the mind into thinking it was at least somewhat real.
A fitness platform should be the next avenue for this kind of world-mapping technology. Several companies already offer the first steps towards this.
iFit lets you create routes on real roads, using Google Street View to show a slideshow of your trip as you run or cycle. Zwift has recreated places like London, Richmond Virginia and France’s Mount Ventoux in stylised video game form. Rouvy places cyclist avatars onto “geo-sync’d” videos of popular routes.
The techniques used in Microsoft Flight Simulator could, and should, be used to level up this kind of immersive exercise. But to get an idea of how this might work, a look at how Microsoft Flight Simulator’s world is made is needed.
How Microsoft Flight Simulator Works
Four sets of techniques are used to map Microsoft Flight Simulator’s version of Earth. Satellite imagery from Bing Maps is used as the base. When paired with terrain elevation data you already end up with a good-looking environment at 10,000 feet.
Hundreds of key cities are also mapped using intense photogrammetry. The buildings of New York City, for example, are extruded by analysing photos. This results in incredible-looking skylines as you fly by, but it tends to look rough down at street level.
Other buildings and cities are populated using procedural generation. The game uses OpenStreetMap data to determine the characteristics of buildings, filling in towns with suitable models. This offers a clean 3D-modelled environment at any elevation, aside from the odd glitch like a 212-floor monolithic structure looming over Melbourne’s suburbs.
Other parts of the world, most notably airports, were filled-in manually by developer Asobo Studio. Combine these techniques, and add some of the most realistic clouds ever rendered, and you end up with Microsoft Flight Simulator.
The assets involved are so epic much of the image data has to be streamed from servers, even after a 100GB installation.
Making this into a fitness platform
The very same approach to world building would not quite work for a fitness platform designed to let you run and cycle through a virtual version of the real world. Microsoft Flight Simulator is predominately designed for the view from the sky, not 6ft from the ground.
However, a shift of perspective is all that is required.
Zoom right into Google Maps and Apple Maps and you will see the level of 3D modelling already in place in today’s mapping software. London, New York and San Francisco, for example, already look like the 3D maps from a video game.
Zoom-in further in Google Maps and the view switches to Street View, the key resource for street-level environmental mapping at the scale talked about here. Apple and Google have very high resolution images that cover millions of miles of road and pedestrianized areas. Microsoft’s Bing alternative, Streetside, is much weaker in this respect.
Apple is already ahead of Google in creating a seamless virtual version of the real world you can “walk” through too. It introduced Look Around to Maps in iOS 13, which combines three dimensional modelling with flat image data to make street-level transitions look seamless.
It takes a curated approach to Look Around, adding it to Apple Maps city-by-city. Fourteen locations currently have it at present:
- Las Vegas San
- Francisco Bay Area
- Los Angeles
- New York
- Nagoya Osaka
It turns the still image data taken by Apple Maps’s capture vehicles into convincing motion. Using Look Around as its framework, Apple could make the most visually advanced virtual run and cycling platform in the world.
What next? Apply machine learning to remove the cars and people captured by Apple’s cameras, and to improve 3D models and building textures. Add virtual pedestrians and avatars representing other people on the platform. You’d have a fitness platform as vibrant as Zwift, but far more realistic.
Route-making could either be a free-for-all, or tied to smaller areas in order to make the world feel more densely populated. A million kilometres of routes or a thousand are both valid choices.
This kind of fitness “game”, of course, also be a fairly intensive application to run, just like Microsoft Flight Simulator. However, what better excuse would there be for a solid subscription fee than it running right off a server-based service similar to Microsoft xCloud, PS Now or Google Stadia?
Who could make it?
Apple suddenly announcing a smart bike with “Apple World Workouts” may seem a pipe dream. So who else might develop this kind of software?
iFit could make this project a reality. It already lets you create routes based around Street View slideshow images. A partnership with Google is already in place.
Its parent company ICON Health and Fitness also owns NodicTrack and Pro-Form, which produce consumer-grade exercise bikes with iFit integration. All of its bikes and treadmills have automatic resistance, speed or incline changes over Bluetooth. Those with lower-end treadmills and exercise bikes could still feel terrain changes, even if the software itself was displayed on an iPad, Android tablet or TV.
Whether iFit would be interested in such a development-intensive project is in doubt.
“We’re a privately held company. We’ve always made sure that we want to remain profitable. We always take, probably, a more calculated approach in how we grow our business,” iFit President Mark Watterson told me, when asked recent about the impact rival Peloton has made with its major spending strategy.
One other company is already doing something similar to what is laid out here, VZfit. This company was previously known as VirZoom.
It connects a virtual reality headset to either a smart bike or a cadence sensor, to allow your pedalling on a normal exerise bike to correspond to on-screen action. It offers a bunch of basic games, and workouts based on Google Street View data.
Those Street View workouts offer much smoother transitions than iFit’s, but are still based on a rudimentary harvesting of the image data, complete with awkward image disruptions as one source picture careens into the next.
VR headsets also have no place in remotely serious workout hardware right now. You’ll get far too sweaty without thick foam padding clamped to your face by an elasticated headband.
So where does that leave us? Short of a start-up appearing with many millions in VC funding just for this very idea, Apple and Google are the dream developers of this fantasy project. They control the image and mapping data required to make it happen. They have enough money to do the job properly.
Another fitness angle to their portfolios would not do either brand any harm. The platform could be sold as a subscription service with a fee to match that of Netflix, without spending billions on films rights and development studios.
They would not even need to make their own high-end hardware for the purpose. Existing smart bikes, turbo trainers, smart treadmills and footpods could do the job, as they do for Zwift.
Petabytes of data sit in Google’s and Apple’s mapping platforms, and their potential use for enriching newly-popular home workouts has barely begun to be explored.