Google Maps is no longer just a navigation app, as the innovative features that launched in the last decade now allow us to explore the world from the comfort of our mobile phones.
Features like Street View and Immersive View pushed the application closer to the new-generation Google Maps. Immersive View for routes is a game changer, too.
Announced last year, Immersive View for routes combines a massive amount of data from multiple sources to create a multi-dimensional simulation of the real world. Its purpose is simple but brilliant: to allow us to preview our routes with a realistic simulation of the real world, so when you get behind the wheel, you'll already be familiar with the route and every turn it includes.
Like the main Immersive View feature, the component aimed at preview routes combines data from Street View, weather information, traffic data, and aerial imagery.
If you wonder how Google obtains aerial imagery, the answer comes down to this: cameras on planes. The company has managed to create a new version of its famous camera rig, which most of us already spotted on a car at a traffic light, to sport a smaller yet sturdy form factor ready to be installed on planes.
The camera rig, which originally tipped the scales at approximately 500 pounds (about 225 kilos), is now light enough to be carried around in a backpack. However, the aircraft hardware uses a new design with four lenses facing toward each other for a parallax effect that helps Google feed data into a 3D modeling engine. Using the processed data, Immersive View for routes can generate 3D simulations of the real world, including cars, buildings, trees, and everything else you find on the street.
Additionally, Google uses the cameras installed on planes for more data to improve other products, such as Google Earth, and other features, including navigation in Google Maps. When you see the Earth from above, you can spot certain essential items, including street signs, more accurately, and Google runs an analysis on the received imagery to extract the data.
Eventually, the captured data helps Google make Google Maps more accurate.
Immersive View for routes requires tremendous work and more data than we can imagine, so Google can't yet release the feature worldwide. It's available in limited regions, such as Las Vegas, New York, and Seattle, but Google has already promised to bring it to more regions this year.
Google engineer Daniel Philip said in an interview that creating the 3D models for each location is a highly complex process that takes time, especially because it must accurately simulate the real world. However, Immersive View for routes is definitely worth the wait, so fingers crossed for Google to come up with another expansion in the spring at its I/O event.
Announced last year, Immersive View for routes combines a massive amount of data from multiple sources to create a multi-dimensional simulation of the real world. Its purpose is simple but brilliant: to allow us to preview our routes with a realistic simulation of the real world, so when you get behind the wheel, you'll already be familiar with the route and every turn it includes.
Like the main Immersive View feature, the component aimed at preview routes combines data from Street View, weather information, traffic data, and aerial imagery.
If you wonder how Google obtains aerial imagery, the answer comes down to this: cameras on planes. The company has managed to create a new version of its famous camera rig, which most of us already spotted on a car at a traffic light, to sport a smaller yet sturdy form factor ready to be installed on planes.
The camera rig, which originally tipped the scales at approximately 500 pounds (about 225 kilos), is now light enough to be carried around in a backpack. However, the aircraft hardware uses a new design with four lenses facing toward each other for a parallax effect that helps Google feed data into a 3D modeling engine. Using the processed data, Immersive View for routes can generate 3D simulations of the real world, including cars, buildings, trees, and everything else you find on the street.
Additionally, Google uses the cameras installed on planes for more data to improve other products, such as Google Earth, and other features, including navigation in Google Maps. When you see the Earth from above, you can spot certain essential items, including street signs, more accurately, and Google runs an analysis on the received imagery to extract the data.
Eventually, the captured data helps Google make Google Maps more accurate.
Immersive View for routes requires tremendous work and more data than we can imagine, so Google can't yet release the feature worldwide. It's available in limited regions, such as Las Vegas, New York, and Seattle, but Google has already promised to bring it to more regions this year.
Google engineer Daniel Philip said in an interview that creating the 3D models for each location is a highly complex process that takes time, especially because it must accurately simulate the real world. However, Immersive View for routes is definitely worth the wait, so fingers crossed for Google to come up with another expansion in the spring at its I/O event.