From the landmarks included in Downtown Road Navigation to the improved skin tone calibration in photo shooting, Google is taking steps to make its products more inclusive. It’s a way for the web and mobility giant to make people forget past internal flaws when it comes to respect for ethics and diversity.
Since Google is Google, all of this requires a more refined (and more confidential, promise) of its algorithms.Artificial intelligence, The California giant is the only way to gather any meaningful information from the tons of data collected from the 3 billion Android-based mobile devices in circulation across the planet.
The strong themes of the opening speech of the 2021 edition of the annual Google Io Conference hosted by Google’s Bigg Boss Sundar Pichai this afternoon are more personalized AI and more inclusive products. Face and remotely.
For example, Google Maps Navigation Assistance can list roads and places that can cause traffic jams, slowdowns and more serious collisions or accidents. Add data on height differences and other motorists’ behavior, and Google may suggest route guides to reduce fuel consumption.
In terms of inclusion, Google Maps will display more accurate data on pedestrian crossings and access for people with disabilities in the coming weeks, better planning the journeys of people with reduced mobility.
For workers, Google Office Tools will soon refer to more comprehensive expressions in documents that are automatically maintained during team meetings. Because “man for the job” is not always … man, if you know what Google means.
“There’s still a lot to be done in terms of inclusion, but we want to change the technology pretty much for everyone,” Sameer Samat, head of new products for Google, said on stage.
At Google, artificial intelligence is used to bridge the gap between the various interactions that humans have with machines: orally, in writing, or through photos and videos. After all, understanding what people mean when they search their search engine has been successful since the beginning of Google’s millennium.
Therefore Google has made efforts to better understand the “natural language” of its users and the content shared on various platforms such as YouTube and Google Photos. So the user will soon be able to ask his Android device to “search for the part of the video of the lion roaring in front of the sun” and automatically, this essence will appear on his screen as seen in the video on YouTube.
The same kind of cross search between voice, text, photos and videos will soon be possible everywhere. Google calls it multimodal search. Once Mountain View finds out what multinational people are looking for on a daily basis, they can better understand what they want to find before they ask.
This ntic takes the form of instructions that are displayed by default in its Google Maps app for places to visit soon and other interesting places, which is definitely at the heart of Google’s functionality these days. Another technology of the moment is directly available on Google Maps on the street and even in shopping malls and airports to help you find your way through your mobile screen.
To avoid congestion time, we can see the traffic level in real time in the shops and public places we want to visit. Because if there was one thing that was taught to us last year, it was that beyond the real need to be more inclusive of technology, even if they thought they were very different, humans are the most similar of all, even capable able-bodied.
Watch the video
Source: Economy – Le Devoir