Finding the dress immortalized in the magazine, but in a different color. Stroll in Venice and let yourself be guided by augmented reality. Find out the name of the plant we have in the pot to give it the most appropriate care. Google has just announced some of the new features based on theArtificial intelligence. Features and applications such as Google Lens, Immersive View And Live View that have the ability to make the way we interact with the information around us more and more natural and intuitive, at any time. How many times have we typed on field research by Google the words or keywords to get a certain piece of information and object without having found exactly what we wanted. Reluctantly we discovered that words are not always the easiest way to express what we want.
There multiple search of Google changes the perspective. Through the camera of Google Lens it is sufficient to take the image of what most closely resembles what we are looking for; let’s say for example an armchair that has the same style as that of a friend. But we don’t like yellow and in our house beige goes better. Thus, we can add the color we want to the results proposed by Google Lens to see only that variant. And the search ended in a matter of seconds. The procedure takes longer to describe than to perform. That’s why now thanks to multiple search you can interact with Google in a very way more natural and intuitive.
Google tries to organize information in the world for over 20 years. The results that can be obtained today are part of a process focused on understanding the information semantically contained in the different media. In fact, Google has been investing for years in understanding the semantic value of an image or a video and make it accessible to users in their searches. In conclusion, therefore, if you can see it, you can also search for it. Here are three practical examples of the new image-based multiple search.
Example of Google Multiple Search with Google Lens
1. Search for a dress with a print of another garment
In this case it is sufficient to frame the garment with the print of interest with Google Lens, press enter and wait for the result. Once obtained, it will be necessary to add the term in the search field dress to obtain the garment you want in several very similar proposals.
2. Identify the name of a plant to cure it
Same procedure: frame the plan with Google Lens and add the question in the multiple search field How to take care of it, how much to water it, where to place it, etc. And then, thinking about it, already knowing the name without having to download a thousand applications is a privilege.
3. Find information about blueberries
It will be necessary to frame the chosen raw material, in this case the blueberries, narrowing down the image on the smartphone in detail and then adding the word recipe in the searches or going even further into the specifics: nutritional qualityAnd, ways to keep themwhen they ripen, homemade ice cream. From now on it is no longer necessary to know the name of what we are looking for if there is a concrete image of it.
Live View
When Google Maps will become more immersive
Artificial intelligence and augmented reality will soon bring new fundo updates to Italy immersive viewing, Live View And aerial images. An important tool to help both Google native electric vehicle (EV) drivers and those who move on foot, by bicycle or by public transport. Change the user experience. Soon it will be possible to plan a visit to the Uffizi Gallery in Florence as well as to the Peggy Guggenheim Collection in Venice in a more rigorous manner. They will be able to virtually fly over buildings and see where the entrances to the museum are located, for example. The slider storm will show what the area looks like at different times of day and what the weather will be like. You will also be able to locate the busiest places. And when hunger strikes, just go down to street level to explore the nearby restaurants, and also take a look inside to quickly understand the atmosphere of a place before booking. The first cities in which this can be done will be precisely Florence and Venice.
Another thing is Live Viewalready active in London, Los Angeles, New York, Paris, San Francisco and Tokyo. This function is also active in Italy in pedestrian paths. Use arrows powered by augmented reality to point you in the right direction. Going the wrong way now is really difficult! In the United States, in Zurich and Tokyo, Live View allows you to get detailed directions even in closed areas such as airports or shopping centers to quickly and safely find toilets, waiting rooms, taxi ranks, car rentals and much more. Live View Indoors will enter more than 1,000 new airports, train stations and shopping malls in Barcelona, ​​Berlin, Frankfurt, London, Madrid, Melbourne, Paris, Prague, Sao Paulo, Singapore, Sydney, Taipei and Tokyo in the coming months. While drivers of electric vehicles equipped with Integrated Google Maps will have a more serene drive because they will be indicated charging points closer and faster. Attention, iPhone users need to download the apps Google And Google Maps to use all these services.
Source: Vanity Fair

I’m Susan Karen, a professional writer and editor at World Stock Market. I specialize in Entertainment news, writing stories that keep readers informed on all the latest developments in the industry. With over five years of experience in creating engaging content and copywriting for various media outlets, I have grown to become an invaluable asset to any team.