![]() Central London is the first City Landmarker available now, with more launching over the next year. Location-Based Services anchors Lenses to places using our city templates, or any custom location around the world. Multi-User Services lets groups of friends interact together at the same time within the same Lens. The three major services that Lens Cloud brings include: Furthermore, it is introducing Lens Cloud so that developers can use it along with Lens Studio to develop a new generation of AR experiences. It is also working on a feature, Ray Tracing, that will "let reflections shine from AR objects in a lifelike way." For this feature, Snapchat has joined hands with Tiffany & Co, and Disney and Pixar, that will bring a signature piece and Buzz Lightyear's spacesuit into AR. It is also making enhancements to the Lens Analytics feature with Event Insights with the aim to enable creators to debug issues efficiently. In addition, it is working with AstrologyAPI and Sportradar to broaden its API Library as well. Now, Snapchat has launched the latest version of its Lens Studio to enable developers to create realistic lenses. Capabilities like Connected Lens technology, VoiceML, and hand tracking are allowing developers to create new ways of interacting with AR. Snapchat has been continually working on various resources, programs, and gadgets such as Snap Lens Network, GHOST, and Spectacles, that it offers its partners, developers, and creators, to enrich user experience.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |