On its face, Amazon’s first Re:MARS conference is all about far out, world changing ideas, but the company is still very much a retailer at heart. Fitting then, that one of the first big announcements from this morning’s event is all about using artificial intelligence to help people better shop for close.
Amazon’s been talking about similar initiatives for a while, but StyleSnap looks to actually be coming soon via the Alexa iOS app (though the actual timeframe is still TBD).
Amazon’s Consumer Worldwide CEO Jeff Wilke introduced the feature today, telling the crowd, “When a customer uploads an image, we use deep learning for object detection to identify the various apparel items in the image and categorize them into classes like dresses or shirts. We then find the most similar items that are available on Amazon.”
The feature will be accessible by clicking the camera icon in the corner of the Alexa app. Users take a photo or upload a screenshot of a look they like, and Amazon will offer up suggestions, that factor in things like price, reviews and brands.
The company’s got a blog post detailing some of the steps taken in order to provide the service. It’s a series of complex asks for what seemingly amounts to a simple task for the human brain.
“To have neural networks identify a greater number of classes, we can stack a greater number of layers on top of each other,” the company writes. “The first few layers typically learn concepts such as edges and colors, while the middle layers identify patterns such as “floral” or “denim”. After having passed through all of the layers, the algorithm can accurately identify concepts like fit and outfit style in an image.”
Amazon has already begun to implement AI for other shopping applications, including Go and Whole Foods.
This post was originally posted at http://feedproxy.google.com/~r/Techcrunch/~3/KuI-DKvidKs/.