Digamma.ai CEO Q&A Series: Interview with Jarrod Wolf, co-founder of AddStructure
1. How are you looking to fundamentally change the customer shopping experience through Addstructure?
With our technology we’re hoping to make the shopping experience much more seamless. Imagine looking at your phone and using your voice to speak. As you’re speaking, the product mix that is being displayed to you is updating in real time. You can say something like, “I’m looking for a TV maybe like between 40 and 50 inches, around $600 and that has 3 HDMI ports.” And, as you’re speaking, the products that are being shown to you are actually updating with that conversational memory.
Next, you can imagine that technology like this could also enable a much better grocery cart building experience. It’s difficult to build a grocery list because you’re building a cart of 50 or 60 products. With our technology you can have your phone in your hand and say, “I’m looking for milk. Actually I want that to be organic milk. I want some of the yogurt. I want some sourdough bread. I want all the ingredients to make chicken enchiladas.” As you’re speaking, the entire list is just building itself. So, it’s a much faster, natural experience than the experience you have today.
2. Addstructure offers a white-labeled natural language understanding (NLU) platform that retailers and brands can use to enable conversational commerce channels. Why is this important in the retail environment now?
Historically, if I wanted to buy groceries, I’d have to go into a physical retail location. But now if I have an Alexa, or even my cellphone, I can fulfill that order anywhere. In today’s environment – with more natural ways to order — it is really important to be able to meet consumers where they are in order to stay competitive.
A lot of people have thrown around the word “Omnichannel” for a while now. It’s attained buzzword status. What they’re really saying is that consumers now have the ability to shop anywhere. They can shop on Facebook. They can shop in their vehicle. They can shop on the metro. They can shop in their living room. They can use any number of devices to shop. So, the real goal of the retailer is to be in all of the places a person is looking to shop. Consumers increasingly prefer to shop in whatever manner has the least amount of friction, and for essentials — like toilet paper or shampoo — that method is increasingly becoming voice.
Second, a large portion of our technology is based around question answering and fact extraction. Let’s say I’m a consumer and I’m looking to buy a washing machine. There are hundreds of washing machines on the market. Aside from price and size, there are probably very few other things that I actually care about. If you go to an Amazon product page, what you’ll find is just tons and tons of information that no consumer is concerned about. Allowing a consumer to tell a retailer what information he or she cares about is vastly more convenient for the consumer than requiring them to surf through product pages. Retailers need to make information that consumers want available so that they are meeting the customers where they are and how they want to shop.
3. How did you and your team come up with the idea behind Addstructure?
AddStructure came out of my co-founder William Underwood’s graduate research project. He and his graduate professor, Professor Bing Liu, at the University of Illinois Chicago, worked to create a feature extraction algorithm called aspect-based sentiment analysis. They also created another feature extraction technique called double propagation. IAC, which at the time owned UrbanSpoon, found the project online and said, “Hey, we’d love to turn your graduate research project into an actual product for UrbanSpoon.” We instantly realized the potential value our technology could have in the field of product search and recommendation. The rest, as they say, is history.
4. How do you see the field of NLU changing the retail industry over the next decade? What industries or type of companies do you believe would benefit the most from advances in NLU?
I think NLU will create huge advantages in the customer service field. The majority of questions that come through email and chat-centers today could be answered with NLU if the proper workflow management and partner integrations are coordinated. But without those integrations, there are always going to be questions that will have to go to a person.
I’ll give you an example. Let’s say you buy a package from Walmart and that package hasn’t been delivered. Your tracking number from Walmart also shows that it hasn’t been delivered and it’s 2 days overdue. Now someone at Walmart is going to have pick up a phone, call FedEx, and ask, “Hey, where is the package, what’s going on?”. If Walmart and FedEx had better NLU and the proper integrations, this type of query is something that could be handled by a computer. In this scenario, Walmart would send a query to FedEx. FedEx would then query its systems, conduct an automated diagnostic, and respond. They now answer a greater number of queries than they can today. This type of sophistication between companies is not common right now.
I also think you’ll see a lot of stores adding product mapping technology and kiosks to answer questions that store associates traditionally answered, such as “where are the paper towels” or “where is the bathroom”. This will allow the retailer with brick and mortar stores to be more capital efficient.
A lot of NLU sophistication will come in the form of product recommendation. I imagine at some point there’s going to be two-way advertising on Google Home and Alexa. A brand will send you a notification saying, “Hey Jarrod, today’s top deals are X, Y and Z.” I then might respond and say, “Yeah, I’d love to hear more about x” and a conversation will start.
With over a decade of AI experience, Digamma.ai’s team are your trusted machine learning consultants, partners, and engineers.