Skip navigation
FMI Midwinter
Augmodo.png Augmodo

Augmented reality company Augmodo wins FMI TechPitch Competition

Supermarket News sat down with CEO Ross Finman to talk about the win, as well as the future of augmented reality in grocery

Augmodo, a tech company with a goal of augmenting the retail workforce, has won the FMItech Pitch Competition at the third annual FMI Midwinter Executive Conference, recently held in Marco Island, Fla. 

The three finalists, including meal planning tool Grocery Shopii and carbon-neutral grocery marketplace Greenchoice, presented their food retail tech start-up solutions at FMI Midwinter, with event attendees voting on the winner. 

The pitch competition was open to any technology solution provider with less than $5 million in funding. Augmodo was announced live on FMI’s main stage and was also able to pitch live to FMI’s full Midwinter audience from the keynote stage.

Augmodo’s concept involves suiting up retail and grocery workers with AI and wearable cameras to improve ecommerce inventory and picking.

So far, the company has commitments from a 2,000-store North American chain and a 500-plus store international chain, and are also testing with New York gourmet grocer Happier Grocery.

Supermarket News Executive Editor Chloe Riley sat down with Finman to talk about the win, and what he sees as the future of augmented reality within retail.

The following interview has been edited and condensed.  

SUPERMARKET NEWS: So first, congrats on the win.

ROSS FINMAN: Thank you. We really appreciate the opportunity. FMI was probably one of the best conferences we went to just because all the decision makers were there, and I guess dangling southern Florida in the middle of January is a good carrot to get those decision-makers.

SN: It’s very hard to resist, especially for us Midwesterners.

So tell me about Augmodo’s beginnings? And how has the original concept shifted from where you first started?

RF: The origins of my company was actually during the baby formula shortage about 18 months ago. Then, I was trying to order formula online and no one actually had it, and I ended up driving 100 miles. So as I was driving around in my minivan to pick up formula, checking out a whole bunch of different stores, I was super frustrated, and that was when a lot of this stuff all clicked. And I was thinking, ‘Hey, we thought about this AR mapping of stores at Niantic,’ (Niantic, where Finman previously worked, is the software development company best known for developing Pokémon Go) and I’m just like, No. 1  customer pain point: Feeding your kids. So, that was what got me started in the space. 

From there, we actually started with smart glasses, so the initial pitch of the company was having smart glasses where we replaced the Zebra scanner, you can do a one-for-one cost replacement. So, we actually built out a full product on there, it was donated to some retailers, and then people liked the idea of an augmented workforce… but then it just became, I’d get my foot in the door, I’d get invited out, then people would love the demo, but then I’d run into, ‘What’s the integration?’ And then everyone would be like, ‘yeah, we just updated our handheld scanners a year ago. We don’t want to touch that for another five years.’

Finally a couple people actually asked, ‘Oh, can you just wear the glasses and we keep our handheld scanners just to record the data?’ So it ended up that 80% of the value from a smart glasses solution was in the data which you can collect with the wearable cameras from associates going around the store. And then we were like, ‘You’d pay $1,000 for that?’ and they’re like, ‘Well, it’s cheaper than a robot.’ 

And then that was a change, actually at Groceryshop last year where initially we went in there with the smart glasses demo, and we just got enough feedback there that we tested it out, and at the end of the conference, 20 out of 20 people said that they’d go with the dash cam clip on. And then, once we started pitching that, then sales moved very quickly.

SN: What are you finding is your sweet spot both in terms of so far in terms of retailer footprint, how many stores? Is there a point at which this doesn’t scale yet? 

RF: The main technical limitation for us is just the number of people walking around a store. If the store has one to five orders per day, you don’t quite have enough data. You can kind of work it, but it’s not good enough. Generally, a dozen orders per day is probably the minimum, two dozen is very comfortable, beyond that, we’re filtering down in terms of,what do we need from there? So, at a minimum of one to two dozen orders per day per store. Usually, that’s what the larger chains, not like the 10 store ones, just from what I’ve seen in the market.

SN: And what do you say to retailers about the advantage of using Augmoto versus scanning robots or stable, on-shelf cameras?

RF: Compared to robot scanners, we’re 100 times cheaper CapEx, we get on the order of 10 times more frequent data, all while being a completely passive solution. So, it’s not like the associates need to be trained to do anything, with robots, they don’t require the training either, obviously, but here it’s, OK, they’re already going around the store, so put a camera on them as they go around. So therefore, CapEx is a lot cheaper, less maintenance. Say you have 10 of these dashcams in the store, say two of them break down: OK, you have 80% the same data, we’re probably already filtering that all out, FedEx out a couple more. So, getting set up from there is pretty straightforward.

SN: I’m curious about your forward-looking thoughts in terms of where augmented tech fits into retail in the future. When I first heard your pitch, I got excited about thinking of this from a consumer perspective.

RF: Well, that’s where a lot of my heart lies — to change the consumer shopping [experience]. Navigating my own shopping list my wife will give me: ‘Hey, pick up black lentils while you’re out there,’ and at least in my local store, black lentils, it’s not next to the green lentils or in the bean section at all. It’s in the South Asian section. I’m just like, ‘OK,’ so I’m circling up and down the bean aisle texting around, and finally asking three different associates and they’re like, ‘Oh yeah, that’s in Indian food.’

And you’re just like, ‘Really?’ So even just navigating the store, because for me, I try and get out of a Costco in 20 minutes or under, because I’m very efficiency oriented. So therefore I view that when Meta, Google, Apple smart glasses come out, [a question is]: Would you keep those glasses on when you walk into a store? 

If brands can start to influence people at the point that they’re making a decision in-store, it’s a lot easier for [consumers] to opt-in. Shopping is going to be one of the key use cases of smart glasses longer term, and even some of launching use cases of it.  But the thing is, that’s a nice vision, but it’s hard to execute on that from a consumer standpoint, which is why I view B2B pickers optimizing their experience inside of a store, and just starting off with the cameras as the first domino to get that all going. So, Amazon started with books and we’re starting with the ecommerce pickers.

SN: It’s an exciting vision of the world of in-store shopping.

RF: Well, all the tech is coming together in there…retail is going to be one of the most disrupted markets from the new coming wave of technology. So you see all this virtual reality, augmented reality, smart glasses work coming through in there. I view shopping is going to be a major use case and a lot of these, traditionally from a tech world, ‘unsexy’ markets, which coming from Idaho, I take a little bit of offense to, but I think you get the gist. I view that shopping is going to be majorly changed as a result of it.

Hide comments


  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.