The Candid Voice in Retail Technology: Objective Insights, Pragmatic Advice

The Internet Of Things, Retail, And Analytics: SAS Analyst Days 2016

						Username: 
Name:  
Membership: Unknown
Status: Unknown
Private: FALSE
					

I got my start in the software world through supply chain software, specifically supply chain execution management. And quickly found myself at ground zero of Walmart’s RFID mandate. As the entire technology industry groped around, trying to figure out how to really implement RFID in the supply chain, it quickly became clear that there was a huge gap between the constant pinging signals coming from RFID tags, and the execution systems that needed to ultimately capture the meaning of those signals.

When people talk about IoT today, while RFID is implied, in some ways that technology is only the really early version of what IoT really means. The data packet sent by an RFID tag is very small — as small as the industry could make it. As IoT expands the types of sensors and their uses, the amount of data that an IoT sensor can capture will increase (think temperature or other environmental sensors, or marketing messages that need to be passed along to a shopper’s mobile phone), and it will have a big impact on the size of the data packet sent over the network. You add an exponential number of sensors and internet-connected devices, and to say that the impact on the network will be huge or gigantic or mind-boggling is a vast understatement.

So you have this gap between the signals coming from sensors and the execution systems that need to translate those signals into actual transactions, and you have more and different kinds of sensors that can send more complex signals, which require an even greater level of interpretation to make sense of it.

That sounds like a problem for analytics. If a kernel of analytics sat on the sensor or device, or at least very near to it at the local level, like in a reader or router, then the sensor itself could decide whether a signal is actually worth putting on the network for some other layer of the architecture to deal with. Instead of mind-boggling numbers of messages flying around, crushing the corporate network, only the few key, important messages would ever leave the local level and make it to some central repository, whether in the cloud or within the enterprise’s corporate data center.

And once those messages make it to the mother ship, wherever it may reside, there is still a lot of work to be done to interpret that message and act on the information that message brings. I’ve found, in all the years of dealing with RFID,that people tend to assume that the messages coming from sensors are much smarter than they actually are.

The reality is more like this. At the sensor level: “I’m here! “ or “32 degrees Fahrenheit “, basically over and over again until something changes. If nothing is changing, or no boundary condition is hit, there is absolutely no reason to send these messages out onto the network. No one cares.

It is only when detecting a change that the message becomes important, whether that message is “I’m not here any more! “ or “35 degrees Fahrenheit “. If that triggers a boundary condition, then the message that might be sent on the network might be “there is one less here now “ or “this unit is getting hot “. But that’s not enough to make a decision or take an action.

“There is one less here now “ might need to be reconciled against another location. If the RFID tag that was in the location two seconds ago is no longer here, then where did it go? If the tag shows up in a place where it shouldn’t be, then a system might need to alert a human to go get the straying inventory and put it back where it does actually belong. In that case, should the message go all the way up the network to the central inventory management position, be interpreted, and then generate another message back down to the local network to notify the human? That seems like a waste of bandwidth and messaging, if there was enough intelligence in the analytics at the local level to know whether the central inventory system needs to be consulted at all.

And you may be thinking “So what? We have enough bandwidth for that. “ Yes, for one instance of that message. But how many instances of that message might happen on an hourly or daily basis? And how long would it take for the sum of all those messages to swamp your bandwidth? This is the great unknown — and risk — of IoT.

And then there is the case where if the tag shows up in another nearby location where more of that item is kept, then a human probably doesn’t need to be notified. But messages should fly across the corporate or cloud network to the central inventory management system, because while overall inventory may not need to be adjusted, the inventory in those two locations will need to be — one incremented, and one decremented.

A problem made for analytics indeed. And, as SAS pointed out when I attended their analyst days event last week, it is exactly the kind of problem that a company like SAS is positioned to solve.

You may be wondering at this point how that could be. Too often the company labors under a reputation for ivory-tower analytics, which may lead to the incorrect impression that the company is also a monolithic solution that must be implemented centrally.

Not so. And the solution architecture that the company is developing in support of IoT speaks to a depth of thinking about the problem that is impressive. For example, it is relatively easy to develop local level analytics that can operate in an IoT environment — the kernel at the local level that decides when to put a message onto the network in the first place. But if you end up with ten million sensors or devices — or more — out there in the environment, how do you keep all of those devices up on the latest version of the software? How do you coordinate the analytics that will need to occur on or across devices?

The easy answer is just to manage it like all software — you send periodic updates. Except there’s that network thing again. The more data you put on the network, the more it crushes that network. And there isn’t much value in deploying local analytics to devices to prevent them from putting an overwhelming number of messages onto the network if you then have to bring that same network to its knees periodically while updating the software remotely.

SAS is thinking about these problems, and they have partnerships in place with companies like Intel to minimize their load on companies’ networks. And that’s really just the beginning of where SAS is going in support of IoT.

Everyone is enthusiastic about IoT — certainly the retailers we surveyed in our latest research on the topic are almost euphorically optimistic about the impact IoT will have on their business. But it’s these nitty-gritty details, like which messages to create, which ones to send, and what to do in response to those messages, that will make or break an IoT implementation.

I look forward to our next IoT benchmark (we’re running the survey over the summer) to see if the euphoria remains, or if reality is starting to set in. But no matter what, it’s good to know that there are companies out there actively trying to solve these problems, even before retailers get far enough along to encounter them at any scale.

Because with IoT, scale takes on a whole new meaning. And I don’t think retail is ready for it.

 

Newsletter Articles March 8, 2016
Authors