Supply Chain & Big Data ÷ Analytics = Innovation

Google the term “advanced analytics” and you get back nearly 23 million results in less than a second.

Clearly, the use of advanced analytics is one of the hottest topics in the business press these days and is certainly top of mind among supply chain managers.

Yet, not everyone is in agreement as to just what the term means or how to deploy advanced analytics to maximum advantage.

At HP, the Strategic Planning and Modeling team has been utilizing advanced operational analytics for some 30 years to solve business problems requiring innovative approaches.

Over that time, the team has developed significant supply chain innovations such as postponement and award winning approaches to product design and product portfolio management.

Based on conversations we have with colleagues, business partners and customers at HP, three questions come up regularly – all of which this article will seek to address.

  1. What is the difference between advanced and commodity analytics?
  2. How do I drive innovation with advanced analytics?
  3. How do I set up an advanced analytics team and get started using it in my supply chain?

Advanced analytics vs. commodity analytics

So, what exactly is the difference between advanced analytics and commodity analytics? According to Bill Franks, author of “Taming The Big Data Tidal Wave,” the aim of commodity analytics is “to improve over where you’d end up without any model at all, a commodity modeling process stops when something good enough is found.”

Another definition of commodity analytics is “that which can be done with commonly available tools without any specialized knowledge of data analytics.”

The vast majority of what is being done in Excel spreadsheets throughout the analytics realm is commodity analytics.

Read more at Supply Chain & Big Data ÷ Analytics = Innovation

What do you think about this topic? Write down your opinions in the comment box below, and subscribe us to get updates in your inbox.

Share on FacebookShare on Google+Share on LinkedInTweet about this on TwitterEmail this to someone

Overcoming 5 Major Supply Chain Challenges with Big Data Analytics

Big data analytics can help increase visibility and provide deeper insights into the supply chain. Leveraging big data, supply chain organizations can improve the way they respond to volatile demand or supply chain risk–and reduce concerns related to the issues.

Sixty-four percent of supply chain executives consider big data analytics a disruptive and important technology, setting the foundation for long-term change management in their organizations (Source: SCM World). Ninety-seven percent of supply chain executives report having an understanding of how big data analytics can benefit their supply chain. But, only 17 percent report having already implemented analytics in one or more supply chain functions (Source: Accenture).

Even if your organization is among the 83 percent who have yet to leverage big data analytics for supply chain management, you’re probably at least aware that mastering big data analytics will be a key enabler for supply chain and procurement executives in the years to come.

Big data enables you to quickly model massive volumes of structured and unstructured data from multiple sources. For supply chain management, this can help increase visibility and provide deeper insights into the entire supply chain. Leveraging big data, your supply chain organizations can improve your response to volatile demand or supply chain risk, for example, and reduce the concerns related to the issue at hand. It will also be crucial for you to evolve your role from transactional facilitator to trusted business advisor.

Read more at Overcoming 5 Major Supply Chain Challenges with Big Data Analytics

What do you think about this article? Share your opinions with us in the comment box.

Share on FacebookShare on Google+Share on LinkedInTweet about this on TwitterEmail this to someone

How Big Data And Analytics Are Transforming Supply Chain Management

Supply chain management is a field where Big Data and analytics have obvious applications. Until recently, however, businesses have been less quick to implement big data analytics in supply chain management than in other areas of operation such as marketing or manufacturing.

Of course supply chains have for a long time now been driven by statistics and quantifiable performance indicators. But the sort of analytics which are really revolutionizing industry today – real time analytics of huge, rapidly growing and very messy unstructured datasets – were largely absent.

This was clearly a situation that couldn’t last. Many factors can clearly impact on supply chain management – from weather to the condition of vehicles and machinery, and so recently executives in the field have thought long and hard about how this could be harnessed to drive efficiencies.

In 2013 the Journal of Business Logistics published a white paper calling for “crucial” research into the possible applications of Big Data within supply chain management. Since then, significant steps have been taken, and it now appears many of the concepts are being embraced wholeheartedly.

Applications for analysis of unstructured data has already been found in inventory management, forecasting, and transportation logistics. In warehouses, digital cameras are routinely used to monitor stock levels and the messy, unstructured data provides alerts when restocking is needed.

Read more at How Big Data And Analytics Are Transforming Supply Chain Management

Share on FacebookShare on Google+Share on LinkedInTweet about this on TwitterEmail this to someone

Automating Big-Data Analysis and Replacing Human Intuition with Algorithms

Big-data analysis consists of searching for buried patterns that have some kind of predictive power.

But choosing which “features” of the data to analyze usually requires some human intuition.

In a database containing, say, the beginning and end dates of various sales promotions and weekly profits, the crucial data may not be the dates themselves but the spans between them, or not the total profits but the averages across those spans.

MIT researchers aim to take the human element out of big-data analysis, with a new system that not only searches for patterns but designs the feature set, too.

To test the first prototype of their system, they enrolled it in three data science competitions, in which it competed against human teams to find predictive patterns in unfamiliar data sets.

Of the 906 teams participating in the three competitions, the researchers’ “Data Science Machine” finished ahead of 615.

In two of the three competitions, the predictions made by the Data Science Machine were 94 percent and 96 percent as accurate as the winning submissions.

In the third, the figure was a more modest 87 percent. But where the teams of humans typically labored over their prediction algorithms for months, the Data Science Machine took somewhere between two and 12 hours to produce each of its entries.

“We view the Data Science Machine as a natural complement to human intelligence,” says James Max Kanter, whose MIT master’s thesis in computer science is the basis of the Data Science Machine.

Read more at Automating Big-Data Analysis and Replacing Human Intuition with Algorithms

What do you think about this article? Share your opinions with us in the comment box and subscribe us to get updates.

Share on FacebookShare on Google+Share on LinkedInTweet about this on TwitterEmail this to someone

The Key to Analytics: Ask the Right Questions

People think analytics is about getting the right answers. In truth, it’s about asking the right questions.

Analysts can find the answer to just about any question. So, the difference between a good analyst and a mediocre one is the questions they choose to ask. The best questions test long-held assumptions about what makes the business tick. The answers to these questions drive concrete changes to processes, resulting in lower costs, higher revenue, or better customer service.

Often, the obvious metrics don’t correlate with sought-after results, so it’s a waste of time focusing on them, says Ken Rudin, general manager of analytics at Zynga and a keynote speaker at TDWI’s upcoming BI Executive Summit in San Diego on August 16-18.

Challenge Assumptions

For instance, many companies evaluate the effectiveness of their Web sites by calculating the number of page hits. Although a standard Web metric, total page hits often doesn’t correlate with higher profits, revenues, registrations, or other business objectives. So, it’s important to dig deeper, to challenge assumptions rather than take them at face value. For example, a better Web metric might be the number of hits that come from referral sites (versus search engines) or time spent on the Web site or time spent on specific pages.

TDWI Example. Here’s another example closer to home. TDWI always mails conference brochures 12 weeks before an event. Why? No one really knows; that’s how it’s always been done. Ideally, we should conduct periodic experiments. Before one event, we should send a small set of brochures 11 weeks beforehand and another small set 13 weeks prior. And while we’re at it, we should test the impact of direct mail versus electronic delivery on response rates.

Read more at The Key to Analytics: Ask the Right Questions

Share on FacebookShare on Google+Share on LinkedInTweet about this on TwitterEmail this to someone