How Big Data And Analytics Are Transforming Supply Chain Management

Supply chain management is a field where Big Data and analytics have obvious applications. Until recently, however, businesses have been less quick to implement big data analytics in supply chain management than in other areas of operation such as marketing or manufacturing.

Of course supply chains have for a long time now been driven by statistics and quantifiable performance indicators. But the sort of analytics which are really revolutionizing industry today – real time analytics of huge, rapidly growing and very messy unstructured datasets – were largely absent.

This was clearly a situation that couldn’t last. Many factors can clearly impact on supply chain management – from weather to the condition of vehicles and machinery, and so recently executives in the field have thought long and hard about how this could be harnessed to drive efficiencies.

In 2013 the Journal of Business Logistics published a white paper calling for “crucial” research into the possible applications of Big Data within supply chain management. Since then, significant steps have been taken, and it now appears many of the concepts are being embraced wholeheartedly.

Applications for analysis of unstructured data has already been found in inventory management, forecasting, and transportation logistics. In warehouses, digital cameras are routinely used to monitor stock levels and the messy, unstructured data provides alerts when restocking is needed.

Read more at How Big Data And Analytics Are Transforming Supply Chain Management

Share on FacebookShare on Google+Share on LinkedInTweet about this on TwitterEmail this to someone

Big (and Smart) Data for Digital Globalization

Data is all around us whether we use it or we are part of it. More than another trend, data is the way to move with agility and make every step and achievement tangible for those who do not see or believe it. One of the most transformational and accelerating factors of digitization is precisely how data is considered, leveraged, valued, and distilled. As data mining is not new it has become more than just a back office type of activity. It is all about turning facts into more than facts, figures into more than figures, and content into more than content.

For digital globalization practitioners and leaders, data shines like a glittering prize. That is why they face similar challenges to all business leaders when it comes to making the most of data. With the world to conquer and a number of diverse audiences to engage, they have to transform big data into smart data to focus on what enables making–and avoids breaking–the digital experiences local customers require. Specifically they must pin down the right data at the right time in the content supply chain to convert it into reliable indicators and valuable assets in the long run. In addition, due diligence is required to cover the cost and efforts of funneling, acquiring, and maintaining data. While the amount, the nature, and the scope of data depend on digital globalization targets and priorities, several categories may help establish a good base line to identify smart data and agree on a starting point for global expansion.

  1. Customer understanding data-Ranging from general (e.g. census) to segmentation data these data enable you to bear in mind what customers do at all times as prospects, decisions, buyers, or users.
  2. Usage data-As typical performance data this remains crucial in any proper mix of smart data for digital globalization.
  3. Content effectiveness data-Capturing and measuring the real impact of content on experiences is tricky and must reflect the nature and ecosystem of the content.

Read more at Big (and Smart) Data for Digital Globalization

Do you have any opinions about this topic? Post your comments below and subscribe us for updates.

Share on FacebookShare on Google+Share on LinkedInTweet about this on TwitterEmail this to someone

Tapping into the ‘big data’ that can help Greek supply chains to be more agile

Supply chain platform provider GT Nexus has begun tapping into the big data that has accumulated in its system to help shippers, carriers and forwarders provide “assurance of supply”.

In an interview prompted by the possibility of a Greek exit from the Eurozone, GT Nexus’s EMEA director of marketing, Boris Felgendreher, said the Greek crisis bore all the hallmarks of major disruption – the sort that shows the limitations of supply chain planning.

“This sort of situation puts a premium on being agile, in respect of companies being able to move from one sourcing location to another, and that is always difficult. This particular disruption has an added element in that it is financial,” he said, alluding to the fears of a ‘Grexit’ and the problems Greek companies have with making and receiving payments.

And although Greece itself has accepted the terms of its bailout, a number of Eurozone countries have still to ratify the deal, meaning the threat of a Greek exit persists.

However, Mr Felgendreher explained that a recent development by GT Nexus could offer firms a way to circumvent these issues through a “fusion of the physical and financial supply chains”, following an agreement between the platform developer and trade finance solutions provider SeaburyTFX.

SeaburyTFX has developed a funding programme that leverages big data on the GT Nexus platform to deliver suppliers access to low-cost capital. The programme opens the flow of capital into the supply chain to reduce costs and risk by basing funding decisions on the trading partners’ performance history, instead of the buyer’s or supplier’s credit.

Read more at Tapping into the ‘big data’ that can help Greek supply chains to be more agile

What do you think about this article? Share your opinions with us in the comment box.

Share on FacebookShare on Google+Share on LinkedInTweet about this on TwitterEmail this to someone

New Approaches to Analytics to Revolutionize Logistics

New Approaches to Analytics to Revolutionize Logistics

Three stages are commonly used to categorize an organizations maturity in their use of business intelligence and analytics technologies:

  1. Descriptive: What happened in the past?
  2. Predictive: What will (probably) happen in the future?
  3. Prescriptive: What should we do to change the future?

Descriptive analytics typically means good old fashioned business intelligence (BI) – reports and dashboards.  But, there is a newish technology in the Descriptive category – one that I might argue is worthy of a category in its own right.  That technology is visual data discovery.  The visual data discovery approach has a rapidly growing fan base for many reasons, but one stands out:  It increases the probability that business managers will find the information they need in time to influence their decisions.

Visual data discovery tools typically provide:

  1. Unrestricted navigation through, and exploration of, data.
  2. Rich data visualization so that information can be comprehended rapidly.
  3. The ability to introduce new data sources into an analysis to expand it further.

By helping to answer a different class of question – the unanticipated one – visual data discovery tools increase the probability that managers will find the information they need in time to influence their decisions.  And that, after all, is the only valid reason for investing in business intelligence solutions.

If you have any opinions, you are welcome to leave a comment or send us message.

Share on FacebookShare on Google+Share on LinkedInTweet about this on TwitterEmail this to someone

Bywaters waste management uses BI to improve customers’ recycling

Bywaters waste management uses BI to improve customers’ recycling

Bywaters, a recycling and waste management company, has improved productivity by 4% using Pentaho data integration and business intelligence software.

Sasha Korniak, head of analytics and data science at Bywaters, masterminded the project at the family-owned company, which operates nationally, and includes Nandos, Guy’s and St. Thomas’ Hospital, and BNP Paribas among its 2,000-plus customers.

“I wanted Bywaters to embrace a data-driven culture that would give authority and confidence to make autonomous decisions substantiated by credible data and enable consumers to increase recycling and sustainability,” says Korniak.

“We are no longer just a waste management company, we are a waste consultancy, improving our customers’ recycling through providing the data”, says Korniak. “If you are not data driven, but just go out and collect bins, the sustainability of your business will be damaged”.

If you have any opinions, leave a comment below or send us a message.

Share on FacebookShare on Google+Share on LinkedInTweet about this on TwitterEmail this to someone

Why Google Flu is a failure: the hubris of big data

Why Google Flu is a failure: the hubris of big data

People with the flu (the influenza virus, that is) will probably go online to find out how to treat it, or to search for other information about the flu. So Google decided to track such behavior, hoping it might be able to predict flu outbreaks even faster than traditional health authorities such as the Centers for Disease Control (CDC).

Instead, as the authors of a new article in Science explain, we got “big data hubris.” David Lazer and colleagues explain that:
“Big data hubris” is the often implicit assumption that big data are a substitute for, rather than a supplement to, traditional data collection and analysis.

The problem is that most people don’t know what “the flu” is, and relying on Google searches by people who may be utterly ignorant about the flu does not produce useful information. Or to put it another way, a huge collection of misinformation cannot produce a small gem of true information. Like it or not, a big pile of dreck can only produce more dreck. GIGO, as they say.

Google’s scientist first announced Google Flu in a Nature article in 2009. With what now seems to be a textbook definition of hubris, they wrote:
“…we can accurately estimate the current level of weekly influenza activity in each region of the United States, with a reporting lag of about one day.”

If you have any opinion, feel free to send us a messageor leave your comment below.

 

Share on FacebookShare on Google+Share on LinkedInTweet about this on TwitterEmail this to someone
post

Data Science Is Dead

This article gives a very good criticism about the popular Data Science / Data Scientist these days. Data Science is something business people invented as a creative way for a new profession as a result of Big Data. “Science” is about creating knowledge as a result of study / research. Consequently, “Data Science” should be about creating knowledge through the study of data, not just data analysis, a/b testing, or troubleshooting which almost the majority of business people are doing. Essentially, the article claims data analysis / data troubleshoot is NOT Data Science.

Today it is a big hype for companies to look for Data Scientists with unrealistic expectation in those job ads. They are looking for a miracle medicine but a quick fix to data which they are unable to handle today. As a result, a short cut is taken and a new profession is created. Major software companies even invent new products to automate the jobs of Data Scientists! It is just like the story of the King’s New Robe. When a king was naked with an imaginary rob and walking down the street, everyone was so ashamed to be called stupid that they never called out the imaginary robe as a lie, not until kids, with their pure and untainted mind, laughing at the king’s stupid belief. The story was repeated to supply chain management (SCM) and is repeated to Big Data and Data Science today. People are so ashamed to be called stupid so they just follow the trend and try to build empires out of the trend. Companies are spending billions of dollars to just make reports as eye candy but do not really know how to use them to improve their bottom lines.

… you’ll realize that the “Big Data” vendors have filled your executives’ heads with sky-high expectations (and filled their inboxes with invoices worth significant amounts of money) …

The author claims there is no data science until you are working on “structured” data, where is most statistics draws its inference for prediction and control. The author emphasis the importance of “cleaning off the rotten banana peels” before you look at data so you won’t draw biased conclusion, which is totally against the idea of Big Data today.

I understand the importance of having fresh idea to keep people engaged in our advancement. Putting emotion aside, if possible, this article does provide a very bitter but true advice to Data Scientists.

Don’t be the data scientist tasked with the crime-scene cleanup of most companies’ “Big Data”—be the developer, programmer, or entrepreneur who can think, code, and create the future.

Reference: Data Science Is Dead

Share on FacebookShare on Google+Share on LinkedInTweet about this on TwitterEmail this to someone

Five Data Mining Techniques That Help Create Business Value

Five Data Mining Techniques That Help Create Business Value

The term data mining first appeared in the 1990s while before that, statisticians used the terms “Data Fishing” or “Data Dredging” to refer to analysing data without an a-priori hypothesis. The most important objective of any data mining process is to find useful information that is easily understood in large data sets. There are a few important classes of tasks that are involved with data mining:

  1. Anomaly or Outlier detection
  2. Association rule learning
  3. Clustering analysis
  4. Classification analysis
  5. Regression analysis

Data mining can help organisations and scientists to find and select the most important and relevant information. This information can be used to create models that can help make predictions how people or systems will behave so you can anticipate on it. The more data you have the better the models will become that you can create using the data mining techniques, resulting in more business value for your organisation.

If you have any opinion about how data mining help to create business value, post it in the comment box. And contact us for discussion.

Share on FacebookShare on Google+Share on LinkedInTweet about this on TwitterEmail this to someone

2013 in review: Big data, bigger expectations?

In the parlance of the industry, big data’s feat was a result of the successful convergence of the “three Vs”:

Volume: A large amount of data

Variety: A wide range of data types and sources

Velocity: The speed of data moving from its sources, into the hands of those who need it

Although other Vs have since been contemplated, such as Veracity and Value, the original three attributes promised big data could go far beyond the boundaries of traditional databases, which require data to be stored in rigid rows and columns.

However, over the past year, reality began to sink in: People came to realize what big data could and could not do. Unfortunately, performing large-scale analytics in real time proved to be more daunting than originally thought. Although Hadoop continues to be the world’s most popular big data processing platform, it was designed for batch processing and is far too slow for real-time use.

Reference: 2013 in review: Big data, bigger expectations?

Share on FacebookShare on Google+Share on LinkedInTweet about this on TwitterEmail this to someone

Risk Management: A Look Back at 2013 and Ahead to 2014

Risk Management: A Look Back at 2013 and Ahead to 2014

According to Yo Delmar, vice president of MetricStream, 2013 has been witness to extraordinary change. We are living and doing business in an increasingly global, mobile, social and Big Data world, fraught with new risks and complex regulations. As such, individuals and organizations are struggling to keep pace.

In response to greater uncertainty, complexity and volatility throughout 2013, we’ve seen increased convergence and alignment amongst internal teams, including IT, security and the business. As a result, organizations are better poised to provide the context for communicating risks. We’ve also seen the business ecosystem evolve to include geographically diverse vendors and third parties, and as a result, organizations must continue to view these entities as part of the organization itself, and manage them in a more tightly and integrated way.

Growing convergence among IT, security and the business: The landscape of risk and compliance continues to evolve, as organizations are asked to manage their IT risk and compliance activities far beyond that of basic audit and compliance requirements of the past. As new technologies bring their own set of unique risks, there is a growing disconnect among internal audit, security, compliance and the business on what it means to build, manage and lead a truly safe, secure and successful business.

As a result, we are seeing more focused efforts when it comes to getting these groups on the same page by building a common risk language, as well as a discussion framework to enable cross-functional collaboration. Doing so can set the context for communicating risks in a way that drives more effective governance and decision-making across the board of directors, executive management team and each respective business function.

What is your 2014 resolutions? Leave us a comment or send us a message.

Share on FacebookShare on Google+Share on LinkedInTweet about this on TwitterEmail this to someone