Outsourcing refers to the contracting with another company for business purpose. It includes both international and domestic contracts. Sometimes outsourcing also refers to exchange or transfer of employee and assets between different firms. It helps the firms in reduction of cost and improvement in quality.
Today, retail is driven by data and technology. Big data is becoming really important to retailers. Retailers must adopt big data and digital skills to get succeed in a sector. According to a survey from “101Data”, 96% of retailers reported that big data was important to them and 48% of retailers reported that big data best fit with their marketing department.
As we know that the thumb rule of online retail marketing is: to know every product across your service area, to know every person to whom they interact and with having the best ability to connect them in a transaction.
We all have probably seen the big data and digital in retail sale. If not then you may experience this. For this you will need to go to a shopping site for online shopping. Add a watch in your cart and after sometime remove it from the cart. Now move out from that website. After onward every site you visit may found a watch ad of that shopping site. Online retailers use the data of your interest and customize the ad to you according to your interest. As a research of Amazon states that they had 30 percent of sales due to their recommendation engine. That is the use of big data and digital in the retails.
One of the famous example of taking advantages of big data by a company to drive success is Rolls-Royce. As we now that Rolls-Royce is famous to manufacture large engines which generate a large amount of power generally used in airplanes and ships. They generated a large amount of data during the manufacturing of a jet engine. They used these big data information in mainly three areas i.e. design, manufacture and after sales support. After using these data they found huge change in every field of their industry. They were getting more appropriate and best designs, Product went more error free and sales increased
These are the few application of big data and digital in retail sale:
Expected buying behaviour– To find the expected buyers of different products using the analysis of data. It’s like if a retailer has to sell a game then they will love to advertise for keen gamers and with the help of big data they may easily find those people.
Opinion about brands– To find the famous product among the buyers and organise their product on the basis of buyer’s choice. It also help us to get real-time opinions & responses.
Personalized Shopping– To provide discounts based on past shopping details like preferences and give discounts on the basis of their previous transactions. It help any organization to keep their customers for a long period of time.
E-Commerce optimization– In retail marketing customers are on driving seat. Companies are totally dependent on what customers want, it doesn’t matter what you sell if it’s not according to the customers need you will never get success. So, it’s essential to understand the user behaviour on the website to optimize sell. Big data analysis can provide the user behaviour.
Optimizing the store– Have you ever noticed that why essentials are at the far end and luxury goods at the start. This is the optimizing store. Using big data analysis they track user movements to understand their behaviour.
Price– Big data can also provide the base for prize optimization model.
Big data is the biggest reason for getting success in retail sale. It creates a lot of opportunities for marketing and sales. So, retailers are moving quickly into big data. Some companies are already using big data and digital and getting benefits of using big data and digital in retail sale and they got 5-6 percent increment in their productivity rates and profitability.
Defining, articulating and representing business problems is a crucial initial step in any Big Data initiative. To deliver quick results from big data, it is good to have powerful and well organized analytic capability. And, if not? Nothing to worry. Reach out to a right big data partner who can deliver quick results – this could be a Proof Of Concept (POC). Once recognize the POC is successful and could generate business value – yes, go ahead to the next level.
Always, it is good to have internal analytic team who can work on data to solve problems and help finding innovative ways to serve customer better. Analytics team must have enough and good amount of data and an effective communication to deliver results. They also need appropriate tools according to the size and nature of data to perform operation. Analytic team has to look at big data life cycle – from data preparation to final report/model delivery or including model monitoring, if modelling is a part of final outcome. Each and every stage of the big data journey effects the final result and business impact and hence, it is important to have involvement of data expert – we may call him data scientist.
Companies are investing resources on technologies, operations, training and development of skills. But during the analysing of big data the most important factor is understanding of data and connecting with the business background and leverage them for decision support. If the analyst doesn’t have the understanding of these things then the corporate information doesn’t help them to find out the appropriate solution. For delivering quick results, analyst should pay enough time to understand the data and the business problem and more over the business itself. Analyst should have the understanding of customer issues and according to the issues they have to decide, what can be done about it and what tools to leverage.
Let us review factors that drive success when companies try to deliver quick results with big data. To deliver quick results with big data, we will need to consider some of the these points..
Business Understanding – “If I were given one hour to save the planet, I would spend 59 minutes defining the problem and one minute resolving it,” Albert Einstein said. Before making an attempt to solve any problem, we should step back and invest time and effort to improve the understanding of the question that what we are trying to solve. For delivering quick results, we need to have good understanding of the business.
Strong Strategy – Drive to Solution: To run any successful operation a strategy is essential. There should be a strong strategy having a vision that what needs to be done before starting to analyse the data. The whole big data approach should have a strong and clear vision and drive to final output.
Data understanding – Data is the main and the most important factor during the analysis. Whole analysis is around the data. Analyst should have knowledge to use and structure data according to their needs. Data should be organised and well structured, so that it will be easy to work on that. Data should be clustered according to their logical type. So an analyst will target the focus area for the data operations. The area will be identified according to their impact on business.
Smart analytics team: Not size of the team – but, quality of the team. While recruiting the analyst or statisticians, recruiter needs to look for high problem solving capabilities, and reasoning power of candidates. Candidates should have the skills that – how and why to approach a problem. Engineering background candidates may be a good analyst.
Expertise on Big data tools and technologies– To deliver quick results, big data team should have expertise in big data tools and technology. It will help the big data team to get in the core of data.
In a recent survey of “The Economist Intelligence Unit” which has been done after completion of a big data project, one-half of analyst said that they didn’t had enough structured data to support decision-making, compared with only 28% who said the same about unstructured data. In fact, 40% of respondents complain that they had too much unstructured data
Ability to leverage right tool – Once you have the skilled analyst and right data with a strong strategy, you need to look for the correct technology on which the data would be analysed. Technology acts as a bridge between the skilled analyst and the right data. So right technology is needed to operate those data by the analyst. There are different technology which can be leveraged for this purpose according to the requirement.
Governance – It is very necessary to connect all the resources and technologies as a single unit to deliver quick results. Governing the whole process in well-mannered way plays an important role in delivering quick results. Governance body need to evaluate the team and assign the works according to their potential and skills.
To deliver quick results, start with a POC and ensure that result is out and useful. In order to start the POC, it is necessary to have deep business understanding, right mix of skilled people, ability to choose right tools and techniques with a powerful strategy that makes the result faster and accurate.
Master Data Management (MDM) is a method to define and manage all critical data of an organization to one file i.e. master file to provide a single point of reference. To define and manage those critical data, MDM includes the processes, governance, policies, standards and tools. The benefits of MDM increases by increasing number of department, resources and related data. So, Master data is a subset of Big data and while analysis MDM provide a starting point. Applying MDM gives many benefits while leveraging Big Data.
“The world of big data is a world of unknowns, and you need to somehow anchor it to the stuff you do know and trust — that’s the relationship between big data and master data,” said Ted Friedman, an analyst at consulting and market research company Gartner Inc.
So, MDM put high value while leveraging big data and companies are starting to see. According to survey of The Information Difference Ltd., an MDM consulting and Research Company in London. Only 17% of the 209 corporate respondents in North America, Europe and Asia said they expected big data applications to generate new master data in their organizations. But 59% said they thought MDM hubs and big data systems could be linked together for business uses, including the ability to use master data to automatically detect customer names in sets of big data.
There are several reasons why organizations enhance their Master Data Management (MDM) with Big Data:
More Effective Analytics – Using MDM with big data comes with some great benefits. It provide the basic framework for performing analytics. It brings different new and old information for leveraging. Thus data comes in huge quantity and which makes the analysis more effective.
More arranged data – MDM gives a better arrangement of complex data than normal management of data. Anyhow analyst need to arrange data in right format and MDM help them a lot in this purpose.
Source of Big data – MDM is the subset of big data. It having a lots of information from different sources. So it can be used as the source of big data.
Decision making – Companies use MDM management to keep all records or data which comes from different internal and external sources. Applying big data analytics in MDM gives a better understanding about the company requirement as well as it helps in decision making.
Data quality – The benefits of using master data management with big data is data quality. Whatever data we will use for big data analysis if it is coming from MDM then definitely the quality of data will be far better than those data which are coming from different external sources without applied MDM
Generally MDM holds the internal most trusted data of any organization where as big data contains internal as well as external data – social media, mobile, cloud and many other sources. MDM strategies for Big Data indicate a transformation of the applicability of MDM – and increased customer or product centricity and personalization. Organizations that combine big data with master data management started gaining gaining advantages.
Today, massive amount of data is uploaded in web-world creating huge new and exhilarating business opportunities to small and medium size companies. However, collecting all of the required data is only one part of the storyline. Mining and converting these data into actionable is where real business value lies. The overall goal of web data mining process is extract information from various web sources and transform it into an understandable structure for further processing. The task of Data mining is to mine or analyse a large quantity of data using automatic, semi-automatic or manual ways.
In general, there are mainly two ways for data mining. First one is traditional way, manual data mining and other one – we could call automated or semi-automated data mining. Manual mining is a time taking and long process where mining is done one by one or say- by each record or piece of information. It needs a lot of time and a lot of efforts. Whereas automated web data mining is a process wherein most of the repeated tasks can be converted into simple logic based script which can scrap all web data as desired. As its name shows that things can be automated – say close to 100%. Whole mining process runs automatically using some algorithm. It is the approach that companies prefer to use for web data data mining.
To make the web data mining process automated, we can follow different methods based on the web structure, data format and size; it may require a custom made language script -script can be in R, Python, etc. and API. We can define process using any scripting language and run it for sourcing the entire data from websites. It will scrap all data whatever we instructed in the code.
We could leverage Python or R. Both are well-known for scripting – data mining especially used in Big data projects. The beauty of python is that it is a user friendly language. A person from non-technical background can quickly learn and understand it without much difficulties. Another benefits of using Python is that ‘number of lines in script. If you are writing any web scraping script in Python, then it will be completed maximum in few hundreds of line whereas if you will choose java or any other language then it can go to thousands of lines. Due to these advantages Python is preferred in data mining for exploring big data potential. One another benefit of using Python in data mining is Python modules. There are many Python modules are available for data mining which can be easily implemented during the scripting.
For huge volume of web data mining, it is always good to go for automation using custom made script so that time and effort can be saved substantially. Data mining will act as a phase in which we could get data for processing and later for analysis. So it also comes under the big data collection phase. So big data tools can be easily implemented on it.
Broadly speaking, there are three different steps in web data mining. One is identifying the source or sources of data or web pages, mine the desired data and save it into the data processing environment using the right tool; and finally, process the data for decision support. Data identification is the first step in mining process. Here we identify the data in the web page/s that we want to mine. In second step, we check the data pattern i.e. in all pages’ data is showing in same manner and in the source code ‘class’ name of data is same or not. If the class name of data in each web page are same then only we can go for automation otherwise to run an automation job will be really a tough call. Third and last one is writing code and run job. Here we use big data potential like Python. We write code and using the class of that particular value i.e. available in page source code. Input will be anything like url, id etc. but using that it must be redirected to the data page.
Nowadays, everyone prefers to collect web data using automated mining. It is cost saving as well as time saving. For automated web data mining we can use any programming language and different APIs. Python and R seem to be most preferred language for web data mining and also considered as preferred tools for big data projects.
Big data is a fast emerging concept that totally transformed the business – the way it runs in almost all industry and sector. Now it has been covered a big area of business. The main part of big data is to store the vast amount of data and analyse them to get some valuable information.
Data monetization means to get some beneficial information from that huge amount of data to generate profit for your organisation. According to an Economist Intelligence Unit survey those firms who used big data analysis result for decision making got 5-6% of improvement in their performance and other 41% company expect improvement in their business within next 3 years due to big data. These data are enough to prove that big data can be monetized.
Now, let us discuss some recent trends in big data.
Cloud computing – These days, cloud computing is one of the hottest trends in big data field. For storing big data, we need a large storage capacity. We need at least some TB of space. Cloud computing allow you to store the data in service provider’s server. It is of flexible size i.e. if the amount of data increase then its space can also increase according to that by allotting more space. Amazon Web Services is an example of cloud computing.
Hadoop – Big data means Hadoop? Hadoop is a framework on which big data processing occurred. It consists two core parts in it i.e. HDFS and Map Reduce. HDFS is storage part, used to store data and Map Reduce is processing part.
Security – Bank fraudulent reports show that – lost opportunity is multi-billion dollars every year and we could minimize it leveraging big data. Big data analytics fills the security void. It helps to find the security gap by analysing the patterns. Big data is frequently used in fraud detection.
More predictive analysis– Big data helps companies in decision making. As we saw earlier that with the help of big data analysis companies got 5-6% of improvement in their performance. With the use of big data, companies are able to serve their customers better and increase their revenue.
Apache Spark– It is a new technology in big data that works 100 times faster than Hadoop. It saves both, your time and your money. Effectively, companies moving to Spark technology to resolve big data problems.
IoT– At present Internet of Things like- sensors, smart machines, connected devices, etc. is capable to generate a huge amount of real- time data that helps companies to find the proper and basic information about their customers. This information helps them to make customized strategy.
According to a prediction by Gartner “By 2020, information will be used to digitalize or eliminate 80% of business processes and products from a decade earlier and by 2017, more than 30% of enterprise access to broadly based big data will be via data broker services, serving context to business decisions.
Another prediction by Gartner states that by 2017, more than 20% of customer-facing analytic organisation will provide product tracking information leveraging the IoT.
These are the recent trends in big data through which big data can be monetized. Monetizing big data is the best way to increase the revenue or EBITA.
Data monetization means generating more and more profit from Big Data. Today, every business owner is trying to focus on Big Data to get values from it. These values help them out to make business strategies and increase ROI. By the analysis of the data, organizations come to know more about their customers, vendors, and their behaviour, pattern, etc. This information helps them to make their strategy according to the needs and demands of the customer – effectively serve the entire stakeholders better.
A Big data related survey on July 2015 showed that 60% of CEOs around the world now use data analysis to run their decision making process.
In July, 2015 an Economist Intelligence Unit survey surveyed upon 600 CEOs around the world and of different sectors about the use of big data in their organization. This survey showed that 60% of CEOs across the globe now use data analysis to govern their decision making process. 75% of them was ready to accept that their organization should be data-driven. 90% decision was better in last three years when they had made it after analysing the different and relevant information. 42% of CEOs was facing difficulties while dealing with unstructured data. This survey result is enough to make things clear – Big Data is how much important.
Every company is trying to monetize data in different ways. Here, there are 7 ways that can look at data monetization.
Look at your business – various data sources– Take a deep breath and find all sources of data within your organization. There are always some data which won’t be used by the organization due to different reasons like highly unstructured but if you are capable to analyse those data
Search potential application of the data – Let us not leave any data sources unturned. Today, technology evolved a lot and there are various ways and means to convert any structured and unstructured data in to meaningful. So, look at data sources and try improving the way business run, customers are treated, vendors are considered, etc.
Assemble the right team– For analysing data you need to have a well organised and skilled team. Monetizing the data is mostly depending upon the analysis result and ability to generate insight out of data. So, get the right people on boat.
Look for your company size– For monetizing big data- size of your company also matters. If your team size is small, then look for a vendor who can help monetizing the data.
Analytic skills- Your team need to have good analytic skills because big data not just about the size of data it is all about converting the data – identify hidden pattern and leverage them for decision support. Also, experience in working on different big data tools and technology regarding the analysis of data is one of the important way to monetize big data.
Look for the patterns – identify and prioritize areas to monetize. Here patterns refer to the type of data and hidden pattern. While analysing data, similar pattern to be analysed in separate attempts. It can save your time and effort and you can expect more benefits from the data.
Leverage internally or sell the data to external parties– On which role of big data analysis your company is playing. Here the ‘role’ refers as a consumer of data, an aggregator or a creator of new data product. Which one will suit better to your organization go for that, it will help you lot in monetizing big data.
Data by nature may not be helpful unless processed. Data in a proper context make sense and provides information. 10rs doesn’t make any sense but 10rs for a ‘Dosa’ in Kerala in 2015 make sense and also gives an information. Same story is about Organization data too. Those unstructured data never going to help you to make profit but putting those data in a proper context can help monetize.
Anything and everything we do in this connected digital world – be it online shopping, Facebook liking, responding to social events, adding new friends in social media, blogging, tweeting, sharing product review, etc. leave a trace of data about us as consumers. Extracting the huge volume of data, identify business value from the data and use them for decision making is all about the intent of Big data analytics.
Everyone agrees that big data means big insight and big decision support. No one denies that today. Challenge is processing big data and that too, there is no one single way or solution in processing this data. Usually, big data is processed in big data lab. Big data lab is a dedicated development environment for experimentation IT infrastructure with presence of big data tools, algorithms, technologies and approaches to process big data and analytics.
If you are going to set up a big data lab, then it is good to understand about some of the potential challenges ahead. And, some of the quick questions you ask yourself..
- Why we do set up a big data lab?
- How much amount of data do we have?
- What all different data formats are we planning to process?
- Which type of storage system we need?
- Do we have right set of people for setting up a big data lab and analytics?
- Can we bear the cost for all these?
If you have answer to all the above questions – yes, well and good. But still you will face some more challenges while setting up a big data lab. Those are:
Cost is often more than expected – Cost will be the biggest issue while setting up a big data lab. You will need huge amount for the all setups. If you want to build a storage for storing the data then you will need expensive hardware for this purpose which must be capable enough to manage large amount of data. If you plan to go for Amazon Web Services, then also you will have to pay for this – per minute, per application, etc.. Different tools and technology will be needed for the different operation during processing of data. You must hire experts like data architect, designer, data analyst, business analyst and many more to handle and operate different operations. In all these things you need to invest huge amount.
Storage Capacity– Working on big data is quite often a big challenge. You need to have new and flexible tools for processing big data because of its variable size and complexity. We can break big data into three Vs i.e. volume, velocity and variety. While setting up the big data lab, we need to consider all these points. Volume of data means the amount of data. We need to set up big data lab for these variable amount of data to be store easily. Velocity refers to the rate of coming data. Variety of data refer to the different types of data. The big data setup should be able to hold the data of these quality.
Data ingestion and Process– The processing of data needs appropriate technology for data integration and data development. We can’t handle all the data processing with same tools and same people. Each time we may need different tools and techniques to analyse the data and so, different skill set as well. We need to be aware of available tools and upcoming technologies. So, it’s a big challenge to cover all these tools and technology part while setting up a big data lab.
Analytics tools and Expertise – The next step in big data processing is analysis. Analyst always need to do a deep dive analysis to improve the knowledge of our customer’s requirement and to make business strategy. We need data analyst for this purpose. Their work is to analyse the data and find out the valuable patterns in data. If you have tools i.e. right analytic tools (SAS, R, Matlab, Excel, , Tableau, etc.) f in your lab – then it would be easy for the data analyst to analyse those data. It is a challenge, if you are a start-up or medium size team, to manage these tools, license cost and right people for the appropriate work in your big data lab.
Visualization tools and experts– The analysis report, final outcome analytics, needs to be presented in a dashboard, or report or visualize them like chart, graph, etc. so that it can be easily interpreted by the client. Visualization expert work on this step and convert the mathematical value into simple visual form. You should have good visualization expert and visualizing tools to perform this task.
Ready for facing all these challenges? Yes! Then go for it, otherwise think again.
Though there could be many challenges in setting up a big data lab, it is always important to remember that the right data can help you make intelligent and smart decisions. Big Data can be a source for this great insight.
We wanted to make our world a better place to live – always. It is big data – one among, that is changing the way the world is today – business runs and social delivery. Effectively, it helps to improve social well-being.
Reports state that everyday 2.5 quintillions of data is being created which contains information of millions of people. Governments gather an increasing amount of data day by day and often question is – are we converting these into better governance and capacity building? Do we leverage these data for improved social delivery?
Today, there is increasing pressure on governments to convert big data into actionable information and for social delivery. We are in a world where every social organization can improve the ability to leverage big data and data science for increased social delivery.
Every aspects of our life is affected by big data – in one or other manner, and some of them really make a remarkable difference. Let’s discuss a few of them – how big data can help improving social delivery.
Improve Healthcare – The industry has improved a lot – whether predicting epidemics, finding solutions for deadly diseases, improving quality of life or avoiding preventable deaths. Behind these improvements, big data and analytics play a crucial role. Today, people are aware of health potential of wearables. Big data contains the data by track sleep, track eating, mood, physical and emotional health. Once data is generated that could give information back to the users who generated it, and create personalized insights – health tips. Big data analytics also being used to decode the entire DNA strings to find and to understand the disease. In early days clinical trails were limited to small sample sizes but now it can be analysed on a wide range of data.
Transportation – Government authorities started to reorganize fruits of big data and analytics. They realize city infrastructure and public transport systems in order to encourage people to walk and use cycle more rather than drive cars in the city. Often in cities, people are suggested alternative route and appropriate driving time. How autorities provide these information? By using GPS system it tracks the public transport and traffic signals predict traffic volumes.
Improve social security – Effective social security is not just a product which can be built, use and improve. Big data is used in improving security enforcement. Different national security agencies leverage big data and analytics to find the terrorists and criminals pattern of crime. Big data is also being used to detect and prevent cyber-attacks, fraud detection, etc.
Improvement of cities – Everyone talks about smart cities. With the help of data about city, governments plan different schemes for development. How much money to be spent and in which area? These questions being answered by big data and analytics. Good part is – a number of cities around the world started exploring big data analytics to being a smart city.
Trading – Big data has vital use in trading specially in capital market. Big data is widely being used for analysing fraud pattern and minimize the fraudulent trading. By analysing different patterns experts predict the up and rise of the financial market combining various data points. Also, there are several real time and near real time reports that are being generated on trading and operations using big data and analytics.
Retail – Big data application in retail often enables them offering optimized price. And for that, retailers constantly finding innovative ways to draw insights from online and offline data sources. They are embracing big data and analytics to understand their customers, matching them to products or services; eventually convert them into dollars – in one or other form. Retailers analyse various data points – sales, pricing, social, economic and many other to find hidden pattern to serve their customers better.
Personal life – Big data and analytics is impacting every social touch point of a customer – the way we think, behave and act; everything is being monitored and analysed for increased customer service. Many dating and matrimony sites use big data – to analyse the listing characteristics, behaviour and reaction to find a perfect match. They also use these data to improve the matching algorithm.
Exploring big data and analytics for social delivery is rapidly evolving. It offers tremendous opportunities when explored and applied. It can help governments – drill down deeper on the opportunities of big data, and uncover powerful and effective methods for optimizing governance.