tag:econsultancy.com,2008:/topics/data-analytics Latest Data & Analytics content from Econsultancy 2017-08-17T13:00:00+01:00 tag:econsultancy.com,2008:BlogPost/69345 2017-08-17T13:00:00+01:00 2017-08-17T13:00:00+01:00 An introduction to customer research (a 3,000-word guide) Nicholas Villani <p>Of course, when it comes to conducting customer interviews, sending out surveys and collating insights, method overrules madness. Employing some fundamental principles and ground rules will ensure that you are making the most of this opportunity, and not leading yourself or your customer astray.</p> <p>In this article we will explore: </p> <ul> <li><strong>1. Guerilla research</strong></li> <li><strong>2. Qualitative research</strong></li> <ul> <li>a. Customer research</li> <li>b. Focus group</li> <li>c. Ethnographic research</li> </ul> <li><strong>3. Quantitative research</strong></li> <ul> <li>a. Customer surveys</li> <li>b. Website/social data</li> <li>c. CRM data</li> </ul> </ul> <h3>1. What's this guerilla research thing all about?</h3> <p>Let’s get this one out of the way first. The principles of guerrilla research, originally adopted by <a href="https://www.econsultancy.com/blog/69268-a-day-in-the-life-of-consumer-product-manager-at-trustpilot">product managers</a> and <a href="https://www.econsultancy.com/blog/68842-a-day-in-the-life-of-a-user-interface-designer">UX designers</a>, have started to gain traction across other disciplines. This method encourages one to just get out, hit the pavement, and start asking questions. Of course, I love the agility of this principle, and the concept that it doesn’t require a massive budget or months of planning to really start finding out what your customer thinks, but it’s important to highlight that there are a few inherent risks.</p> <p>Firstly, in the process of conducting guerilla research, it’s still important to have some structure if you want to collect any data that is useful. Standardising questioning is key. If you change your set of questions between each interview, then you have no benchmark to measure against.</p> <p>Secondly, guerilla research is only effective if you target people who are in your core demographic, or if your value proposition is designed to be general enough to appeal to everyone. For example, asking Sue in accounts, who is in her mid 50’s, if a piece of video content you’ve created will appeal to Gen Z males is only going to provide opinion, and not actual insight.</p> <p>Lastly, ad hoc guerilla research can occasionally lead you astray. One person’s opinion in a coffee shop is not necessarily going to be indicative of your wider target market. Randomly approaching people without fully briefing them into the process can strongly influence the way they respond to questioning. For example, if you thrust a product or a campaign creative in front of someone random, there is a strong chance that rather than offering constructive feedback, they will ‘save face’, which can also lead to false insights.</p> <p>With all of this said, you can start guerilla research immediately if you choose to, and that can often be better than nothing. At the end of the day, if you are talking to real people, then you are on the right track.</p> <p><img src="https://assets.econsultancy.com/images/0008/8316/high_street.jpg" alt="street" width="615" height="439"></p> <p><em>Hit the pavement and start asking questions = Guerilla research </em></p> <h3>Qualitative or quantitative?</h3> <p>Broadly speaking when it comes to customer research it helps to ensure you try to collect both qualitative and quantitative data to paint a full picture. There are many ways to collect insights and my general guidance for those starting out is to firstly decide what you are trying to determine and then select the research methods that will best help you on that path.</p> <p>Don’t try to do everything. One method from each bucket is generally enough to give you the guidance you need, especially if you are just starting out.</p> <h3>2. Qualitative</h3> <p>There are several highly effective qualitative research techniques, which include customer interviews, focus groups and ethnographic research. All three of these can be highly effective in their own right, and the correct method depends on your time, budget and overall goals.</p> <h4>a. Customer research</h4> <p>Customer interviews are one of the quickest, yet most effective ways of generating insights. This is a chance to sit down with someone who is either not yet a customer, is currently a customer, or is no longer a customer of your business, and whilst they can seem relatively straight forward, there are some key factors to consider.</p> <p>Firstly, ensure you have a script. Informal questioning can be messy. Be clear about exactly what you want to know, and restrict your questioning to focus on that. It’s okay to dig deeper into something using 'but why?' methodology, but generally, try to keep your questioning on track. Specifically, if you intend to interview several customers, this consistency ensures that you are can compare responses.</p> <p>I typically get asked how many interviews one should conduct, and despite this being like asking how long a piece of string is, I’d suggest you stop asking when you know what the answer is going to be. Once you’ve detected a common trend, then you generally have enough information to make an informed decision.</p> <p>Secondly, and most importantly, the biggest risks with qualitative interviewing, and a downfall I’ve witnessed countless times, is the interviewer, sometimes inadvertently, leading the customer to validate decisions they’ve already made. It sounds obvious, but it’s surprisingly difficult to not show bias during qualitative interviews, particularly if you are close to the subject being discussed. For this reason, it can help to employ someone impartial to conduct the interviews, or vet your script with someone else prior to the interview and then make sure you don’t stray from it.</p> <p>Thirdly, when it comes to customer interviews, try to ensure you ask open questions. Anything that elicits a yes/no response is a waste of time for both you and the interviewee. Try to phrase your questions using pretexts of ‘What if’, ‘how’ and ‘why’. Below are some examples of good and bad questioning: </p> <ul> <li> <strong>Bad</strong>: Do you like this campaign creative? <strong>Good</strong>: How does this campaign creative make you feel?</li> <li> <strong>Bad</strong>: How many times do you anticipate you would visit our website each week? <strong>Good</strong>: Describe your online behaviour and what would make you visit our website?</li> <li> <strong>Bad</strong>: Do you like using the product? <strong>Good</strong>: What types of product do you like to use?</li> </ul> <p><img src="https://assets.econsultancy.com/images/0008/8320/dachshund-672780_1280.jpg" alt="dog on a lead" width="615" height="407"></p> <p><em>Leading questions are a no-no</em></p> <p>The final principles of qualitative interviewing techniques come down to common sense.</p> <ul> <li> <strong>Respect the interviewees time</strong>. If you run out of time, end the interview.</li> <li> <strong>Try to conduct interviews face to face</strong> where possible, or at the minimum via video conference, so you can also observe body language.</li> <li> <strong>Use some form of predefined template</strong> to record and track your responses. A tool like Google Forms can be super useful, even if just for yourself to fill out the answers as you go, as it will collate multiple interviews into one spreadsheet automatically.</li> <li> <strong>Select interviewees who are representative</strong> of your core segment.</li> </ul> <h4>b. Focus groups</h4> <p>Conducting focus groups follows a similar principle to what’s outlined above, but again with several key considerations. As conducting a focus group is generally a significant investment in time and resource, it’s important to ensure effectiveness.</p> <p>The optimum focus group size in my experience is between 6-8 people. This ensures that they are manageable, yet also insightful. When selecting participants, try to choose those who are demographically different, but share an opinion about your product or service.</p> <p>Effective facilitation is critical when it comes to conducting a focus group, and a good facilitator is worth their weight in gold. It’s important to make sure that everyone has the chance to have their voice heard, but that you also keep to the agenda whilst drawing out real insights.</p> <p>Where possible, if conducting a focus group, I’d strongly recommend using gamification where possible. Asking participants to sort a series of cards into a perceived order, or sketch on a pad can often be far more effective than long winded conversations. Similar principles to one-to-one customer interviews also apply here:</p> <ul> <li>Keep to time</li> <li>Don’t use leading questions</li> <li>Try to illicit open responses</li> <li>Avoid introducing bias</li> </ul> <p>Lastly, it can be very useful to have a second facilitator in the room for a focus group, primarily to act as a scribe. Over the course of two hours you are likely to unearth a lot of information. Noting this down and categorising it appropriately as the session is underway is much more efficient.</p> <p><img src="https://assets.econsultancy.com/images/0008/8318/focus_groups.jpg" alt="focus group" width="500"></p> <p><em>Try to gamify focus groups</em></p> <h4>Ethnographic research</h4> <p>Ethnographic research, in many ways, is the quickest and fastest technique for qualitative research, as it involves just observing potential or existing customers from a distance. The most critical factor when conducting ethnographic research, is that you should not interfere with the participant at all, as to do so introduces bias. It’s also often better if you can conduct this research without the person knowing that their behaviour is being observed, but this is not always possible, or legal for that matter.</p> <p>Whether you are watching someone interact with your website, an app or even in a physical store, you want to ask the following questions:</p> <ul> <li> <strong>Which path did they follow</strong> to get where they wanted to go, and what touchpoints did they interact with?</li> <li> <strong>Were there any specific pain points</strong> or barriers that slowed them down?</li> <li> <strong>Are there specific behavioral patterns</strong> that you can observe when watching repeated visits or interactions?</li> <li> <strong>Does the customer show any specific emotional response</strong> throughout the journey, or do they remain dissonant? </li> </ul> <p>After the observation session has taken place, a debrief interview can provide further insights, and the rules outlined earlier should once again be followed. Use questions such as ‘why did you do that?’ or ‘how did it make you feel?’, to really dig out valuable insights. Once again, when deciding how many observation sessions to conduct, you should run as many as it takes until you can detect a similar pattern or trend. </p> <h3>Quantitative </h3> <p>Quantitative research is much better suited to testing specific hypotheses or validating assumptions. There are several sources one can use to collect quantitative data, the most popular being customer surveys. Additionally, first party data sources such as owned websites, social pages and CRM systems can also be fantastic places to gather quantitative insights. Let’s discuss each.</p> <h4>Customer surveys </h4> <p>Often overused, rarely well thought-out, the customer survey is a tricky beast which requires careful planning and distribution to have any chance of being useful. There are a plethora of free and paid survey tools out there, so the first consideration is which platform to use. For basic surveys, I would suggest a tool like Google Forms will suffice. If you wish to perform more advanced data mining and analysis, or have complicated logic you need to integrate, then tools such as Survey Monkey or Survey Gizmo are worth investigating.</p> <p><img src="https://assets.econsultancy.com/images/0008/8319/g_forms.jpg" alt="google forms" width="615" height="314"></p> <p><em>Google Forms</em></p> <p>There are many ways to distribute a customer survey, whether you are emailing to a database, promoting through paid media, or even collecting responses in person (such as at a trade show), the survey should adhere to a few basic principles:</p> <ul> <li> <strong>Less is more</strong> – No-one, I repeat, no-one, likes to wade their way through a 100-question survey. Only ask what you absolutely need to know.</li> <li> <strong>Use quantitative question types</strong>, like multiple choice, as much as possible. ‘Free text’ questions are difficult to collate and review.</li> <li>Making specific questions compulsory is fine, but again, avoid doing this for ‘free text’ questions, unless this information is critical.</li> <li> <strong>Clearly set expectations</strong> at the start of the survey. If you say it will take 5 minutes, then it should not take more than 5 minutes. Also ensure you include a progress tracker.</li> <li> <strong>Be explicit</strong> about how you will use participants' information. If you don’t need it, don’t collect it</li> <li>If sending the same survey out to several different groups of people, ensure you include an indicator so you can segment the data. Another alternative is to use time stamps, geolocation or unique URL’s that lead to the same survey link</li> <li> <strong>Incentives can skew results</strong>. If the incentive is too large, then people will just blindly click their way through the questions rather than actively participating</li> <li>If time and the platform permits, use logic to help make the whole survey as frictionless as quick as possible. There is nothing more infuriating than answering ‘no’ to one question to then be bombarded with another series of questions that are completely irrelevant</li> <li> <strong>Optimised the survey for mobile</strong> (ie - can you see all of the possible answers without needing to scroll)</li> </ul> <p>Aim for at least 100 responses to your survey, across your chosen segment. This will typically be enough to accurately represent the mean. As always, some careful planning and considerations before the survey is distributed can save you hours of work when crunching the data. </p> <h4>Website/social data</h4> <p><a href="https://econsultancy.com/training/courses/google-analytics/">Google Analytics</a> is an incredibly powerful, free tool that is often overlooked by teams outside of marketing, particularly when it comes to better understanding your customers. There is no better quantitative source of data available other than customer traffic to and around your website. If you are also in the process of developing a mobile app, it’s well worth looking at Firebase, which will consolidate reporting into one dashboard.</p> <p>By setting up Google Analytics properly, you can analyze how the user navigates through the website, which is particularly valuable when optimising the customer journey. Furthermore, you can examine where traffic to the website is coming from, which is important when understanding the customer journey.</p> <p>In addition to your owned web properties and apps, it can often be extremely helpful to explore more generally what people are saying about your brand, product or category online. There is a plethora of free social listening tools which can help with this. Whilst this is often utilized to define content strategy, it is also a super useful process for product or campaign development.</p> <p>At it’s very simplest, Google Trends allows you to view real time, indexed search data on your chosen topics, which can be broken down over specific timescales and cordoned geographically. Other useful tools I've come across include Social Mention and Answer The Public which each offer a slightly different perspective, with some clever visualisation features.</p> <p><img src="https://assets.econsultancy.com/images/resized/0003/1802/exercise_bacon-blog-full.png" alt="google trends" width="615" height="287"></p> <p><em>Google Trends</em></p> <h4>CRM data</h4> <p>Your <a href="https://www.econsultancy.com/blog/68769-what-s-the-difference-between-crm-marketing-automation-and-dmps">CRM</a> or any sales data you collect is a fantastic resource for insights. It goes without saying that this will only be as robust as you make it, so investing time to ensure you have set up the CRM properly is vital. Ask yourself what data is worth collecting and what you will use it for. Looking for trends based on customer locations or temporal patterns could lead to some interesting insights, as could looking at commonality in the path to purchase or the behaviour of your most valuable customers. </p> <p>Using your CRM for customer research takes time and planning, but is well worth the effort. Often CRM systems are only accessed by the sales department, but they are an incredibly powerful tool for the rest of the organisation, particularly when it comes to getting a better understanding of your customers, their pain points and where you can potentially create additional value. </p> <p>Regardless of whether you are a small business, there are multitudes of benefits to having a CRM system in place, and a host of amazing cloud based solutions out there with reasonable pricing schemes. It's hard to go past Salesforce for overall functionality and flexibility, but if you are a small business using the Google suite of tools, then Prosperworks is well worth checking out.</p> <p>Keep in mind that your own customer data will help you segment and understand who is buying and using your products or services. It can also help you look for commonality between lapsed customers, which can be used to inform marketing or product decisions.</p> <h3>In conclusion</h3> <p>There are no hard and fast rules when it comes to conducting customer research but what I have outlined here are some of the basic principles that I have found useful. It's not exhaustive, so if you have any tips of your own, feel free to add them in the comments.</p> <p>The best advice I can give you is to use what you've got, but make sure you get out and talk to real customers. Assuming you know what they think is the biggest mistake a business can make.</p> <p>And lastly, remember that if you are setting out to conduct customer research it’s important that you find the absolute truth, not just try to validate what you already think.</p> tag:econsultancy.com,2008:TrainingDate/3243 2017-08-16T04:20:51+01:00 2017-08-16T04:20:51+01:00 Mastering Customer Experience (CX) Management in the Digital Age <p>This 1-day intensive course is designed to give you a holistic understanding of the mind of today’s customer and delivers highly effective strategies to attract and retain them. This new and highly relevant course gives you the edge you need to be successful in today’s complex business environment.</p> <p>Quite simply, without customers you don’t have a business. Winning customers today has a become a lot more complicated as people have changed the way they buy goods and services.</p> <p>Research indicates that typically 80% of your business comes from 20% of your loyal customers.</p> <p>But in today’s customer controlled world, earning loyalty is a real challenge.  This is because we are dealing with a very smart and discerning customer who is looking for immense value, has very high expectations and hyper researching everything.</p> <p>Yet, innovative and smart businesses have created customer experience formulas that work extremely well for them. Zappos, Disney, Airbnb, Virgin, Starbucks, Nordstrom, Hubspot among others continue to deliver amazing experiences and drive sales.</p> <p>This unique and insightful course teaches you how successful companies design and deliver amazing customer experiences. It gives you an insider view and highly effective tips and tricks to deliver amazing experiences at every brand touch point to win and retain your customers.</p> tag:econsultancy.com,2008:TrainingDate/3242 2017-08-16T04:05:45+01:00 2017-08-16T04:05:45+01:00 Proving Digital ROI <p>A one-day workshop which will demystify the concept of ROI (return on investment)  by instructing participants about the key metrics, calculation, and techniques for reporting marketing performance to management.</p> tag:econsultancy.com,2008:WebinarEvent/898 2017-08-14T13:34:05+01:00 2017-08-14T13:34:05+01:00 Measurement & Analytics: Trends, Data and Best Practice <p>Econsultancy's Trends Webinar for September 2017 looks at emerging trends, case studies and the state of Measurement &amp; Analytics.</p> <p>This insight will come from Econsultancy's own research along with collated third-party data and statistics, hosted by our in-house research analyst, Sean Donnelly.</p> tag:econsultancy.com,2008:BlogPost/69319 2017-08-09T01:00:00+01:00 2017-08-09T01:00:00+01:00 Analytics approaches every marketer should know #4: Prescriptive analytics Jeff Rajeck <p>To recap, we have covered three types of analytics already, each of which has its own characteristics, methodologies, and best practices: </p> <ul> <li> <a href="https://econsultancy.com/blog/69298-analytics-approaches-every-marketer-should-know-1-descriptive-analytics/"><strong>Descriptive analytics</strong></a> - summarizes what happened in the past</li> <li> <a href="https://econsultancy.com/blog/69300-analytics-approaches-every-marketer-should-know-2-diagnostic-analytics/"><strong>Diagnostic analytics</strong></a> - attempts to determine why important events happened</li> <li> <a href="https://econsultancy.com/blog/69308-analytics-approaches-every-marketer-should-know-3-predictive-analytics/"><strong>Predictive analytics</strong></a> - predicts new data points from exists data which would be difficult or impossible to get otherwise. </li> </ul> <p>In some ways, prescriptive analytics achieves what marketers might expect from predictive analytics. That is, prescriptive analytics recommends what actions we should take in the future.</p> <p>Before we begin, though, we'd like to let you know that Econsultancy are running an Advanced Mastering Analytics session in Singapore on Tuesday, August 15th. You can <a href="https://econsultancy.com/training/courses/advanced-mastering-analytics-training-singapore/dates/3232/">find out more details and book your spot at this link</a>.</p> <h3>What is analytics?</h3> <p>In our previous posts on analytics, we defined analytics as a practice, a process, and a discipline whose purpose is to turn data into actionable insight.</p> <p>With prescriptive analytics, the focus is as much on the <em>action</em> as on the insight.</p> <h3>Prescriptive analytics overview</h3> <p>Paradoxically, understanding the complex area of prescriptive analytics starts with a simple question. In marketing, why do we do what we do?</p> <p>Dispensing with trivial answers to that question, most marketers do what they do because, relying on experience and reflection, they believe it is the right thing to do. For example, a food delivery service may notice that their customers tend to buy more often on weekends.  A seasoned marketer would then recommend that the company offers discounts mid-week to boost business.</p> <p><img src="https://assets.econsultancy.com/images/0008/8138/1.png" alt="" width="800" height="480"></p> <p>With the right data, marketers can automate the process of noticing a consumer's behaviour and sending an offer. Additionally, they can look for other behavioural patterns which can also be influenced with appropriate actions. </p> <p>Finding the right data and devising the rules to recommend appropriate actions is the essence of prescriptive analytics.</p> <h3>An example of prescriptive analytics</h3> <p>Perhaps the most well-known example of prescriptive analytics is a recommendation engine.</p> <p>Recommendation engines use personal data sourced through descriptive analytics (e.g. pages viewed and items purchased) and predictive analytics (in-market segments or inferred demographics) to suggest products that a particular individual might be interested in.</p> <p>For recommendation engines, prescriptive analytics means deciding which data to use and how to process that data using decision logic to produce a tangible action.<strong> </strong>For a recommendation engine that action is recommending the products.</p> <p><img src="https://assets.econsultancy.com/images/0008/8139/3.png" alt="" width="800" height="356"></p> <h3>The distinguishing features of prescriptive analytics</h3> <p>Analytics experts disagree about what distinguishes predictive from prescriptive. Both methods use past data and algorithms in order to make an educated guess about something which is not known.</p> <p>For marketers, however, differences are arguable more apparent.</p> <p>For example, one notable difference is the desired result of the practice. Predictive analytics may only discover new data whereas prescriptive analytics must produce a recommended action.</p> <p>Prescriptive analytics may also deliver more than one recommended action. Should it do so, the algorithm should also be able to rank them, so the most appropriate is considered or viewed first.</p> <p>Also, unlike predictive analytics data output, prescriptive recommendations must be tested against a desired business outcome. The results of a prescriptive algorithm cannot be 'eyeballed' like predictive data to determine whether it is correct. Recommendations, for example, can only be validated by analysing whether the consumer clicked on the recommended product.</p> <p>Because of this condition, prescriptive analytics also require a feedback mechanism which validates recommendations and improves them over time. How to do this is beyond the scope of this post, but interested readers can find more info in one of the many books on 'recommender systems' or read: <a href="https://econsultancy.com/blog/69112-what-s-the-difference-between-ai-powered-personalisation-and-more-basic-segmentation/">What's the difference between AI-powered personalisation and more basic segmentation?</a></p> <p><img src="https://assets.econsultancy.com/images/0008/8140/4.png" alt="" width="800" height="360"></p> <p>So, to make a clear distinction between predictive and prescriptive analytics, if a marketer is characterizing data, then they are <a href="https://econsultancy.com/blog/69228-predictive-analytics-four-prerequisites-of-an-effective-strategy/">doing predictive analytics</a> and should focus on producing correct data.</p> <p>If they are trying to deliver business value through recommendations, then they are doing prescriptive analytics and need to be more concerned with actual results such as conversions or revenue.</p> <h3>How to do prescriptive analytics</h3> <h4>Engineer defining events</h4> <p>To get started with prescriptive analytics, review your website, app, or product and look for opportunities to leverage the platform to learn more about your customers.</p> <p>For example, an ecommerce site which has multiple product categories could create mini-sites which not only featured related products, but some relevant content as well. So you now assume that someone visiting a toy-themed section of the site is clearly in-market for toys. Should they also read a post about infant care, then it is quite likely that they are a new parent.</p> <p>Those who trigger the defining events then need to be categorized into segments so that they can receive the right recommendations.</p> <h4>Devise recommendations with a measurable action</h4> <p>Then, you need to identify an action which, if taken, will validate the predicted category. This could be making an offer, requesting personal information or suggesting content (which requires a click).</p> <p>For the new parent mentioned above, the measurable action may be asking for personal data in return for a free sample of a product. For someone in market for a package holiday, offering various travel-related content, and measuring clicks, may be sufficient.</p> <h4>Implement feedback mechanisms</h4> <p>The results of the suggested measurable actions should then be reviewed to see whether your defining event is useful at categorising consumers.</p> <p>That is, if relevant offers are not taken up, then your defining event is probably not creating an in-market segment for the product.</p> <p><img src="https://assets.econsultancy.com/images/0008/8141/2.png" alt="" width="800" height="249"></p> <h4>Measure business impact</h4> <p>In addition to correctly identifying prospects, prescriptive analytics should also drive profitable action.</p> <p>For a publisher, those who are identified as in a particular segment should click on relevant ads more. Shoppers who are supposedly in-market should, ultimately, not just click on offers but should buy more.</p> <p>While delivering business value may be a longer-term goal, thinking of how to measure return on investment (ROI) in your analytics at the start of the project is always recommended.</p> <h4>Review data at each stage to improve results</h4> <p>With prescriptive analytics, there are three points where you will have data to help you improve:</p> <ul> <li> <strong>Categorisation</strong> - are people engaging with your defining event?</li> <li> <strong>Validation</strong> - did those who engage also validate their category with the measurable action?</li> <li> <strong>Monetisation</strong> - are you able to encourage your prospects to do more business with you? </li> </ul> <p>While there are no fixed rules for how effective each stage must be, using a control group which has not been exposed to the prescriptive analytics applied will give you a relative indication whether your programme has been successful.</p> <h3>Prescriptive analytics best practices</h3> <h4>Start with a commercial tool</h4> <p>As prescriptive analytics is quite complex, it is best to start with a vendor so that you become familiar will the options available. With guidance, you will also be likely to have a better chance of producing successful recommendations and personalisations.</p> <p><a href="http://www.adobe.com/uk/marketing-cloud/target/automated-personalization.html">Adobe</a>, <a href="https://www.ibm.com/us-en/marketplace/real-time-personalization">IBM</a>, and now <a href="https://www.google.com/analytics/optimize/">Google </a>have industry-leading products for brands with a sizable budget. For smaller marketing departments, there are countless independent personalisation and recommendation engine vendors who will typically provide services along with their products.</p> <h4>Provide the next-best-action, not just the next-best-offer</h4> <p>Marketers traditionally focus on the moment when customer is about to buy and, consequently, only deliver offers to encourage a purchase.</p> <p>With digital marketing, however, marketers can do much more than that. As discussed in a <a href="https://econsultancy.com/blog/69229-how-digital-transformation-can-revolutionise-marketing/">previous post</a>, data from digital marketing can also identify behaviour which indicates that a customer has hit an 'inflection point' in the buyer's journey. That is, they are poised to move from, say, just being aware of a need to being interested in a particular product.</p> <p>The wider opportunity, then, for prescriptive analytics is to find these inflection points and provide information which moves a consumer along the journey in the brand's favor, also known as next-best-action marketing.</p> <p>The next-best-action may be additional product information, an opportunity to speak with a subject matter expert, or even a strategically-placed 'buy now' button to circumvent a normally long buying cycle.</p> <h4>Try many strategies</h4> <p>One issue which marketers have with prescriptive analytics is that they are motivated to use for a particular product line or customer journey. Which recommendations will work for customers, however, is difficult to determine from the outset.</p> <p>For that reason, many different approaches should be tried before deciding on prescriptive analytics as a strategy. For example, a brand may find that website personalisation and recommendations don't work, but that dynamic audience segmenting delivers superior return on ad spend (ROAS).</p> <h3>So...</h3> <p>Through this four-part series, we have covered a broad range of analytics. We started with the simplest and most commonly-used analytics, descriptive, and ended here with perhaps one of the most advanced topics in marketing, prescriptive analytics.</p> <p>Through numerous examples, how-to guides, and best practices we hope that readers now have a better idea of what is meant by the term 'analytics' and the goals of each methodology.</p> <p>The most important takeaway, though, is that marketers should now to be able to distinguish each type of analytics and ensure that they follow the appropriate guidelines.</p> <p><em>Descriptive</em> analytics needs to provide clear and simple representation of complex data. <em>Diagnostic</em> analytics should produce a root cause or short list of contributing factors to an issue. <em>Predictive</em> analytics aims to deliver an algorithm which produces high-quality new data points. And <em>prescriptive</em> analytics must produce both a recommendation which drives profitable action and validation method.</p> <p>Regardless of which analytics you use, though, there are countless of additional books, reference guides, and blog posts to help you on your way. A few which were used in writing these posts are listed below.</p> <h3>References</h3> <ul> <li><a href="https://www.amazon.com/Business-Analytics-Practitioners-International-Operations/dp/1461460794">Business Analytics A Practitioner’s Guide </a></li> <li><a href="https://www.amazon.com/gp/search?index=books&amp;linkCode=qs&amp;keywords=9781466591660">A User's Guide to Business Analytics</a></li> <li> <a href="http://www.vlamis.com/blog/2015/6/4/the-four-realms-of-analytics.html">The Four Realms of Analytics</a>, Tim Vlamis</li> <li> <a href="https://community.lithium.com/t5/Science-of-Social-Blog/Big-Data-Reduction-1-Descriptive-Analytics/ba-p/77766">Big Data Reduction</a> (Parts 1,2, and 3), Michael Wu, Chief Scientist at Lithium Technologies</li> </ul> tag:econsultancy.com,2008:BlogPost/69305 2017-08-07T10:09:27+01:00 2017-08-07T10:09:27+01:00 A day in the life of... Pre-Sales Technical Manager at RedEye Ben Davis <p>Before we find out, remember if you're looking for a new role you can check out the <a href="https://jobs.econsultancy.com/?cmpid=EconBlog">Econsultancy jobs board</a> or to test your skills with our <a href="https://www.econsultancy.com/training/marketing-readiness">Modern Marketer Quiz</a>.</p> <h3>Please describe your job: What do you do?</h3> <p>My main responsibilities revolve around ensuring our clients data is transferred, processed and stored in the best way possible to ensure they can extract the maximum effectiveness from our <a href="https://econsultancy.com/blog/68952-a-recipe-for-the-martech-layer-cake/">customer data platform</a>. This covers liaising with a client’s marketing and technical teams before they come onboard and managing the data aspects of the platform build during the client onboarding process. I speak both ‘client’ and ‘technical’ and act as a bridge translating top line goals and objectives into actionable technical requirements.</p> <h3>Whereabouts do you sit within the organization? Who do you report to?</h3> <p>The Presales department sits within the Technical Services team, however I have almost as much client contact as our Account Management team. One day I could be suited and booted in a client office, the next I could be buried in SQL, wiping mayonnaise off my Doctor Who T-Shirt!</p> <p>I report to the Head of Presales, who has been with the company for over 10 years, and together we also work on improving and transforming our internal data processes. With the fast paced nature of our industry, it’s a continual process.</p> <h3>What kind of skills do you need to be effective in your role?</h3> <p>You obviously need to know your stuff technically, but you also need to know how to talk to people and sometimes translate technical concepts to language more easily understood by non-technical people. From a technical point of view, you need to know your database technologies, methods of transferring, storing and manipulating large data sets as well as the regulatory and legislative compliance of data protection and security.</p> <p><img src="https://assets.econsultancy.com/images/0008/8123/stevemcgrath.jpg" alt="steve mcgrath" width="400" height="400"></p> <h3>Tell us about a typical working day… </h3> <p>If I am at a client meeting, I am mostly discovering client requirements and examining the legacy marketing technology we need to integrate with. The recurring topic of conversation at most meetings these days is the looming <a href="https://econsultancy.com/blog/67540-what-is-the-eu-general-data-protection-regulation-gdpr-why-should-you-care/">GDPR regulation</a> and how our customer data platform can help with compliance. I also sit on the Direct Marketing Association (DMA) North Council and we are currently having lots of discussion about the GDPR and how it will affect not only our clients’ marketing departments, but also their Information Governance in general.</p> <p>If I am in the office, I am normally creating data flows, technical specification and functional requirements to ensure our tech teams can build and configure our platform to match the client requirements.</p> <h3>What do you love about your job? What sucks?</h3> <p>I love the company and the people. I have worked in many large and small organisations and I have never come across a company that, not only says it cares about its people, but demonstrates it daily. You can see that in the quality of people we hire, and the work they output. As a technology company, we would be nothing without the people and you can tell the senior leadership team recognises that. I also love the ping pong and pool tables and probably spend a bit too much time in the social area!</p> <p>What sucks? As unbelievable as it sounds, I honestly can’t think of anything. As in any job you can have days where you get frustrated, sometimes with clients, sometimes with colleagues, but they are just small challenges in an otherwise rewarding role.</p> <h3>What kind of goals do you have? What are the most useful metrics and KPIs for measuring success? </h3> <p>We have a good set of metrics for how our onboarding process is working including a handover/review process to our account managers, who I consider my customers as well as the clients. Things like 'time taken to onboard', meeting client go-live dates and lack of remedial tech work needed after go-live, all form part of a continuous improvement process that has changed a lot in the time I have been here.</p> <p>We are also encouraged to set personal goals as part of our career development and as a part-time wedding photographer, I also have goals relating to this on my career development plan!</p> <h3>What are your favourite tools to help you to get the job done?</h3> <p>I use SQL Developer and Notepad ++ a lot. We use Sharepoint and OneNote for project collaboration and Slack for internal comms. I can have anywhere from 4-20 Excel spreadsheets open at any one time and I cannot imagine how I ever used to work on just one monitor.</p> <h3>How did you get into analytics/automation, and where might you go from here?</h3> <p>I started my career in sales and slowly went more back-office into IT roles until I ended up in marketing about 15 years ago. I have previously worked agency side with creative and digital agencies. RedEye has such a varied number of departments and I am lucky that my skill set is quite versatile, so what I want to do is follow in the footsteps of my boss and experience different areas of the business.</p> <p>I am also passionate about data protection, data security and the GDPR and there are lots of areas to explore around this. I have a feeling it will be a busy year!</p> <h3>Which brands and their marketing automation have you been impressed by recently?</h3> <p>I bought some shirts recently from Charles Tyrwhitt and was very impressed with their welcome programme. I receive highly personalised email and postal comms that are presented in such a way as to make you feel very much part of the Tyrwhitt ‘tribe’.</p> <p>I am also constantly amused and impressed with Virgin Trains – particularly <a href="https://twitter.com/VirginTrains">their Twitter feed</a> where the operator’s personalities always shine through. They are meme masters.</p> <h3>Do you have any advice for people who want to work in this area?</h3> <p>Always be passionate. If you don’t have the passion, find a job that you do have passion for. Always be learning – never feel you know everything. Always be talking – talk to everyone you meet, from all walks of life.</p> <p>And BE NICE. It always, always comes back to you.</p> tag:econsultancy.com,2008:BlogPost/69310 2017-08-04T11:57:14+01:00 2017-08-04T11:57:14+01:00 What is digital transformation? [video] Ashley Friedlein <p>So here we are chatting about what digital transformation is, including challenges around people, culture, talent, leadership and process. We discuss ways you can measure levels of digital transformation and get into business model disruption, and trends including <a href="https://www.econsultancy.com/reports/trend-briefings-artificial-intelligence-ai">artificial intelligence</a>, <a href="https://econsultancy.com/blog/68770-an-introduction-to-ai-and-customer-service/">conversational interfaces</a>, chatbots, driverless cars and more. </p> <p>So grab your own brew of choice, watch our chat, and feel free to ask questions or make your own comments below and we'll continue the discussion there.</p> <p><iframe src="https://www.youtube.com/embed/_LbtWZ0-LVE?wmode=transparent" width="425" height="350"></iframe></p> <p><strong><em>If your business needs help with Digital Transformation, <a href="https://econsultancy.com/training/digital-transformation/">get in touch with Econsultancy</a>. </em></strong></p> tag:econsultancy.com,2008:BlogPost/69308 2017-08-04T01:00:00+01:00 2017-08-04T01:00:00+01:00 Analytics approaches every marketer should know #3: Predictive analytics Jeff Rajeck <p>To this end, we will now cover the practice of predictive analytics and show how it is not necessarily about predicting the future, but rather a way to figure out what is happening right now and how marketers can use that information to their advantage.</p> <p>Before we begin, though, we'd like to let you know that Econsultancy is running an Advanced Mastering Analytics session in Singapore on Tuesday, August 15th. <a href="https://econsultancy.com/training/courses/advanced-mastering-analytics-training-singapore/dates/3232/">Click here to find out more details and book your spot.</a></p> <h3>What is analytics?</h3> <p>We have <a href="https://econsultancy.com/blog/69298-analytics-approaches-every-marketer-should-know-1-descriptive-analytics/">previously defined analytics</a> as a practice, a process, and a discipline whose purpose is to turn data into actionable insight.</p> <p>With predictive analytics, however, the focus is more on the insight than the action. </p> <h3>Predictive analytics overview</h3> <p>With descriptive and diagnostic analytics, we are able to describe data and offer explanations for why certain events happened. Notably, both techniques use data from things which happened in the past. The data itself, therefore, is never in question, even if the diagnoses are controversial.</p> <p>With predictive analytics, we are still relying on data from past events, but <strong>instead of using the data to describe or explain the past, predictive analytics uses data to get more data. </strong></p> <p>So why are we using existing data to get more data? Two reasons:</p> <ol> <li>The new data is either too difficult to get or not yet available.</li> <li>The new data will help us to make better decisions.</li> </ol> <p>Note that, contrary to popular perception, the data we get from predictive analytics will not necessarily be used to predict the future. Instead, <strong>predictive analytics is mostly used to predict what a data point would be if we knew what it was.</strong></p> <p>This confusing yet crucial point is probably best explained with an example.</p> <h3>An example of predictive analytics</h3> <p>One good example of predictive analytics which is relevant for marketers is sentimental analysis (inspired by a <a href="https://community.lithium.com/t5/Science-of-Social-Blog/Big-Data-Reduction-2-Understanding-Predictive-Analytics/ba-p/79616">post</a> by Dr. Michael Wu, Lithium's chief scientist).</p> <p>Say you need to find out whether comments on social media, overall, are positive or negative about a new product line. You could, in theory, gather all of the comments, read them individually, and keep count of how many were positive and how many were negative.</p> <p>Or, instead, you could run the comments through a sentiment analysis algorithm which 'scores' each comment according to how positive or negative it was. Then, using the average score, you would have your answer. Greater than zero is net positive, less than zero, negative.</p> <p>But how does a sentiment analysis engine work? How does it know what is positive or negative? The algorithm can perform this task because <strong>it 'learns' the difference between positive and negative comments through predictive analytics. </strong></p> <p>Using sample text, marked as 'positive' or 'negative', the sentiment analysis algorithm learns which word combinations are likely to be positive and which negative. After sufficient training, the algorithm then had rules to help it decide the tone of the passage.</p> <p>So when a new passage, which is not marked as 'positive' or 'negative', is presented to the algorithm, it uses the rules it learned previously to indicate whether it is positive or negative. The algorithm, therefore, takes existing data (the comments) to create new, more useful data (the overall sentiment).</p> <p>So, with a sentiment analysis algorithm, marketers can perform predictive analytics. They can 'predict' what the overall sentiment would be if they read and scored all of the messages individually.</p> <p><img src="https://assets.econsultancy.com/images/0008/8022/p4.png" alt="" width="594" height="315"></p> <h3>The distinguishing features of predictive analytics</h3> <h4>Produces utility data</h4> <p>One of the most apparent differences between predictive analytics and descriptive analytics is that its output is data to be used, not just read. From the example above, the sentiment analysis score for each individual comment is not particularly useful; it has to be averaged and interpreted.</p> <h4>Requires an algorithm</h4> <p>Additionally, unlike diagnostic analytics, you will probably write your own algorithm to do the prediction.  </p> <p>To understand why this is the case, have a look at some of the data sets predictive analytics is used to obtain: </p> <ul> <li>Social media influencer scores.</li> <li>Whether a customer is 'in-market' or has a particular interest.</li> <li>Where a customer is in the purchase funnel.</li> <li>What is the experience consumers 'must have' before they buy?</li> <li>The likelihood of a customer to cancel your service.</li> <li>A 'lead score', often used by business-to-business (B2B) marketers.</li> </ul> <p>Each of these require a significant amount of data to be effective, and if the new data is to be consistent and reliable an algorithm is required to process the data uniformly.</p> <h4>Needs training data</h4> <p>Also, in order for the new data sets to be accurate, <strong>predictive analytics requires actual data for training. </strong>Training data must also be 'marked' with the outcome so that the algorithm can be calibrated. In the example above, all of the comments used to train the sentiment algorithm had to 'marked' as positive or negative. </p> <p>Note that creating an algorithm doesn't require fancy machine learning or artificial intelligence (AI). Many companies derive their B2B lead scoring algorithm through a collaboration between marketing and sales.</p> <p><img src="https://assets.econsultancy.com/images/0008/8020/p3.png" alt="" width="828" height="388"></p> <h4>Is not exact</h4> <p>Finally, unlike descriptive analytics,<strong> predictive analytics only offers results which are possibly true.</strong> As with diagnostic analytics, the analyst has to take a stand with the predictions and will typically need some evidence to support the algorithm's results.</p> <h3>How to do predictive analytics</h3> <p>Now that you perhaps have a better idea of what predictive analytics are, how do you actually do it?</p> <h4>1) Think of data that you want, but don't have</h4> <p>The first step is to reflect on your current marketing programme and<strong> think of something that you'd like to know, but currently do not.</strong></p> <p>For example, if you are trying to boost your ecommerce sales, what do people who buy things do before buying? Do they visit the site multiple times, watch product videos, or linger on the site?</p> <p>If you knew the answer to that question, you could focus your marketing efforts on getting people to have that 'must-have' experience as often and as quickly as possible.</p> <p><img src="https://assets.econsultancy.com/images/0008/8023/p5.png" alt="" width="887" height="376"></p> <h4>2) Build a training set</h4> <p>Every algorithm needs to be trained with real data. To get training data, you first need to distinguish source data which has the correct attributes from data which does not. Then you need to mark each case with the result.</p> <p>In the case of the sentiment analysis algorithm, someone must determine which words and phrases were negative and which were positive and then mark them as such for the algorithm to learn the difference.</p> <h4>3) Write the algorithm</h4> <p>While this sounds complicated and difficult, it need not be. <strong>An algorithm is simply a list of instructions to follow in order to transform one data set to another.</strong></p> <p>So to start off, you can simply look at your data and identify common features between the data sets that achieve your goal. Then your algorithm could be that people who do 'X', represented by the data set, also tend do 'Y', the desirable goal.</p> <p>Additionally, as mentioned previously, the process does not have to be automated. You could simply look for behaviours (e.g. pages viewed) in Google Analytics and see whether that behaviour frequently led to your goal (e.g. a purchase).</p> <h4>4) Test performance</h4> <p>To test performance you need to find additional 'marked' data and see how well the algorithm's output corresponds with the marks.</p> <p><strong>Testing data should not be the same as the training data.</strong> Reason being that you may devise an algorithm which is optimized only for the training data, but performs poorly on any other data.</p> <h4>5) Review, improve, repeat</h4> <p>Once exposed to real data, the performance of the algorithm will probably be underwhelming. But with some testing and additional data analysis (beyond the scope of the post, but <a href="https://www.analyticsvidhya.com/blog/2015/12/improve-machine-learning-results/">here is a good introduction</a>), it is likely that you can improve it over time.</p> <p>Nothing tests an algorithm better than putting it to real use, though. Results like customer churn can be tested and improved with result data alone but less concrete results, like interest segments or lead score, may require collaboration with other departments.</p> <p>Regardless, <strong>implementing a predictive algorithm is an iterative process</strong> and the more it is reviewed, the more likely it will become useful.</p> <h3>Predictive analytics best practices</h3> <h4>Start simple</h4> <p>As with all analytics, it's better to start with predictive analytics which work in a small way then to try something ambitious which fails.</p> <p>So for the first few attempts, <strong>use an outcome that is absolutely true (e.g. did buy/didn't buy) and look for one or two explanatory variables.</strong></p> <p>The 'must-have experience' mentioned in step 1 is a good example of a simple predictive algorithm. You are simply looking for a single common experience customers have before buying something.</p> <h4>Aim for high-quality data before deploying</h4> <p>While testing is difficult and can be discouraging, your predictive output should be of a reasonable quality before launching the algorithm. While there is no definite rule for how accurate your model should be, your algorithm should offer enough predictive power that it makes a visible impact on business performance.</p> <h4>Not everything will work</h4> <p>Even the best ideas for predictive analytics often do not work. Behaviour which seems logically to lead to your goal may only do so a small percentage of the time.  </p> <p>On the bright side, proving that a data set is unrelated to your goal is still useful information – and the steps you take to finding out that an algorithm doesn't work is a good start to finding one which is indeed predictive.</p> <h3>So...</h3> <p>Although it is intoxicating to think we can predict the future with data, the reality is that we can, at best, only really be sure about what is happening right now.</p> <p>Fortunately, <strong>marketers can still derive useful information by discovering connections between an existing data set and a desirable goal.</strong> Marketers can then encourage the original behaviour in an attempt to engineer the goal.</p> <p>In this way, even though predictive analytics is not a crystal ball, it remains a worthwhile practice which can delivery real business value and, with some effort, return on investment.</p> tag:econsultancy.com,2008:BlogPost/69300 2017-08-02T01:00:00+01:00 2017-08-02T01:00:00+01:00 Analytics approaches every marketer should know #2: Diagnostic analytics Jeff Rajeck <p>Before we start, though, we'd like to let you know that Econsultancy is running an Advanced Mastering Analytics in Singapore on Tuesday, August 15th. <a href="https://econsultancy.com/training/courses/advanced-mastering-analytics-training-singapore/dates/3232/">Click here to see more details and book your spot</a>.</p> <h3>What is analytics?</h3> <p>In the <a href="https://econsultancy.com/blog/69298-analytics-approaches-every-marketer-should-know-1-descriptive-analytics/">previous post</a>, we defined analytics in detail, but essentially analytics is a practice, a process, and a discipline; the purpose of which is to turn data into actionable insight.</p> <h3>Diagnostic analytics overview</h3> <p>Previously, we discussed how descriptive analytics will tell you <em>what</em> just happened. To understand <em>why</em>, however, you need to do some more work. You need to perform diagnostic analytics.</p> <p>In many cases, when there is a single 'root cause' of the situation, diagnostic analytics can be quick and simple - you just need to find that root cause.</p> <p>But, if no root cause is apparent, then you need to use diagnotic techniques to discover a causal relationships between two or more data sets.</p> <p>The analyst also needs to make it clear what data is relevant to the analysis so that the relationship between the two data sets is clear.</p> <h3>An example of diagnostic analytics</h3> <p>In a descriptive report, you note that website revenue is down 8% from the same quarter last year. In an attempt to get ahead of your boss's questions, you conduct diagnostic analytics to find out why.</p> <p>First, you look for a root cause.  Perhaps there was a change in ad spend, a rise in cart abandonments, or even a change in Google's algorithm which has affected your web traffic.</p> <p>Finding nothing, you then look at one of the data sets which contribute to revenue: impressions, clicks, conversions, and new customer sign-ups.</p> <p>You discover from the data that changes in revenue closely tracks changes in new customer sign-ups, and so you isolate these two data series in a graph showing the relationship. This then leaves you, or one of your colleagues, to conduct diagnostic analysis on user registrations to find out why they are down.</p> <p><img src="https://assets.econsultancy.com/images/0008/7922/3.png" alt="" width="1077" height="562"></p> <h3>The distinguishing features of diagnostic analytics</h3> <p>Like descriptive analytics, diagnostics requires past 'owned' data but, unlike descriptive analytics, diagnostic analytics will often include outside information if it helps determine what happened.</p> <p>From the example above, it's clear that domain knowledge is also more important with diagnostic analytics. External information from a wide range of sources should be considered in root cause analysis.</p> <p>And, when comparing data sets looking for a relationship, statistical analysis may be required for a diagnoses, specifically regression analysis (see point 2 below).</p> <p>Finally, with diagnostic analytics you are trying to tell a story which isn't apparent in the data and so the analyst needs to go 'out on a limb' and offer an opinion.</p> <h3>How to do diagnostic analytics:</h3> <h3>1) Identify something worth investigating</h3> <p>The first step is doing diagnostic analytics is to find something that is worth investigating. Typically this is something bad, like a fall in revenue or clicks, but it could also be an unexpected performance boost.</p> <p>Regardless, the change you're looking to diagnose should be rare as analysing volatile data is a pointless exercise.</p> <h3>2) Do the analysis</h3> <p>As shown in the example above, diagnostic analytics may be as straightforward as finding a single root cause - i.e. revenue dropped last month because new customer sign-ups were down.</p> <p>More complex analyses, however, may require multiple data sets and the search for a correlation using regression analysis. How to carry out regression analysis is beyond the scope of this post but there are <a href="http://datapigtechnologies.com/blog/index.php/a-visual-explanation-of-linear-regression-in-excel/">many excellent tutorials</a> available to help you with it.</p> <p>What you are trying to accomplish in this step is to find a statistically valid relationship between two data sets, where the rise (or fall) in one causes a rise (or fall) in another.</p> <p>More advanced techniques in this area include data mining and principal component analysis, but straightforward regression analysis is a great place to get started.</p> <h3>3) Selectively filter your diagnoses</h3> <p>While it may be interesting that a variety of factors contributed to a change in performance, it's not helpful to list every possible cause in a report.</p> <p>Instead an analyst should aim to discover the single, or at most two, most influential factor(s) in the issue being diagnosed.</p> <h3>4) State your conclusion clearly</h3> <p>Finally, a diagnostic report must come to a conclusion and make a very clear case for it.</p> <p>It does not have to include all of the background work, but you should:</p> <ul> <li>identify the issue you're diagnosing,</li> <li>state why you think it happened, and</li> <li>provide your supporting evidence</li> </ul> <p> <img src="https://assets.econsultancy.com/images/0008/7920/1.png" alt="" width="599" height="392"></p> <h3>Diagnostic analytics best practices</h3> <p>Here are a few more things to keep in mind when doing diagnostic analytics.</p> <h3>Correlation does not prove causation</h3> <p>Correlation will tell you when two variables (say clicks and conversions) move in sync with one another.  </p> <p>While it's tempting todraw conclusions from that fact, the correlation must also make sense before it can be considered as causal evidence.</p> <p>For some dramatic illustrations of why this is the case, please refer to <a href="http://www.tylervigen.com/spurious-correlations">this excellent collection of spurious (meaningless) correlations</a>.</p> <p><img src="https://assets.econsultancy.com/images/0008/7921/2.png" alt="" width="800" height="388"></p> <h3>Be wary of using multiple explanatory data </h3> <p>When doing regression analysis, it is possible to improve your 'correlation score' (R-squared) by adding additional variables.</p> <p>Doing so should, however, be avoided as you are both confounding your analysis (remember keep it to two factors at most) and<a href="https://en.wikipedia.org/wiki/Overfitting"> 'overfitting' your model</a>. That is, you are no longer using reason to find an answer, but instead just throwing data at the problem and seeing what works.</p> <h3>But don't be drawn to easy answers, either</h3> <p>When you are thinking of root causes or of possible correlations, think broadly of everything that could have affected the outcome.</p> <p>Typically marketers are seeking to explain campaign performance or web traffic and the contributing factors are endless.  </p> <p>The number of paid impressions, changes in advertising creative, and audience targeting are all obvious places to check but also consider things like the time-of-year, competitive offers, and platform algorithm changes.</p> <h3>So...</h3> <p>Currently, analytics seems to be largely focused on describing data through reports. The potential for the practice, however, is far greater than displaying data and letting the audience make conclusions.</p> <p>Analysts can do better, though. They can provide further insights into the data by using diagnostic analytics to try and explain why certain things happen.</p> <p>Ideally, marketing reports should contain both. Descriptive charts and graphs to keep people informed about the systems and results which concern them and separate, diagnostic reports which aim to explain a significant phenomena such as a decline in new business or a change in web browsing behaviour.</p> <p>Not only will this help the reader to understand why some decisions have been made, but it also provides evidence that the report writer understands the data and the point of collecting it. That is, we collect data so that we can make better-informed decisions through analytics.</p> tag:econsultancy.com,2008:BlogPost/69298 2017-07-31T01:00:00+01:00 2017-07-31T01:00:00+01:00 Analytics approaches every marketer should know #1: Descriptive analytics Jeff Rajeck <p>When it is broken down into its various practices, however, analytics is much more approachable and certainly something just about anyone could handle.</p> <p>In this series, we are going to cover four of the most-used analytic approaches and provide details on what distinguishes them, in what circumstances they should be used, and how marketers can use each of them more effectively. </p> <p>Once marketers can distinguish different types of analytics and know how best to use them, they will hopefully gain confidence that they truly understand 'analytics'. To start off, we'll look at an overview of descriptive analytics.</p> <p>Before we begin, though, we'd like to let you know that Econsultancy are running an Advanced Mastering Analytics session in Singapore on Tuesday, August 15th. <a href="https://econsultancy.com/training/courses/advanced-mastering-analytics-training-singapore/dates/3232/">Click here to see more details and book your spot</a>.</p> <h3>So what is 'analytics'?</h3> <p>Before we describe a type of analytics, it's best to define exactly what we mean by the term.</p> <p>First off, analytics is the practice of converting existing data and information into new data and information which can support decision making.<strong> </strong>Analytics turns data into actionable insight.</p> <p>That is, when you have 'done analytics' you should have easier-to-read data than you had previously and it should help people make better decisions.</p> <p>Also, analytics is a process which involves a number of steps including: </p> <ul> <li>acquiring data,</li> <li>applying domain knowledge,</li> <li>performing mathematical functions on the data,</li> <li>using statistics where appropriate, and</li> <li>reporting results in an easy-to-understand format </li> </ul> <p>Finally, analytics is a discipline which crosses IT, business intelligence and marketing as well as executive decision makers.  So learning the best practices for how to process and present data is a useful skill for just about anyone.</p> <h3>Descriptive analytics overview</h3> <p>In this post, we're starting with one type of analytics which is probably the simplest of all the types, descriptive analytics.</p> <p>Descriptive analytics exists to highlight the features and characteristics of a data set by using a summary. It is typically used to convert a large amount data into a small amount of information which is easier to understand.</p> <h4>An example</h4> <p>For example, a business which sells cars may have a long list of all of the cars it has sold in a year. That list is too hard for people to use for decision making, and so an analyst would summarize the data using descriptive analytics.</p> <p>The resulting report may include the number of cars sold each month, an average of how many cars were sold per day, or simply the sum of cars sold in a year.</p> <p>All of these figures describe the data in simpler terms than the list as a whole. Because it is easier to digest a summary than a list, the descriptive report will be more suitable for those trying to understand what happened and decide what to do in the future.</p> <p><img src="https://assets.econsultancy.com/images/0008/7908/6.png" alt="" width="800" height="486"></p> <p><em>Dashboards are a feature of analytics software</em></p> <h3>The distinguishing features of descriptive analytics</h3> <p>So with the above definition and example in mind, what makes analytics 'descriptive' as opposed to something else?</p> <p>For a start, descriptive analytics only uses facts and real data. Descriptive analytics should not include assumptions or derived data which cloud the description. For example, the report described above should not include estimates and any missing data should be clearly noted.</p> <p>Descriptive analytics is also only about the past. Future estimates and predictions belong to another type of analytics with different best practices.</p> <p>And finally, calculations made for a descriptive analytics report should be marked clearly. Analysts should indicate if a data point is a sum, average, or an aggregation. Probability and statistics, like predictions, belong to another sort of analytics and should be omitted.</p> <h3>When to use descriptive analytics</h3> <p>Descriptive analytics can be presented either as a real-time dashboard or a report depending on the urgency of the data.</p> <p>KPI reports are a particularly popular example of descriptive analytics as they include real numbers from the past which require little or no further calculations to make sense.</p> <h3>How to do descriptive analytics</h3> <h4>1) Start by collecting relevant data</h4> <p>For marketers, there is typically a short list of metrics which are relevant to other departments: </p> <ul> <li>Ad clicks</li> <li>Web page views</li> <li>Conversions</li> <li>Revenue </li> </ul> <p>So to get started, these figures need to be collected into a single database or spreadsheet so that they are ready to analyse.</p> <h4>2) Do the analysis</h4> <p>When analyzing data, decide first what people really need to know. Do they want to see trends over time, or just want to know whether targets are hit?</p> <p>If in doubt, leave it out and see whether anyone asks for it.</p> <p>Also, be conscious that what you are doing is descriptive analysis and stick to the key principles listed above.</p> <h4>3) Present data clearly</h4> <p>Unlike other types of analytics, descriptive analytics leaves the interpretation of the data to the reader.</p> <p>The analyst can influence perceptions of the data through scaling, but this is discouraged.</p> <p><img src="https://assets.econsultancy.com/images/0008/7903/1.png" alt="" width="800" height="322"></p> <h4>4) Aim for consistency</h4> <p>Having the same report every week helps decision makers compare results over long time frames, so <strong>descriptive analytic reports should be regular and consistent.</strong></p> <p>Dashboards should also be consistent and changes should be versioned so that it is simple to revert to a previous one.</p> <h3>Best practices for descriptive analytics</h3> <p>While there are many <a href="https://www.amazon.com/Visual-Display-Quantitative-Information/dp/1930824130">sizable tomes</a> and lengthy blog posts about how best to present descriptive data, there are a few general principles which are fairly straightforward.</p> <h4>1) If you can use a single number instead of a chart, do so</h4> <p>Too often, descriptive reports are filled with useless, distracting charts which would have been better delivered as a number.</p> <p>For example, web traffic is typically reported as a line chart, with each day's figure creating a data point. What might be easier to understand, though, is the week's daily average and how it has moved since last week.</p> <p><img src="https://assets.econsultancy.com/images/0008/7907/5.png" alt="" width="800" height="384"></p> <h4>2) Only include what is necessary</h4> <p>Additionally, descriptive analytics should concentrate on necessary figures and charts for the decisions being made. Extra data or superfluous charts and graphs go against the purpose of analytics!</p> <h4>3) Know your charts and when to use them</h4> <p>If you are going to use charts, be sure to use them correctly. A few best practices include: </p> <ul> <li>Only use line graphs when the items on the x-axis are continuous, e.g. a period of time.</li> <li>Use horizontal bars when the categories have lengthy descriptions.</li> <li>Pie charts are best when there are less than 4 major 'slices' and be sure to order them according to their size. </li> </ul> <p>Refer to <a href="https://blog.hubspot.com/marketing/data-visualization-choosing-chart">one</a> of the <a href="https://eazybi.com/blog/data_visualization_and_chart_types/">many posts</a> on the <a href="http://lifehacker.com/5909501/how-to-choose-the-best-chart-for-your-data">subject </a>for more info on this topic.</p> <p><img src="https://assets.econsultancy.com/images/0008/7909/x.png" alt="" width="696" height="518"></p> <h4>4) Remove chart junk</h4> <p>Finally, remove many of the chart elements which Excel includes as standard. Y-axis numbers, data points, and grid lines can usually be omitted.</p> <p>The result is a clear presentation of data which still gets your point across without confusing your reader.</p> <p><img src="https://assets.econsultancy.com/images/0008/7904/2.png" alt="" width="800" height="326"></p> <h3>So...</h3> <p>For those new to analytics, descriptive analytics is the best way to start as very little data manipulation is required to deliver high-quality, useful reports.</p> <p>There are some best practices to follow, but overall descriptive reports and dashboards should deliver what the consumer requires, and nothing more.</p> <p>Keeping things simple makes life easier for the decision maker, and for you, the analyst, as well!</p>