Data-Driven Transformation: Scaling Insights for Business Impact

No one doubts the value of data-driven insights. Companies with a data-centric culture are 58% more likely to achieve revenue goals than non data-driven companies. They are 23 times more likely to top their competitors in customer acquisition and 19 times more likely to stay profitable. Moreover, the tech and tools that allow many businesses to leverage their wonderful data are already widespread.

 

Who exactly is data-driven?

Yet, the number of companies that have scaled their data strategies and are putting data at the centre of their decision-making processes is still relatively small. It could be as low as 23%. At this figure, it makes you wonder whether predictions like Gartner’s that 65% of B2B sales organisations, for instance, will have transitioned from intuition-based to data-driven-based by 2026 will come true.

So, what’s going on? Why are businesses failing to scale their data strategies and become data-driven, as you might expect?

 

Want to become more data-driven? Download our ebook today to find out how

 

The question of scale

In our experience, many strategies are often confined to a particular area of a business. Insights are typically derived from local data rather than enterprise-wide systems and third-party sources, resulting in missed opportunities.

Companies struggle to expand their data insights beyond a few systems. Without a clear strategy, they are unsure how to progress and effectively integrate data from different parts of the company and beyond to gain a better understanding.

Moreover, because the data-driven aspect of their operation stands alone, decision-making at an organisational level becomes inconsistent and disagreements arise with personnel using a mix of intuition and data to shape their viewpoints.

 

What are the challenges in scaling data insights?

Here are seven common challenges when it comes to scaling your data insights across the business:

  • Data Silos – Data across departments is fragmented, residing in siloes, making it difficult to unify and extract comprehensive insights business-wide.
  • Data Quality – Duplicated, flawed and inaccurate data is often found across an organisation. Integrating data from multiple sources in multiple formats can be complex.
  • Infrastructure scalability – The scalability of the underlying IT infrastructure may need to be improved to support growing data storage requirements and processing needs.
  • Data integration – Integrating multiple data sources to establish a single source of truth (SSOT) for analysis is complicated.
  • Data Governance and Compliance – Data regulations are evolving, with strict rules for using, storing, and protecting data. Data security, privacy, and regulatory compliance will become more complex as data volumes increase.
  • Talent Shortages – Finding the talent to drive a data strategy can be difficult with current labour shortages. It requires expertise in managing and optimising large-scale data systems.
  • Cultural Resistance – People are often resistant to change, and altering decision-making routines is a good example. There may also be a lack of data literacy and scepticism about the value of data insights.

 

How to navigate the challenges

The best way to navigate the challenges of scaling your data insights is by developing a clear strategy. It’ll help address the issues more cost-effectively and in a timelier manner. All of the following should feature in your plan:

  • Upgrade your infrastructure – invest in solutions that process and store large amounts of data.
  • Data management – consolidate data from multiple sources for analysis in a centralised data warehouse or lake to establish consistency and a SSOT.
  • Enable automation – significantly reduce the labour involved in manual data tasks and the risk of errors, helping ensure data accuracy and reliability. Microsoft Power Automate, which automates workflows and business processes, is a good example of the technology in action.
  • Use row-level security – ensure that access to data in your data warehouse or lake is governed by row-level security so people see only the relevant data.
  • Invest in training – give employees the support they need to embrace data-driven insights. Improving data literacy will foster data democratisation across the organisation.
  • Foster a data-driven culture – develop an environment where the enterprise values and understands data. A previous blog will give you guidance on creating such a situation.

Continuous improvement should also feature in your plan. You want to review your progress and consider where enhancements can be made. You must refine your plan regularly to ensure your data strategy stays aligned with your business objectives.

 

Your step-by-step process to begin your scaling journey

Here’s a more detailed approach to start scaling your data insights:

1. Define a strategy with clear goals, objectives and a timeline – Make sure your data strategy aligns with your business goals. You want to prioritise your data projects and investments, ensuring resources go where they will deliver the most impact. Also, you can break down your strategy into manageable phases with a timeline.

2. Assess current capabilities, tools and tech stack in place – Conduct a comprehensive assessment of your data infrastructure by doing the following:

    • Evaluate existing data assets across the organisation
    • Determine the quality, consistency, and accessibility of current data
    • Assess your infrastructure for data storage, processing and integration
    • Consider the infrastructure’s ability to scale over time as your needs grow

3. Centralise data management – Collect, store and manage your organisational data in a single, unified location. Providers such as Microsoft offer centralised management solutions to support your data strategy end-to-end. For instance, Microsoft Fabric is an integrated platform offering a comprehensive data management and analytics solution.

4. Bridge the gap by investing in tech and tools – As the last point implies, scaling up your data insights will require investment. That’s not to say existing investments will have been wasted because chances are they can be integrated with a larger, centralised platform. Microsoft technology offers good interoperability and is committed to developing platforms that work well with legacy technology.

5. Start building your data pipelines for a SSOT – With the tech and tools in place, you can establish the data pipelines to create your centralised SSOT. The pipelines will extract data from multiple sources, clean it, and transform it to ensure consistency.

6. Identify the models and reporting for your data – You want to choose the appropriate modelling techniques for the insights you’re trying to achieve. These can include:

    • Descriptive analytics for summarising historical data
    • Diagnostic analytics to identify patterns and relationships
    • Predictive analytics to forecast future trends
    • Prescriptive analytics for recommending actions

It’s also important to consider how best to report insights to your audiences. How data literate are the different audiences? Reporting tools are available, giving you plenty of options.

7. Plan to ensure widespread internal adoption of your insights – As mentioned, you must develop data literacy and create a data-driven culture. Training programs are essential, and senior executives need to offer visible support. Also, celebrate any progress individuals or teams make on their digital journeys.

8. Stay focused on data privacy, governance and security – Governance and security should be at the heart of any strategy. You want to foster a culture that highly values data governance across the business. You can also leverage tools that help automate and support governance and security tasks.

9. Review adoption and gather feedback – Run surveys and assessments to monitor improvements in data literacy and gather feedback from employees and stakeholders on the effectiveness and usability of the data tools. What’s more, look for data-driven decisions in key business areas.

 

How we can support you

However, with our expertise in data strategy development, we’ll likely be able to help and provide you with a solution to get back on track and achieve your scaling goals.

At Ipsos Jarmany, we follow an outcome-driven approach that, with our experience and skillsets, helps customers fast-track their data strategies. We’re also experts in the latest tech and tools, giving leading businesses the insights to move the needle on their KPIs.

Get in touch, and let’s start a conversation about scaling your insights today.

Join forces with Ipsos Jarmany to turn your 2024 goals in to reality

* https://europeanbusinessmagazine.com/business/businesses-increase-data-analytics-investment-by-54-in-2024-new-study-reveal/#:~:text=A%20new%20report%5B1%5D%20from,experienced%20positive%20impacts%20on%20profits.%22

** https://explodingtopics.com/blog/data-analytics-stats

Boosting Customer Satisfaction: How Data And Analytics Drive Personalised Customer Experiences

Are people becoming harder to satisfy? Yes. The likes of Amazon and Apple have set new standards in customer experiences that we expect other businesses to match. But there are other factors, too. The cost-of-living crisis has made all of us a lot more critical. According to the Institute of Customer Service, customer satisfaction in the UK has fallen to its lowest level since 2010.

In this blog, we’ll explain why personalising customer experiences can boost customer satisfaction, and why we believe leveraging data, and applying analytical techniques, is the mechanism for achieving this. We’ll show how you can kick-start your personalisation strategy today using the data already at your disposal and build from there.

 

Why customer satisfaction and personalised experiences are critical

Customer satisfaction is critical for long-term business success. It often correlates with higher revenues and market share. There are many ways to increase satisfaction levels, but one of the most effective is improving the customer experience.

A better experience drives customer loyalty and retention, reducing churn and increasing customer lifetime value. It directly impacts perception and encourages repeat business. And it converts customers into brand advocates, driving sales through positive word of mouth.

Particularly in saturated markets, where competition is high and growth increasingly difficult, customer experience can make all the difference. It sets companies apart, meeting a need that many businesses don’t seem to be fulfilling. A survey in the US found that 81% of customers prefer companies that offer a personalised experience. Plus, 70% said it was important for personnel to know their past purchases and interactions.

 

Want to become more data-driven? Download our ebook today to find out how

 

How data fuels customer personalisation

Personalisation starts with data. It helps businesses understand customer preferences and behaviours. Data can show everything from gender to preferred touchpoints, purchase histories, and brand perceptions. With these insights, companies can tailor their interactions and offerings to customers’ requirements.

Multiple types of data are available. All of them are useful in personalising customer experiences; however, it doesn’t matter if you don’t have access to all of them. The thing to focus on is using all the available information and then devising a plan to collect the data you don’t have and then integrating that with your existing data.

Here’s a list of the kinds of data you want to collect:

  • Primary data – You collect this from your website and apps. Primary data covers click patterns, browsing history, search records, reviews, and user preferences. It will help you understand individual user behaviours and preferences.
  • Survey data – To provide a more nuanced understanding of your customer base, you should conduct customer feedback surveys, preferences surveys, net promoter score surveys, and customer satisfaction surveys.
  • Third-party data – This covers areas like web scraping, which we discussed in a previous blog. It also includes macro data to better understand the world and insights from data brokers or aggregators, which sell data to help you better understand your target audience. We can also add data clean rooms, which are secure environments where brands, publishers, and advertisers can share and analyse their first-party data. Plus, there are walled gardens, such as Google’s Ads Data Hub, which provide detailed insights into user behaviour, preferences, and demographics.
  • Historical sales data – Using sales data to support personalisation is self-explanatory. It’s an excellent source for understanding a customer’s preferences and buying patterns, and it can help identify upsell and cross-sell opportunities.
  • CRM data – These systems are designed to support personalisation and are a great tool for tailoring customer interactions. CRMs record and manage all customer-interaction data, creating a unified customer profile. You can then interrogate that profile to help personalise customer communications.
  • Social media engagement data – Using social media data, you can learn about customers’ preferences and interests. Many companies use it to segment their audiences based on commonalities such as demographics and behaviours.

 

Turning data into insights to guide your customer experience strategy

Get to work with the primary data at your disposal. Start measuring and tracking how customers are interacting with your website and apps. Conduct customer journey analysis to expose the cause-and-effect relationship between touchpoints and marketing channels. It’ll give you an immediate understanding of how different channels work together to create customer behaviour.

Here are some of the many techniques for extracting the goodness from your data for personalisation. Such as:

  1. Heatmapping – These visually represent website behaviour through colour-coded overlays. You’ll see, for example, the areas of your website that draw your customers’ attention and where they may be searching for information, giving you insights to improve signposting.
  2. Behavioural analysis – It enables you to examine how, when, and why customers engage your company through purchasing habits, brand interactions and product usage. It’s a great aid for customer profiling and segmentation.
  3. Conversion rate optimisation (CRO) & funnel analysis – CRO is the systematic process of increasing the number of website visitors that perform a desired action. Testing and optimising webpages allows you to find the right level of personalisation to boost conversion rates. Likewise, funnel analysis, which identifies critical events along the customer journey, will help identify those points where personalisation will have the greater impact.
  4. A/B and multi-variant testing – Using A/B and multi-variant testing, you can spot the best-performing personalisation messaging, visuals, and signposting strategies. You’ll find CRO and A/B testing are an effective combination for creating experiences that resonate better with consumers and increasing website performance.
  5. Predictive analytics – For this technique, you can use propensity modelling to predict the likelihood of future action based on historical user activity. As such, you can pre-empt customer needs, delivering the right message to the right person at the right time to provoke a specific event.
  6. Churn analysis – Here, you’re studying historical churn data to make churn prediction possible. By analysing churn data, you can identify the moments when a personalised customer experience could make the difference between a customer remaining loyal or going to a competitor.
  7. Web scraping – Extracting data from websites can reveal customer behaviour, preferences, and interests, which you can use to tailor your messaging, product offerings, and engagement strategies. It’s a great way to pull information from review sites and conduct customer feedback analysis and sentiment analysis.
  8. AI – With its ability to continuously learn and adapt, AI is taking customer experiences to the next level, supporting real-time personalisation. Through advanced technology, it can adjust recommendations and tailor experiences the moment a customer interacts with a brand across any one of multiple touchpoints.

 

What happens when you get personalisation right?

The evidence that personalisation works is pretty conclusive. Take a look at some of the leading personalisation practitioners and the results they’re achieving.

 

 

 

 

Where to go from here

Admittedly, Netflix, Spotify and Ocado are at the cutting edge of personalisation. However, they demonstrate how powerful a personalised customer experience can be at moving the needle on a wide range of KPIs.

The good news is that you can make significant strides in your personalisation strategy simply by leveraging your primary data. Then, it’s really just a process of developing that strategy and advancing your use of data and analytics to see greater returns.

The truth is that personalisation is critical right now, and companies need to get it right the first time or risk falling behind their competitors. At Ipsos Jarmany, we have the experience and expertise to help you on your personalisation journey, just as we do for many customers. So please don’t hesitate to get in touch.

We can start a conversation on personalising customer experiences whenever you’re ready.

 

Join forces with Ipsos Jarmany to turn your 2024 goals in to reality

Building A Data-Driven Culture In 10 Simple Steps

A data-driven business analyses and interprets data when making strategic decisions. It’s the opposite of following your gut.

However, you could ask — why be data-driven if your gut instinct is pretty good? Many prominent business figures have publicly stated they made millions by following their “spider senses”. And, while this may be true, we’d argue that behind every gut player, there is a data-driven person or team course-correcting everything that instinct gets wrong.

 

Want to become more data-driven? Download our ebook today to find out how

 

Our data-driven world

Let’s start by discussing data, and the critical role it plays in business operations; it’s the foundation of every organisation, empowering data-driven decisions and providing justification for those decisions. Additionally, data helps organisations to establish benchmarks that can be universally agreed upon and measured against.

The amount of data globally is growing. By 2025, global creation will grow to more than 180 zettabytes, and individual companies are also seeing data volumes increase by 63% per month.

So, if data is suitable for decision-making, and there’s more of it, then we should all be great at business, right?

Unfortunately this isn’t always the case. Despite having plenty of data, many businesses aren’t doing as well as expected at leveraging that data to drive commercial insights. In fact, the average company analyses just 37-40% of its data and hence a tonne of information that could support decision-making processes is slipping under the radar. It is therefore crucial for businesses to enhance their utilisation of data by fostering a data-centric culture, where data and analytics are central to the organisation, and employees are empowered to effectively leverage data to enhance business outcomes.

 

How do you build a data-driven culture?

Becoming data-driven isn’t like flicking a switch. It takes more than simply purchasing new technologies or setting up a business intelligence dashboard. The truth is that becoming data-driven relies as much on cultural change as any single piece of technology.

For businesses to be data-centric, they need employee buy-in, and data needs to be embedded in all parts of the organisation. It requires investment in tools, technology and training, with clearly defined processes and governance. Ultimately, it requires a data literate workforce, where employees from across the organisation are empowered to utilise data and truly understand the value of data.

As we tell all our clients, a data-driven culture starts at the top—in the boardroom. The senior team needs to start using data in its decision-making process, becoming a role model for the rest of the company. In due course, senior managers and employees will catch on and realise they have to back up ideas with data in order to have serious conversations.

The change process won’t happen overnight, but it will happen if the C-suite is engaged.

 

What a data-driven culture looks like

So, you want to become a data-driven company. What should you be aiming for? To give you an idea, we’ll highlight some gold-standard data-driven companies. 

 
Amazon

The first – and this won’t come as a great surprise – is Amazon.

Amazon has become data-driven by democratising data across the business to support better decisions faster. In a letter to shareholders, company founder Jeff Bezos laid it out thus:

“The senior team at Amazon is determined to keep our decision-making velocity high. Speed matters in business – plus a high-velocity decision-making environment is more fun, too. Most decisions should probably be made with around 70% of the information you wish you had. If you wait for 90%, you’re probably being slow in most cases.

This summarises how it’s ok if you don’t have access to all data. You just need to ensure you’re best leveraging the data you do have access to, and be willing to course correct as more data comes in.

Data drives Amazon from beginning to end. It optimises supply chain and inventory management, pricing strategies, marketing, and advertising. The company uses it for continuous improvement, analysing customer feedback, reviews, and browsing behaviour.

 

Nike

Another couple of great data-driven companies are Nike.

Nike has long used analytics to understand customers and drive decisions in digital marketing. Product recommendations are the result of data on user activity, and product design strategies are based on data around user behaviours and preferences; leading to product development that will meet audience expectations.

 

Coca-Cola

Coca-Cola intentionally focuses on strategic, data-driven experimentation and agility. If it didn’t, how would it have the boldness to launch some of its products—which, let’s face it, are quite a departure from Coca-Cola Classic? Here, we’re thinking about Flashlyte, an advanced hydration drink for the Mexico market, and Smartwater Alkaline for North America.

When you look at the diversity of Coca-Cola beverages nowadays and how well the company is doing, it’s clear that it is analysing a lot of data.

 

The benefits of adopting a data-driven culture

Thus far, we’ve delved in to what it really means to be a data-driven culture, some gold-standard industry examples, and why data is such an important asset to businesses. 

Let’s now explore some of they key benefits of adopting a data-driven culture:

  • Make better-informed decisions – Being data-driven ensures decisions are based on facts rather than assumptions, eliminating human biases.
  • Improve productivity – Data-driven companies give employees across the organisation quick access to accurate data to formulate strategies quickly.
  • Optimise campaign performance – Having the ability to analyse data on customer behaviour and preferences helps companies improve the impact of campaigns.
  • Drive internal efficiencies – Obtaining objective proof of where a problem lies can often be a battle. Data makes that more accessible, and data-driven companies can quickly identify their operational inefficiencies to act.
  • Enhance internal accountability – With data-driven decision-making, companies can track how decisions are made. This improves transparency and accountability, leading to fewer internal conflicts and greater trust.
  • Strive for consistency – Humans find it hard to achieve consistency because we’re only human on good and bad days. Data can provide the consistency, accuracy and objectivity we often need.

 

10 steps to building a data-driven culture

So here are our 10 steps to creating a data-driven culture to ensure your business is one of the winners.

 

1. Investment

Deploy the infrastructure to gather and process data from across the business. Implement the systems to analyse the data and provide tailored insights to different teams across the organisation.

 

2. Buy in from the top

Ensure the leadership team makes decisions based on data, setting expectations across the company that facts back choices and can be explained later on by looking at the figures.

 

3. Start from the hiring process

Develop a team of data experts, including data analysts and, depending on the amount of data you have, data scientists. Data scientists can build and deploy data models while analysts maintain your data infrastructure and provide data reports to the company.

 

4. Integrate data across departments

Make sure data isn’t stuck inside specific departments, creating siloes. If data is integrated, a business can have a more comprehensive, holistic and data-driven view of operations for making better decisions.

 

5. Aim for consistency

Select canonical metrics and programming languages to ensure consistency, particularly as a data-driven culture evolves. Without these benchmarks, much time can be wasted debating over different versions of a metric.

 

6. Data governance

Developing policies that determine how data is gathered, stored, processed, and disposed of is critical. These policies will also control access to what kinds of data and by whom. This will ensure better quality outcomes and regulatory compliance.

 

7. Encourage experimentation

By allowing personnel to fail when experimenting with data, companies often reveal hidden insights that unlock greater value from their operations.

 

8. Don’t isolate data teams

Data-driven cultures thrive when data scientists and analysts are embedded in a business’s operations. The better the data folks know what’s going on across a company, the more likely they’ll start delivering insights that have a real impact.

 

9. Explain choices

Set expectations around transparency and accountability so that personnel know they may need to validate their decision-making process. This allows for feedback that, in turn, will support continuous improvement.

 

10. AI integration

Even in the early stages of developing a data-driven culture, it’s useful to consider how artificial intelligence (AI) can take your decision-making to a new level. AI is becoming a critical enabler for companies, supporting more informed decisions, predictive analytics, the automation of routine tasks, and much more.

 

Each step in developing a data-driven culture is important. They can’t be rushed or skipped. At the start of a company’s data journey, the process of becoming data-driven can seem daunting. We’ve seen instances where, due to a lack of experience, time-to-value has taken much longer than originally expected.

 

Helping reduce time-to-value

It’s important to remember that a strong data-driven culture is proactive, not reactive, and it doesn’t happen overnight. At Ipsos Jarmany, we can guide your data investment and support you with becoming data-driven faster, so you can achieve time-to-value sooner than expected. Our teams of consultants, which have extensive experience building data-driven cultures, are working closely with customers, helping them navigate towards data-driven decision-making and enabling them to gain a competitive advantage, innovate faster and boost productivity.

 

Start a conversation on becoming data-driven today by contacting us.

 

Join forces with Ipsos Jarmany to turn your 2024 goals in to reality

A Beginner’s Guide To Web Scraping

Web scraping is a technique for automating data extraction from web pages. It involves virtual machines with Python scripts crawling web-page HTML to extract data.  

The data from web scraping can serve many purposes. Essentially, digital professionals will want to use the information to answer a series of questions, such as how to increase sales, reduce costs, improve customer satisfaction, and reap other business benefits. In this article, we refer to web scraping specifically in the online retailer space, generating insights around performance on third-party affiliate sites, however the opportunities for leveraging web scraping are endless.  

 

discover how to ace your omnichannel analytics with our latest ebook


Why web scraping is important 

It’s easy to see why web scraping is essential to brands in our data-driven, online world. It’s a reliable way to gain insights that help optimise decision-making across product development, product placement, pricing, promotions, development, and more.

Any doubts about web scraping’s value can be dispelled with a quick look at the market for web scraping software. Driven by the continued growth of e-commerce, the market for web scraping software is expected to grow from $1.1 billion in 2024 to $2.49 billion by 2032. 

 

The different types of web scraping  

In the online retailer space, there are numerous types of data that can be harvested. These can include everything from brand visibility and banner usage to search terms, pricing, and customer reviews. It’s a long list. The critical point is that brands use a combination of web scraping techniques and decide which ones to focus on based on the questions they want answered.

To make things easier, we’ve divided the techniques into several broad categories, with a few sub-categories for extracting data from specific web page features. 

 

Product listing page scraping 

Product listing pages (PLPs) list products under various categories on a website. They are a vital part of any e-commerce site, providing search engine visibility and a better online shopping experience for customers.  

PLPs, also known as category pages, contain valuable information on product visibility. When scraped, they can reveal insights into your products’ popularity compared to competitors’ and the characteristics of the most popular products. 

 

Filter scraping  

This is more of an extension of standard PLP scrapes. Rather than scrape category pages, marketers scrape filters on a PLP web page. For instance, in the case of TVs, brands can scrape for filters such as screen size or price and see what share of visibility their products gain under these terms.  

 

Banner scraping 

Again, this builds on PLP scraping, delivering an additional key performance indicator (KPI). It allows marketers to track daily banner changes to see brand share on key pages. Brands also use it to check their banners are appearing on web pages per their campaign plan.  

 

Product description page (PDP) scraping 

This kind of scraping takes place on the page where a product is listed. It’s more taxing than PLP and takes longer because of all the insights available. While PLP scraping might be daily, PDP scraping could be weekly.  

Brands can gather information like the number of product reviews, reviewer ratings, and product images or videos available. Other scrapable data includes product price, discounting, and stock information. They can also see the current product description. 

 

Search scraping 

Here, marketers are scraping data on different search terms. This shows you what products are visible using which search terms. In practice, a marketer could scrap 5-10 generic search terms on a product to obtain an average visibility score.  

 

Typical use cases for web scraping 

There are many use cases for web scraping, and these are our top 5: 

  1. Consumer sentiment analysis – Essentially, web scraping allows you to filter out the noise and gain direct feedback and sentiment from your target audience. Using PDP scraping, you can analyse user-generated text from reviews to evaluate product performance.
  2. Lead generation – One way to overcome the challenges of generating leads is to use web scraping, a low-cost form of collecting relevant information on potential customers. It can provide information such as email addresses, job titles or company names.
  3. Content strategy monitoring – Data from web scraping can provide insights into competing brands’ content strategies and search engine optimisation tactics. It can help brands refine their approaches most effectively based on the latest market trends.
  4. Price comparison – Optimised pricing is key to success in any competitive market. With web scraping, brands can access up-to-date information on competitors’ pricing to improve the effectiveness of their pricing strategies.
  5. Supply chain and inventory monitoring – Web scraping can reveal e-commerce retailers’ stock for a brand’s product. Likewise, it can monitor the price of the raw materials, among other things, used in the manufacturing process. As such, brands can use the insights to identify a shortage of products in stores and potential supply chain issues. 

What are the business benefits of web scraping? 

It’s easy to see the business benefits of web scraping from the use cases. Again, a rapid online search would provide you with a long list, but to save time, we’re focusing on the main ones: 

  • Greater revenues – With granular detail on a product’s market performance and competitors’ performance, brands can identify gaps they can look to fill. Furthermore, with deep product insights updated daily, they can optimise product development, their go-to-market strategy, and the product’s lifetime performance to maximise sales.
  • Cost savings – Web scraping is a software-driven, automated process that can be operated 24/7. Therefore, it offers a highly cost-effective way to obtain data that can help determine the success of a product or a business.
  • New markets – Brands can practice web scraping across many websites and web pages in multiple regions. Theoretically, this can provide data on the viability of launching products in new markets or developing products to fill a noticeable gap. The same web scraping can increase revenues in existing markets and help maximise sales and revenues among first-time audiences.  

How to ace web scraping in 10 simple steps 

  1. Choose the business questions that you want to answer
  2. Complete a data audit to understand what information you already have
  3. Identify the gaps in your data and the possible online sources
  4. Ask yourself how granular you want the data to go
  5. Define what your KPIs are going to be
  6. Begin creating the Python scripts to scrap the data you need
  7. Create a process for extracting, managing and storing the data
  8. Ensure your processes are compliant with current data regulations
  9. Decide what data will go into your Power BI dashboard reports
  10. Determine who gets access to what dashboard reports

How to deal with the technical side of web scraping

Web scraping is simple, but it comes with specific technical requirements. There’s no getting around the fact you’ll need some Python expertise. And that your processes for extracting, managing and storing the data should be well-polished.  

Often, companies prefer to focus their resources on other things rather than creating Python scripts, managing scraped data, and hosting data. While they may love the idea of dashboards populated with insights that are easy to grasp and share with colleagues, they question whether spending time and money teaching staff how to convert raw data into a Power BI-friendly format is the best option.  

All of this is entirely reasonable.  

That’s why, at Ipsos Jarmany, we’re helping an increasing number of clients get the maximum value out of web scraping. With our expertise and experience, we’re helping them formulate their strategies and execute their campaigns to deliver insights that are helping improve the bottom line and identify new business opportunities. It’s delivering outstanding results for them and can do the same for you.  

Start a conversation on web scraping at its best by contacting us today.

 

Join forces with Ipsos Jarmany to turn your 2024 goals in to reality

A Day In The Life Of A Junior Analyst

As part of my university degree, I decided to do an industrial placement year to help gain skills and experience within the data analytics industry. Throughout my internship thus far, I’ve been exposed to many different areas within data and analytics, and this is something which I love about my role; every day is different. 

So, let me tell you more about my internship so far…

 

About my role and the team

I work in a team that has five members; this includes two Junior Analysts, a Commercial Analyst, a Senior Analyst, and a Consultant. The Consultant manages the team and our workload. We collaborate with one another on a daily basis, often starting the day with a morning catch-up meeting to talk through our current priorities. 

Alongside this, I have a separate bi-weekly meeting with my manager which gives me the opportunity to voice any concerns about work or social life. Our projects have varying time constraints, so active communication is very important when working on the group projects, and asking for help is essential with any individual work. Everyone on my team is a delight to be around, so asking for help is easy. 

As a member of the Microsoft account team, one of our daily tasks involves providing assistance to Microsoft Account Managers who approach us with various queries. These inquiries often revolve around resolving data submission issues, troubleshooting data systems, or addressing any other Microsoft data-related challenges.  

In addition to supporting Microsoft Account Managers with their queries, our team also oversees multiple processes. Within my role, I have specific individual tasks that encompass aspects such as: 

  • Ongoing management of George the Chatbot – this is a bespoke AI solution that was created by my team to give Microsoft Account Managers quick answers to common questions. It has helped to create efficiencies, such as streamlining the email process so queries are directed to the Ipsos Jarmany team inbox. As part of my role, I ensure that George contains up-to-date information, as well as offering suggestions for improvement. If these optimisation suggestions are approved, I use Microsoft’s automation software, Power Automate, to add the new topics to George. 
  • Identifying and resolving data discrepancies – Another task that is specific to my role is comparing rebate units from two data systems that store third-party sales data. This is a monthly task, and involves me identifying discrepancies, investigating why there are discrepancies, and then aiding the rebate team on how to resolve these discrepancies. This is important as the data relates to the number of units sold via third parties, and therefore indicates how much incentive each partner gets paid for selling Microsoft products. 

 

Tools and software I use in my role 

Continuous learning and development play a significant role in my internship experience. At Ipsos Jarmany, we have access to a suite of online training resources which helps us to develop a variety of skills, both soft and platform specific. As part of my role, I use a variety of tools and software, however in particular this includes: 

  • Microsoft Excel 
  • Microsoft Power BI 
  • Microsoft Power Query 
  • Microsoft Power Automate
  • And a range of coding languages, including Python and VBA. 

 

Opportunities I have experienced on my placement 

There have also been opportunities for me to get involved in internal development projects. This could involve anything from improving an outdated process to creating new tools that are designed to help Account Managers. For example, 

  • I recently decided to take initiative and improve our monthly checking process by automating the Excel files using Python. Firstly, using the knowledge I learnt from my online courses, I recorded macros and edited the VBA code. Then, I converted this code into Python, using xlwings (as Python can be coded more efficiently than VBA). Consequently, the process now runs much faster and we have significantly more time to investigate the potential discrepancies. 
  • My team is currently working on a group project, which involves automating weekly sales and marketing emails that go out to the Account Managers. I was tasked with creating new Excel files in our SharePoint and then automating the process of filling them with data from another table, and lastly attaching these files to the emails using Power Automate. I also used HTML to add logos to the end of the emails. This project really allowed me to develop my Power Automate skills and gave me the opportunity to learn a different coding language. 

 

What a typical day looks like

One of the great things about my role is that every day is different and I have the opportunity to get involved in a variety of different projects. However, to give you a taste of what a typical day could look like, take a look at the below agenda. 

Time 

Tasks 

8:50-55 

Arrive at the office and choose any desk to connect to a monitor (as we hot desk), preferably somewhere close to my team members. 

9:00 

The working day starts! I first tend to check my emails, or start working on urgent projects straight away. 

9:15 – 9:45 

This time is usually reserved for a team meeting, so we can catch-up and discuss daily and long-term priorities. 

10:00 

Focus time: Usually I check if any new queries (A.K.A tickets) have come through in our inbox. If nobody has been assigned the ticket, I begin solving the issue. If there are no ticket updates, I tend to use this time to research new topics that we may want to add to George the Chatbot. 

11:30 

Grab a snack from the communal snack table. 

13:00 

Lunch time! I often treat myself to something from Kingston market to reward myself if I have been having a productive morning. Or if work has been slow, I treat myself to something from the market to raise my spirits… 

14:00 

Back to work: I use this time to check the team inbox for any responses to my tickets and, if so, communicate with the team if there are issues. Monthly checks process is coming up so I’ll also double check all documents are prepped and ready to go. 

15:00 

30-minute call with Sales Operations Manager to discuss new George topics. During this call I make sure to take notes so I can then edit the topics in Power Automate accordingly. Part of this then includes keeping the relevant stakeholder team in Microsoft updated with progress. 

16:00 

If I’ve finished all essential work for the day, I then use this time to continue watching online courses to further expand my knowledge. 

17:00 

Social time! We take it in turns to organise the team socials, like a general knowledge quiz. On the occasion’s that I’m not organising it, I relax and participate! Note to self: Win the social. 

17:30 

Done for the day! Follow everyone to pub for pints. 

 

My highlights

Ipsos Jarmany tend to mostly recruit graduates, so we’re a very social crowd and this is an aspect of my internship that I really enjoy. I just work in one team that’s associated with the Microsoft account (there are probably around 5 different teams at Ipsos Jarmany), and every week all the Microsoft teams come together for a joint social.  

On a Friday, the entire office takes part in ‘forced fun’ social time, which is often a quiz, or another fun activity. Everyone is super friendly and approachable, and we have a really nice and social culture where we often go to the pub after work (or attend the local open mic night). We also have a company social committee who organise monthly events, like bowling or rounders competitions. This is a great opportunity to mingle with other employees outside of your own team (and there’s sometimes free drinks and food!).  

Alongside this, we have Junior Analyst socials which are always spectacular. I try to organise these in my spare time so everyone has the chance to wind down. 


Some of my key achievements and highlights (so far) include: 

  • Automating a lengthy process down to a few seconds 
  • Taking ownership and responsibility of my tasks  
  • Getting the opportunity to use my knowledge to help other people 
  • Our Ipsos Jarmany socials 
  • Making friends for life 

I’ve really enjoyed my internship so far and I can’t wait to see what’s next. If you’re interested in becoming a Junior Analyst at Ipsos Jarmany, as part of your university placement year, then take a look at our open opportunities today. 

Propensity Modelling – Why You Need It In Your Digital Lives

 

We’re focusing on propensity modelling because it’s a tool that digital professionals can really benefit from, and we’ve recently discovered that whilst many are familiar with the term, there seem to be some unknowns around the details. 

Therefore, we thought it was time to lift the lid on propensity modelling so you can make a more knowledgeable decision on bringing this sciencey stuff into your working lives. (Safe to say, we think you should).

 

 

What Is Propensity Modelling?

A quick online search reveals propensity modelling as “statistical approaches to predicting the probability of particular users, say customers or leads, performing certain actions”. That all sounds rather dry. Think of a technique that could predict the probability of yoga practitioners drinking herbal teas or customers using a chatbot on your website instead of calling customer services.

Propensity modelling is a big deal. Predicting the likelihood of someone doing something is a powerful tool for marketers and sales professionals in particular, but also for other business functions too. They can use it to improve campaigns, make more effective decisions, and improve customer satisfaction. 

McKinsey has reported that 71% of consumers expect companies to deliver personalised interactions nowadays, reinforcing our view that propensity modelling is a must-have digital tool for successful companies.



Why Is Propensity Modelling Important?

You can argue that propensity modelling is even more important nowadays. Many consumers still struggle with the rising cost of essential items and must cut back on non-essential purchases. In 2023, the global state of consumers was described as unsettled. Our question would be, have things changed in 2024? We don’t think so; hence, propensity modelling seems a great way to combat consumer instability.

One industry that has spotted this is publishing. Publishers went big on propensity modelling after they fell victim to the great unsubscribe when people culled their number of subscriptions after a peak during the pandemic. Publishers like Mediahuis in Belgium have used it to help retain their user base. When customers showed a high propensity to churn, they received marketing phone calls from Mediahuis, which increased retention by 14.17 per cent in just three months.

 

 

The Typical Use Cases For Propensity Modelling

Using propensity models, companies can address areas such as:

  • Enhancing user experience – Through propensity modelling, organisations get a better understanding of customer tastes and behaviour. Using these insights, they can personalise experiences to boost satisfaction and loyalty.
  • Improve conversion rates – Models can predict the likelihood of a customer making a purchase based on data like past purchasing behaviour and browsing history. Businesses can use this insight to target the people they see as most likely to convert.
  • Reduce churn – Using this model, companies can predict the probability of a customer terminating a relationship. Remember the publishers? Using this data, they knew where to direct the marketing calls.
  • Maximise responses – Called propensity-to-respond models, these statistical tools help predict the likelihood of a customer responding to a marketing campaign or promotion. For instance, they can determine the probability of a customer clicking on an email.
  • Measure lifetime value – This modelling helps companies predict the amount they will likely receive from a person over their lifetime as a customer. It identifies the most valuable ones, showing where to focus resources.
 
 

What are the benefits of propensity modelling to businesses?

In terms of how propensity modelling impacts the bottom line, here is a list of some of the key business benefits:

  • Increased ROI – Companies often perform extensive testing to maximise returns when developing new products and services. Propensity modelling analyses data that can refine testing procedures, accelerating progress towards the product and service with the best ROI.
  • Reduced costs – Propensity modelling benefits decision-makers, giving them insights to make better choices. For example, data from a propensity model can help focus marketing campaigns, making them more targeted and cost-effective. 
  • Saved time and budget – This third benefit is closely related to the preceding two. It gets to how propensity modelling provides data that can help companies save time and money during development. Compared to real-time testing, which is time-consuming and expensive, computer-driven propensity modelling is less costly and faster. 
  • Improved customer satisfaction – Customer satisfaction is a primary focus for businesses in these unsettled times. Propensity modelling can help pinpoint customers who aren’t particularly satisfied and could be thinking of leaving. This allows businesses to turn the situation around and improve satisfaction scores.

 

 

Businesses using propensity modelling

Industries from manufacturing to banking, telecommunications, and utilities have adopted propensity modelling. It has helped them develop websites and services.

For instance, Mitsubishi Motors has successfully used propensity modelling to predict the likelihood or propensity of consumers completing the build-and-price tool for its website. Vodafone has used propensity modelling and other analytical techniques to identify enterprises and customers that would benefit most from 5G. In the case of EDF Energy, propensity modelling was used to reduce churn levels successfully.

 

 

How to start propensity modelling

The first thing to say is making friends with at least one data scientist is a smart move. We’ll lay out some steps to get the ball rolling, but with the warning that you will need someone with a mathematics background to hold your hand. 

So propensity modelling in 5 simple steps would be as follows:

  1. Gather your dataset to use for your model
  2. Create a propensity model for your chosen use case
  3. Explore and validate your dataset
  4. Configure and train your model
  5. Start predicting using your model

Up to this point, data scientists (like your new friend) have always been the go-to for propensity modelling. But you won’t be surprised to learn that they increasingly share the spotlight with artificial intelligence (AI). Indeed, the demise of data scientists is being openly discussed

The reality is that AI has transformed the accuracy of predictions based on existing data sets, accelerating the analysis process and uncovering trends and anomalies faster than traditional models. What’s more, off-the-shelf solutions are available to power your propensity modelling ambitions, simplifying everything tremendously. That said, these solutions may meet the needs of smaller companies; however, a commercial offering may not meet your business requirements for larger businesses. It’s the typical trade-off you’d expect.

 

 

What technical capabilities do you need for propensity modelling?

In our experience, many companies lack the resources to create their own propensity models but have reservations about the ROI of commercial solutions. They feel stuck.

At Ipsos Jarmany, we have propensity modelling capabilities which can help many businesses overcome the hurdles that seem to be holding them back. Our teams of data science experts can create tailored propensity models using machine learning algorithms that will track variables and predict the likelihood of a particular event occurring.

Our flexible approach ensures customers gain the right service to meet their business needs. Plus, our focus on business value helps the companies we work with achieve the ROI they need.

Start the conversation on developing your propensity modelling practice by getting in touch with us today.

 

 

Top Tips For Landing An Internship in Data and Analytics

“I wasn’t sure what I wanted to do; I just wanted to explore something that I found interest in doing…”

That was my thought process when I was applying for internships and placements. As a 2nd year chemistry student at the University of Warwick, and an analytics enthusiast, I was looking to find a career that matched my interests and skills. I continued fumbling around, exploring different roles until I found the data industry – or maybe the data industry found me.

I was drawn to how data and analytics were used to tell stories and help businesses make better decisions. Without any prior experience in the field, I decided a year-long placement in the industry was the perfect opportunity to fully immerse myself in the world of data and analytics. So, that’s what led me to Ipsos Jarmany, where I’m now partway through my internship year, supporting one of their key clients with offer management in the consumer tech space. So far, it’s been an enjoyable and eye-opening experience learning about skills in a field that I didn’t even think was possible 18 months ago.

 

Getting started on my search for an Internship

My journey was not linear, and the “application season” took a toll on me at times but also taught me a lot along the way. I had to be resilient and patient, learning hard and fast to adapt from rejection. At times, it felt like walking through the foggiest of forests, not knowing if and where you would come out. But I guess any career journey is not always linear; and until December of last year, I wasn’t even applying to data-related roles.

What helped me was knowing where to start. The first place I turned to was Bright Network; a platform which connects students with employers and opportunities. I had previously enjoyed their Network Festival events, as well as courses, advice, and job recommendations, which helped me to discover new roles and sectors. Bright Network was a great way to learn about my interests and build upon my skills.

But where Bright Network helped me become interested in the data industry, it was Gradcracker that directed me to my current role at Ipsos Jarmany. In my eyes, Gradcracker is the holy grail of placement platforms for STEM students; it had a plethora of opportunities, most of which were tailored to me. Although, I’ll admit, it took months of sifting through tons of ads and applying to many positions hoping for a response, whilst simultaneously upskilling myself and improving my application in the process.

The last tool in my arsenal was networking. Although this was not immediately clear to me, and it involved speaking to friends and family about the process, and understanding things they had learnt about networking and the job application process in general. For me, attending networking and corporate events helped me to learn more about data, tech and working life. I began trying to simply enjoy the interview process, and I found that it really helped having friends review my application and answers.

Another tool that stood out was LinkedIn; the ultimate networking site. I followed companies and groups, set up alerts and filters, and connected with people in my field. LinkedIn helped me become more aware of roles and increase my visibility, and I highly recommend it to anyone looking for a placement or an internship.

 

Acing the interview process

Getting to the point where you’ve established what kind of role, and field, you’re looking for in your internship search is a pivotal moment. Finding a company, you want to work for is great too, but then you have to ask yourself, ‘why should they hire you?’

This is something you need to demonstrate in the interview process, and, given how popular internships are, this is something you need to ace if you want to stand out amongst the crowd.

I would be lying if I said I was confident when it comes to interviews. I still remember mine like the back of my hand. I was wearing a black suit and debating whether to wear a tie or not to this online interview.

What I will say is that you should expect to be asked a mix of situational, behavioural and competency questions. For me, it helped by writing down my ideal answers for many questions, and reading them aloud to try and strike a balance between actually answering the question, not sounding too rehearsed, and not talking too much. Although this did help in the early stages, I later learnt that the interview process is just about them as it is about me; a Q&A conversation rather than the stress I had built it up to be. For that I would have to credit the Motivez ‘application season’ and Global Progress Enterprise mentorship program.

 

To ace an interview, I’d say there are two fundamental elements you need to think through prior to the interview; ‘You’ and ‘The company’.

  • ‘You’ – You will tackle the majority of the interview and it’s important to use the famous STAR technique; Situation, Task, Action, and Result. It is also important to make sure that you are excited and enthusiastic about the job, and can demonstrate your skills and talents. But also, don’t forget to bring your hobbies and interest, where appropriate, as this shows a lot more that you think and helps the interview to get to know you. These are things I constantly doodle now before an interview just to remind myself of key points I want to mention.
  • ‘The Company’ – This involves researching the company, so you understand as much as possible about the business, the work they do, the role, the company values and their ethos. At the end of the day the interview is not only to see whether you are a good fit for the company, but also if they are a good fit for you; after all an interview works both ways. Don’t forget to bring a question or two with you too!

This point may sound obvious, but you should know your CV inside out. You would be astonished how many times you will be asked about your CV, and if you believe that the recruiter will not check and ask about your CV, you’re wrong. Most importantly, always be prepared to not just talk about what is on your CV, but to expand on each point, especially your interests, hobbies and accomplishments.

 

Skills you need

So, whether you’re looking to be a data analyst at Ipsos Jarmany, or another similar company, here are the 4 skills you need to demonstrate:

  • Microsoft Excel – Take it from some who does not have Maths or Computer Science degree, proficiency with excel is important! I had to demonstrate being able to manipulate and summarise data in different ways, using various functions and formulas in Excel, such as VLOOKUP, SUMIF, COUNTIF, and Pivot Tables. If, like me, this is something you haven’t done much of before, then definitely get practicing.
  • SQL – During my internship so far, I have run a couple of SQL queries, so having a little bit of prior knowledge is certainly an advantage. If SQL is new to you, it’s still worth doing your research so you can demonstrate an awareness of it.
  • Microsoft Power BI – Power BI is a key tool that we use at Ipsos Jarmany. So far, I’ve used it many times as the main bases of our dashboards, and often it has been my Excel fundamental skills that have helped with this.
  • Communication – A skill that is just as important (and often harder to learn) is the ability to communicate effectively with teammates, colleagues’, clients, and stakeholders. It’s important to understand the importance of listening to their needs, expectations, and feedback, and providing clear and concise explanations of your analysis and findings. I have learnt how important good notetaking is, and planning the day effectively. I have had to learn the balance between being too technical and not technical enough, adapting my approach to different situations, because simply not one person is the same and not everything will be understood the way you expect it to be. It is great to use data visualisation techniques, such as charts, graphs, and dashboards, to present the results of analysis in a compelling and understandable way, but it is equally important to be to communicate this so you can clearly and concisely answer questions, address issues, and provide recommendations based on the data.

 

Don’t forget, all these skills can be learned over the course of your internship, and there are many resources to help you pick up and sharpen these skills. I was fortunate enough for Ipsos Jarmany to provide many of these resources and support along the way. Although, having a passion to learn and ability to apply skills learned in the work environment is just as important as the skill itself.

Your Essential Guide to Microsoft Azure for Data Platforms

The cloud market is dominated by the big three: AWS, GCP and Microsoft Azure. We compared them in a previous blog, but now it’s time to say why we believe Azure might be a better business option.

In this blog, we’ll provide a recap of Azure and an update on its performance in the cloud market. We’ll explain why Azure is a great choice for Modern Data Platforms, whose development has become a priority for many organisations. Once that’s done, we’ll touch on key Azure services, its AI developments, including Copilot, and how Azure generally prices services.

 

Let’s begin.

 

What is Microsoft Azure?

Microsoft Azure is Microsoft’s cloud computing platform, providing infrastructure as a service (IaaS), platform as a service (PaaS), and software as a service (SaaS) offerings. Azure comprises over 200 products and cloud services, including computing, storage, and networking. It also allows customers to build, run and manage applications across multiple clouds, on-premises, and at the edge, using their preferred tools and frameworks.

 

Who is using Azure?

Azure has customers in the healthcare, financial services, government, retail, and manufacturing industries. Its services carry over 100 compliance certifications, giving customers peace of mind in a world with growing data regulations. Plus, Azure’s security continues to evolve, with Microsoft investing $1 billion a year to maintain its cyber defences.

Of the Fortune 500 companies, 95% use Azure for cloud services. With revenues increasing by 30% in the fourth quarter of 2023, Azure has been growing faster than AWS. Moreover, it could be the cloud industry leader by 2026 based on current trends.

 

Why Companies Opt For Azure

Businesses choose Azure because it allows them to adopt cloud computing confidently and at their own pace to align with their budgets. Azure provides:

  • Trust – In addition to comprehensive compliance coverage, customers gain multi-layered defences for their cloud data. Azure clarifies that customers own their data and can be confident about where it’s stored.
  • Hybrid compatibility – Companies can easily connect their on-premises infrastructures to Azure. Moreover, they can integrate Azure workloads with other cloud and edge solutions and manage these multicloud environments with their Azure tools.
  • Your cloud terms – Azure allows customers to develop cloud infrastructures using all languages, including open source and frameworks. They also gain unlimited scale for their application to support business developments over time, and offer a no-code solution which makes it easily accessible for non-technical users.
  • Future-ready – Microsoft is developing Azure fast, aligning with rapid developments in AI. The company has invested billions of dollars in OpenAI, which created ChatGPT, to make use of OpenAI’s expertise to strengthen Azure’s own AI services.

 

Is Azure a good choice for data platforms?

Data platforms are increasingly at the heart of business strategies for driving expansion. A growing number of organisations recognise that being data-driven is vital to success. As Forbes describes, data-driven companies are 23 times more likely to top their competitors in customer acquisition and are about 19 times more likely to stay profitable.

However, developing modern data platforms requires experience and expertise. Often, companies lack the skillsets to create these sophisticated platforms, or IT doesn’t have the resources and time to complete such mission-critical projects.

 

The Microsoft Intelligent Data Platforms

Microsoft is simplifying things by offering an Azure-based modern data platform solution to meet the needs of all companies. Called the Microsoft Intelligent Data Platform, this one-stop solution provides the databases, analytics, AI, and data governance companies require to build a cutting-edge modern data platform infrastructure.

Businesses like NETZSCH Group in Germany, which manufactures grinders and thermal analysis instruments, have adopted the data platform solution. The company gained a single source of truth out-of-the-box to open new growth revenues by adopting the platform, which is underpinned by Azure cloud services, including Azure AI and Machine Learning, Azure Synapse Analytics, and Power BI.

Other examples include Portuguese energy provider EDP, which is using the Microsoft Intelligent Data Platform to identify the best places to install aerial power lines and electric chargers on public roads. Its Azure-based data platform is fed by a data lake with more than one petabyte of data connected to 60 apps.

 

An Azure-powered Intelligent Data Platform

The Azure services that support the Microsoft Intelligent Data Platform fall under four main categories: databases, analytics, AI and ML and data governance:

Databases

  • Azure SQL – A fully managed SQL database to power even the most resource-intensive apps and to support mission-critical workloads in the cloud. It scales rapidly and offers intelligent query processing to improve performance.
  • Azure Synapse Link for SQL – This enables near real-time analytics over operational data in an Azure SQL database. It’s an automated system for replicating data from transactional databases into a dedicated SQL pool for analytics.
  • Azure Arc-enabled data services – It extends Azure capabilities across multicloud, on-premises and edge environments. Users can unify, govern and secure their databases to optimise resiliency, performance and costs.

 

Analytics

  • Azure Synapse Analytics – An enterprise analytics service, it accelerates time to insight across data warehouses and big data systems. The Synapse Studio gives data engineers, administrators and scientists a unified experience, making it quicker and easier to complete tasks.
  • Microsoft Fabric – This all-in-one AI-powered analytics solution simplifies analytics for enterprises. A software-as-a-service offering, it seamlessly integrates data and analytics services to reduce costs for business insights.
  • Azure Databricks – Like Azure Synapse Analytics, Azure Databricks unlocks business insights and is set up for open-source Apache Spark environments. It excels in diverse data processing needs when openness and flexibility are crucial.

 

Data Governance

  • Microsoft Purview – This enables you to secure and govern your complete data estate. It comprises a family of solutions that cover tasks such as auditing, compliance management, data lifecycle management and data loss prevention across your Azure environments.

 

AI and ML

  • Azure AI – It encompasses a range of services for developers and data scientists to build, deploy and manage AI applications and solutions. The services help accelerate AI innovation, simplify model operations and ensure responsible AI.
  • Azure Machine Learning – With Azure Machine Learning, data scientists and developers can build machine learning models quickly and develop confidently using streamlined Machine Learning Operations (MLOps).

 

Azure: it’s AI future

Microsoft is betting big on AI driving growth for Azure. Currently, Microsoft has 53,000 Azure AI customers – one-third of whom took up the service just in the last 12 months. The company has been adding graphics processing units to its data centres as more customers look to run their AI workloads in Azure.

In 2023, Microsoft unveiled AI innovations such as Microsoft Azure AI Studio, offering a one-stop shop to seamlessly explore, build, test and deploy AI solutions using the latest AI tools and machine learning models. It also included generative AI companion Microsoft Copilot, powered by Microsoft Azure OpenAI Service. Microsoft has rolled out Copilot in Windows and Copilot for Microsoft 365; Copilot for Azure is currently in preview. 

With Copilot for Azure, customers can use the AI companion to design, operate, optimise and troubleshoot apps and infrastructure from cloud to edge, helping streamline cloud operations and management. Its purpose is to unify knowledge and data across hundreds of services to increase productivity, reduce costs, and provide deep insights.

 

How much does Azure cost?

Microsoft offers Azure in different service modes to meet every organisation’s needs and budget. It provides substantial savings compared to other clouds, transparent, competitive pricing, and tools to keep costs firmly under control. Its offering can be summarised as follows:

  • Free Tier – This gives new customers access to popular services for the first 12 months. Plus, customers who try free must move to pay-as-you-go within 30 days to continue receiving 12 months of free services.
  • Pay-as-You-Go – No upfront commitment is required, and customers pay only for what they use beyond their free amounts.
  • Reservations – Make a one- or three-year commitment to select Azure services, and Microsoft will pass on savings of up to 72%. Customers can pay for their Azure reservations either upfront or every month.
  • Spot Virtual Machines – With Spot Virtual Machines, customers buy unused compute capacity at significant cost savings. These machines are ideal for workloads that can handle interruptions and don’t need to be completed within a specific period.

 

How to gain Azure Skills

As highlighted in a previous blog comparing Azure, AWS, and GCP, existing Microsoft experience can help adopt Azure. However, to take full advantage of the cloud platform, training to develop some solid Azure skillsets wouldn’t go amiss.

Microsoft offers a robust training programme to help customers get started on Azure and develop their cloud environments. Microsoft Learning is free and includes role—and product-focused documentation, hands-on training, and certifications. You can find out more about it and start learning by visiting Microsoft Learn.

 

Find expert support for Azure

As a Microsoft Solutions Partner, we have experienced Microsoft certified Azure consultants here at Ipsos Jarmany. Our excellent team of data engineers, analysts and scientists have extensive experience with Azure, especially in deploying modern data platforms using Azure services.

Our flexible approach to Azure consultancy ensures customers gain the right service to meet their business needs. As a Microsoft partner, we’ve worked with several large customers, helping them accelerate their Azure journeys and maximise their ROI.

Start the conversation by getting in touch with us today.

Data-driven decision-making, made easy with Jarmany

 

Beyond Cookies: How To Navigate The Upcoming Apocalypse

Four years later, Safari and Mozilla have blocked third-party cookies, and Chrome has…well, finally given a date for when it plans to close the door on cookies for good—Q3 2024.

So, what makes the end of third-party cookies so important? Theoretically, it gives marketers a major headache because they’ll lose valuable data about their target audiences. And that spells lost revenue. However, Chrome says it’s blocking these pieces of code because it can offer a viable alternative to satisfy marketers’ cookie dependency.

In this blog, you’ll gain an insight into the world of third-party cookies and why they’re being phased out. Crucially, you’ll discover the potential impact of this move on your marketing and the alternatives available to keep acing your strategy.

 

What Are Third-Party Cookies?

Third-party cookies are set by a website other than the one you’re browsing. For instance, you visit a website and watch an embedded YouTube video. Along with the video, the request will result in a third-party cookie being installed in your browser, which tracks you as you visit other websites.

By accumulating data on browser sessions, these cookies provide valuable insights into users. The data can be used to increase conversion rates, shaping the type of online ads you might see when visiting a website for the first time.

It’s worth noting that cookies aren’t just third-party. First-party cookies also gather important user insights but work differently. Stored by the website you’re browsing, they collect user data to help improve website performance. They also perform valuable functions, improving ad targeting and creating fast login times among other things.

 

Why Google Is Axing Third-Party Cookies

Third-party cookies have been around since the 1990s. How much data they can collect on users has long been a concern, with some personal information leading to invasive online experiences. This type of personal information can include:

  • Gender
  • Sexuality
  • Religion
  • Political affiliation

Pushback was inevitable, and legislation like the General Data Privacy Regulation (GDPR) in Europe in 2016 changed the rules: companies had to be transparent about their cookies and the information they held.

Google’s slow start tackling the third-party cookie problem was attributed to its dominant position in online advertising. Indeed, 80 percent of the company’s revenues come from ads, so it wanted to tread carefully. Then, at the end of 2023, the company announced the first phase of testing its new Tracking Protection feature, which was scheduled to start in January 2024. This involved turning off cookies for 1 percent of Chrome users — approximately 30 million users. This 1 percent will then grow to 100 percent from July to September.

 

Google’s Tracking Protection Tool explained:

Built into Chrome, Tracking Protection is a feature for blocking third-party tracking of users’ online activities. The feature stems from Google’s Privacy Sandbox initiative to create technologies that protect people’s privacy online while helping companies and developers build thriving digital businesses. Its key aims are to phase out support for third-party cookies, reduce cross-site and cross-app tracking, and keep online content and service free for all.

 

In a blog about the announcement highlighting Tracking Protection, Google was upbeat about the cookie era. Anthony Chavez, vice president of Privacy Sandbox, a Google-led initiative to set website standards for access to user information, wrote, “Third-party cookies have been a fundamental part of the web for nearly three decades. While they can be used to track your website activities, sites have also used them to support a range of online experiences — like helping you log in or showing you relevant ads.

For campaigners, the decision showed how legislation like GDPR, the EU’s Digital Services Act, and Digital Markets Act, create safer digital spaces, making tech giants rethink their online ad packages.

 

The Impact Of The End Of Third-Party Cookies

What will a cookieless future look like? Well, it won’t be entirely cookieless, as first-party cookies will still be around (more on that later). But the future will be different in significant ways. User privacy will be better protected. Creepy situations where websites seem to know personal details that you don’t remember sharing will be fewer. On the flip side, the web may seem less convenient. For example, consumers may experience a decline in personalisation, with non-personalised ads becoming widespread, and useful services like pre-filled address information on order forms may no longer be available.

Businesses will be the biggest losers. Some 75 percent of marketers rely on third-party cookies worldwide for valuable data to target audiences. Moreover, more than 50 percent of marketers seem pretty despondent about future revenues when third-party cookies aren’t around.

The well-known cookie-based marketing techniques in jeopardy are:

  • Retargeting: Tracking users and serving them ads for products or pages they have viewed.
  • Ad targeting: Serving users specific ads based on their unique browsing histories and individual profiles.

Without third-party cookies, businesses won’t be able to determine which products or services interest people visiting their websites. Blank spaces will appear where valuable data existed before, like product preferences and previous searches. This shortfall will impede improvements in customer experiences, including targeted advertising. McKinsey & Company has said 71% of consumers expect companies to deliver personalised experiences, which doesn’t bode well for websites with untargeted ads.

 

How To Thrive In A Cookie-Free Era

At least, the slow death of third-party cookies has provided time to come up with alternatives. However, your first step before looking at the options should be an audit so you know the impact of life without third-party cookies on your marketing strategy.

With or without that information, there is something you should be doing right now — and that’s optimising your use of first-party cookies. We called this out in an earlier blog, highlighting that first-party data was arguably more important. Moreover, they tick all the boxes right now because they are less intrusive and do not cross-track users. Crucially, they will provide information to personalise experiences and target ads based on a user’s interests.

When first-party cookie data is ingested by Customer Data Platforms (CDPs) and aggregated with other customer-related information, marketers gain deep insights into their audience to help optimise campaigns. Here are some other mechanisms to consider besides strengthening first-party data in the post-third-party cookie era:

  • Zero-party data: Customers provide this data intentionally with brands through surveys, polls and membership applications. It tends to provide accurate insights coming directly from users.
  • Data clean rooms: These are secure environments where multiple parties can collaborate on sensitive data. Participants extract insights from each other’s data under strict controls.
  • Data partnerships: This is a collaborative arrangement between non-competing companies to share data for enhancing targetting and customer segmentation. An example would be an automotive brand – building connected vehicles – and a telecommunications company.
  • AI and ML: Here, the predictive capabilities and behaviour modelling with AI and machine learning (ML) make up for the loss of insights from third-party cookies and cross-site tracking.
  • Ad networks and platforms: Working with trusted ad networks and platforms that help place ads on sites to gain traffic can provide access to targeted audiences.
  • Direct marketing: This traditional form of communication with customers via email marketing, social media engagement, and loyalty programs can efficiently reach audiences.

 

How We See Life Without Third-Party Cookies

Marketing shouldn’t lose strength once these cookies are off the menu. Undoubtedly, Google is working its socks off to create viable alternatives – since so much of its revenues depend on advertising. Moreover, AI and ML provide more insights into customers, which can make up for shortfalls in lost knowledge of customer behaviour.

One challenge, of course, is transitioning away from third-party cookies. Creating, maintaining and developing the data platforms to leverage advances in AI and ML, for example, takes experience and expertise that internal teams may need to gain. Sure, upskilling or recruiting are options, but they can be costly and time-consuming.

At Ipsos Jarmany, we’re helping our customers develop their data capabilities so that the coming third-party apocalypse is a noteworthy event but nothing more. They continue to improve the ROI of their marketing through advanced technologies that extract significant value from their sales, marketing and operational data.

Start the conversation by getting in touch with us today.

Data-driven decision-making, made easy with Jarmany

 

A Guide To Next-Level Product Performance Analysis For 2024

Alongside this, Product Performance Analysis also gives you the intel to understand and optimise the customer journey, from when someone identifies a need to when they convert by purchasing a product or service.

In this blog we delve in to what product performance is, the benefits, and the top analytics techniques you should be leveraging in order to gain a true and holistic understanding.

Let’s begin.

 

What is Product Performance Analysis?

For anyone who hasn’t read our guide to Product Performance Analysis, here is a quick rundown of what it’s about. Essentially, you’re using data to analyse your products’ performance at any time. This allows product managers to gain insights, such as:

  • How well is the product performing in terms of sales and revenue? 
  • What is the market share of the product? 
  • Which customer segments does your product appeal to most? 
  • Which customers are abandoning their customer journeys and at which stages? 
  • What do customers think about the product? 
  • Which products tend to be purchased at the same time, and are therefore good cross-sell or up-sell opportunities?
  • Which product details or features are being used and which aren’t?
  • What information are customers seeking out about the product?
  • How does the product compare to competitor’s products, in terms of sales performance?
  • What market trends are impacting, or could impact, the sales performance of my product?
  • How effective are the marketing efforts for the product, and with which customer segments? 

 

The Benefits of Product Performance Analysis

Product Performance Analysis is key for ongoing measurement of your products’ performance, however it can also be a key asset for assessing product launches and campaign performance, such as Black Friday or Christmas campaigns. For example, at Ipsos Jarmany we do a lot of work with some of our clients in the consumer tech industry to help them understand how new product launches have performed, and what optimisations can be made to boost sales performance.

Product Performance Analysis also enables data-driven decision-making, which can improve campaign effectiveness, increasing response rates by up to 600%. In addition, it can also help you to expand the number of engaged customers, known to generate 23% more revenue than average customers and be a tool for tracking indirect sales to provide a holistic view of product performance.

 

The Product Performance Analysis Techniques You Need to Know 

Product Performance Analysis isn’t singular, you need to take a multi-faceted approach which, using a blend of techniques, can give you a holistic overview of your products’ performance. There are multiple techniques you can use to gain the product performance insights you need to achieve an edge in today’s business world, and guide your decision-making for both offline and online sales strategies.

In this blog we differentiate between direct and indirect performance insights, providing you with 9 key techniques for best-in-class direct performance analysis, and 3 key techniques for acquiring valuable indirect insights.

 

Direct Performance Insights

 

#1 Funnel Analysis

Firstly, we have Funnel Analysis. This involves mapping and analysing the steps to achieve a desired outcome on a website and assessing how many users get through each step. The idea is that you’ll be able to identify drop-off points along the way to the desired outcome and, therefore, understand where improvements need to be made to increase conversions.

When it comes to funnel analysis, it’s important to define your goal and map out the customer journey to achieving that goal – whether it’s to gain newsletter sign-ups, purchase a product, create a user account, register for an event or download your app. The AIDA model, which includes Awareness, Interest, Desire and Action stages, is often used to map sales journeys. You can then track customers’ progress, seeing the funnel stages where they drop off, alerting you to friction points that need attention.

Funnel analysis should be a key component in your product performance analysis as it allows you to identify bottlenecks, optimise the user journey, understand user behaviour and improve your conversion rates. Facilitating a seamless conversion process is crucial to fostering repeat business among customers. In fact, 88% of online shoppers express that they are unlikely to revisit a website following a negative user experience.

 

#2 Trend Analysis

Trend analysis is another important technique to help you to better understand your customers and gain insights into their motivations, expectations, and the external influences, like economic, social and technological trends, that impact their behaviour. Nike, for example, uses Trend Analysis to ensure its products keep meeting the needs of customers. Based on insights that showed growing consumer concerns about sustainability, the brand launched Nike Space Hippie, a line of sneakers engineered from recycled materials.

To establish trends you need to use current and historical data from various sources, including surveys, reviews, feedback, market research and web analytics. This will allow you to discover the customer segments where your products resonate most and at what times of the year. The findings then allow you to optimise product sales analytics, engagement strategies and marketing campaigns. 

Additionally, Trend Analysis can help you to: 

  • Identify new product opportunities 
  • Understand customer behaviour 
  • Pre-empt customer needs 
  • Personalise marketing campaigns and comms based on customer insights 
  •  Track your business progress and results.
 

#3 Customer Journey Analysis

A critical component in understanding how your products are performing, is by looking in to how your customers are finding you, and what part of the customer journey is having the biggest impact on conversions. This is where Customer Journey Analysis and Customer Journey Mapping comes in. These two techniques are often confused but are two distinct processes, with mapping a subset of analysis. Think about them in this way: Journey Mapping tells you what happened, and Journey Analysis tells you why it happened.

By analysing your customers’ journey map, you can identify the various touchpoints and interactions a customer has with your business throughout their entire lifecycle. It also helps you to understand areas of friction along the journey, spot unnecessary touchpoints, establish which touchpoints have a great attribution towards conversions, and call out points where customer expectations were delivered or exceeded.

Customer journey analysis is hugely interconnected with Product Performance Analysis as it allows you to gain comprehensive insights into how customers interact with your products and identify pivotal moments in that journey that results in a conversion. 

 

#4 Cohort Analysis

Next up we have Cohort Analysis. This allows you to identify the behavioural characteristics of groups within your customer base to detect patterns and insights that you can use to optimise customer retention.

Three types of cohorts are commonly identified as:  

  • Acquisition cohorts – based on when someone became a customer. 
  • Behavioural cohorts – reflecting past behaviours or profile properties. 
  • Predictive cohorts – based on what a customer is expected to do in the future. 

To complete a Cohort Analysis, you need to first set a clear goal of what you want to accomplish, such as reducing churn rates. Next, select a question like, are our products still meeting the needs of customers? Decide what you need to measure to answer the question and choose the attributes for the cohorts in your analysis. Once you have your data, you need to test your findings, which you can do through a simple A/B test.

 

#5 Retention Analysis

Retention Analysis quantifies usage data to determine customer churn or retention potential. It highlights why customers decide to stay, giving you data to drive product development, services and customer support strategies, and increase customer life-time value (LTV).  

Customer retention is critical for businesses, with retaining customers costing five times less than acquiring new ones. Therefore, you want to use Retention Analysis to focus on your power users and look at their behaviours to analyse their engagement and discover areas of the customer journey for improvement. Look at what features power users engage with and ask yourself why. While the average retention rate across industries maybe 75%, it’s not uncommon for software-as-a-service companies with Retention Analysis strategies to reach 90%-plus.

 

#6 Predictive Analytics

Most Product Performance Analysis techniques are centred around analysing past or present performance to glean actionable insights. However all businesses want (and need) insights into the future to make the most profitable decisions. Predictive Analytics, which uses historical data, statistics and machine learning to anticipate the future, can be fed into your decision-making process to optimise your strategies. There are no limits: it can support all your business functions, including marketing, sales, operations and finance.

It can help marketers understand customer behaviour better and predict the impact of marketing messages more accurately. You can spot industry trends earlier than competitors, find hidden relationships between customer data points, spot the most promising marketing prospects, and highlight at-risk customers that need more attention. Amazon is a convert of Predictive Analytics, using the technology to establish what products users are likely to purchase in the future, based on previous purchases and the behaviour of similar consumers. With this information, they are then able to provide personalised recommendations to improve cross-sell and up-sell product conversions. 

 

#7 Customer Feedback Analysis

Whilst many of these techniques have been focused on quantitative analytics, you also need to evaluate qualitative insights based on customer feedback. This type of insights can highlight your customers’ needs and areas where you can improve. Customer Feedback Analysis enables you to process that feedback in bulk, extracting insights a human might overlook. It is a crucial driver of revenue growth since 86% of buyers have said they are willing to pay higher prices for a great customer experience.

There are lots of data sources to give you feedback. They can be surveys, reviews or customer support conversations. Some of the most commonly used sources are: 

  • Customer satisfaction (CSAT) surveys 
  • Net promoter surveys (NPS) 
  • Customer effort scores (CES).  

Things to note are NPS is a relationship study metric, while CSAT and CES are primarily transactional metrics. Companies often use NPS to understand customer loyalty, while they may turn to CSAT feedback if they want to change their product portfolio.

 

#8 CRM Analysis & Optimisation

Customer relationship management (CRM) systems have been around for decades.  

They store vast amounts of data, including customer and prospect data, customer interactions and service issues, and play a pivotal role in understanding and improving product performance. For example, CRM analysis can help you to identify customer segments, create personalised and targeted marketing campaigns, present cross sell and up-sell opportunities, and collect customer feedback.

Analysing your CRM data in detail can also provide you with insights such as: 

  • Lead drop-out rates across the sales cycle 
  • Number, length and success of sales calls 
  • Marketing email open rates and times 
  • Social media post interactions 
  • Customer service pain points

Solutions like Microsoft Dynamics 365, used by 97% of Fortune 500 companies, come with analytics capabilities, including visualisations, dashboards and goal management. It includes a familiar Microsoft interface, minimising training, and delivers cutting-edge analytics, including sales forecasting.

 

#9 Heatmapping

Using heat maps lets you see what users do when they visit your web page. You can identify the places where they click, how far they scroll and what areas they seem to avoid. You can also detect the messages that resonate more with customers and locate messaging in the high-exposure positions to drive users down your funnel.

Software solutions for heat mapping are widely available, for example Microsoft offers Microsoft Clarity, a free tool to capture how people use your website. There are multiple paid-for options, with tools for different use cases, from understanding user behaviour to optimising landing pages. Heat maps are a powerful solution; they have been known to increase website click-through rates by as much as 276%.

 

 

Indirect Performance Insights

 

#1 Indirect Channel Data

Up to this point, we’ve explored various techniques for obtaining performance insights through direct data sources. However, it is crucial to complement this approach by incorporating data from external sources to achieve a comprehensive understanding of your products’ performance. A pivotal starting point for this integration is through indirect channel data, which refers to information directly provided by third-party retailers. This is typically sales data, with insights around number of units sold, sales by regional location, and stock levels.

By leveraging these insights, you can effectively compare the performance of your direct-to-consumer sales with your indirect-to-consumer sales. This comparative analysis enables you to identify key opportunities, both offline and online, and strategically focus your marketing efforts where they are likely to yield the greatest impact.

 

#2 Third-Party Data

Whilst indirect channel data will provide valuable insights, third-party retailers often provide limited information. To help bridge this gap and unlock broader insights, you should lean on third-party data providers to gain insights around competitors’ performance, industry insights and market conditions.

Well-known providers, like GfK, provide sales and market intelligence to help you connect the dots and improve decision-making. This could include insights around media consumption by your target audience, or industry sales by channel and price.

 

#3 Web Scraping

In addition to gathering indirect data from retailers and other third-parties, you can also collect information for Product Performance Analysis through web scraping technologies. This will provide you with insights on your own performance, as well as competitor’s performance.

Web Scraping automates the extraction of valuable information from across the internet, in turn providing you with industry insights, competitive analysis, and information on what customers say about your product. You can obtain information on your share of voice and benchmark your operations against competitors.

Web Scraping requires the development of scripts, using tools like Python, to launch virtual machines that gather data, which is then structured, cleansed and processed for analysis. It is essential if you’re selling indirectly through third-party affiliate sites, as it allows you to gain deeper insights into your products’ performance on those sites and insights into your competitors’ performance.

 

How To Maximise Product Performance Analysis

Many of these techniques for Product Performance Analysis may be familiar. You may have started using a few and are now considering how best to implement some others to get to the next level. It can feel like a complex process, requiring skills, expertise and time that may present barriers.

At Ipsos Jarmany, we have been helping businesses improve their analytics capabilities to thrive in a world where organisations are increasingly data-driven. Our analytics specialists are helping them turn raw data into actionable insights so they can make the most effective decisions in terms of their products and product development. 

By working with us, we can help you gain maximum ROI from Product Performance Analysis with none of the hassle. If that sounds interesting, please get in touch with us today. 

Data-driven decision-making, made easy with Jarmany

 

Ipsos Jarmany’s Year In Review

In the data and analytics world, Gen AI and machine learning took centre stage in 2023, not forgetting the heightened exposure on data literacy, data democratisation, and a huge emphasis on greater data privacy and security, to name a few. But, what took centre stage for Ipsos Jarmany in 2023?

Let’s take a moment to reflect back at our key milestones…

 

#1 Celebrated Our 15 Year Anniversary

Ipsos Jarmany was launched in 2008, with the aim of providing a top-tier data analytics consultancy service that would challenge the way businesses handle and harness their data. We set out to deliver a service that would empower our clients to uncover valuable and actionable insights that drive timely and informed decision-making, and that’s exactly what we have done.

Over the last 15 years, we have hit a lot of milestones, from building long-term partnerships with a range of blue-chip clients, to contributing to the wider data analytics industry by training just shy of 250 grads. We are proud to have grown significantly in the last 15 years and we’ve got big ambitions for 2024 too.

 

#2 Invested In Our AI Capabilities

Let’s face it, Generate AI has truly shaken up the world this year. From the launch of ChatGPT, to Microsoft’s Copilot and Google’s Gemini & Bard, it’s been hard to escape the AI hype. And, whilst these technological breakthroughs have bought about a wealth of benefits and greater accessibility and democratisation for AI-enhanced knowledge, learning, productivity and creativity, it’s not come without it’s challenges. Businesses left right and centre are all now striving to understand how they can use AI to deliver efficiencies, drive growth and forecast potential challenges – which is no easy feat (especially if you lack the right tools, technologies, and most importantly, data).

We’re by no means new to the AI world, in fact, we have been assisting our clients with artificial intelligence and machine learning technologies for a number of years now. But, as you may know, technology is ever changing, and it seems like advances in AI are often a daily occurrence nowadays. So, as part of our commitment to upskilling our AI capabilities and staying ahead of the AI curve, we became a Microsoft Solutions Partner in Data & AI. This designation showcases our ability to assist clients in overseeing and controlling their data across various systems and enabling the creation of analytics and AI solutions.

We are dedicated to ensuring that we consistently deliver top-tier service and customised solutions that align to our client’s data and AI requirements, and obtaining our Microsoft status was the final cherry on top to solidify our client’s confidence in our AI capabilities.

 

#3 Set-up Dedicated Data Engineering, Visualisation and Data Science Practices

2023 saw us not only advance our AI capabilities but also take deliberate steps to establish dedicated practices and expert teams in data engineering, data visualisation, and data science.
This strategic move aligns with our commitment to being a well-rounded data and analytics service provider, aiming to offer comprehensive solutions to our clients. 

Beyond enhancing our technical expertise, this initiative promotes a collaborative culture within Ipsos Jarmany, encouraging knowledge sharing across cross-functional teams and aims to cultivate best practices, foster innovation, and maintain a consistent, high-quality standard across all our client engagements.

This holistic strategy reflects our enduring dedication to delivering tailored solutions and staying at the forefront of the ever-evolving landscape of data and analytics.

 

#4 Made Upskilling Our Workforce a Priority

Investing in our AI capabilities and establishing dedicated practices aligned with our technical specialisms has been a core focus in 2023, however hand-in-hand with these achievements has been the upskilling of our workforce and fostering a culture of continuous learning and development.

Throughout 2023 we continuously reviewed our graduate training programme to ensure our analysts were on track to learn the tools, techniques and soft skills that would enable them to excel in their data careers; and we intend to do the same in 2024.

That said, it’s been a big year for the Ipsos Jarmany workforce, with a total of 13 employee promotions and 17 members of the team attaining various Microsoft Accredited Qualifications.

 

#5 Continued to Scale-up, Globally

Since Ipsos Jarmany was born in 2008, our mission has been to establish a data analytics agency that would change the way businesses handle and harness their data. Our goal was to empower them to uncover valuable and actionable insights that drive informed decision-making by providing a truly world-class service. And this year, we’ve taken world-class to a whole new level, scaling-up our global presence by supporting clients from the UK and Europe, all the way to South Korea and in between.

 

#6 Made the World a Better Place

Ok, we appreciate this sounds extreme but bear with us on this one. Alongside our data capabilities we’ve been honing in on what we can do to positively contribute towards our local communities and the environment.

From a sustainability perspective, we’ve been working on how we can minimise our environmental impact, from reducing office wastage to encouraging our workforce to make greener commuting decisions. We are proud to have reduced our office electricity consumption by 37% compared to 2022, and we are proactively working with some of our clients to continue making positive sustainability improvements across the board. There’s much more to come, but for now we are making progress.

From a charity perspective, we have continued to maintain our long-term relationships with local charities as well as supporting larger humanitarian crisis’s. When it comes to fundraising activities, you name it, we’ve done it; from dragon boat racing on the Thames, to jaw-biting office Mario Kart tournaments and competitive bake sales. We won’t be afraid to admit we even got creative and ran a charity leg-waxing event (yes, you heard us right).

 

Looking ahead to 2024

As we say farewell to the transformative year of 2023, Ipsos Jarmany stands tall, having successfully navigated the ever-changing landscapes of data and analytics with resilience and innovation, even amidst economic uncertainty. Our 15th-year anniversary celebration marked a significant milestone in our journey, showcasing a decade and a half of growth, valuable partnerships, and notable contributions to the data analytics industry. This milestone also reaffirms our dedication to leading the way in the field of artificial intelligence.

Armed with the knowledge and experiences gained in 2023, we eagerly look forward to the year ahead. With high aspirations and exciting plans, we are poised to make 2024 another exceptional chapter as we strive to continue making a positive impact in the data and analytics industry.

Here’s to the journey that lies ahead!

Join forces with Ipsos Jarmany to turn your 2024 goals in to reality

Our Top 10 Predictions For Data and Analytics in 2024

It’s hard to think of a time when IT has received so much media attention. At Ipsos Jarmany, we’ve lost count of the number of Gen AI scare stories published since way back in January. Only recently the ructions at OpenAI with the dismissal and reinstatement of Sam Altman dominated headlines for a couple of weeks.

Anyway, with all that behind us, we should turn to what lies ahead and the areas of focus for our data and analytics strategies. Spend on both continues to increase exponentially. Currently valued at $225.3 billion globally, investment is expected to reach $665.7 billion by 2033 as companies increasingly rely on data and analytics to drive growth.

We’re not going to argue with the sentiment here because our clients are certainly benefitting from greater use of technology to deliver their goals.

So here are our predictions for the year ahead. Enjoy.

#1 Data Governance & Ethics Will Be A Top Priority For Organisations

Let’s kick things off with governance and ethics. The truth is that legislation is tightening on data control, and the penalties for getting it wrong are high. Failure to comply with GDPR regulations can result in penalties of up to £17.5 million or 4% annual global turnover, whichever amount is greater.

This may be old news to many businesses, but we see many companies greenwashing governance, meaning they’re making false claims on how they govern their data. They are their own worst enemies, because not only is good governance critical from a legal perspective, it’s also important from an operational standpoint. It helps businesses achieve their goals. This is why many businesses are putting it ahead of AI as a priority in 2024, and we can understand why.

 

#2 Data Culture Is Going To Be Taken Much More Seriously

Taking up spot number 2 is data culture. Every employee should get the chance to improve their data literacy — and that’s not just for their own good, but for the good of the business too. A thriving data culture is going to be even more important to a company’s success in 2024, especially as AI becomes more widely used.

Everyone should be talking about how data can and should be used at all levels of your organisation. Unfortunately, this has been seriously overlooked, hence as few as 8% of companies successfully scale their analytics capabilities. In short, they haven’t thought about how to build their culture before they invest in the tech. You’ve been warned.

 

#3 Data Ops & Automation Will Play a Key Role In Saving Organisations Time, Resource and Money

We appreciate that mentioning data ops and automation isn’t likely to get anyone in the party mood. Nevertheless, we’ve attended loads of meetings and seminars this year where ops and automation have been high on the agenda. So why is everyone talking about this data management stuff?

To say 2024 is going to be a pivotal year in business with the rapid evolution of AI could be the biggest understatement of the year. Therefore, our advice is that every business gets its data pipelines in order fast. It could make the difference between success and failure over the coming years, and if that isn’t enough for you, it’s also going to help save time and money on your data analytics.

 

#4 Data Security Will Gain More Investment

Let’s face it, data security probably isn’t the most exciting topic that we anticipate will be trending in 2024, but that doesn’t make it any less important. Data security is more important than ever as consumers are becoming increasingly concerned over their data privacy. That said, you can be sure that security as an issue isn’t going away, and, in fact, it’s going to be even more important in 2024.

We’re seeing heightened privacy laws for one thing. Speak to anyone in AdTech and they’ll tell you about the demise of cookies as they become blocked by default in many web browsers. Plus, in the UK, GDPR guideline changes are on the table as the Government seeks to move away from the one-size-fits-all EU version.

There’s no escaping the impact AI will have on data security either. As you’ll probably have read hackers are using AI to design malware that can hide from security systems. But on the other hand, AI is also helping analyse huge volumes of data for companies to help spot those hidden attacks. What’s clear, however, is that investing in data security can save millions.

 

#5 Micro Partitions Will Be The Key To Efficient Data Operations

Appearing for the first time in our end of year Top 10 — micro partitions. If you’re not familiar with micro partitions, then let us shed some light; this is a feature in the Snowflake data platform that we think businesses will use to make their data operations more efficient.

It’s just one of the reasons why the Snowflake platform, which is used by more than 8,500 businesses, continues to grow in popularity. In simple terms, the feature divides the tables where your data is stored into micro partitions. The benefits include faster query performance, data compression, concurrency and horizontal scaling.

 

#6 Terraforming Is Set To Become The Default For Managing Cloud IT

Terraform, from which we derive Terraforming, is one of the most popular infrastructure as code (IaC) tools available, supporting a wide range of cloud providers, and we anticipate it’ll really take off in 2024. Now the default method for managing cloud IT, IaC tools represent a huge market, expected to be worth $2.8 billion by 2028.

By Terraforming, you can manage your cloud resources as a single unit and automate deployments. Plus, you can lift-and-shift those resources easily between different platforms. Terraform is a declarative IaC tool, which means you define what you want, and it figures out the rest.

Expect to see changes around Terraform licensing in 2024. HashCorp, the brains behind Terraform, announced the adoption of a business source licence for Terraform, which is a middle ground between open source and end-user licensing.

 

#7 No Code Self-Service Platforms Are On Course To Be The Next Big Thing To Aid Data Democratisation

Expect no code self-service platforms to play a key role in developing data cultures over the coming year. These platforms empower people, who don’t have IT expertise, to build their own digital applications without breaking into a sweat, making data capabilities much more accessible to a much wider audience. It’s possible because no-code platforms use building blocks to design the application logic. In 2024, more than 65% of application development activity will be low code or no code.

The future seems clear then; however, be advised the low code/no code world we’re heading in to comes with a few dangers. As is often highlighted, they can create vulnerabilities that will need addressing. These include authorisation misuse, data leakages and asset management failures for starters.

 

#8 Sustainability In Data Will Come Under The Spotlight

In 2024, data will have a bigger say in sustainability. Firstly, data modelling and forecasting, driven by AI, will make it easier for businesses to connect processes to sustainability goals like net zero. Being able to evidence the outcomes will no doubt help win business among customers, who increasingly want to see sustainability claims backed up with figures.

Data will also need to account more for its own carbon footprint as we head into the coming year. AI especially has come under the spotlight as a significant contributor to greenhouse gas emissions, thanks to the amount of power that it soaks up. Respected publications have said how training an AI model can emit as much carbon as five cars across their lifetimes. Hence businesses are going to have to be even more aware of their IT carbon footprint, thinking up ways to reduce processing requirements.

 

#9 Data Observability Is Going To Be More Heavily Policed

Linked to sustainability, data can expect to be more heavily policed in 2024. Get ready for organisations to start pulling back the curtain on their data processes to understand exactly how much processing is going on for example. Of course, data observability can do much more than that. IBM points out that it leads to higher data quality, faster troubleshooting, improved collaboration, increased efficiency, improved compliance and greater revenue. It’s an impressive list.

Data monitoring data has been with us for some time, but we think it’s going to be used much more heavily and companies will think more closely about their data operations.

 

#10 Gen AI: Reality Will Kick In When You Dig Into Your Data

We were thinking of leaving this blank. Afterall, what more can be said about Gen AI that hasn’t already? Actually, there’s still a lot to say around Gen AI and thankfully more serious discussions are beginning to happen now the hype is (arguably) fading.

We’re seeing more business start to think how they can accelerate their AI strategies, all the while developing use cases for Gen AI projects. It’s great to see because Gen AI and AI more broadly can be truly transformative. Although, we’re anticipating that the AI hype will diminish when organisations lift the lid of their data and discover that their data quality, quantity and format isn’t in a strong enough position to facilitate AI models.

At Ipsos Jarmany, we’re excited for the opportunity to assist numerous companies in implementing their AI strategies and establishing a robust data foundation as we move into 2024.

 

Get In Touch

There is a lot of opportunity in our list to make 2024 even better than 2023. But isn’t that the great thing about technology: it keeps getting better and better? Moreover, the improvements aren’t merely lineal—they’re exponential.

On the flip side, the sheer amount of technology out there can make it feel harder to navigate to the right solution for your organisation. Perhaps you’re not ready for Gen AI yet and you need to get your data house in order first. But how do you do that?

At Ipsos Jarmany, we continue to help businesses realise all the opportunities available from technology. Our expertise in data, starting from developing strategies to delivering transformational solutions, has helped our clients make 2023 a special year, and we’re looking forward to making 2024 even more successful.

If you’d like to discuss how Ipsos Jarmany can support you on your data and digital journey in 2024 then please contact us today.

Data-driven decision-making made easy with Ipsos Jarmany

12 Essential Steps To Be AI-Ready

The Truth About AI

True, AI can significantly boost performance and revenues by transforming organisations in lots of different ways. From how they engage with customers to how they recruit people and manage their finances. In fact, it’s predicted to boost GDP in the UK by 14% come 2030.

But honestly speaking, AI is anything but plug-and-play. And maybe that is why we see an astonishing 85% of AI projects fail to deliver the expected business value.

Something doesn’t add up. You seemingly have the data and access to the AI models, so what’s wrong? Well, let’s go back to the data for a second—because that’s where AI projects normally go off the rails.

 

Don’t forget the data

Broadly speaking, either companies don’t have enough of it, aren’t using it in the right way, have major quality issues, or just don’t have the correct systems to store and warehouse the stuff. We see the same problems time and again.

When it comes to artificial intelligence, getting the foundations right is absolutely critical. AI isn’t a quick process and there isn’t a ‘one size fits all’ solution; this is a long-term strategic investment in your business which will improve over time. However, all this relies on the quality of your model inputs, namely data. If you’re not getting this right, you’re already setting yourself up for failure (and a lot of wasted time, effort and money).

In this blog we’re going to tell you how to prepare your data for AI success. With our 12 steps, you will be able to navigate AI with confidence and start reaping the full power of the technology for better business results.

Here we go:

 

 

#1 Data volumes

Generally, AI algorithms require significant volumes of data – we really can’t emphasise this enough. However, just how much will depend on the AI use case you’re focused on. One figure often referred to is the need for 10x as many rows (data points) as there are features (columns) in your data set. The baseline for the majority of successful AI projects is normally more than 1 million records for training purposes.

 

#2 Data history

Let’s say you want to use AI for demand forecasting or for marketing mix models. In this case, at Ipsos Jarmany, we recommend having at least 3 years’ worth of data; otherwise, your model will just repeat the previous year’s outputs. It stands to reason that for AI to detect and predict events better than we can, it needs to work with loads of historical data to uncover the patterns and anomalies that we need it to.

 

#3 Data relevance

Depending on your use case, you’ll also need specific data sets for your algorithm. For example, marketing mix models aims to measure the impact of various marketing inputs on sales and market share, hence you’ll need data sets such as previous years’ sales, marketing performance and budget allocations.

 

#4 Data Quality

We’ve put this at #4 but maybe we should have put it at #1. It’s massive. If the quality of data you’re inputting into your AI model is poor, you can bet your chances that the AI models output will be poor.

In short, many companies face data quality issues, so there’s every chance your unsuccessful AI project will do nothing more than put a broader issue under the spotlight. Not a bad thing.

So, how do you go about achieving data quality? Essentially, you’re going to have to go through your data and ensure it doesn’t suffer from any of the following:

• Inconsistency
• Duplication
• Inaccuracy
• Outdatedness
• Irrelevancy
• Incompleteness
• Lack of governance

 

#5 Data Understanding

Whilst we place a massive emphasis on data quality (and rightly so), having a large volume of high-quality data doesn’t stand for much if you don’t have a solid understanding of your data. By this we mean understanding what the data relates to, what the data is telling you, and being able to identify patterns and trends, as well as spikes, dips and outliers in your performance.

Additionally, when it comes to data, it’s key that you have an understanding of what’s happening within the wider business so you can apply business context to the data. For example, if you’re seeing a dip in sales performance can this be attributed to seasonality, or perhaps a stock or distribution issue?

 

#6 Data labelling

This is pretty much as it sounds. You’re annotating your data, defining it as an image, text, video or audio, to help your learning model find “meaning” in the information. It’s important to remember that labelling—like the next step we’ll go on to talk about—should come after you’ve ensured the data quality. 

Labelling is essentially a manual step done by software engineers and the last thing you want is for an engineer to waste their time labelling duplicated, inaccurate or irrelevant data.

 

#7 Data augmentation

Data augmentation is all about creating new or modified data from your existing sets to artificially increase the quantity of data and its value. 

By making small changes such as randomly changing words in text data, you’re not only increasing the data set but improving its quality, helping avoid “overfitting”, where your model aligns too closely to your original training data and struggles with new information.

 

#8 Data systems

For AI to work, you’re going to need the right data systems in place. The key essentials include loads of computing capacity, offering a mix of CPU and GPU processing, modern storage and warehousing, high bandwidth, low latency networking and security for your sensitive data.

That’s potentially a lot of investment, and therefore many companies are looking at cloud services to give them the systems they need at the right price. Leading cloud providers, including Microsoft, can provide you with AI data systems you require to get your AI project off the ground.

 

#9 Data privacy

Data privacy is more tightly controlled than ever and rightly so. Yet, as we know, AI needs tons of data to work, which amplifies your risk of privacy breaches occurring. Trust us, you need to take data privacy very seriously and invest in the tools and techniques to make sure your data comes with encryption, anonymisation and owner consent.

 

#10 Data governance

We touched on this earlier when we talked about data quality. The point we made then was that the correct data governance will boost your data quality, saving you time and money. What’s more, correct governance will ensure sensitive and confidential data is classified accordingly and deleted in line with the appropriate data retention schedule.

 

#11 Data People

Another key step that you need to consider on your journey to becoming AI-ready is data people; and there are three sides to this point.

Firstly, gaining internal buy-in from the key stakeholders within your business is critical to any AI project. These stakeholders need to share the same vision as you when it comes to what you’re trying to achieve with AI and how it can benefit the business. They need to understand the strengths and limitations of the AI model so that expectations are aligned. And, the only way you can ensure internal adoption is by getting stakeholder buy-in from the get-go.

Secondly, in any digital transformation project roughly 10% is based on having the right tech in place, and 90% is based on having the right people and skillsets in place. This may seem surprising, given the importance of having the right tech stack to handle your data and AI models, however that said, you really can’t afford to underestimate how important it is to have the right people in place too. It’s certainly not new news that there’s a skills gap in the industry right now, so in order to future-proof your AI strategy, you need to consider what skill sets you currently have within the business, identify areas where training and development is required, and establish at what point you may need to lean on external agencies for support.

Lastly, is data culture. It’s true, a lot of people are concerned that their jobs will become obsolete as a result of businesses adopting AI. In fact, 44% of employees are worried about the impact on AI on their jobs. Given this, fostering a strong data culture within your organisation should be a priority if you want to ensure internal adoption of your AI model, and offset the workforce anxiety associated with AI. If your workforce are invested in your AI strategy, then this will set you off with a solid foundation for achieving AI success.

 

#12 Data automation

Now that we’re coming to the end, and you’re clear on what you need to do, we’re going to put the idea of automation on the table. It makes a lot of business sense to remove the human intervention here. To use an example, AI Builder, as part of Microsoft Power Platform, offers a turnkey solution for using Microsoft AI through a point-and-click experience. It’s being used by many large enterprises for hand off, error free AI models.

 

Data happiness

No doubt, that feels like a long list, and you’re right, it is. But as you’d expect there are tools out there to help businesses get their data in the right order for AI.

What’s more, at Ipsos Jarmany, we have the data engineering and AI expertise to help you apply those tools, flesh out your data strategy, and get your data AI-ready to start maximising business growth and efficiencies. If that weren’t enough, we have the AI skills to build the ML models that will extract all the value you need from your data.

Today, we’re helping many companies successfully integrate AI into their businesses, making certain their data is up to the job.

If you’d like to know more about data or AI please get in touch.

 

Data-driven decision-making, made easy with Ipsos Jarmany

Demystifying Data Governance

In order to address these challenges and circumnavigate the severe consequences of non-compliance, businesses must implement a robust data governance framework. And, if you’re striving to become a truly data-driven organisation, then having a comprehensive data governance strategy in place is non-negotiable.

 

What is Data Governance, and why is it important?

Let’s start at the beginning; what actually is data governance?
According to The Data Governance Institute

“Data Governance is a system of decision rights and accountabilities for information-related processes, executed according to agreed-upon models which describe who can take what actions with what information, and when, under what circumstances, using what methods.”

In simpler terms, data governance establishes the foundation for collecting, managing, and releasing data for improved quality, accessibility and use. This includes defining the policies, standards, architecture, decision-making structure and issues resolutions process around your data.

Aside from the fore-mentioned repercussions of not having a defined data governance framework in place, it’s essential to formulate a data governance strategy so you can achieve the following:

  • Data quality and accuracy
  • Compliance and Risk Management
  • Efficiency and Cost Reduction
  • Data Privacy and Security
  • Data Monetisation

 

 

Creating & Implementing your Data Governance Strategy

To help guide you on your own data governance strategy, we’ve broken this down in to 5 steps.

Step 1: Define Your Goals and Objectives
When implementing a data governance strategy, it is important to first outline the goals and expected outcome of the strategy. Ask yourself, what are you trying to achieve, which internal stakeholders need to feed into this strategy, and what success looks like for your organisation.

You should also consider what people, processes and technologies will sit at the core of your strategy, and how can you ensure your strategy is adaptable to change and can pivot based on changing business factors.

Step 2: Secure stakeholder buy-in
Data governance initiatives require collaboration and engagement from various business units. Therefore, a key part of your planning process should be ensuring alignment with internal stakeholders.

Make sure you’re involving key stakeholders from across the organisation, including business users, IT teams, legal, compliance, and executive leadership, and then mutually agreeing on what a ‘good’ data governance strategy looks like.

Gaining their buy-in is important so that relevant stakeholders are onboard with why a robust data governance strategy is needed and what the benefits are, so you can ensure continuous collaboration.

Step 3: Establish Roles and Responsibilities
Your next step is to establish who will feed into your data governance initiative. Take time to outline what the required roles are, and what responsibilities and scope for these roles will be. This could include roles such as data engineers, data analysts, and data architects. You can then evaluate if the personnel and skillset already exists within your organisation, or if you need to up-skill your workforce through training or collaborating with external partners.

It’s important to remember that in a data-driven organisation everyone is responsible for data governance. It’s not just down to the ‘data experts’ to oversee data governance – essentially any function who touches data needs to be aware of data governance practices. This could include marketing, sales or finance functions.

Step 4: Evaluate Your Technologies
Once you’ve outlined the roles and responsibilities required to implement and manage your data governance function, you then need to review whether you have the technological capabilities to fulfill these requirements efficiently.

These tools should support you with data collection, data storage, data analysis, data architecture and data management, amongst other capabilities. Evaluate what tools you already have at your disposal so you can then decipher any gaps in your technology stack.

Step 5: Outline Your Processes
The last stage in defining your data governance strategy is to develop comprehensive policies and guidelines that cover data classification, data access controls, data retention, data quality standards, and privacy requirements. You should ensure these policies are aligned with relevant regulations and industry best practices, and are easily accessible to stakeholders around the business.

Documenting these processes will ensure that, regardless of who is actioning certain aspects of your governance strategy, the outcome will always be the same. There should be no human error or user discrepancy.

Data governance is an ongoing process and so your strategy should evolve over time to stay in line with your business’s goals and objectives; it needs to be able to evolve as your data does too. As such, you should consider your process for evaluation and continuous improvement so you can be sure that your plan is future-proofed.

Once you’ve worked your way through these 5 steps you’re ready to get going with implementing your data governance strategy.

 

Getting started

Data governance is a critical aspect of modern organisations. By implementing a robust data governance framework, businesses can establish trust in their data, ensure compliance with regulations, and drive efficiency.

Furthermore, effective data governance allows organisations to unlock the full potential of their data assets, leading to improved decision-making, enhanced customer experiences, and sustainable business growth.

Prioritising data governance is not just a compliance requirement, but a strategic imperative for organisations seeking to thrive in the data-driven era.

If you’d like to find out more about creating and implementing a data governance strategy, or if you’re looking for external support to help kickstart data governance in your organisation, then reach out to the team at Ipsos Jarmany today.

Discover how to ace your omnichannel analytics with our latest ebook

Mastering Marketing Mix Modelling: Your Roadmap To Success

Marketing Mix Modelling (MMM) is the practice of analysing an organisations multi-channel marketing efforts to establish which elements are driving the most success. In turn, this enables you to better allocate resources based on the channels that are driving the most ROI, so you can continue to optimise performance and invest the right level of spend.

Marketing mix models use aggregated data to determine trends in seasonality and then predict channel attribution. These types of statistical models have been used historically, however they were phased out due to the rise of individual tracking. 

We’ve now seen a return of MMM’s due to changes in legislation, such as GDPR, 3rd party cookies and Googles privacy sandbox, which has reduced the ability to use individual tracking, forcing organisations to look for alternative ways to track and predict channel performance and attribution.

Marketing Mix Models are designed to answer questions like:

  • Am I spending money in the right places?
  • Am I overspending in some channels?
  • How much money should I be spending?
  • How should I split my marketing investment across the marketing mix?
  • How much money will I make in the next quarter?
  • What is the point of diminishing return?

 

Getting the most out of your marketing mix model

In order to achieve these insights, it’s important to feed the model with high quality data so you can obtain the optimal output. You need to consider factors such as:

  • How much marketing spend do I have access to?
  • Are there other factors that will affect revenue? Such as stock shortages, changes in pricing or macroeconomic factors?
  • What type of data do you have at your disposal? For example sales data, marketing spend data, stock data.
  • How much data do you have access to, and how granular is this data? For example do you have 1 years worth of data, or 8 years worth of data? The more data the better.
  • What are your goals? E.g. do you want the model to optimise ROI, or generate the most awareness, or drive the most traffic to your website? MMM can only prioritise one goal at a time.
  • What marketing channels are within your remit?

Once you’ve input the data and parameters into the MMM, the model will then output:

  • A selection of different combinations of marketing spend, based on your goals and budget
  • The diminishing return curves for each channel based on current data
  • The decay rates for each channel
  • Current vs optimised return / revenue estimation
  • Current channel spend vs suggested optimised spend

 

Benefits of Marketing Mix Modelling

As we’ve touched upon earlier in this blog, marketing mix models can bring a wealth of benefits to your business, mainly by steering your decision-making towards investing in the perfect blend of marketing channels to drive the optimal output. However, further to this you can also benefit from:

  1. A clear foundation for ongoing data-driven insights

Marketing mix modelling provides a quantitative foundation for decision-making, rather than relying on gut instinct or intuition. It also enables you to regularly analyse your marketing investment, performance and ROI over time, so you can uncover trends and patterns across your marketing mix.

  1. Greater level of insights

Marketing Mix Models also enables you to dig deeper into your performance, so you can understand how your multi-channel marketing campaigns work together, which channels drive the highest attribution, how seasonality impacts your campaigns, customer channel preference and changing user behaviour. 

This level of insights means you can tailor your marketing campaigns based on different audience segments – for example if one type of demographic typically has a higher conversion that can be attributed to one marketing channel, and a different demographic typically responds more positively to another channel, you can use MMM to create the perfect blend of activity based on the value of each audience segment.

  1. Capability for predictive analytics

By examining the results of previous marketing campaigns and their influence on business outcomes, businesses can enhance their ability to predict future success more accurately. This predictive capability aids in making well-informed decisions and crafting effective marketing strategies, enabling businesses to optimize their decision-making process and develop impactful marketing strategies.

 

Challenges of Marketing Mix Modelling

Whilst there are many benefits to leveraging marketing mix modelling, it does not come without it’s challenges, and it’s important to carefully consider these before you begin using your MMM. These challenges include:

  1. Getting your data right in the first place

The first hurdle in setting up your marketing mix model is ensuring that the data you’re inputting is high quality, clean and in the right format. You also need at least 3 years of data in order for the model to churn out recommendations – anything less than this would be an unreliable output, so ensuring that you have a data collection and data cleaning process in place is critical. Ask yourself if you have the right data systems in place, from data warehouses and lakes to data visualisation.

  1. Complexity of the data

With so many different factors to consider, it can be difficult to ensure that the analysis is accurate and comprehensive, and different industries may require different approaches for analysis. Therefore, before you start using your marketing mix model, you need to ensure that you’re equipped to handle this complex data with varying parameters and limitations that may impact your models output. 

  1. Ongoing management of the MMM

Marketing mix models are a complex form of statistical analysis, and given that they are steering you on financial investments for your marketing activity, you need to be 100% confident that the data you’re inputting, the model performance, and the output delivered by the model are all performing seamlessly. 

It’s also natural for an organisation to alter their level of investment, marketing channel preference and goals on a frequent basis, so you need to ensure the model is set up to satisfy your ever-changing goals. This requires a specialist skillset from analysts who have experience working with marketing mix models, and can be a challenge if you don’t have this skill set available internally.

 

How Ipsos Jarmany can support you

At Ipsos Jarmany, we build marketing mix models that combine the power of machine learning and statistical analysis to uncover the best way to invest your marketing resources. These models can be tailored to your businesses goals, marketing budget and parameters.

Once your data has been inputted into the model, it will run approximately 2000 times, each time changing the spend and the channels to maximise ROI. Our model will then output the top 100 optimised spends, based on your current / defined spending patterns to show the variety of different approaches that can be taken to solve the same problem. We’ll then work with you, and your business knowledge, to select the option that is best suited to your organisation.

Further to this, our model feeds the outputs into an interactive Power BI report so you can visualise the optimal approach, whilst also giving you the ability to alter the spend for each channel to review how this impacts return, decay curves and other factors.

If you’d like to find out more about how you can use marketing mix modelling to uncover the best way to allocate your marketing spend, or if you’d like to see a demo of our model, then reach out to the team today.

Data-driven decision-making, made easy with Jarmany

Choosing The Right Data Analytics Agency: 5 Key Factors To Consider

To address this challenge, many businesses are opting to partner with external organisations that can fill this skills gap. By doing so, they gain access to expertise that helps them unravel the true narrative hidden within their data, and generate valuable insights that drive tangible outcomes – all without the burden of sourcing and training new talent.

However, selecting the right data analytics service provider requires careful consideration and thoughtful deliberation. With a multitude of agencies in the market, it is crucial to strategically evaluate various factors to ensure a mutually beneficial partnership that aligns with your business needs and supports your goals, both now and in the future.

In this blog, we will explore 5 of the key factors that you need to consider when choosing a data analytics service provider.

Let’s get to it.

#1 Assessing Internal Capabilities

On your journey to identifying the right data analytics partner for your business, the absolute first thing you need to do is assess your internal capabilities. This starting point allows you to identify what, if any, capabilities you can manage internally, and therefore specifically which areas you need to outsource. Ask yourself what talent, tools and technology exists within your current team and infrastructure, and whether current bandwidth permits your team to fulfil any of your businesses data needs.

This will help direct you towards either finding an end-to-end agency whose capabilities span a wide area, or a specialist agency who can simply bolster your internal skillset.

 

#2 Services Offered

Once you have established the level of support you need from an external agency, you can then match these requirements to the service offering of each agency.

Collate a list of agencies that are in the line-up and work through your checklist of technical and analytical capabilities you’re specifically looking for from an agency. This will help you to identify the agencies whose offering aligns with your needs vs the agencies whose specialism and services aren’t well-matched.

Whilst it can be easy to simply consider your current requirements, it’s imperative that you also consider what type of support you may need in the future, so you can opt for an agency that can provide long-term support.

 

#3 Expertise and Experience

Now that you have established the breadth of services provided from each agency, and disregarded those that are not closely aligned to your requirements, you need to consider the extent of experience and expertise they can provide for each area within their service offering. For example, an agency may claim that they have experience in AI and building predictive forecasting models, but to what extent?

Are they well equipped with the right expertise internally to give you full confidence in their ability to perform? And, what case studies, testimonials and demos can each agency provide to back this up?

Industry experience is also a critical factor to consider here. Does their experience specifically relate to your industry, demonstrating that they can not only deliver on the project, but can also provide an in-depth understanding of the meaning behind your data and as well as context behind the insights?

Similarly, you should consider their level of experience with companies that are similar in size and scale to your own. Do they usually partner with smaller scale businesses, or are they well-versed in working with larger scale organisations and can therefore appreciate the complexity of internal processes and varying requirements of stakeholders that sit across the business.

After all, the partnership will look vastly different with a smaller company vs a larger global company with numerous project streams and vested stakeholders.

 

#4 Analytical Capabilities

Effective data analytics relies on advanced analytical capabilities, and it’s important that the analytical capabilities of the data service provider match your analytical requirements. It’s therefore imperative that you assess the provider’s proficiency in certain areas of analytics, such as:

  • Statistical analysis
  • Data modelling
  • Visualisation tools
  • Cloud infrastructure
  • Data mining
  • Machine learning and AI
  • Advanced analytics

Whilst you need to evaluate their core skills, the agency’s analytical capabilities should span far beyond basic reporting.

Can they provide sophisticated insights, spot patterns and trends in your data and provide business and industry context to further aid strategic decision-making?

Further to this, you needed to establish each service provider’s level of technical capabilities. This exceeds standard analytics, as it’s their ability to build and maintain the infrastructure that sits behind your data.

 

#5 Tools and Technology

Another key consideration is the tools and technology that the agency uses for data analytics. They should be proficient in working with the latest data analytics software, programming languages, machine learning frameworks, and visualisation tools. Are they ahead of the curve when it comes to new technologies within the data and analytics industry?

It’s also essential to ensure their platform expertise is compatible and aligns with the tech stack you’re already using internally. For example, if your organisation currently uses Microsoft products, and are now in need of a business intelligence solution, then Microsoft Power BI is probably the most suitable tool for you to use. Selecting an agency who only specialise in Tableau may not be the optimal match in this case. You also need to take into account any pre-established preferences you have regarding software stacks in order to identify if this aligns with the agency’s software capabilities.

 

Summing Up

So, there we have it, 5 key areas you need to consider when assessing which data analytics service provider is right for you and your organisation. However, this is just scratching the surface – choosing a data analytics service provider is no small feat, and so there are many more factors you need to consider when searching for the right agency to meet your challenges and build a long-term partnership with.

We have created an in-depth guide outlining the main 12 considerations – think of it as the core criteria you should be using to guide you on your search.

Download the eBook here to access this intel, or alternatively feel free to get in touch with us if you’d like to discuss how we can support you with your data needs.

Data-driven decision-making, made easy with Ipsos Jarmany

AI and Ecommerce – A Powerful Partnership For Growth

Ecommerce is older than the internet. Yes, we scratched our heads over that one too, but it’s a fact that eCommerce started in the 1970s with teleshopping, and the internet didn’t officially celebrate its first birthday until January 1, 1983. Still the history of eCommerce and the internet is closely connected, with the web providing the technologies for eCommerce to thrive.

In this blog, we’ll bring the story of eCommerce up to date, highlighting the challenges that eCommerce professionals face today in a crowded marketplace, and how AI can help you overcome these challenges to increase sales.We’ll also share our Top 5 AI benefits and flag up a couple of techniques that you can discuss with your AI team to immediately boost your eCommerce performance

 

Recent Ecommerce History

eCommerce may have been around for 40-something years, but it’s only in recent times that people have really embraced it. Sure, the internet was a boost, but it was the pandemic that caused the current explosion, driving 40% of UK shoppers to spend more online by March 2020, with this figure rising to 75% by February 2021.

What’s more, there’s certainly been no going back to the way things were. More than a quarter of UK consumers stated they expected to shift more of their shopping online post pandemic and four out of every five UK consumers today are now digital buyers.

 

The Challenges Of Ecommerce Today

No-one would question that the size of the eCommerce pie is bigger than ever; however, the leap in the number of businesses trying to get a slice of that pie has grown by just as much.

A quick look at the Office for National Statistics’ figures shows 79,000 more eCommerce websites in 2021 registered in the UK versus 2020. One estimate puts the UK at 1.24 million eCommerce websites today in 2023, second only to the United States.

 

How AI Can Help Ecommerce Overcome Its Challenges

With so much negativity around AI right now, it’s refreshing to see how gun-ho the whole eCommerce world is about this technology. But who wouldn’t be happy if AI could generate 20% additional eCommerce revenue and reduce costs by 8% in today’s tough business climate?

 

The Top 5 Applications For AI in Ecommerce

So where does AI fit into eCommerce? Well, AI helps companies optimise the customer experience and increase operational efficiencies end-to-end.

Here are 5 ways that AI can transform your eCommerce operation:

#1 Personalized product recommendations

It’s what digital buyers expect to see nowadays and can increase the ROI on your marketing spend by 5-8 times according to McKinsey. However, it’s something that would be too expensive to do manually for a large customer base.

Using AI, you can automate the personalization process using algorithms that accurately predict buying behaviour based on historical customer data to increase cart size and drive revenue.

#2 Smarter Searches

In the same way AI can personalize recommendations, it can do the same for your searches. It means your eCommerce website can tailor search results based on criteria like a user’s previous searches and purchases. Hence, if a customer types in men’s clothes, the results will include brands the customer has previously bought.

In addition, using AI-based natural language processing algorithms, your site’s search engine can pick out what phrases and words are often used. This way, it doesn’t matter if the searcher doesn’t type the exact product name, and uses jargon instead, like blow dryer instead of hair dryer.

#3 Smart Logistics and Warehousing

Stock outs are your worst nightmare, but overstocks are little better because of the associated costs. The beauty of AI is that it can help you calculate the right amount of product that should be in stock at any given time.

Furthermore, when AI is used in logistics, it can help your company analyse existing routing for optimisation. Going a step further, the predictive capabilities of AI can also help with your basic warehouse maintenance, tracking the performance of the machines supporting your warehouse, so you can plan the most advantageous maintenance schedules. 

#4 Demand Forecasting and Dynamic Pricing

The two go together with demand forecasting and dynamic pricing helping to improve your pricing strategies. In this case, AI analyses market conditions, spots pricing gaps and recommends strategies to realise the opportunities. 

There are different AI algorithms to support different pricing strategies. For example, eCommerce websites can access algorithms to maximise revenues, minimise customer churn rates, increase loyalty and beat competitors on price.

#5 AI Assistants and Chatbots

Aren’t they the same thing? The boundary separating the two may be a bit blurry, but really, they deliver assistance in different ways: Virtual Assistants handle multiple kinds of tasks, and Chatbots tend to engage more with customers.

Chatbots enable conversational commerce and can engage passive visitors through natural language understanding that launches conversations to learn people’s requirements and to guide them to relevant products. Virtual assistants can do things like handle data-sensitive tasks and provide customer support vial phone, email or chat etc.

 

Your Top AI Ecommerce Techniques

Ratcheting up the techie side of this blog a bit, we wanted to share some examples of AI techniques that are relevant, and you can use. Your AI team will probably be familiar with them too.

Logistic Regression

It’s a kind of statistical analysis for predicting the likelihood of a binary outcome. For eCommerce, it can predict the probability of a customer making a purchase based on their answer to the question, given parameters x, y and z would a promotion get them to buy?

Clustering

Here the algorithm organises objects into groups based on multiple variables. It can group customers based on purchasing patterns; bunch physical stores together based on performance; and bundle products together based on the same criterion. The process takes you to a deeper level of segmentation, identifying new collections of like-minded people to reach out to.

Sentiment Analysis

A classification algorithm, sentiment analysis reveals subjective opinions or feelings collected from many sources. You can use it for multiple objectives, including market research, precision targeting, product feedback and deeper product analytics. It can also boost customer loyalty, through improved customer service, helping agents resolve customer queries quicker.

 

The View From Ipsos Jarmany

At Ipsos Jarmany, we work closely with eCommerce professionals looking to improve the performance of their websites. We recently added a section dedicated to AI on our eCommerce solutions page to provide some insights that you may find helpful.

You may also find value in our The 5 Best Strategies to Boost eCommerce Sales eBook and our Ecommerce Intelligence Demo which demonstrates what you can do with the right tools in place to keep track of your eCommerce performance.

What’s clear today is that eCommerce offers great opportunities but presents significant challenges; and that AI is helping businesses overcome the hurdles to make the most of this rapidly growing sales channel.

If this blog has triggered some questions, thoughts or ideas, speak to us today and let us see how we can get your eCommerce business on the path to a best-practice AI adoption.

To learn more about how AI can improve the performance of your eCommerce get in touch with Ipsos Jarmany today and have an honest conversation with our AI experts.

Data-driven decision-making, made easy with Ipsos Jarmany

 

The Frontier Model Forum; What Is It and How Will It Help Regulate AI?

Engrained into our everyday lives through technologies such as facial recognition, digital assistants, and smart cars, the era of AI is well and truly upon us, and there are no signs of its substantial growth stagnating. In fact, the AI market size is projected to reach $407 billion by 2027; representing an annual growth rate of 37.3% from 2023 to 2030 [1].

Alongside this, businesses are also recognising the potential of AI and are increasingly leveraging it to streamline their operations, enhance data-driven decision making through data analysis, automate repetitive tasks and improve customer services. To provide some context to this, according to Gov.uk, in the UK alone almost half a million businesses had adopted at least one AI technology in their operations at the start of 2022 [2]. 

And yet, whilst the AI industry has continued to advance and adoption has increased, there has been little development in the mitigation of AI-associated risks, regardless of the growing concerns about cyber-security and regulatory compliance of artificial intelligence within organisations [3]. 

Now, don’t get us wrong, we’re not convinced we’re going to have an iRobot situation on our hands any time soon, however it cannot be denied that there are potential risks associated with the use of AI technology, and an urgent need for regulation to address these concerns.

This is where the Frontier Model Forum comes in to play…

Want to become more data-driven? Download our ebook today to find out how.

Introducing The Frontier Model Forum 

The Frontier Model Forum (FMF) is a newly announced partnership aimed at promoting the responsible and safe development of AI models. 

Formed by Microsoft, Google, OpenAI and Anthropic, this new industry body has set out to cover four core objectives: 

  1. Advancing AI safety research
  2. Identifying best practices
  3. Collaborating with policymakers, academics, civil society and companies
  4. Supporting efforts to develop applications that can help meet society’s greatest challenges 

Whilst these four tech-giants have founded the FMF, their aim is to establish an Advisory Board by inviting member organisations to contribute towards its strategy and priorities. Organisations that wish to join the forum will need to meet the following membership criteria: 

  • Develop and deploy frontier models (large-scale ML models that are capable of performing an extensive range of tasks that go beyond what is currently possible with even the most advanced existing models)
  • Demonstrate strong commitment to frontier model safety 
  • Are prepared to contribute towards advancing the FMF’s efforts 

The aim of the Frontier Model Forum is then to leverage the collective technical and operational knowledge of its member companies to benefit the overall AI ecosystem. This includes driving progress in technical evaluations and benchmarks, as well as creating a public repository of solutions to promote industry best practices and standards. Through these collaborative efforts, the Forum seeks to contribute to the advancement and development of the AI industry as a whole. 

“Companies creating AI technology have a responsibility to ensure that it is safe, secure, and remains under human control. This initiative is a vital step to bring the tech sector together in advancing AI responsibly and tackling the challenges so that it benefits all of humanity.” Brad Smith, Vice Chair & President, Microsoft.

 

Our thoughts

In our perspective, AI presents a range of risks – job displacement, security & privacy concerns, bias and discrimination to name a few. However, we believe the primary concerns related to AI revolves around the absence of regulation, and the lack of clear guidelines. This is why we consider the launch of the Frontier Model Forum to be a highly encouraging and indispensable development which will help to mitigate risks, establish industry-recognised standards and reduce potential negative social impact. 

By bringing together experts and industry leaders, it will foster a collective effort to:

  • Reduce potential negative impact
  • Safeguard society’s interest
  • Ensure the responsible and ethical use of AI 

The Frontier Model Forum has the potential to shape the future of AI in a way that minimizes risks, enhances transparency, and creates a more secure and accountable environment for AI development and deployment, so we can continue to reap the benefits made possible by AI and unveil further progress in the field of AI, all whilst effectively managing the associated risks. 

Discover how to ace your omnichannel analytics with our latest ebook

  1. forbes.com/advisor/business/ai-statistics/
  2. https://www.gov.uk/government/publications/ai-activity-in-uk-businesses/ai-activity-in-uk-businesses-executive-summary
  3. https://www.mckinsey.com/capabilities/quantumblack/our-insights/the-state-of-ai-in-2022-and-a-half-decade-in-review