12 Essential Steps To Be AI-Ready

The Truth About AI

True, AI can significantly boost performance and revenues by transforming organisations in lots of different ways. From how they engage with customers to how they recruit people and manage their finances. In fact, it’s predicted to boost GDP in the UK by 14% come 2030.

But honestly speaking, AI is anything but plug-and-play. And maybe that is why we see an astonishing 85% of AI projects fail to deliver the expected business value.

Something doesn’t add up. You seemingly have the data and access to the AI models, so what’s wrong? Well, let’s go back to the data for a second—because that’s where AI projects normally go off the rails.

 

Don’t forget the data

Broadly speaking, either companies don’t have enough of it, aren’t using it in the right way, have major quality issues, or just don’t have the correct systems to store and warehouse the stuff. We see the same problems time and again.

When it comes to artificial intelligence, getting the foundations right is absolutely critical. AI isn’t a quick process and there isn’t a ‘one size fits all’ solution; this is a long-term strategic investment in your business which will improve over time. However, all this relies on the quality of your model inputs, namely data. If you’re not getting this right, you’re already setting yourself up for failure (and a lot of wasted time, effort and money).

In this blog we’re going to tell you how to prepare your data for AI success. With our 12 steps, you will be able to navigate AI with confidence and start reaping the full power of the technology for better business results.

Here we go:

 

#1 Data volumes

Generally, AI algorithms require significant volumes of data – we really can’t emphasise this enough. However, just how much will depend on the AI use case you’re focused on. One figure often referred to is the need for 10x as many rows (data points) as there are features (columns) in your data set. The baseline for the majority of successful AI projects is normally more than 1 million records for training purposes.

 

#2 Data history

Let’s say you want to use AI for demand forecasting or for marketing mix models. In this case, at Jarmany, we recommend having at least 3 years’ worth of data; otherwise, your model will just repeat the previous year’s outputs. It stands to reason that for AI to detect and predict events better than we can, it needs to work with loads of historical data to uncover the patterns and anomalies that we need it to.

 

#3 Data relevance

Depending on your use case, you’ll also need specific data sets for your algorithm. For example, marketing mix models aims to measure the impact of various marketing inputs on sales and market share, hence you’ll need data sets such as previous years’ sales, marketing performance and budget allocations.

 

#4 Data Quality

We’ve put this at #4 but maybe we should have put it at #1. It’s massive. If the quality of data you’re inputting into your AI model is poor, you can bet your chances that the AI models output will be poor.

In short, many companies face data quality issues, so there’s every chance your unsuccessful AI project will do nothing more than put a broader issue under the spotlight. Not a bad thing.

So, how do you go about achieving data quality? Essentially, you’re going to have to go through your data and ensure it doesn’t suffer from any of the following:

• Inconsistency
• Duplication
• Inaccuracy
• Outdatedness
• Irrelevancy
• Incompleteness
• Lack of governance

 

#5 Data Understanding

Whilst we place a massive emphasis on data quality (and rightly so), having a large volume of high-quality data doesn’t stand for much if you don’t have a solid understanding of your data. By this we mean understanding what the data relates to, what the data is telling you, and being able to identify patterns and trends, as well as spikes, dips and outliers in your performance.

Additionally, when it comes to data, it’s key that you have an understanding of what’s happening within the wider business so you can apply business context to the data. For example, if you’re seeing a dip in sales performance can this be attributed to seasonality, or perhaps a stock or distribution issue?

 

#6 Data labelling

This is pretty much as it sounds. You’re annotating your data, defining it as an image, text, video or audio, to help your learning model find “meaning” in the information. It’s important to remember that labelling—like the next step we’ll go on to talk about—should come after you’ve ensured the data quality. 

Labelling is essentially a manual step done by software engineers and the last thing you want is for an engineer to waste their time labelling duplicated, inaccurate or irrelevant data.

 

New call-to-action

 

#7 Data augmentation

Data augmentation is all about creating new or modified data from your existing sets to artificially increase the quantity of data and its value. 

By making small changes such as randomly changing words in text data, you’re not only increasing the data set but improving its quality, helping avoid “overfitting”, where your model aligns too closely to your original training data and struggles with new information.

 

#8 Data systems

For AI to work, you’re going to need the right data systems in place. The key essentials include loads of computing capacity, offering a mix of CPU and GPU processing, modern storage and warehousing, high bandwidth, low latency networking and security for your sensitive data.

That’s potentially a lot of investment, and therefore many companies are looking at cloud services to give them the systems they need at the right price. Leading cloud providers, including Microsoft, can provide you with AI data systems you require to get your AI project off the ground.

 

#9 Data privacy

Data privacy is more tightly controlled than ever and rightly so. Yet, as we know, AI needs tons of data to work, which amplifies your risk of privacy breaches occurring. Trust us, you need to take data privacy very seriously and invest in the tools and techniques to make sure your data comes with encryption, anonymisation and owner consent.

 

#10 Data governance

We touched on this earlier when we talked about data quality. The point we made then was that the correct data governance will boost your data quality, saving you time and money. What’s more, correct governance will ensure sensitive and confidential data is classified accordingly and deleted in line with the appropriate data retention schedule.

 

#11 Data People

Another key step that you need to consider on your journey to becoming AI-ready is data people; and there are three sides to this point.

Firstly, gaining internal buy-in from the key stakeholders within your business is critical to any AI project. These stakeholders need to share the same vision as you when it comes to what you’re trying to achieve with AI and how it can benefit the business. They need to understand the strengths and limitations of the AI model so that expectations are aligned. And, the only way you can ensure internal adoption is by getting stakeholder buy-in from the get-go.

Secondly, in any digital transformation project roughly 10% is based on having the right tech in place, and 90% is based on having the right people and skillsets in place. This may seem surprising, given the importance of having the right tech stack to handle your data and AI models, however that said, you really can’t afford to underestimate how important it is to have the right people in place too. It’s certainly not new news that there’s a skills gap in the industry right now, so in order to future-proof your AI strategy, you need to consider what skill sets you currently have within the business, identify areas where training and development is required, and establish at what point you may need to lean on external agencies for support.

Lastly, is data culture. It’s true, a lot of people are concerned that their jobs will become obsolete as a result of businesses adopting AI. In fact, 44% of employees are worried about the impact on AI on their jobs. Given this, fostering a strong data culture within your organisation should be a priority if you want to ensure internal adoption of your AI model, and offset the workforce anxiety associated with AI. If your workforce are invested in your AI strategy, then this will set you off with a solid foundation for achieving AI success.

 

#12 Data automation

Now that we’re coming to the end, and you’re clear on what you need to do, we’re going to put the idea of automation on the table. It makes a lot of business sense to remove the human intervention here. To use an example, AI Builder, as part of Microsoft Power Platform, offers a turnkey solution for using Microsoft AI through a point-and-click experience. It’s being used by many large enterprises for hand off, error free AI models.

 

Data happiness

No doubt, that feels like a long list, and you’re right, it is. But as you’d expect there are tools out there to help businesses get their data in the right order for AI.

What’s more, at Jarmany, we have the data engineering and AI expertise to help you apply those tools, flesh out your data strategy, and get your data AI-ready to start maximising business growth and efficiencies. If that weren’t enough, we have the AI skills to build the ML models that will extract all the value you need from your data.

Today, we’re helping many companies successfully integrate AI into their businesses, making certain their data is up to the job.

If you’d like to know more about data or AI please get in touch.

 

New call-to-action

Demystifying Data Governance

In order to address these challenges and circumnavigate the severe consequences of non-compliance, businesses must implement a robust data governance framework. And, if you’re striving to become a truly data-driven organisation, then having a comprehensive data governance strategy in place is non-negotiable.

What is Data Governance, and why is it important?

Let’s start at the beginning; what actually is data governance?
According to The Data Governance Institute

“Data Governance is a system of decision rights and accountabilities for information-related processes, executed according to agreed-upon models which describe who can take what actions with what information, and when, under what circumstances, using what methods.”

In simpler terms, data governance establishes the foundation for collecting, managing, and releasing data for improved quality, accessibility and use. This includes defining the policies, standards, architecture, decision-making structure and issues resolutions process around your data.

Aside from the fore-mentioned repercussions of not having a defined data governance framework in place, it’s essential to formulate a data governance strategy so you can achieve the following:

  • Data quality and accuracy
  • Compliance and Risk Management
  • Efficiency and Cost Reduction
  • Data Privacy and Security
  • Data Monetisation

 

 

Creating & Implementing your Data Governance Strategy

To help guide you on your own data governance strategy, we’ve broken this down in to 5 steps.

Step 1: Define Your Goals and Objectives
When implementing a data governance strategy, it is important to first outline the goals and expected outcome of the strategy. Ask yourself, what are you trying to achieve, which internal stakeholders need to feed into this strategy, and what success looks like for your organisation.

You should also consider what people, processes and technologies will sit at the core of your strategy, and how can you ensure your strategy is adaptable to change and can pivot based on changing business factors.

Step 2: Secure stakeholder buy-in
Data governance initiatives require collaboration and engagement from various business units. Therefore, a key part of your planning process should be ensuring alignment with internal stakeholders.

Make sure you’re involving key stakeholders from across the organisation, including business users, IT teams, legal, compliance, and executive leadership, and then mutually agreeing on what a ‘good’ data governance strategy looks like.

Gaining their buy-in is important so that relevant stakeholders are onboard with why a robust data governance strategy is needed and what the benefits are, so you can ensure continuous collaboration.

Step 3: Establish Roles and Responsibilities
Your next step is to establish who will feed into your data governance initiative. Take time to outline what the required roles are, and what responsibilities and scope for these roles will be. This could include roles such as data engineers, data analysts, and data architects. You can then evaluate if the personnel and skillset already exists within your organisation, or if you need to up-skill your workforce through training or collaborating with external partners.

It’s important to remember that in a data-driven organisation everyone is responsible for data governance. It’s not just down to the ‘data experts’ to oversee data governance – essentially any function who touches data needs to be aware of data governance practices. This could include marketing, sales or finance functions.

Step 4: Evaluate Your Technologies
Once you’ve outlined the roles and responsibilities required to implement and manage your data governance function, you then need to review whether you have the technological capabilities to fulfill these requirements efficiently.

These tools should support you with data collection, data storage, data analysis, data architecture and data management, amongst other capabilities. Evaluate what tools you already have at your disposal so you can then decipher any gaps in your technology stack.

Step 5: Outline Your Processes
The last stage in defining your data governance strategy is to develop comprehensive policies and guidelines that cover data classification, data access controls, data retention, data quality standards, and privacy requirements. You should ensure these policies are aligned with relevant regulations and industry best practices, and are easily accessible to stakeholders around the business.

Documenting these processes will ensure that, regardless of who is actioning certain aspects of your governance strategy, the outcome will always be the same. There should be no human error or user discrepancy.

Data governance is an ongoing process and so your strategy should evolve over time to stay in line with your business’s goals and objectives; it needs to be able to evolve as your data does too. As such, you should consider your process for evaluation and continuous improvement so you can be sure that your plan is future-proofed.

Once you’ve worked your way through these 5 steps you’re ready to get going with implementing your data governance strategy.


Getting started

Data governance is a critical aspect of modern organisations. By implementing a robust data governance framework, businesses can establish trust in their data, ensure compliance with regulations, and drive efficiency.

Furthermore, effective data governance allows organisations to unlock the full potential of their data assets, leading to improved decision-making, enhanced customer experiences, and sustainable business growth.

Prioritising data governance is not just a compliance requirement, but a strategic imperative for organisations seeking to thrive in the data-driven era.

If you’d like to find out more about creating and implementing a data governance strategy, or if you’re looking for external support to help kickstart data governance in your organisation, then reach out to the team at Jarmany today.

New call-to-action

Mastering Marketing Mix Modelling: Your Roadmap To Success

Marketing Mix Modelling (MMM) is the practice of analysing an organisations multi-channel marketing efforts to establish which elements are driving the most success. In turn, this enables you to better allocate resources based on the channels that are driving the most ROI, so you can continue to optimise performance and invest the right level of spend.

Marketing mix models use aggregated data to determine trends in seasonality and then predict channel attribution. These types of statistical models have been used historically, however they were phased out due to the rise of individual tracking. 

We’ve now seen a return of MMM’s due to changes in legislation, such as GDPR, 3rd party cookies and Googles privacy sandbox, which has reduced the ability to use individual tracking, forcing organisations to look for alternative ways to track and predict channel performance and attribution.

Marketing Mix Models are designed to answer questions like:

  • Am I spending money in the right places?
  • Am I overspending in some channels?
  • How much money should I be spending?
  • How should I split my marketing investment across the marketing mix?
  • How much money will I make in the next quarter?
  • What is the point of diminishing return?

 

Getting the most out of your marketing mix model

In order to achieve these insights, it’s important to feed the model with high quality data so you can obtain the optimal output. You need to consider factors such as:

  • How much marketing spend do I have access to?
  • Are there other factors that will affect revenue? Such as stock shortages, changes in pricing or macroeconomic factors?
  • What type of data do you have at your disposal? For example sales data, marketing spend data, stock data.
  • How much data do you have access to, and how granular is this data? For example do you have 1 years worth of data, or 8 years worth of data? The more data the better.
  • What are your goals? E.g. do you want the model to optimise ROI, or generate the most awareness, or drive the most traffic to your website? MMM can only prioritise one goal at a time.
  • What marketing channels are within your remit?

Once you’ve input the data and parameters into the MMM, the model will then output:

  • A selection of different combinations of marketing spend, based on your goals and budget
  • The diminishing return curves for each channel based on current data
  • The decay rates for each channel
  • Current vs optimised return / revenue estimation
  • Current channel spend vs suggested optimised spend

 

Benefits of Marketing Mix Modelling

As we’ve touched upon earlier in this blog, marketing mix models can bring a wealth of benefits to your business, mainly by steering your decision-making towards investing in the perfect blend of marketing channels to drive the optimal output. However, further to this you can also benefit from:

  1. A clear foundation for ongoing data-driven insights

Marketing mix modelling provides a quantitative foundation for decision-making, rather than relying on gut instinct or intuition. It also enables you to regularly analyse your marketing investment, performance and ROI over time, so you can uncover trends and patterns across your marketing mix.

  1. Greater level of insights

Marketing Mix Models also enables you to dig deeper into your performance, so you can understand how your multi-channel marketing campaigns work together, which channels drive the highest attribution, how seasonality impacts your campaigns, customer channel preference and changing user behaviour. 

This level of insights means you can tailor your marketing campaigns based on different audience segments – for example if one type of demographic typically has a higher conversion that can be attributed to one marketing channel, and a different demographic typically responds more positively to another channel, you can use MMM to create the perfect blend of activity based on the value of each audience segment.

  1. Capability for predictive analytics

By examining the results of previous marketing campaigns and their influence on business outcomes, businesses can enhance their ability to predict future success more accurately. This predictive capability aids in making well-informed decisions and crafting effective marketing strategies, enabling businesses to optimize their decision-making process and develop impactful marketing strategies.

 

Challenges of Marketing Mix Modelling

Whilst there are many benefits to leveraging marketing mix modelling, it does not come without it’s challenges, and it’s important to carefully consider these before you begin using your MMM. These challenges include:

  1. Getting your data right in the first place

The first hurdle in setting up your marketing mix model is ensuring that the data you’re inputting is high quality, clean and in the right format. You also need at least 3 years of data in order for the model to churn out recommendations – anything less than this would be an unreliable output, so ensuring that you have a data collection and data cleaning process in place is critical. Ask yourself if you have the right data systems in place, from data warehouses and lakes to data visualisation.

  1. Complexity of the data

With so many different factors to consider, it can be difficult to ensure that the analysis is accurate and comprehensive, and different industries may require different approaches for analysis. Therefore, before you start using your marketing mix model, you need to ensure that you’re equipped to handle this complex data with varying parameters and limitations that may impact your models output. 

  1. Ongoing management of the MMM

Marketing mix models are a complex form of statistical analysis, and given that they are steering you on financial investments for your marketing activity, you need to be 100% confident that the data you’re inputting, the model performance, and the output delivered by the model are all performing seamlessly. 

It’s also natural for an organisation to alter their level of investment, marketing channel preference and goals on a frequent basis, so you need to ensure the model is set up to satisfy your ever-changing goals. This requires a specialist skillset from analysts who have experience working with marketing mix models, and can be a challenge if you don’t have this skill set available internally.

 

How Jarmany can support you

At Jarmany, we build marketing mix models that combine the power of machine learning and statistical analysis to uncover the best way to invest your marketing resources. These models can be tailored to your businesses goals, marketing budget and parameters.

Once your data has been inputted into the model, it will run approximately 2000 times, each time changing the spend and the channels to maximise ROI. Our model will then output the top 100 optimised spends, based on your current / defined spending patterns to show the variety of different approaches that can be taken to solve the same problem. We’ll then work with you, and your business knowledge, to select the option that is best suited to your organisation.

Further to this, our model feeds the outputs into an interactive Power BI report so you can visualise the optimal approach, whilst also giving you the ability to alter the spend for each channel to review how this impacts return, decay curves and other factors.

If you’d like to find out more about how you can use marketing mix modelling to uncover the best way to allocate your marketing spend, or if you’d like to see a demo of our model, then reach out to the team today.

New call-to-action

Choosing The Right Data Analytics Agency: 5 Key Factors To Consider

To address this challenge, many businesses are opting to partner with external organisations that can fill this skills gap. By doing so, they gain access to expertise that helps them unravel the true narrative hidden within their data, and generate valuable insights that drive tangible outcomes – all without the burden of sourcing and training new talent.

However, selecting the right data analytics service provider requires careful consideration and thoughtful deliberation. With a multitude of agencies in the market, it is crucial to strategically evaluate various factors to ensure a mutually beneficial partnership that aligns with your business needs and supports your goals, both now and in the future.

In this blog, we will explore 5 of the key factors that you need to consider when choosing a data analytics service provider.

Let’s get to it.

#1 Assessing Internal Capabilities

On your journey to identifying the right data analytics partner for your business, the absolute first thing you need to do is assess your internal capabilities. This starting point allows you to identify what, if any, capabilities you can manage internally, and therefore specifically which areas you need to outsource. Ask yourself what talent, tools and technology exists within your current team and infrastructure, and whether current bandwidth permits your team to fulfil any of your businesses data needs.

This will help direct you towards either finding an end-to-end agency whose capabilities span a wide area, or a specialist agency who can simply bolster your internal skillset.

#2 Services Offered

Once you have established the level of support you need from an external agency, you can then match these requirements to the service offering of each agency.

Collate a list of agencies that are in the line-up and work through your checklist of technical and analytical capabilities you’re specifically looking for from an agency. This will help you to identify the agencies whose offering aligns with your needs vs the agencies whose specialism and services aren’t well-matched.

Whilst it can be easy to simply consider your current requirements, it’s imperative that you also consider what type of support you may need in the future, so you can opt for an agency that can provide long-term support.

#3 Expertise and Experience

Now that you have established the breadth of services provided from each agency, and disregarded those that are not closely aligned to your requirements, you need to consider the extent of experience and expertise they can provide for each area within their service offering. For example, an agency may claim that they have experience in AI and building predictive forecasting models, but to what extent?

Are they well equipped with the right expertise internally to give you full confidence in their ability to perform? And, what case studies, testimonials and demos can each agency provide to back this up?

Industry experience is also a critical factor to consider here. Does their experience specifically relate to your industry, demonstrating that they can not only deliver on the project, but can also provide an in-depth understanding of the meaning behind your data and as well as context behind the insights?

Similarly, you should consider their level of experience with companies that are similar in size and scale to your own. Do they usually partner with smaller scale businesses, or are they well-versed in working with larger scale organisations and can therefore appreciate the complexity of internal processes and varying requirements of stakeholders that sit across the business.

After all, the partnership will look vastly different with a smaller company vs a larger global company with numerous project streams and vested stakeholders.

#4 Analytical Capabilities

Effective data analytics relies on advanced analytical capabilities, and it’s important that the analytical capabilities of the data service provider match your analytical requirements. It’s therefore imperative that you assess the provider’s proficiency in certain areas of analytics, such as:

  • Statistical analysis
  • Data modelling
  • Visualisation tools
  • Cloud infrastructure
  • Data mining
  • Machine learning and AI
  • Advanced analytics

Whilst you need to evaluate their core skills, the agency’s analytical capabilities should span far beyond basic reporting.

Can they provide sophisticated insights, spot patterns and trends in your data and provide business and industry context to further aid strategic decision-making?

Further to this, you needed to establish each service provider’s level of technical capabilities. This exceeds standard analytics, as it’s their ability to build and maintain the infrastructure that sits behind your data.

#5 Tools and Technology

Another key consideration is the tools and technology that the agency uses for data analytics. They should be proficient in working with the latest data analytics software, programming languages, machine learning frameworks, and visualisation tools. Are they ahead of the curve when it comes to new technologies within the data and analytics industry?

It’s also essential to ensure their platform expertise is compatible and aligns with the tech stack you’re already using internally. For example, if your organisation currently uses Microsoft products, and are now in need of a business intelligence solution, then Microsoft Power BI is probably the most suitable tool for you to use. Selecting an agency who only specialise in Tableau may not be the optimal match in this case. You also need to take into account any pre-established preferences you have regarding software stacks in order to identify if this aligns with the agency’s software capabilities.

Summing Up

So, there we have it, 5 key areas you need to consider when assessing which data analytics service provider is right for you and your organisation. However, this is just scratching the surface – choosing a data analytics service provider is no small feat, and so there are many more factors you need to consider when searching for the right agency to meet your challenges and build a long-term partnership with.

We have created an in-depth guide outlining the main 12 considerations – think of it as the core criteria you should be using to guide you on your search.

Download the eBook here to access this intel, or alternatively feel free to get in touch with us if you’d like to discuss how we can support you with your data needs.

New call-to-action

AI and Ecommerce – A Powerful Partnership For Growth

Ecommerce is older than the internet. Yes, we scratched our heads over that one too, but it’s a fact that eCommerce started in the 1970s with teleshopping, and the internet didn’t officially celebrate its first birthday until January 1, 1983. Still the history of eCommerce and the internet is closely connected, with the web providing the technologies for eCommerce to thrive.

In this blog, we’ll bring the story of eCommerce up to date, highlighting the challenges that eCommerce professionals face today in a crowded marketplace, and how AI can help you overcome these challenges to increase sales.

We’ll also share our Top 5 AI benefits and flag up a couple of techniques that you can discuss with your AI team to immediately boost your eCommerce performance

New call-to-action

Recent Ecommerce History

eCommerce may have been around for 40-something years, but it’s only in recent times that people have really embraced it. Sure, the internet was a boost, but it was the pandemic that caused the current explosion, driving 40% of UK shoppers to spend more online by March 2020, with this figure rising to 75% by February 2021.

What’s more, there’s certainly been no going back to the way things were. More than a quarter of UK consumers stated they expected to shift more of their shopping online post pandemic and four out of every five UK consumers today are now digital buyers.

 

The Challenges Of Ecommerce Today

No-one would question that the size of the eCommerce pie is bigger than ever; however, the leap in the number of businesses trying to get a slice of that pie has grown by just as much.

A quick look at the Office for National Statistics’ figures shows 79,000 more eCommerce websites in 2021 registered in the UK versus 2020. One estimate puts the UK at 1.24 million eCommerce websites today in 2023, second only to the United States.

 

How AI Can Help Ecommerce Overcome Its Challenges

With so much negativity around AI right now, it’s refreshing to see how gun-ho the whole eCommerce world is about this technology. But who wouldn’t be happy if AI could generate 20% additional eCommerce revenue and reduce costs by 8% in today’s tough business climate?

 

The Top 5 Applications For AI in Ecommerce

So where does AI fit into eCommerce? Well, AI helps companies optimise the customer experience and increase operational efficiencies end-to-end.

Here are 5 ways that AI can transform your eCommerce operation:

#1 Personalized product recommendations

It’s what digital buyers expect to see nowadays and can increase the ROI on your marketing spend by 5-8 times according to McKinsey. However, it’s something that would be too expensive to do manually for a large customer base.

Using AI, you can automate the personalization process using algorithms that accurately predict buying behaviour based on historical customer data to increase cart size and drive revenue.

#2 Smarter Searches

In the same way AI can personalize recommendations, it can do the same for your searches. It means your eCommerce website can tailor search results based on criteria like a user’s previous searches and purchases. Hence, if a customer types in men’s clothes, the results will include brands the customer has previously bought.

In addition, using AI-based natural language processing algorithms, your site’s search engine can pick out what phrases and words are often used. This way, it doesn’t matter if the searcher doesn’t type the exact product name, and uses jargon instead, like blow dryer instead of hair dryer.

#3 Smart Logistics and Warehousing

Stock outs are your worst nightmare, but overstocks are little better because of the associated costs. The beauty of AI is that it can help you calculate the right amount of product that should be in stock at any given time.

Furthermore, when AI is used in logistics, it can help your company analyse existing routing for optimisation. Going a step further, the predictive capabilities of AI can also help with your basic warehouse maintenance, tracking the performance of the machines supporting your warehouse, so you can plan the most advantageous maintenance schedules. 

#4 Demand Forecasting and Dynamic Pricing

The two go together with demand forecasting and dynamic pricing helping to improve your pricing strategies. In this case, AI analyses market conditions, spots pricing gaps and recommends strategies to realise the opportunities. 

There are different AI algorithms to support different pricing strategies. For example, eCommerce websites can access algorithms to maximise revenues, minimise customer churn rates, increase loyalty and beat competitors on price.

#5 AI Assistants and Chatbots

Aren’t they the same thing? The boundary separating the two may be a bit blurry, but really, they deliver assistance in different ways: Virtual Assistants handle multiple kinds of tasks, and Chatbots tend to engage more with customers.

Chatbots enable conversational commerce and can engage passive visitors through natural language understanding that launches conversations to learn people’s requirements and to guide them to relevant products. Virtual assistants can do things like handle data-sensitive tasks and provide customer support vial phone, email or chat etc.

 

Your Top AI Ecommerce Techniques

Ratcheting up the techie side of this blog a bit, we wanted to share some examples of AI techniques that are relevant, and you can use. Your AI team will probably be familiar with them too.

Logistic Regression

It’s a kind of statistical analysis for predicting the likelihood of a binary outcome. For eCommerce, it can predict the probability of a customer making a purchase based on their answer to the question, given parameters x, y and z would a promotion get them to buy?

Clustering

Here the algorithm organises objects into groups based on multiple variables. It can group customers based on purchasing patterns; bunch physical stores together based on performance; and bundle products together based on the same criterion. The process takes you to a deeper level of segmentation, identifying new collections of like-minded people to reach out to.

Sentiment Analysis

A classification algorithm, sentiment analysis reveals subjective opinions or feelings collected from many sources. You can use it for multiple objectives, including market research, precision targeting, product feedback and deeper product analytics. It can also boost customer loyalty, through improved customer service, helping agents resolve customer queries quicker.

 

The View From Jarmany

At Jarmany, we work closely with eCommerce professionals looking to improve the performance of their websites. We recently added a section dedicated to AI on our eCommerce solutions page to provide some insights that you may find helpful.

You may also find value in our The 5 Best Strategies to Boost eCommerce Sales eBook and our Ecommerce Intelligence Demo which demonstrates what you can do with the right tools in place to keep track of your eCommerce performance.

What’s clear today is that eCommerce offers great opportunities but presents significant challenges; and that AI is helping businesses overcome the hurdles to make the most of this rapidly growing sales channel.

If this blog has triggered some questions, thoughts or ideas, speak to us today and let us see how we can get your eCommerce business on the path to a best-practice AI adoption.

To learn more about how AI can improve the performance of your eCommerce get in touch with Jarmany today and have an honest conversation with our AI experts.

New call-to-action

 

The Frontier Model Forum; What Is It and How Will It Help Regulate AI?

Engrained into our everyday lives through technologies such as facial recognition, digital assistants, and smart cars, the era of AI is well and truly upon us, and there are no signs of its substantial growth stagnating. In fact, the AI market size is projected to reach $407 billion by 2027; representing an annual growth rate of 37.3% from 2023 to 2030 [1].

Alongside this, businesses are also recognising the potential of AI and are increasingly leveraging it to streamline their operations, enhance data-driven decision making through data analysis, automate repetitive tasks and improve customer services. To provide some context to this, according to Gov.uk, in the UK alone almost half a million businesses had adopted at least one AI technology in their operations at the start of 2022 [2]. 

And yet, whilst the AI industry has continued to advance and adoption has increased, there has been little development in the mitigation of AI-associated risks, regardless of the growing concerns about cyber-security and regulatory compliance of artificial intelligence within organisations [3]. 

Now, don’t get us wrong, we’re not convinced we’re going to have an iRobot situation on our hands any time soon, however it cannot be denied that there are potential risks associated with the use of AI technology, and an urgent need for regulation to address these concerns.

This is where the Frontier Model Forum comes in to play…

New call-to-action

Introducing The Frontier Model Forum 

The Frontier Model Forum (FMF) is a newly announced partnership aimed at promoting the responsible and safe development of AI models. 

Formed by Microsoft, Google, OpenAI and Anthropic, this new industry body has set out to cover four core objectives: 

  1. Advancing AI safety research
  2. Identifying best practices
  3. Collaborating with policymakers, academics, civil society and companies
  4. Supporting efforts to develop applications that can help meet society’s greatest challenges 

Whilst these four tech-giants have founded the FMF, their aim is to establish an Advisory Board by inviting member organisations to contribute towards its strategy and priorities. Organisations that wish to join the forum will need to meet the following membership criteria: 

  • Develop and deploy frontier models (large-scale ML models that are capable of performing an extensive range of tasks that go beyond what is currently possible with even the most advanced existing models)
  • Demonstrate strong commitment to frontier model safety 
  • Are prepared to contribute towards advancing the FMF’s efforts 

The aim of the Frontier Model Forum is then to leverage the collective technical and operational knowledge of its member companies to benefit the overall AI ecosystem. This includes driving progress in technical evaluations and benchmarks, as well as creating a public repository of solutions to promote industry best practices and standards. Through these collaborative efforts, the Forum seeks to contribute to the advancement and development of the AI industry as a whole. 

“Companies creating AI technology have a responsibility to ensure that it is safe, secure, and remains under human control. This initiative is a vital step to bring the tech sector together in advancing AI responsibly and tackling the challenges so that it benefits all of humanity.” Brad Smith, Vice Chair & President, Microsoft.

 

Our thoughts

In our perspective, AI presents a range of risks – job displacement, security & privacy concerns, bias and discrimination to name a few. However, we believe the primary concerns related to AI revolves around the absence of regulation, and the lack of clear guidelines. This is why we consider the launch of the Frontier Model Forum to be a highly encouraging and indispensable development which will help to mitigate risks, establish industry-recognised standards and reduce potential negative social impact. 

By bringing together experts and industry leaders, it will foster a collective effort to:

  • Reduce potential negative impact
  • Safeguard society’s interest
  • Ensure the responsible and ethical use of AI 

The Frontier Model Forum has the potential to shape the future of AI in a way that minimizes risks, enhances transparency, and creates a more secure and accountable environment for AI development and deployment, so we can continue to reap the benefits made possible by AI and unveil further progress in the field of AI, all whilst effectively managing the associated risks. 

New call-to-action

  1. forbes.com/advisor/business/ai-statistics/
  2. https://www.gov.uk/government/publications/ai-activity-in-uk-businesses/ai-activity-in-uk-businesses-executive-summary
  3. https://www.mckinsey.com/capabilities/quantumblack/our-insights/the-state-of-ai-in-2022-and-a-half-decade-in-review 

Are You AI Ready? Build Your Framework for Success

At Jarmany, we’re conscious of all the media hype around artificial intelligence (AI), and how the discourse has been mixed, to say the least. The idea that general-purpose AI will be the biggest event in human history may feel like hyperbole, and it’s probably too early to call. But what’s certain is that it’s going to change all our lives and is already transforming business.

In this blog, we want to touch on the opportunity that AI presents organisations, but more importantly we’ll get into what you need to do to make sure your business can leverage AI to the max.

What is the business opportunity of AI?

Just so we understand what we’re talking about here—AI will have made the world $15.7 trillion richer by 2030[1]. It will also have given a 26%-plus boost in GDP for local economies by the same date.[2]

Those figures may actually be conservative bearing in mind how quickly AI and its adoption is advancing, but regardless of how many trillions-of-dollars AI generates, there’s plenty for business to get excited about. Indeed, McKinsey found that way back in 2021, 27% of the companies it spoke to in an AI-related survey said 5% or more of their profits were already down to AI.

 

The difference between generative and non-generative AI

So what do we mean by AI? There are actually two kinds—Generative AI which produces new content, like chatbot responses, that imitate human creativity. And non-Generative, or predictive, AI forecasts outcomes based on patterns in historical data.

It’s generative AI and ChatGPT from OpenAI in particular that’s been grabbing all the headlines recently, which is unsurprising since Microsoft pumped a massive $10 billion into the continued development of this natural language processing tool back in January.

In practice, Generative will work alongside non-Generative, and in unison at times to enhance outcomes. Right now, these two types of AI are revolutionising businesses, from sales & marketing departments, to logistics and inventory, accounting & finance and human resources.

Whether it’s boosting efficiency by removing repetitive tasks like writing emails or summarizing large documents; or improving supply chains by showing how much of anything should be stored where and when, AI is there to give your business an edge.

 

How difficult is it to use AI in a business?

You won’t be surprised to learn that successful adoption of AI depends on how much effort you put in beforehand. There are plenty of problems to making AI work for a company—but for every issue there is a solution and we’re going to walk you through the key ones now.

We recommend establishing an AI Framework for Success. Make it a mental checklist that you go through and learn and share with colleagues so everyone interested in making AI a success is aligned. Remember AI adoption is a team game and you don’t want anyone from across the company going off-piste.

 

The Jarmany AI Framework for Success

We’re going to split the framework broadly in two. There are the structural parts that you have to get right, covering data, architectures, legal requirements and skillsets for example. Plus, there are the softer parts, which cover things like sensitivities and ethics.

 

AI Framework for Success—1st Phase:

Time to make sure you have the correct foundation for AI:

What’s your AI mission statement?

Sounds obvious, but you’d be shocked by the number of companies we’ve come across that launch into AI without a clear vision of the revolution’s ultimate goals. Get together, agree and write down what you want AI to achieve for the business. Decide what you want the main benefits to be—enhance user experience, improve topline revenue or reduce internal costs?

 
Check your data quality

You need to audit your current data sources to ensure you have enough data and that it’s in the right place, clean enough and essentially fit-for-purpose. It’s worth spending a moment on this because you also need to consider how accessible your data is. Your systems-data needs to be able to flow freely in order for AI to work. The last thing you need are data siloes.

 

Do you have enough performance?

Along with your data, you need to audit your infrastructure to find out whether you have the basic computing capabilities to process large amounts of data for AI. Sure, the availability of AI services on public clouds like Azure offering massive amounts of compute and storage can help you here but see what you have in-house before you take that step.

 

Who is on the AI team?

We all know how labour shortages are hurting IT at the moment, so you need to count the number of hands you have available for your AI taskforce. If you’re short, then we recommend training for those who want to join up and, more for the longer term, think about bringing in AI specialists.

 

AI Framework for Success—2nd Phase:

You’ve put a check against everything structural, so now it’s time to move into the second, softer phase, which is just as important.

Data governance, ethics and bias?

Governance is going to need some thought because to train AI algorithms, for example, you need large quantities of data, making storage and security of major importance.

Racial and gender biases are also a known problem with AI unless work is done to iron out discriminatory assumptions in algorithms, often associated with low-quality data. Set down standards that will help control the problem, and check out the UK Government’s white paper on its approach to AI regulation and the EU’s AI Act for guidance.

 

Deal with employee concerns

Your personnel will have legitimate worries over how AI is going to impact them. The question over whether they will they lose their jobs is the elephant in the room that you’ll need to address first and foremost. You need to correct many of the negative assumptions about AI and communicate the benefits, reinforcing that it will enable them to focus on other, less mundane, repetitive and manual tasks, freeing them up to work on more interesting stuff.

 

Walk before you run

Everyone comes to AI nowadays with preconceived ideas—and it’s most likely that internal stakeholders will have massive expectations for AI in general. Afterall, they read the news, right? While it’s great to have high-level interest in a project, you have to manage people’s expectations at the start.

Therefore, consider a proof of concept to test that your AI model is working before going big. Use just a small sample of data to demonstrate the model’s effectiveness to the people that really matter before launching anything wide scale across the business.

 

Summing Up

With so much excitement around AI—and its transformative power for business—we could forgive anyone for not wanting to hold things up with questions like—Are we AI ready?; because quite frankly that’s incredibly boring, and who wants to be a killjoy?

But asking that question and following a framework like the one we’ve shared is incredibly rewarding in the long term and is the best way to get the most out of your AI investment.

Still, even with your AI Framework to Success, the time and expertise needed to get everything lined up can be a challenge; and so, at Jarmany, we’ve created a team of AI specialists that can deliver AI in the most time effective and cost-efficient way possible.

If this blog has trigged some questions, thoughts or ideas, speak to us today and let us see how we can get your business on the path to a best-practice adoption of AI.

New call-to-action

[1] https://www.pwc.com/gx/en/issues/data-and-analytics/publications/artificial-intelligence-study.html

[2] ibn

Microsoft Fabric 101: The Comprehensive Analytics Solution for Businesses

Microsoft, a leader in the technology industry, recently announced the launch of Microsoft Fabric, a comprehensive analytics solution that promises to revolutionise the way businesses store, manage and analyse their data; in turn, streamlining their data processes so businesses can extract timely and valuable insights, more efficiently. 

In this blog, we will take a closer look at Microsoft Fabric and explore its features and benefits as well as discussing our thoughts. So, whether you’re a data scientist, analyst, or business leader, we’re here to demonstrate how it can help you unlock the full potential of your data.  

So, let’s get to it. 

What is Microsoft Fabric?

Microsoft Fabric is a comprehensive, all-in-one data analytics solution that encompasses a whole suite of data services, including data engineering & transformation, data science, real-time analytics, and business intelligence. It brings together the suite of existing products within the Microsoft stack, such as Data Factory, Power BI, and Synapse, to deliver a seamlessly unified experience that serves your end-to-end analytical needs. 

By integrating a variety of different data services, Fabric offers a simplified user experience which can be customised based on each business’ needs and therefore eliminates the need for multiple vendors. It also enables businesses to centralise their admin and governance whilst providing users with a familiar and easy-to-learn experience.


What Are The Key Features?

#1 Data Lake

One of the key features of Microsoft Fabric is its data lake, also known as OneLake.  

OneLake provides a centralised repository for all enterprise data and is the foundation of all services available on Fabric. By providing a unified storage solution, data scientists and analysts can more easily access and analyse data from various sources, including structured, semi-structured, and unstructured data.

Microsoft Fabric’s data lake is designed to handle massive amounts of data, making it an ideal solution for businesses with large volumes of data, whilst also simplifying the management of big data. 

#2 Data Engineering

Another important feature of Microsoft Fabric is its data engineering capabilities. With Microsoft Fabric, businesses can design, build and maintain infrastructures, allowing them to more easily transform and process their data, in turn making it easier to analyse and derive insights.  

Additionally, Microsoft Fabric provides a range of other data engineering capabilities, including: 

  • Creating and managing data lakehouses 
  • Designing data pipelines that feed in to your lakehouse 
  • Using notebooks to write code for data ingestion, preparation and transformation 

All in all, these engineering capabilities allow businesses to better prepare their data for analysis. 

#3 Business Intelligence

Microsoft is already widely known for their popular business intelligence and data visualisation tool, Power BI, so it will come as no surprise that real-time analytics and BI has been incorporated into the features of Fabric.  

This capability enables users to: 

  • Monitor and analyse data in real-time 
  • Build interactive dashboards 
  • Manage ad hoc reporting 
  • Implement predictive analytics 
  • And much more. 

This feature helps businesses to gain real-time valuable insights into their operations so they can make more informed decisions and can respond quickly to changes in the market.

#4 Co-Pilot and Data Activator

Another exciting feature of Microsoft Fabric is the integration of the newly announced Copilot and Data Activator. 

Copilot is Microsoft’s new artificial intelligence tool that can aid productivity by automating repetitive tasks, writing code, creating visualisations, summarising insights, and much more.  

Data Activator is a no-code tool for analysing data and then automating alerts & actions off the back of those insights. This could include notifying sales managers when inventory dips below a certain threshold, alerting finance teams when a customer is in arrears with their payments, or automatically creating support tickets if an error is triggered.


Our Thoughts on Microsoft Fabric

Now that we’ve explored some of the key features of Microsoft Fabric, we’re going to give you the run-down of what we think of this new unified platform.

The Benefits

  1. One interface to access all components of Fabric
  2. Existing knowledge of Microsoft products can be utilised 
  3. Strong, centralised governance of data access 
  4. Git integration for robust source control 
  5. Simplified billing

Whilst we’re big fans of the Microsoft technology stack, we won’t deny that there are a few contrasting elements that need ironing out before Microsoft Fabric has our full backing.  

Firstly, the application has a few bugs which impacts the user experience – no doubt due to the sheer amount of integrated services and level of capacity, but something we imagine will be resolved as the uptake increases and it’s phased out of preview. 

Whilst the promise of exciting AI features is enticing, a lot of these features are not yet available which is a little disappointing given the current AI-hype and market-eagerness to leverage these types of tools. 

Lastly, stand-alone Microsoft Fabric is currently only available on a pay-as-you-go basis, making it a more expensive option and therefore a less feasible option for businesses that are more price sensitive. Later this year ‘reserved capacity’ SKUs are due which will bring down the cost of dedicated computer resources.

Get In Contact

Overall, Microsoft Fabric is a great unified analytics solution if you’re looking for a system that offers a suite of services for data processing, analysis, and visualisation, all in one place. And, with features like Copilot, Data Activator and the integration of Power BI, there’s no doubt it will make it much easier, and more streamlined, for businesses to extract valuable insights from their data. Microsoft Fabric is certainly something we’ll be keeping our eye on as it’s phased out of preview and more readily available. 

If you’d like to find out more about Microsoft Fabric, or how you can leverage other Microsoft products to advance your data capabilities, then get in touch with the team today. 

New call-to-action

 

AWS vs GCP vs Azure: A Data Platform Comparison Guide

The name data platform couldn’t be more mundane, but it would be a mistake to judge this technology by what it’s called. Ingesting, processing, analysing and presenting huge quantities of information—data platforms are turning around the fortunes of many organisations today and helping them thrive in some pretty tough markets.

In this blog, we’re going to get into which cloud is best for your data platform. We’re not going to debate whether cloud is your best option, because quite frankly we’re discounting an on-premises infrastructure from the start.

What we’re going to do is help you figure out which of the Big 3—Amazon Web Services (AWS), Google Cloud Platform (GCP) and Azure—is right for your data platform. And even give you an alternative to boot if none of the three cuts it.

Let’s get started.

What are the advantages and disadvantages of the Big 3 Clouds?

So what are the pros and cons of AWS, GCP and Azure? Before we answer that let’s make a couple of things clear. If you approach that question by going through the Big 3 service-by-service, you’re wasting your time.

It’s a mistake because by focusing on each cloud’s services capabilities, you’re missing the bigger picture and may end up having to back-track and rethink your original choice further down the line. You’ll see why later.

The Big 3 Defined

Amazon Web Services (AWS)

Part of Amazon, AWS has more than one million active users and offers more than 200 fully featured cloud services. It accounts for 41.5% of the cloud market and has 5x more cloud infrastructure deployed than its 14 leading competitors combined. In people’s minds, it stands out for AI and ML services. Azure might wonder where that idea comes from, but really there isn’t a cloud that does it in these areas better than AWS.

Google Cloud Platform (GCP)

GCP is the smallest of the Big 3 with 9% cloud marketshare. Despite being the smallest, it’s revenue growth is healthy, and has consistently been up to 45% per annum. In addition, it’s global network is one of the biggest. You get seamless integration with all Google products and it packs a fully-managed data warehouse, called BigQuery, which is highly rated and could be central part of your data platform.

Microsoft Azure

If we renamed Azure, the Microsoft Cloud, you’d get an instant feel for what we’re talking about here: It’s Microsoft’s own public cloud offering; and it’s growing fast. It’s crucially important to Microsoft, delivering revenue of $28.5 billion—up by 22%—in the company’s third quarter results, released in April 2023. It offers everything a data platform could need and is well-known for being simple to work with.

How do I distinguish between the Big 3?

Had we created this blog 8 years ago, you would have seen the word maturity dotted around in a number of places. Back then, people spoke about some of these clouds being more mature than others; and hence offering a broader range of services to meet a company’s specific needs.

Maturity is no longer relevant and if you try to separate the Big 3 on their service offerings—unless your business is very very niche—it’s not worth it.

When it comes to compute power, data storage options, networking, security and compliance, all of the Big 3 have what you want. They all offer tonnes of services—many of which you’ll probably never need.

Location, however, could be an issue. Depending on your industry, you’ll need to comply with a host of regulatory standards around cloud usage, one of which is where your data is situated.

That may sound odd because we’re talking about global cloud providers and thus your data will be everywhere, right? Correct, but while access is ubiquitous, your data will be stored on physical devices somewhere out there—and it’s where those devices sit that counts.

Hence, you need to check where the AWS, GCP or Azure data centre is located that will be storing your data and then you’ll know if that cloud is the one for you. The good news is that all the Big 3 are really up on the regulatory needs of multiple industries, including public sector, and they have teams that can provide you with all the information you need to know if you’ll be on the right side of your industry’s watchdogs.

The Big 3’s key points of difference

There is a way to think about AWS, GCP and Azure so you can start to draw lines between them. Sure, these are going to very broad statements but they are no less true for being light on detail:

  • AWS  the best place to build and run open-source software.
  • GCP – a great choice if you’re already using solutions within the Google Stack.
  • Azure – integrates seamlessly with your existing Microsoft technology.

Perhaps that’s all you really need to know. Maybe you can stop reading here. What’s certain is that these points are going to have a bearing when we get more into the details.

The Pros and Cons of AWS, GCP and Azure

With our broad brushstrokes in place, we now can start focusing the discussion a bit more on the advantages and disadvantages. We’ll show you how to properly evaluate each cloud, based on the premise that they all have the infrastructure, compute, storage and networking etc, you need.

  • Legacy Investment – this is such a crucial point—and so often overlooked—because if you’re heavily invested in Microsoft or Google, it makes no sense whatsoever not to leverage all that legacy.
  • Skillsets – this really builds on from the previous bullet, because if, for example, you have the Microsoft skills already in-house then adopting and working with a cloud like Azure is going to be much easier and less costly in terms of training. Of course, the same argument can apply to AWS and open-source. Therefore, you need to audit what skills you have internally, as part of the decision-making process.
  • Community – a reflection of their size, both AWS and Azure have much larger online communities than GCP. These communities provide advice and resources to resolve challenges and boost developers’ skillsets. The Azure Community, for example, has approximately 182,000 members, and Microsoft employees regularly participate in its online forums.
  • Politics – no we’re not joking; politics does play a role in any cloud decision. It doesn’t always happen, but we often see senior managers having an emotional connection with certain platforms, often Azure, since their experience of Microsoft goes back years. So which way does the wind blow in your company? AWS, GCP, Azure? What’s your sense?

Are AWS, GCP and Azure my only options?

We focused our blog on the Big 3 because they are the ones the vast majority of businesses choose from. Nevertheless, they aren’t your only options.

Ask your IT team about a Modern Data Stack as an alternative to the Big 3 and see what members say. A Modern Data Stack is an assembly of software tools and technologies running across different cloud platforms to collect, process, store, and analyse data.

To be honest, the idea has been around for more than a decade and it’s often used for niche cloud projects; however, modern data stack comes with a sense of freedom. What we mean by that is you’re getting the independence to run a particular workload on a particular cloud. Your IT team chooses whichever one is best suited to the job you want to do.

Parting thoughts

On balance, and based on our experience, we think you have to go a long way to beat Azure. It fits so well with legacy Microsoft infrastructures. There’s nothing that AWS and GCP pack that Azure doesn’t, unless it’s for something niche that probably wouldn’t be relevant to your business anyway.

Indeed, Azure carries Microsoft’s DNA, which makes it easy to learn and intuitive. There’s generally less coding required. What’s more, the whole community thing continues to grow so the support is out there if you need it, both in terms of gazillions of documents and online forums.

Boiled down to just three things, Azure is great on price, ease of use and ease of integration. Not bad really.

We hope this blog proves useful in helping you choose the right cloud for your platform. That said, our team of consultants at Jarmany is available to continue the conversation and give you a deeper insight into the Big 3 and how to find the cloud that is best for your business.

Talk to us today and have an honest conversation about how to select the right cloud for your data platform.

New call-to-action

 

Microsoft’s Annual Build Conference: The Key Announcements

It may come as no surprise that there was a particular focus on generative AI, ChatGPT, and leveraging OpenAI’s capabilities, with Microsoft aiming to enhance its offerings and maintain its market-leading position. However, these developments raise concerns about a potential single-source dependency, prompting speculation about the acquisition of OpenAI by Microsoft.  

In this blog post, we will delve into Microsoft’s 5 key announcements.

#1 Copilot: Microsoft’s Generative AI Assistant

Microsoft unveiled Copilot, an innovative feature that incorporates generative AI technology into its core operating systems and Office 365 products. Copilot acts as an assistant within Office apps and also resides as a taskbar button, assisting users with various tasks on their PCs. While the demos were impressive, it will be interesting to see how this performs in the real-world and if it will be widely accepted and utilised, or if it will become the next generation of ‘Clippy’ for the AI era. 

#2 Bing & ChatGPT: Augmenting Knowledge With Bing Search

ChatGPT’s main limitation lies in its knowledge being restricted to information before September 2021. To address this issue, Microsoft plans to integrate Bing Search with ChatGPT, allowing the search results from Bing to supplement ChatGPT’s responses and keep it up to date. Additionally, Microsoft aims to ensure interoperability between ChatGPT plugins and Bing, enabling integration of the results. Although similar to Google’s approach with Bard, the vast user base of ChatGPT suggests the potential for a significant increase in Bing Search usage. 

#3 Azure AI Studio: Building Custom Models and Ensuring Safety

Microsoft introduced Azure AI Studio, a platform that empowers developers to build their own models and create functionalities on top of them. This initiative also emphasises the importance of AI safety, allowing developers to test applications and mitigate any potential issues that may arise. 

#4 Microsoft Fabric: A Complete End-To-End Analysis Platform

Microsoft Fabric, a direct competitor to Snowflake, offers a comprehensive solution for data engineering, storage, warehousing and analytics. Fabric introduces OneLake (a centralised, simplified storage service), Data Activator (a system for building complex, data-driven alerts) and the integration of Copilot into Power BI to help build eye-catching reports from natural language prompts.

#5 Single-Source Dependency and the Potential Acquisition of OpenAI

Microsoft’s commitment to infusing generative AI across its product range is a strategic move aimed at reclaiming market share from Google in productivity and search. However, this strategy also poses a significant risk—a single-source dependency on OpenAI. If OpenAI were to cease supplying Microsoft with technology, it could impact the company’s core business and profitability, leading to a potential decline in its share price. Consequently, acquiring OpenAI becomes a critical consideration for Microsoft to mitigate this risk. 

 

Overall, it’s clear that Microsoft continues to make strides in AI, with the integration of OpenAI’s technology into its products holding testament to this and demonstrating the value and investment that Microsoft are placing on this type of technology. With millions of users everyday across their suite of products and services, this focus on AI holds promise for enhanced functionality and improved user experience, and we can’t wait to see it evolve more. 

Discover more about Microsoft Build here.

New call-to-action

 

How To Build A Data Strategy: The Framework To Success

Many companies are sitting on mountains of data and information, but few are extracting the gold that lies within it, which we think is crazy. In this blog, we’re going to show you how you can maximise its benefit to allow your business to thrive.

You’ll learn that every successful data lead organisation is built on an effective data strategy. We’ll explain:

  • What a data strategy really is
  • The benefits of having a data strategy
  • Why you really should have a data strategy
  • Jarmany’s 5 steps to building an effective data strategy
Let’s get to it.
 

What actually is a data strategy?

A data strategy is basically a plan that, if implemented properly, will allow you, as a business, to leverage the power of the all the data and information at your disposal quickly and effectively. This power will then result in the business being able to make the most informed decisions possible and act quickly to help maximise commercial performance.

Sounds simple, but the difference between a great data strategy and poor data strategy could result in a massive impact on your business. Research shows that businesses with a strong data strategy can perform over 2.5x better than those with a poor data strategy.1

What are the benefits of having a data strategy, and why should you have one?

You might say to yourself: “I already have loads of data so surely I just need to take a quick look at it and it will give me the answers I need to run my business” …if only life was that simple!

When we start working with our clients, we often see that they are facing a variety of challenges, including:

  • Incomplete and untrustworthy data which results in more arguments than insights
  • Inadequate data cleansing compounding already questionable data
  • Inefficient data management processes slowing down their speed of decision making
  • Insufficient use of available 3rd party data that will give colour and relevance to your internal 1st party data
  • An over reliance on human beings, rather than technology and AI, to do relatively simple and mundane tasks. (A machine will never get bored of doing these tasks, will often do them better, and will be quicker, with far less mistakes or human-error).

Once you have your data sorted so it’s clean, accurate, timely and in a format where you can readily understand and interpret it, you need to ask yourself what’s next and how can you use this information?

You’ll be surprised by how many instances there are where good data and insights can help turbo charge your business. Below is a small subset of the main areas where a data-driven business can drive a massive commercial advantage:

  • Increased Sales a cohesive data strategy can help you identify opportunities to optimise marketing efforts. Businesses that strategically use data to inform business decisions can outperform their peers in sales growth by 85%.2
  • Increased Profits – this can be achieved by streamlining operational logistics and through cost analysis. According to a Business Application Research Center (BARC) data-driven sales reduced the overall cost of operations by 10%.3
  • Greater client satisfaction – Businesses that personalise the customer experience using data can increase the customer lifetime value by 2.5x on average.4
  • Decreased Risk – this can be achieved through better management of regulatory requirements and data breaches. According to IBM the average cost of a data breach in 2022 was $4.35 million and 83% of organisations reported more than one breach.5

Jarmany’s 5 steps to building an effective data strategy

A data strategy is essentially a plan that allows you to quickly and effectively leverage the power of all the data and information available to you as a business. In turn, this allows you to make the best business decisions to drive growth and operational efficiencies.

We’ve consolidated the core steps you need to take to help you define your data strategy:

1. Define the questions that need to be answered to allow the company to meet its strategic objectives and respond to tactical challenges. This could be based on goals relating to revenue growth, increased profit, market share growth or cost reduction.

2. Define the gaps between what you have today and where you want to get to. In particular, you need to consider the following 4 areas:

    • Data – Do you even have all the raw data you need? Are you set-up to collect the data from your business operations required to make the right decisions? Are you maximising the benefit of 3rd party data sets that are available to you? Do you have the right quality, breadth and depth in your data?
    • Technology – What data technology do you already have in your tech stack? Does it have the functionality to complete the tasks required by your business? Are you restrained in your options by significant previous investments in certain tech stacks (Azure, GCP, AWS). Finally, are you making the most of the recent advances in technology that are happening, in particular AI? (Whilst this last question is key to consider, you must always remember to have the enablers of AI in place, such as good data and a clear strategic need, to really leverage its true power).
    • Internal Capability – Do you have the right people with the right skills to enable you to leverage your investment in data and technology so you can transform that data into valuable information?
    • Culture – All of the above points are redundant if you don’t have an organisational culture that is programmed to accept that data needs to be an intrinsic part of the decision support structure. Ask yourself if you have buy-in from the right stakeholders and how you can embed a greater level of acceptance and interest towards data and data-driven insights from your organisation.

3. Define the plan – Once you have defined the objectives that need to be met and the current gaps you face it is important to create the plan to address them. Below are the key factors every good plan needs to contain:

    • Incremental wins – Better data and insights can start driving benefits to your business almost instantly. Therefore, no data strategy should wait until the transformation is 100% complete before launching it. This could mean months of missed opportunity and eventually result in a flop. At Jarmany we think a staged delivery focus is the best. We usually advise 3-month milestones to deliver specific commercial advantages that build on themselves over time. This means you start getting a return on your investment sooner, and also allows you to flex the strategy slightly over time if the needs of the business change. This approach significantly reduces the chances of the business ending up with a BI white elephant that isn’t fit for purpose.
    • Leverage previous investments as much as possible – Don’t reinvent the wheel or spend time and money in areas where you don’t need to, unless it results in greater commercial benefit. (New and shiny isn’t always best).
    • Spend money wisely – Technology, especially AI, is rapidly advancing so investing in the right tech could provide significant commercial advantages to your organisation. However, as always make sure the fundamentals are in place first. (Sometimes new and shiny is the right way forward).
    • Don’t neglect your people – Bring them on the journey and remind your people of the benefits to them. It’s a support function not a threat, training can create your citizen data analysts.

4. Review progress – It’s important to constantly monitor the progress of the implementation of a data strategy. We always advise to stick to the 3-month cadence mentioned above to so you can work in shorter term sprints so you can ensure everything is on track and it enables you to tweak the strategy when necessary.

5. Repeat the above – The needs of any business changes over time especially if it is going through a period of transformational change. Therefore, whilst we talk about working in 3-month sprints, we believe that any data strategy should go through a deep review every 2-3 years. This gives you time to implement a strategy but not too long that the plan becomes irrelevant and doesn’t align with the changing needs and focus of the business.

What’s next?

So, there you go—a successful data strategy framework in five steps, as promised.

We don’t mind confessing to you that negotiating each step can be tricky if you don’t have enough experience and expertise at your disposal. Therefore, the wisest move can often be to work with experts who create data strategies for a living.

At Jarmany, we have the talents to support you in building and implementing a successful data strategy. We’ll help deliver your strategy as well as collect and structure your data to be analysed and modelled in such a way to answer your business questions and deliver your business objectives as quickly and as cost effectively as you can.

Talk to us today and have an honest conversation about how to get your data strategy moving.

New call-to-action

What Is Power BI? And How Does It Work?

Identifying trends and patterns from raw data is hard and has nothing to do with a person’s intelligence. But spotting those signs in shapes and colours is much easier and can be achieved surprisingly quickly. 

Therefore, the rise of data visualisation tools as part of the broader business intelligence (BI) world is no surprise. These tools not only speed up decision-making processes but improve the decisions themselves, helping viewers interpret data more accurately. 

All this brings us to Microsoft Power BI – the most complete data visualisation technology in the market, according to the Gartner Magic Quadrant for BI and Analytics Platforms 2023 – and something that millions of people are using every day to extract insights from within their data. Let us walk you through it.

What is Power BI?

Microsoft Power BI aggregates your data and then represents it visually for you to analyse and share. Forrester calls it Microsoft’s augmented business intelligence platform, infused with the power of AI (which we’ll get to later on).

In essence, Power BI is a collection of software services, apps and connectors. What that means is you can connect data from multiple sources across your business, including Excel spreadsheets, visualise it in a dashboard or a report, share with colleagues and uncover what’s important to you in no time.

Some common types of data visualisation:

  • Bar and column charts
  • Doughnut charts
  • Decomposition tree
  • Funnel charts
  • Gauge charts
  • KPIs

What makes Power BI different from other BI solutions with data visualisation tools? Ask a senior consultant who works with Power BI and has experience of other solutions and you’ll hear words such as more intuitive, adaptable, unified and interactive.

The truth is that because it’s Microsoft, Power BI has a look-and-feel that many of you will recognise and like. If you use Excel then making the step up to Power BI will feel like a natural development.

How Much Does Power BI Cost?

The solution comprises 3 basic elements:

  • Power BI Desktop – a Windows desktop application.
  • Power BI Service – a software-as-a-service offering.
  • Power BI Mobile – apps for Windows, iOS and Android devices.

In terms of licensing:

  • Power BI Desktop is free.
  • Power BI Pro is £8.20 per user/ month
  • Power BI Premium Per User (PPU) is £16.40 per user/ month
  • Power BI Premium is £4,105.60 per capacity/ month

You can find out more about the differences between each package here.

Why Is Power BI Popular?

It’s unlikely you’ll find any area of your operations that Power BI won’t support; hence you’ll see Power BI providing insights to teams across:

  • Finance
  • HR
  • Production
  • Planning
  • Warehouse
  • Supply chain
  • Logistics
  • Sales
  • Marketing

It’s also true that new Power BI use cases will occur as the solution gets more tightly woven into your operations. Soon enough you’ll be building reports and dashboards delivering niche views on everything from expenses to specific project plans and progress on individual targets.

Power BI reports tend to feature historic data sets, delivering a snapshot of your organisation over a set period rather than just in real-time. Nevertheless, your Power BI reports can aggregate and visual data on key parts of your operation in just the same way as your Power BI dashboards, from Finance to HR and Customer Profitability to Ecommerce sales.

Power BI dashboards organise and visualise your data in real-time. You can create alerts when figures change and hit a chosen threshold. Here are a couple of dashboard examples:

  • Ecommerce –
    You can see how your online sales channels are performing day-to-day to gain a deeper understanding of how your products are performing. Insights could include: sales by category, most returned product and reasons for returns and sales over specific periods.
  • Marketing –
    You can visualise the effectiveness of your campaigns and the performance of segments and channels. For example, marketing spend by products, channel performance and campaign success rates.

Once you’re creating your reports and dashboards, you can start using some of the value-adding features in Power BI to distribute your insights and isolate the data that’s most important to your company.

Power BI Apps
Power BI Apps allows you to bundle your reports, dashboards, spreadsheets and datasets and distribute them to individuals or large groups across your organisation in one go.

Power BI Metrics
With Metrics, you can publish the performance metrics that are most important to your business in a single pane within Power BI. The main idea here is that Metrics promotes accountability, alignment and visibility for your teams.

How To Become A Power BI Expert

Power BI is promoted as a self-service tool; and that people with little or no technical background can become data heroes in just a short while.

Because it’s based on Microsoft Excel, many people will get a head start on learning the basics and the drag-and-drop functionality simplifies the process of connecting multiple data sources.

As you’d expect, Microsoft also offers plenty of Power BI training, with online workshops, documentation, and sample dashboards and reports.

At some point, you should think about learning DAX (Data Analysis Expressions), developed by Microsoft for platforms such as Power BI. It’s been referred to as Excel formulas on steroids and is crucial if you want to get the full value of Power BI, helping you create new information from data that is already in your model.

“If you’re familiar with Office 365, you’re going to be able to pick up Power BI quite quickly.”

Building Power BI Dashboards And Reports

You can create visualisations (referred to as visuals) in reports using visual types directly from the visualisation pain in Power BI. Furthermore, there are a growing number of pre-packaged custom visuals available through third parties that might be enough for what you need.

You simply download the custom visuals into your Power BI system and off you go.

Common sense will tell you to be wary of downloading anything unless it’s from a trusted source, in which case you’re better off using custom visuals that have been certified by Microsoft. There are many available in the Microsoft AppSource community site.

To cut down on the effort to extract useful data insights, Power BI has added its own AI Insights feature, which covers Text Analytics, Vision and Azure Machine Learning. It gives you access to a collection of pre-trained learning models that enhance your data preparation efforts. Using this capability, which requires Power BI Premium, you can enrich your data and gain a clearer view of data patterns.

Avoid Common Mistakes In Power BI

As you’d expect, there are best practices that you should follow to extract the full potential of Power BI for your organisation. Here are some top ones:

  • Spend a bit of time thinking carefully about what your dashboard or report is for.
  • When starting out, avoid introducing too much data because it can slow down the performance of your dashboard.
  • Remember you want your data visual to be used by colleagues, so think of them and don’t over complicate the report, making the information difficult to digest.

Top 5 Power BI Tips

Now you know some of the common mistakes, we’ll leave you with some top tips as shared by our own Power BI experts:

  1. Have a clear a purpose in mind – there are so many data visualisation possibilities so be certain on what you’re trying to say and who you’re trying to say it to.
  2. Keep your visualisations simple – it’s worth reviewing your data visual multiple times as it evolves, asking yourself: Can I make it clearer or can anything be removed?
  3. Do some proper benchmarking comparisons – your data also needs context so include benchmarking to show performance against a set of standards.
  4. Annotate your reports using Tooltips and buttons – both provide additional information on visuals, such as contextual data or, in the case of buttons, making them more interactive.
  5. Do a training course – Power BI may be aimed at non-technical people, but there is so much to it and it’s such a powerful tool that to get the most out of this technology it’s definitely worth getting some formal guidance.

Speed Up Your Power BI Development

With all this information, we hope it’s clearer what Power BI is and how it can help your organisation speed up and improve the effectiveness of your decision-making. We also hope you’ve got a sense of why Power BI is a leader in the data visualisation market and how with continued development, such as the integration of AI, that position isn’t likely to change any time soon.

What’s also true, however, is that without the internal experience and expertise of Power BI to hand, you’re going to need to invest time and money in developing those skillsets; and that partnering with an organisation that can plug those skills straight into your operation may be more time and cost effective.

At Jarmany, we’ve built a first-class team of Power BI consultants that can help your business harness the power of data effectively. Whether you are looking for a fully outsourced team or support for your in-house team, we can provide you with seamless expertise at a competitive cost.

If you’d like to know more about how Jarmany could help you maximise the value of Power BI to drive smarter decision-making across your company, contact us today.

New call-to-action

What is econometric modelling?

What is Econometric Modelling? And What Are The Benefits?

Econometric modelling uses statistical analysis to discover how changes in activities are likely to affect sales and turnover, so you can predict future impact and make better-informed decisions. Most typically, it’s used in marketing to provide valuable insights into how well a campaign or marketing activity may perform and the factors that will drive the most ROI. 

For example, you may be thinking about launching a new promotional campaign, a sales discount, or loyalty scheme. Econometric modelling will help you to: 

  • Understand how different variables, like price and distribution channels, will impact your performance 
  • Determine the optimal allocation of resources across your different marketing activities 
  • Forecast your future demand
  • Identify different customer segments and their responsiveness to marketing activities
  • Evaluate market conditions and competitive factors that may impact consumer behaviour
  • And much more. 

For businesses, complex econometric models can help to answer questions about what really drives a company’s main KPIs, such as volume, value, market share and gross margin. After all, few companies really understand the external forces that affect their industries or their brands. 

As well as helping you to answer these vital questions, econometric modelling can also help you to: 

  • Save money 
  • Drive better, faster results 
  • Make data-informed decisions 
  • Make your business more profitable 

Marketing Mix Models; A Subset Of Econometric Modelling

Marketing mix modelling is one way to use econometric methods — this type of model uses aggregated data to analyse all marketing inputs over time to arrive at an optimal allocation for resources. For example, what’s the correct amount to spend on television advertising compared to the radio or the internet? Should a company invest money in more salespeople or in more advertising? What is the impact of promotional spending? At what is the point of diminishing return? With the right approach you can find the right answers.

Marketing mix models have been used historically but were phased out with the rise of individual tracking. However, changes in legislation, like Googles privacy sandbox and the diminishing of third-party cookies, have reduced the ability for businesses to use individual tracking, which in turn has led to the return of the marketing mix model.

Implementing Econometric Models

The first step to making econometric models work, like marketing mix modelling, of course, is to have good data. At Jarmany, we recommend having at least 3 years worth of data to input into the model. Limiting this to just 1 year, for example, would mean that the model would be unable to identify any trends or patterns, and the output would match the trends of last year since there is only one reference point. Basically, the more data, the better. 

These are the steps you should follow: 

  1. Define all the parts of the marketing mix that might have an impact on sales.
  2. Review the state of your existing marketing data on these activities and close the gaps where they exist.
  3. Set-up ongoing processes to collect, clean and store the data; and develop the history that will help provide the patterns the model will identify.
  4. Begin modelling.

With everything in place, econometric models can enable businesses to forecast demand by examining all the economic factors involved. For example, econometric analysis revealed that the growth in the number of women working in the US played a major role in the growth of the restaurant industry from 1950 to 2000. But other variables were at work too: rising incomes made eating out more affordable and greater levels of car ownership, especially among teenagers and college students, all translated into higher restaurant sales. 

Understanding the economic variables that underlie demand makes it possible to forecast the future of an entire industry. What happens to your company if the price of oil plummets, or if more women re-enter the workforce after having families? 

It’s obviously not a simple and straightforward analysis, but having the right data and knowing how the global winds of change are shifting can stop a business from suffering huge setbacks. Just ask BlackBerry or Kodak about the impact of the smartphone revolution.

Find Out More

Econometric modelling can deliver a massive benefit to businesses that want to forward plan and avoid major disruption. But, it’s critical that you have the right foundations in place before you begin econometric modelling. If your inputs are sub-optimal, your outputs will be sub-optimal too.  

Get in touch with our experts and we’ll explain how we can bring this benefit to you. 

Contact Us

Looker vs Tableau vs Power BI: Which is Best for You?

As data continues to become crucial to all sorts of businesses, the need to understand, analyse, visualise, and use data grows more imperative.

However, without a data visualisation tool or analytics solution to view this data, businesses can quickly become overwhelmed. Data analytics solutions, business intelligence (BI) programs, and data visualisation tools are now essentials — rather than optional extras.

That’s why 54% of enterprises consider BI and other data-based solutions to be critical to their work now and in the future. By understanding the insights within their data, businesses can make better informed, data-driven decisions. But with a range of tools out there, which one is best?

In this blog, we’ll look at Looker, Power BI and Tableau — the three leading BI and data visualisation tools — to help decide which is best for you.

New call-to-action

At a glance: Looker vs Tableau vs Power BI

Looker

Looker is a browser-based data analytics and visualisation tool. Founded in 2012, Looker was acquired by Google in 2019 and is now part of Google’s cloud platform. It also uses its own modelling language, LookML, a modular language that allows data and calculations to be reused. Alongside this, Looker’s Data Dictionary is a searchable directory for all metrics and descriptions in a Looker data warehouse.

Advantages

Looker’s unique approach to data offers some interesting advantages:

  • Cloud-based & browser-based: Looker offers the useful combination of being part of Google’s Cloud Platform and being completely accessible via a browser. Google Cloud offers an advanced level of security and a flexible way to manage data. With direct access through a browser, these benefits are offered without the need for software installation and manual updates.
  • Easy Git Integration: Looker can integrate with the popular version control system Git, enabling multiple people to work on multiple visualisations simultaneously. With Looker, users can see changes made to data-modelling layers, and jump back to them anytime. They can also create different version strands for developers to work on. This setup is easy and provides a benefit not offered by other data visualisation tools.
  • Connects with multiple data sources: Looker integrates with more than 50 different data sources due to LookML, Looker’s data modelling language. LookML’s flexible modelling language means it can analyse and visualise data from multiple sources, including Google Cloud, Microsoft Azure, Amazon Web Services and on-premises databases.
  • Self-serve capabilities: LookML also offers the ability to define dimensions, metrics, aggregates and relationships. These can then be used seamlessly in data visualisations, providing self-service analytics whilst also enabling the data to be reused. Looker also offers an Explore feature that enables users to self-serve their data through drag-and-drop functions, individual dashboards, and the ability to add additional fields to aid in further data

Disadvantages

  • Limited range of visualisations: Despite Looker’s popularity, the variety of visualisations offered with the basic program is somewhat limited. This comparison is even starker when comparing these capabilities to Looker’s competitors, Tableau and Power BI. It should be noted that Looker does offer the ability to build custom visualisations, which can go some way to mitigating this issue.
  • More expensive than direct competitors: In theory, Looker’s pricing model is ideal with cost being tailored to the company in question. However, Looker is the most expensive of its competitors — Tableau and Power BI.
  • Steep learning curve: Looker’s unique modelling language requires users to have at least a basic understanding of coding – in particular programming languages like SQL. The theory behind LookML is sound; a programming language that is easier to pick up. However, it is more difficult if a business lacks the right in-house expertise or training.
 

Looker’s ability to integrate with other systems, thanks to their unique LookML coding language, means that enterprise businesses can make use of data stored in already present third party software. Features like Looker Blocks — pre-built data models designed to fit common analytics patterns — streamline this integration, offering pre-built code that can more easily be embedded.

Looker is also a powerful beginner platform. Its systems are easy to learn, and the code is easily understood. While its visualistions might not be as sophisticated as its competitors, it also offers visualisation with real-time analysis and the ability to customise.

Tableau

Tableau formerly held the title of the undisputed king of Premium BI tools and has only recently gained rivals in Looker and Power BI. With quick implementation, ungoverned analytics and data can become accessible and easily shared throughout an organisation.

Tableau has recently been acquired by Salesforce, leading to simple integration with Salesforce users, as well as other programs such as MuleSoft and Slack.

Advantages

  • Interactive data visualisations: Tableau provides interactive data visualisation benefits, helping to turn unstructured statistical information into logical and intuitive visualisations. Filtering and selection provide options for further analysis and ease of understanding.
  • Adaptable to large amounts of data: Unlike other platforms that have a limit on data model size, Tableau has the ability to handle very large amounts of data without there being any impact on performance.
  • Intuitive user interface: Developer and non-dev users alike can easily use Tableau due to its intuitive user interface (UI). Non-dev users can use all the basic facilities of Tableau, however, specialists might be needed to increase the platform’s functionality. Tableau’s simplicity is also coupled with its ability to reliably operate on big data thanks to its columnar data model.
  • Compatibility: Tableau is compatible with multiple data sources, enabling businesses to connect with, access and blend data from multiple sources into one visualisation for easy data analysis. Tableau is also compatible with multiple scripting languages, such as Python or R, to maximise potential output.
  • Mobile support: Tableau has a mobile app for both iOS and Android systems. This app has the same functionality as the desktop and online software, allowing users to analyse data remotely. Moreover, the Tableau dashboard can be customised to each application, meaning functionality can be maximised to the individual’s separate mobile and desktop needs.

Disadvantages

  • Inflexible pricing: Tableau’s pricing doesn’t change on a case-by-case basis, despite the fact that most companies have individual needs. Purchasing an extended licence is required by Tableau’s sales model from the start. Many companies might find that they would rather start with a specific set of features and later adjust the pricing for further features if necessary.
  • Poor after-sales support: Due to Tableau’s seniority, there are many online message forums that users can use to discuss Tableau’s features. However, many focus on a lack of support and maintenance. To resolve this, Tableau’s support team sometimes advise purchasing a new feature, which can become costly.
  • Favoured towards Salesforce: Depending on an enterprise business’s requirements, this might not have a big impact. However, the nature of Salesforce’s acquisition means that Tableau’s development will now be skewed more towards Salesforce integration; Tableau is no longer an independent BI tool.

Tableau is designed with businesses in mind, rather than an IT department or developer. Tableau’s user interface is considered to be the easiest to use of its direct competitors. Its ease of usage means that you do not have to be an expert in programming languages or coding, empowering teams across an organisation to become more data-driven and data-literate in their decision making.

Power BI

Microsoft Power BI integrates well with Microsoft products and systems, however a recent uptick in adoption likely comes from the free version of Power BI that is available to anybody. This free version is reliable for individual analysts, but the premium version allows important functionalities such as sharing reports, dashboards or analytical apps.

Advantages

Microsoft’s tool offers the following advantages:

  • Large range of visualisations: Power BI has a great number of standard visuals to populate your reports, each with a wide variety of format options. Power BI is backed up by integrations with Microsoft Office and can harness the power of Excel to create easy data visualisations. Moreover, if the desired option is unavailable, users can build their own custom visuals also.
  • User-friendly interface: Power BI is extremely intuitive to navigate and user-friendly. Users with little dashboard experience can navigate the platform as easily as those with expertise. This is partly due to their natural language query tool, which allows people to ask simple questions to easily navigate to the data they wish to visualise.
  • Lower cost: Power BI is relatively low in cost compared to other leading platforms. A trial version of Power BI is available to everyone, while Power BI Pro is included in some Office 365 business and enterprise plans. This has caused a shift in the market, causing other BI vendors to become much more competitive in their licensing options.
  • Easy to learn: Power BI might be the easiest to use of the three platforms. Though you will need expert support to truly get the most of your data, those who are familiar with Excel will be able to start using Power BI’s data visualisation tools quickly.

Disadvantages

  • Limited customisations: Though Power BI offers a range of visualisations to choose from, it can be difficult to customise any of them. There are basic formatting options available but this can prove limiting for businesses looking to create bespoke visualisations with limited Power BI experience.
  • Potential learning difficulties: As covered, while Power BI is simple to get to terms with in the beginning, it will require added training further down the line. This especially applies when performing analysis over your datasets, as it will likely require tools that are external to Power BI, like DAX Studio.
  • Data security: Power BI offers advanced encryption capabilities using Azure. However, as it’s a cloud-based tool, some stakeholders may feel uneasy about the security and privacy of their data. Businesses will have to ensure that they have the full breadth of knowledge of Power BI’s encryption services to fulfil their business case.

It’s clear that Power BI offers good integration capabilities, especially with other Microsoft products, allowing data analysis and visualisations to be shared across. It offers the reliability of other products, and even offers integrations into other data analytics tools.

Using a data consultancy to make the most out of your tools

Power BI, Tableau and Looker offer high-quality BI and data visualisation solutions for businesses in 2023. What is ‘best’ for your business is relative — but what’s not relative is that in order to maximise your ROI from these platforms and harness the power of your data, you need to get the best out of these tools.

Without in-house expertise or the right training, the steep learning curve and technical know-how required to maximise its potential can hurt your ROI, and squander the potential within your data. This is where Jarmany can help.

With our consultancy services, we’ll help you find the right platform for your business. Once matched to the correct tool, we’ll help you maximise the insights you get from your data and make business intelligent decisions. Jarmany’s team of data scientists are seasoned experts who understand that no two businesses have the same needs.

Whether you need help selecting a platform, getting the most out of data visualisations, creating a data strategy or something else, Jarmany’s data consultancy experts can help.

Start a conversation today.

New call-to-action


  1.  37 Business Intelligence Statistics to Know in 2022 | 99firms

How Data Visualisation Can Improve Your Decision Making

In 1597 Sir Francis Bacon famously said, “knowledge itself is power.”1 Four centuries later, his words are proving to be more accurate than ever, as knowledge in the form of big data delivers an increasing amount of power to businesses.

Tech giants like Google and Facebook have made it abundantly clear that, to them, big data is a goldmine of insights. Therefore, forward-thinking organisations need to invest in and develop a comprehensive data strategy to improve how they obtain, store, manage, share, and use their data.

However, many businesses struggle to make data work for them. A Mckinsey survey found that 47% of business leaders feel that data & analytics have fundamentally transformed their industries, but they still had difficulties putting data to work for their organisations.2

While new technologies allow organisations to collect lots of data, raw data in and of itself has little value. Instead, the value arises when that data is presented in a way that provides actionable insights, informing business leaders on the best course of action.

That’s why in this blog post we’re going to be looking at how data visualization improves decision making. Let’s dive straight in.


New call-to-action

What is data visualisation?

Data visualisation is the final part of a process that includes the collection, cleansing and analysis of information from numerous data sources. This final stage is all about creating a pictorial representation of that data which can then function as a single source of truth for businesses. 

The goal behind creating these visually stimulating visualisations is to tell a compelling story using raw data whilst keeping crucial KPIs in mind during review processes. With the help of data visualisation, key insights and information, such as trends and patterns, can be digested and understood by stakeholders much quicker.

Types of data visualisation

When it comes to visualising their data to help communicate the story behind it to their stakeholders, there are a number of things businesses need to consider. Chief among these is the category of visualisation they want to focus their efforts on, either:

  • Data exploration: Data exploration helps to uncover insights and identify patterns that need further attention.
  • Data explanation: By presenting an easy-to-understand graph or illustration, data explanation helps an audience better understand the results of that data.

Understanding which of those two ends a given visualisation is intended to achieve is essential in order to achieve success in an overarching data strategy.

While there are just two broad categories of data visualisation, there are a number of specific types of visualisations that organisations can deploy to better understand their data. These include:

  • 2D area visualisations: 2D area data visualisations are typically geospatial, as they relate to the relative position of things on the earth’s surface.
  • Temporal visualisations: Temporal visualisations have a start and finish time and elements that may overlap.
  • Multidimensional charts: Multidimensional charts are those with two or more dimensions that help explore correlations and discover casualty, which is why these are amongst the most commonly used visualisations.
  • Hierarchical charts: Hierarchical data sets are the arrangement of groups in which larger groups encompass smaller sets, allowing users to drill down or drill up to conduct in-depth analysis.
  • Network visualisations: Network data visualisations show how data points are related within a wider network.

How does data visualisation improve decision-making?

Data visualisation helps decision-makers see the big picture. From understanding trends and patterns to highlighting issues and areas of concern, data visualisation is crucial to obtaining enhanced oversight over business operations.

Research has shown that organisations that leverage their customer behaviour data to generate insights and make data-driven decisions can outperform their peers by as much as 85% in sales growth. 3

Consequently, any organisation with an eye on the future needs to make sense of its data through data visualisation techniques and tools to enlighten its decision-making processes. Without effective visualisation, organisations are relying more on guesswork and interpretation when it comes to making crucial decisions.

Benefits of data visualisation

Whilst the primary benefit of data visualisation centres around making better business decisions, it’s worth digging into some of the more specific benefits it can help organisations obtain. These include:

  1. Improving speed: Many bad decisions are just good choices with bad timing, as timing is an often overlooked aspect of decision-making. Data visualisation can help businesses draw insights from vast amounts of data in real-time, increasing response times to challenges.
  2. More accurate numbers: Although data provides decision-makers with potentially all the information they need, it’s usually not presented in an easily digestible format. Data visualisation simplifies the information, boosting our comprehension of the data and reducing the need to fill the gaps with our biases, making our decisions more accurate. However, in order to ensure accuracy, it’s pivotal that the data used within visualisations is of the highest quality.
  3. Simplified communication: Once executives and other decision-makers use data to decide on a specific direction, that decision must be communicated to the team responsible for implementation. While the decision may seem obvious, other stakeholders may not fully understand the reasoning behind it, thereby reducing efficiency. With data visualisation, decision-makers could use graphs and charts to communicate the reasons behind the decision clearly.
  4. Identify benchmarks and trends: An effective visualisation makes it easier than ever before for users to recognise relationships and patterns within their data. By exploring these patterns, users are able to focus on specific areas that need attention to help drive their business forward.
  5. Empowering collaboration: Data visualisation helps organisations by presenting data in a universally understood form, empowering people to contribute to decision-making with their perspectives. Approaching any challenge from multiple perspectives enables decision-makers to make better choices.
  6. Understand the story behind your data: Ultimately, all of these benefits of data visualisation lead to one key outcome — a more comprehensive understanding of the story behind a business’s data. Armed with this knowledge, businesses can make better informed decisions that help to drive outcomes and business success in the long term.

Data visualisation tools

Cutting-edge data visualisation tools are essential for converting raw data into actionable insights. As a result, identifying and deploying the right tools is vital for businesses looking to uncover valuable insights that can help drive growth.

Fortunately, there are now a range of data visualisations tools available to businesses looking to harness the power of their data. The most popular among these include:

  • Domo: Domo is a cloud software company specialising in business intelligence tools and data visualisation.
  • Dundas BI: Dundas Data Visualization, Inc. is a software company specialising in data visualisation and dashboard solutions.
  • Infogram: Infogram is a web-based data visualisation and infographics platform.
  • Looker: Part of the Google Cloud Platform following a 2019 acquisition, Looker markets a data exploration and discovery business intelligence platform.
  • Microsoft Power BI: Power BI is an interactive data visualisation software developed by Microsoft with a primary focus on business intelligence.
  • Qlik: Qlik is a business analytics platform that provides software products such as business intelligence and data integration.
  • Sisense: Sisense is a business intelligence software company best known for embedded analytics.
  • Tableau: Tableau Software is an interactive data visualisation software company focused on business intelligence specialising in visualisation techniques.

Even if businesses have access to one or more of these tools, that isn’t enough to ensure effective visualisations. Remember, collecting, sorting, cleansing and analysing data before it gets fed into a cutting-edge tool is essential to ensuring accurate and relevant insights.

And that’s not all. On top of that, businesses also need knowledge, skills and expertise to ensure that tools such as those outlined above are used correctly and therefore produce results that drive positive outcomes.

Enhance your decision-making with data visualisation

Data visualisation has a track record of driving progress. For example, the 1854 Cholera Outbreak Map of London marked the locations of outbreaks, revealing that affected households used the same drinking water wells. Examination of these wells demonstrated a connection between cholera and contaminated water.4 These results helped the city eradicate cholera and contributed to Louis Pasteur’s discovery of modern germ theory.

Over a hundred years later, businesses are looking to leverage data to ensure both growth and prosperity. A comprehensive data strategy that facilitates visualisations that enhance decision-making processes has therefore become essential to long-term success.

However, that requires access to significant knowledge, expertise and cutting-edge tools, all of which can be difficult to obtain and retain in-house. That’s where data analytics providers like Jarmany come in. We’re here to ensure that your business can establish a successful data strategy that delivers insights through stimulating visualisations.

So, if you’re ready to start using your data to predict needs, deliver efficiencies, connect people and achieve growth targets, get in touch with us today.


Contact Us

1  Knowledge Is Power: How Data Is Feeding Disruption

2  Catch them if you can: How leaders in data and analytics have pulled ahead

3  Delivering personalized experiences in times of change

4  John Snow’s data journalism: the cholera map that changed the world

6 Ways Third-Party Data Can Benefit Your Business

Whilst first-party data can provide rich and meaningful insights on your customers and can feed into machine learning, it often lacks breadth, especially if your business isn’t able to collect, store and manage valuable high quality first-party data efficiently.

This is where third-party data comes in.

Third-party data refers to data that is collected by organisations outside of your company and can be used to gain valuable insights into your target audience, industry, or market.

In this blog post, we’ll explore the reasons why third-party data is so important and how it can benefit businesses of all sizes.

#1 Close the gaps in your data

A lot of organisations are collecting their own first-party data to help derive actionable insights and gain a greater understanding of their customers to then guide decision making.

This could be:

  • Website data
  • Social data
  • Marketing data
  • Operations data
  • Sales data

Whilst this first-party data can be very high value, unless you have a large quantity of it, it often lacks validity and is not enough to base high-level decisions on. This impacts the quality and reliability of your analysis.

In this scenario, third-party data can be used to close the gaps to enhance the value of your insights and findings. Put simply, third-party data cannot match an organisation’s first-party data, however, it can help you build on to the insights you already have. First-party data lays the foundations, third-party data heightens it and allows you to broaden your data ecosystem.

#2 Greater context into customer behaviour

Even if your business is a well-oiled machine when it comes to collecting first-party data, this is often useless if you don’t understand the macro-economic factors driving consumer behaviour.

This could include:

  • Geographical trends
  • Demographic changes
  • Environmental changes
  • Political news
  • Market share/size information

By utilising third-party data, you can obtain insights that will help you to understand current behaviour and predict future behaviour, so you can calculate any impact on business operations, and gain greater insights into supply and demand shifts.

#3 Understanding your target audience

Third-party data can help you better understand your target audience and their behaviours, interests, and preferences. This information can be used to create more targeted marketing campaigns and to develop more effective customer engagement strategies. For example, if you’re selling athletic clothing, you might use third-party data to learn more about your customers’ exercise habits, which can help you create content and promotions that resonate with them.

#4 Strengthen Indirect Sales Insights

Third-party data is also pivotal if your business operates through indirect sales channels, as it enables you to gain insights into your sales activity through each third-party retailer. Without it, you only have a partial understanding of your sales performance.

For example, if you were a company selling computers direct to the consumer, but also indirectly through a retailer, you would have access to certain information, such as no. of units you were providing to the retailer, product price point and location where the units are sold. However, you’d be missing a range of insights such as how the retailers discount & marketing schemes impact sales, whether the user is purchasing online or in person, or if certain areas of the world sell better than others.

This is where you can really benefit from utilising third-party data to gain more granularity into your indirect sales performance.

#5 Improving marketing and advertising efforts

Third-party data can also be used to improve your marketing and advertising efforts by providing a more complete picture of your target audience. As a result, you’ll be able to offer a deeper level of personalisation to help your ads resonate more with your target audience.

For example, you can use third-party data to create more effective targeting strategies for your digital ads, such as targeting based on demographics, interests, or purchase history.

This information can also be used to improve your email marketing campaigns by personalising your messages and making them more relevant to your subscribers.

#6 Making informed business decisions

Ultimately, third-party data can provide you with the valuable insights into your industry and market that can be used to make informed business decisions. It allows you assess the competitive landscape, identify market trends, determine the best target audience for your product and predict future customer behaviour. Combined with your first-party data, this information can provide you with a complete picture that will then guide your business in terms of pricing, distribution, product positioning and much more.

In conclusion, third-party data is a valuable tool that can help businesses to close the gaps in their data, gain greater context into customer behaviour, build a better understanding of their target audience, strengthen indirect sales insights, improve their marketing and advertising efforts, and ultimately make informed business decisions. Whether you’re a small business just starting out or a large corporation looking to stay ahead of the competition, incorporating third-party data into your data strategy is essential for success.

How Jarmany can help you

Managing your third-party data can be a minefield, especially in a privacy conscious world with increasing regulations around data protection and misuse. It can also be a struggle to integrate this third-party data with your existing data, and using this to build and feed machine learning models to gain enhanced insights. Additionally, this type of data management requires a specialised skillset, which is often very timely and expensive to build internally. As a result, leaning on a specialist agency, who have expertise in storing, managing and transforming data in order to gain actionable insights is often the favoured approach.

Get in contact with us today if you’d like to explore how we can help you manage your data, use techniques such as web scraping to obtain more insights, and then build machine learning models to help you drive business growth.

 

Contact Us

Google Analytics 4 Guide: What You Need to Know

You might already be familiar with GA4 — many businesses have been using it alongside UA for the last two years. Alternatively, you might know next to nothing about it. Whatever the case, getting to grips with GA4 is important to your business.

Google Analytics is one of the most popular analytics tools, with over half (55%) of online businesses using it to gain visibility into key website metrics.1 Understanding how the latest version works should be a priority.

But don’t worry, we’ve got you covered. In this GA4 guide, we’ll explain everything you need to know to get you ready for the shift to GA4 — and leverage it to gain a deeper understanding of your customers. But first, let’s answer an important question.

New call-to-action

What is GA4?

GA4 is an analytics service that allows you to measure traffic, engagement, and performance across your websites and apps (known as properties), giving you the insights you need to improve all three.

Launched in 2020, GA4 is the fourth and latest version of Google Analytics. It was designed to phase out and ultimately replace the previous version, Universal Analytics, which was built when the digital world was very different from today.

GA4 provides data insights throughout the customer lifecycle, making it a useful tool for businesses or marketers seeking to understand how customers behave before, during, and after conversion. As you’d expect from a modern data analytics platform, GA4 also offers machine learning insights and data science analysis.

GA4 vs Universal Analytics

Up until October 2020, Universal Analytics was the default version used when a new Google Analytics property was created. After that date, the default version became GA4.

Google now plans to phase UA out completely. From July 2023, UA will stop processing new hits, although users will still be able to access data for their Universal Analytics properties for another six months.

Universal Analytics 360 (also referred to as Google Analytics 360), on the other hand, is used by bigger, enterprise-sized businesses. UA 360 is a scaled-up, paid version of UA with extended capabilities. UA 360 has higher data limits, service level agreements and, support-wise, a dedicated account manager and implementation support.

Like UA, however, UA 360 is being sunsetted, albeit from the later date of July 2024.

What’s the difference between GA4 and UA?

There are several key differences between GA4 and UA. In this section, we’ll highlight the most important ones to understand. 

Events vs sessions

GA4 uses a fundamentally different model for measuring data compared to its predecessor. UA’s measurement model was based on sessions, including any number of user interactions ( known as hits) within a specific time period. These could include page views, clicks, and transactions, for example.

GA4’s data collection model, on the other hand, is based on events, with any user interaction qualifying as a separate event.

The change to events, however, has also led to some ‘missing’ metrics and reports, in particular bounce rate. The bounce rate metric in UA is replaced with ‘engaged sessions’, which shows sessions lasting 10 seconds or longer, has one or more conversion events, or two or more screen or page views.

Other valuable metrics available in UA, such as views per session and average session duration, while harder to access in GA4, have recently been made available through customisable reports.

Multiple devices

UA was designed for a world where desktop reigned supreme. Since its launch in 2005, however, the world has changed drastically. Today, people access digital services across a range of devices, with mobile becoming increasingly popular in recent years. GA4 is designed to track the users of today, seamlessly collecting data across multiple devices.

Machine learning

GA4 has machine learning (ML) capabilities, enabling it to use current and historical data to predict how your users might behave in the future. The resulting insights allow you to see the probability of customers purchasing something or churning, for example. UA, on the other hand, has no ML capabilities.

Data protection and security

GA4 anonymises IP addresses automatically, guarding against the identification and misuse of personal information and protecting personal privacy. Unlike Universal Analytics, this brings GA4 in line with GDPR compliance.

Future-proofing

Compared with UA, GA4 focuses on tracking user IDs rather than cookies. Reducing the reliance on cookies helps future-proof GA4 and move away from UA’s focus on tracking page visits and sessions through cookies. This will help improve the quality and access to insights across multiple platforms.

Google Ads

GA4 enjoys a deeper integration with Google Ads, allowing you to measure app and web integrations together. This ultimately provides a deeper level of insight than UA.

More reporting

With GA4, you get reporting options across the customer lifecycle, with reports focusing on acquisition, engagement, monetisation, and retention (more on this later). With UA, on the other hand, you only get reporting for acquisition. 

GA4 is taking over

GA4 is designed to meet the needs of businesses in 2023, enabling them to understand how their customers behave across platforms and journeys. UA was designed for an era of desktop dominance and cookie-related data — ideas that are slowly becoming obsolete. 

With UA being phased out completely by summer, now’s the time to switch to Google Analytics 4 — if you haven’t already. This means you should:

  • ensure you have a centralised archive of historical data you can draw from
  • set up GA4 event tracking  
  • and transition all your existing UA properties to GA4.

Setting up GA4

As set out in the Google Support guide, GA4 is relatively simple to set up — if you know how.2 In this section, we’ll walk you through the process step by step.

  1. Log in to your Google Analytics account.
  2. Check which version you are currently using. If you can see three columns (Account, Property, and View), you are using UA. If you can see just two columns (Account and Property), you are already using GA4.
  3. Assuming you are still using UA, select ‘GA4 Setup Assistant’ under the Property column.
  4. Click ‘Get Started’ to set up a Google Analytics 4 property. Alternatively, if you already have a GA4 property that isn’t connected to your Google Analytics account, select ‘Connect Properties’ and follow the instructions.
  5. If you are already using gtag.js tags, select ‘Enable data collection using your existing tags.’ If you are using Google Tag Manager or the old analytics.js tags, you’ll need to add gtag.js tags yourself.
  6. Click ‘Create property’.

Once you’re up and running, you can set up a range of capabilities designed to help you track and obtain data, including:

  • Configure Custom Events
  • Configure User IDs
  • Configure Enhanced Measurements
  • Activate Google Signals
  • Link to Google Ads
  • Define Audiences
  • Import or set up Conversions

Using GA4

Tracking across multiple platforms

One key benefit of GA4 is the ability to track data across multiple platforms — something that was virtually impossible in UA. In practice, this means that GA4 tracks website and app data for one property. So if a user visits your site using a laptop and a mobile, the data for the various sessions is consolidated under one user rather than two. This helps you keep track of the same user across multiple devices and sessions.

Cross-platform tracking provides a much more complete view of user behaviour, allowing you to understand how customers engage with your website or app, as well as the different devices they are using to access them. You get to see the entire customer journey — from acquisition through engagement and retention — across various platforms. 

To set up cross-platform tracking, you need to use the appropriate gtag.js script to create unique user IDs. These IDs can then be configured to track users across platforms. 

Using Events

As we touched on earlier in this Google Analytics 4 guide, GA4’s data collection model is based on events. Sessions dominated UA — and they’re still used to a degree — but events are how you track almost everything in GA4. 

Put simply, all user actions on your site or app now qualify as events. So to understand and track events is to understand and track user behaviour and engagement. You can choose which events you want visibility over, and how you track them is up to you. 

Broadly speaking, events fall into four different categories in GA4:

  • Automatically captured events: These events, such as when a user clicks on an ad or when a free trial is converted to a paid subscription, are automatically tracked by default, without you having to do anything.
  • Enhanced measurement events: These are events that you can enable in GA4, allowing you to measure interactions with your content. Enhanced measurement events can be toggled on and off by going to the Admin column, selecting Data Streams, then Web, and then Enhanced Measurement.
  • Recommended events: These events require additional context to function effectively, meaning you’ll have to set them up yourself. They include ‘login’ events (when a user logs in), ‘search’ (when a user searches your content) and ‘share’ (when a user shares your content).
  • Custom events: These are events that are specific to your business, website, or app and not already known or measured by GA4. With custom events, you define the name and the set of parameters for each event.

How to create a custom event in GA4

To create a custom event in GA4, simply follow these steps: 

  1. Select the Admin icon in the bottom left of your screen
  2. Go to the Property column and select Events
  3. From here, select Create Event 
  4. Choose the data stream for which you want to deploy the event (assuming you have more than one)
  5. Click Select
  6. Follow the rest of the set-up prompts to complete the process

Getting the most out of your GA4 reports

As you’d expect from a data analytics tool, GA4 provides a range of reports and data visualisations designed to help you understand your data — and act upon it. In this section, we’ll explain everything you need to know to get the most out of your GA4 reports. 

 
Reports snapshot

As the name suggests, the reports snapshot provides an overview of the most popular metrics in one single, easy-to-read dashboard. This is where you go if you need an at-a-glance view of how your property is performing. Data sets in the reports snapshot include things like: 

  • User behaviour
  • New users by channel
  • Number of sessions by channel
  • Users by country
  • User activity over time
  • Views by page and screen
  • Top events
  • Top conversions
  • Top-selling products

The data sets in the snapshot are pulled from other reports, and the reports snapshot is customisable, allowing you to focus on the insights that matter to you most. To customise your report snapshot, you’ll need to follow these steps:

  1. Select Library from the bottom of the left navigation bar (note: you’ll need admin rights to do this — this option isn’t available in a demo account)
  2. Select Reports 
  3. Select Create a new report
  4. Select Create an Overview Report
  5. Follow the set-up steps to complete

 
Real-time Overview Reports

With GA4, you also get access to real-time reports, allowing you to see how customers are using your website in real time and track their journey through the sales funnel. Real-time reports offer a range of metrics, including:

  • Geo-maps, showing where current users are based
  • Number of users in the last 30 minutes
  • Users by source, showing how your users arrived at your site
  • Users by audience
  • Views by page title and screen name
  • Event count by event name
  • Conversions by event name

 
Lifecycle reports

GA4 breaks down the customer lifecycle into four stages — acquisition, engagement, monetization, and retention — with corresponding reports for each. Let’s take a look at what they offer. 

  • Acquisition: See how new users found your website or app, allowing you to understand which channels and campaigns are proving the most successful.
  • Engagement: Explore how users interact with and navigate through your website or app, with metrics covering a range of events.
  • Monetization: Get a full breakdown of how your website or app is generating money, covering e-commerce, subscriptions, and ad revenue.  
  • Retention: Understand the frequency and duration of users’ interactions with your website or app after their first visit — and how valuable they are to you over their lifecycle.

Together, these reports give you a complete picture of how users behave across all stages of the customer journey, as well as the value they bring through engagement. Ultimately, this helps you refine your campaigns, content, and UX to improve customer acquisition and retention — and ultimately drive more revenue.

 
Other reports

In addition to those highlighted above, GA4 comes with a range of other reports designed to give you a complete picture of your users and how they interact with your website or app. 

For example, the Tech report in Google Analytics 4 analyses the technology that people use when visiting your website or app, including the platform, operating system, screen resolution, and app version. 

Meanwhile, the Demographics report breaks down your users by their age, location, gender, and affinity category, which includes acquisition, behaviour and conversion metrics — giving you greater insight into your customer base. 

New call-to-action

Making the most of GA4

If you rely on the Google Analytics platform, it’s time you started thinking about switching from UA to GA4. At Jarmany, we recommend a test and trial period of at least six months; this helps you identify any nuances in your reporting and reconcile them to help you get up and running with GA4. With July’s sunset date coming fast, this is no longer a choice but a necessity. To benefit from the switch, it’s critical that you start to get to grips with GA4 as quickly as possible. 

That said, getting started with Google Analytics 4 can involve a steep learning curve, while migrating from UA to GA4 can be tricky for those without the technical know-how. Plus, for businesses with multiple brands, websites and properties, successfully merging them together in GA4 for a complete view can be tricky. That’s why it pays to work with an expert technology partner with expertise in migration, implementation and support — like Jarmany. 

As an analytics and data consultancy, we can help you seamlessly migrate to GA4, providing you with the support and expertise you need to get up and running fast and maximise its potential. The change is coming, make sure you’re prepared for it with our expert help.

Sounds interesting? Get in touch today and talk to one of our experts. 

1  How Many Websites Use Google Analytics 2022: Google Analytics Statistics.

2 [GA4] Add a Google Analytics 4 property (to a site that already has Analytics) 

Looker Data Visualisation: A Complete Guide

This is where business intelligence and visualisation tools come in. They allow businesses to turn complex data sets into clear visualisations, and then act on them. The result is smarter decision-making, more streamlined processes, and a competitive advantage over businesses that fail to capitalise on this opportunity.

In this article, we’ll take an in-depth look at one of the most popular data visualisation tools on the market: Looker. Read on to learn about: 

  • Looker’s data visualisation capabilities
  • Its key features and how they are used
  • The pros and cons of choosing Looker over one of its competitors
  • How your business can get the most out of this powerful tool

But before we start, let’s answer an important question…

New call-to-action

What is Looker? 

Looker is a data analytics and visualisation tool. It enables businesses to analyse, and explore their data through unique visualisations, helping them to turn raw data into actionable insights that drive smarter business decisions. It does so through powerful features such as integrated insights and data-driven workflows.

Launched back in 2012, in 2019 Looker was acquired by Google for $2.6 billion and is now part of the Google Cloud Platform. It’s a browser-based solution, so there’s no need to worry about installation or maintenance.

While Looker is well-known in the data visualisation world, direct competitors including Microsoft Power BI, Tableau and Qlik might be more familiar to businesses. Though Google’s acquisition of Looker in 2019 is aiming to change that.

As you’d expect, Looker shares some core features with other popular data visualisation and business intelligence tools, such as the ability to: 

  • Build custom real-time dashboards
  • Connect to any SQL database
  • Create custom applications
  • Leverage embedded analytics
  • Access a range of customer support options 

What modelling language does Looker use?

One of Looker’s key differentiators is LookML, its native modelling language. LookML is an SQL-based language, but it aims to improve on SQL’s shortcomings to help users write simplified and streamlined SQL queries.

LookML is a modular, reusable language. And collaboration tools such as version control means that Looker users don’t have to start a script from scratch or spend ages trying to find what changed and when. 

Looker Blocks — pre-built data models designed to fit common analytics patterns — also prevent users from having to start from square one each time they want to create a data model. Users can select pre-existing models and modify them to their needs. This includes:

  • Analytics blocks
  • Source blocks
  • Data blocks
  • Data tool blocks
  • Embedded blocks
  • Viz blocks

Looker’s data visualisation

As the name suggests, Looker is all about data visualisation. In this section, we’ll run through some of its core data visualisation capabilities — and how you can use them to drive business success. 

Looks and dashboards

  • Looks are visualisations created and saved by users. Looks are created in Looker’s Explore section, which can then be shared and used across multiple dashboards.

  • Dashboards allow users to place and view multiple Looks, graphs or tables in a single place. This allows users to, for example, view a range of different but relevant KPIs in the same way in one place. Dashboards are interactive and customisable. For instance, you can put several Looks into one dashboard and add a filter, acting as a master control that affects each Look within that dashboard in the same way.

Both Looks and dashboards can be shared with anyone, helping everyone get on the same page and view and understand the data easily.

Filtering looks and dashboards

Both Dashboards and Looks have filter functionality. Toggling Looks and Dashboards filters can also provide users with greater flexibility and specificity based on the filters’ hierarchies. For example, by selecting a Dasboard filter for a particular year, that filter would apply to all the Looks in that dashboard by default.

However, you can also choose which Looks within a dashboard are affected by that filter. This enables users to set a dashboard filter for a particular year, and then apply a separate filter specific to certain Looks and disable the default dashboard filter for them. This lets you the ability to apply a filter to all your Looks in one dashboard, or apply different filters to Looks within an overall Dashboard filter.

Types of visualisations

Looker features a rich variety of visualisations that allow you to present, read, and understand data in different ways, including: 

  • Cartesian charts, i.e. any chart plotted on x and y axes, including column, bar, line, and scatterplot charts 
  • Pie and donut charts 
  • Progression charts, including funnel charts and timelines 
  • Text and tables, including single value charts, single record charts, and word clouds
  • Maps, including Google Maps
  • Custom visualisations

There are also 40 visualisations available via Looker Studio, previously known as Google Data Studio, as well as custom visualisations created by Looker’s partners. As mentioned above, Looker’s blocks — and Viz blocks in particular — can be used to quickly and easily create data visualisations. 

Hosted by Looker, you can add them to your Looker instance, allowing for seamless visualisations with powerful functionality, including the ability to drill down, download, embed, and schedule data. 

Suggested reading: For a broader look at how you can leverage your company’s data to drive business success, take a look at our guide: 9 Practical Steps to Building Your Data Strategy.

Pros and cons of Looker visualisations

Now you have a solid understanding of what Looker is and how it works, but how do you know if it’s the right choice for your business? In this section, we’ll look at some of the pros and cons of Looker visualisations. 

Looker Pros:

#1 Cloud-based + browser-based

Looker has all the advantages you’d expect from a cloud-based data analytics platform, including advanced security, high performance, and seamless accessibility. And because you access it directly through your browser, you don’t need to worry about software installation or manual updates and maintenance.

#2 Easy Git integration

Looker allows users to integrate the popular version control system Git, enabling multiple people to work on visualisations simultaneously, record changes, and manage file versions. Looker users can see changes made to data-modelling layers, jump back to them at any time, and create different version strands in repositories that developers can then work on.

While not set up automatically, the integration can be easily set up and provides a benefit other data visualisation tools don’t.

#3 Connects with multiple data sources

Looker can connect with and visualise data from multiple disparate sources, including Google Cloud, Microsoft Azure, Amazon Web Services (AWS), on-premises databases, and a range of database software.

And as a Google browser-based product, Looker easily integrates with Google’s entire suite of browser-based applications. This makes sharing Looker dashboards quick and easy, with no downloading and little set up required.

#4 Self-service analytics

Thanks to Looker’s LookML data-modelling language, users can define dimensions, metrics, aggregates, and relationships. These are then used to populate Looker’s data visualisations, providing users with seamless self-service analytics, while enabling them to reuse data and calculations.

Looker Cons:

#1 Limited range of visualisations

While Looker is a perfectly effective and highly popular data visualisation platform, the variety of out-the-box visualisations is somewhat limited — especially compared to competing data analysis and visualisation tools like Tableau. That said, the ability to build custom visualisations goes some way towards mitigating this issue.

#2 More expensive than direct competitors

When compared with its closest competitors — for instance, Microsoft Power BI and Tableau — Looker is the most expensive of the lot. Businesses looking to cut costs may be tempted to look at one of the cheaper, but no less popular, options on the market.

#3 A steep learning curve

Looker isn’t the type of product you can just pick up and play with from the start. Before you begin visualising data, you need to define a semantic model in LookML, which then translates into SQL. This is to ensure that the underlying data is all drawn from the same place and matches up. 

LookML is designed to make things easier — and it does once you understand how it works — but without the right in-house expertise or outside training, it can be a while before you get the most out of Looker and improve your ROI.

Pros Cons
Cloud-based + browser-based Limited visualisations
Easy Git Integration High cost
Connects with multiple data sources Steep learning curve
Self-service analytics Requires expertise to maximise results

Suggested reading: While Looker is a solid choice for many businesses, there are other business intelligence and data visualisation tools on the market. For a closer look at one of Looker’s direct competitors — Microsoft Power BI — check out the below article: 11 Benefits of Using Power BI for Data Analytics

How to create visualisations in Looker

As a visualisation tool, Looker strives to make creating visualisations as easy as possible. Creating Looker visualisations involves the following simple steps: 

  1. Create and run a query in Looker
  2. Click on the Visualisation tab
  3. Select the visualisation type you want to use 
  4. Select Edit to configure and customise your visualisation

Now, let’s look at some key parts of this process in a bit more detail.

How to choose a visualisation type

Once you’ve created and run a query, click on the visualisation tab. You’ll then be able to choose a visualisation type by selecting one of the chart buttons at the top of the screen. To view more visualisation options, simply click on the three dots to the right of the chart buttons. 

Each option displays your data in a different way, and some options are better suited to certain types of data than others. If you’re measuring the change in a value over time, for example, you’ll be well served by a cartesian chart, with the time-related data making up the x (or horizontal) axis. Meanwhile, if you want to visualise how values are proportioned in relation to each other, a donut chart is your best bet. 

How to customise visualisations

Once you’ve selected one of the visualisation types, you can play around with the configuration options to make the data more readable and customise the look and feel of the visualisation. 

Each visualisation type has its own unique configuration options. In a column chart, for example, you can choose whether you want the data to be grouped or stacked, what kind of spacing you want between columns, the colour of each column, etc. Have a play around and see what works for you. 

Creating multiple visualisation types

Looker also allows you to create multiple visualisations within a Look. For example, you might use a column chart or line chart visualisation in one Look as a way to compare data or provide additional insight and context.

To do this, follow these steps:

  1. Click on the Edit button to display the customisation options for a particular visualisation
  2. Select the Series tab
  3. Go to the Customizations section and click the arrow next to the particular series
  4. Go to the Type box and select the visualisation type you want for that series

Getting the most out of Looker

Looker is a powerful BI and data visualisation tool that helps you start visualising your data and making business intelligent decisions. But you can only do that once you know how to use it and get the best out of it. The companies that are best able to view their data are best positioned to use that data to drive decision-making.

Without in-house expertise or the right training, the steep learning curve and technical know-how required to maximise its potential can hurt your ROI, and squander the potential within your data. This is where Jarmany can help. 

With our Looker consultancy services, we’ll help you to get the best out of the platform, ensuring that your business capitalises on its powerful data visualisation capabilities. Our team of experts has the experience you need to build visualisation solutions tailored to the unique needs and goals of your business, enabling you to:

  • Master Looker’s native language, LookML
  • Create bespoke visualisations that simplify complex data sets
  • Drive data-driven decision-making across your organisation.

To find out more about how Jarmany could help you use Looker to drive business success, get in touch with one of our experts today.

Contact Us