Palo Alto, CA

Digits to Dollars: Maximizing GTM Impact with Analytics with Oz Guner

Play Video

What we covered

In a world where humans can only process information one piece at a time, data analytics emerges as the game-changing tool that empowers teams to internalize vast amounts of data in a single sitting. However, effective data analysis goes beyond simply plugging in methodologies; it starts with a well-crafted strategy and identifying the right metrics. Oz Guner, Business Intelligence Analyst at The Predictive Index joined us in a recent sessing divinginto the significance of data analytics for GTM teams. Here is what we took from the session!

 

Data Analytics is Crucial for GTM Teams

Data analytics play a significant role, or at least they should, for go-to-market (GTM) teams. While humans can only process one piece of information at a time, data analytics provides a wealth of information through visualizations, allowing teams to internalize large amounts of data in a single sitting. By leveraging data analytics, GTM teams can make more informed decisions, optimize strategies, and drive business growth. But the methodologies aren’t just plug and play, it starts at the top with a healthy strategy, which then leads to the right metrics.

These metrics, or key performance indicators (KPIs), can be grouped into two main categories: business-level metrics and unit economics metrics. Business-level metrics include bookings, annual recurring revenue (ARR), monthly recurring revenue (MRR), and annual contract value (ACV). On the other hand, unit economics metrics focus on measuring lifetime value (LTV) to customer acquisition cost (CAC) ratio, which indicates the profitability of acquiring and retaining customers. A healthy LTV to CAC ratio should be around 3x, indicating that a company is generating three times the revenue spent on customer acquisition. Tracking these metrics requires data from various sources, such as marketing automation platforms for top-of-the-funnel data, system of records like Salesforce for opportunity data, and ERP software like NetSuite for financial metrics.

Oz provided a very helpful KPI cheat sheet you can download here:

 

 

Quantitative vs Qualitative Analysis: There is Value in Both

Quantitative analysis involves various techniques such as regression, time series analysis, market segmentation, and clustering, which help organizations gain insights into relationships between variables, make strategic decisions, and predict future outcomes. These methods are valuable for tasks like sales forecasting, pricing optimization, predicting customer behavior, and determining lifetime customer value. However, relying solely on quantitative analysis is not sufficient, as it disregards factors like market conditions and human elements that affect business performance.

Teams have to learn how to use qualitative analysis in conjunction with quantitative methods. Qualitative analysis provides valuable insights, incorporating techniques such as natural language processing (NLP) techniques to understand customer pain points and gain feedback from both existing customers and prospects. Guner mentioned that NLP tools like Gong enable businesses to analyze prospect conversations, identify sentiment, and extract meaningful keywords. This qualitative data supplements quantitative analysis by capturing customer sentiments and preferences that may not be evident in numerical data alone. By leveraging both quantitative and qualitative analysis, organizations can gain a holistic understanding of their market, refine their strategies, and make data-informed decisions.



Analysis Leads To Insights, Which Leads To Action

Going back to the natural language processing example, Oz highlighted the technique of tokenizing every word in net promoter score surveys to identify patterns and associations between words. By aggregating and analyzing the responses, Oz’s team was able to identify specific keywords like “slow software” that indicated a need for resource allocation. This approach helped them address customer concerns and improve their product’s performance. The lesson learned here is that by employing qualitative analysis techniques, businesses can uncover valuable information hidden within customer feedback and take targeted actions to address pain points and improve overall satisfaction.

What sort of tools and resources are necessary for effective data analysis? Oz emphasized the importance of using a reliable business intelligence tool with features for data visualization, data transformation, and the ability to create business logic easily. He mentioned tools like Domo, Tableau, and Looker as examples. Oz also highlighted the significance of leveraging open-source resources for sentiment analysis, such as towardsdatascience.com and blogs by data scientists like Julia Sigi. These platforms provide valuable help into sentiment analysis methodologies, allowing analysts to score and categorize comments based on their sentiment. The key takeaway here is that by utilizing robust business intelligence tools and exploring open-source resources, analysts can enhance data analysis capabilities and generate more accurate insights, which should in turn lead to new strategies and tactics.

If you enjoyed this session, make sure to come to one of our upcoming events! You can see the lineup and subscribe at immersa.ai/events

Speakers

Mustafa Ozen “Oz” Guner
Business Intelligence Analyst III at The Predictive Index
 
Aseem Chandra
Co-founder & CEO @ Immersa