The Key to Building Smarter Software: Deliver Insights




Actionable Insights image

Three years ago I spent a lot of time looking at SaaS business intelligence companies. I loved what I saw in the demos: easy data connections, slick looking graphs, powerful drill down tools and custom dashboards made the tools look like no-brainers. And then I began my diligence calls. All of these bells and whistles were useful for data analysts I learned, but mostly worthless for regular users. Customers didn’t want to become data analysts, they wanted the software to do the work of the data analyst.

It then dawned on me that there’s a massive mismatch between the areas where vendors focus—namely graphics, dashboards, query and reporting tools—and the reality of customers’ needs. No one has time to dig through dashboards, graphs and reports. And customers don’t want to spend any time in your application unless they absolutely have to.

It turns out, this mismatch doesn’t just apply to business intelligence tools but to any software that manages data. Take Salesforce’s Sales Cloud for instance. It collects and manages a ton of data, but does very little to pro-actively analyze that data and provide insights. Wouldn’t it be better if Salesforce emailed you when it detected interesting insights instead?

This is exactly what I heard from customers. They want an email or SMS alerting them to an abnormal condition, stating the insight along with as much information as possible about the issue’s root cause. Here’s an example in a sales context:

We are projecting that you will miss your plan for Bookings this quarter.

Your Plan is $2.5m; Based on the data we have, we project Bookings will be $400k below plan.

This is because you have too few Opportunities in the pipeline given your historical conversion rate from Opportunities to Closed Deals (55%). The total of all Opportunities in the funnel that are projected to close this quarter is $3.8m.

Western Region appears to be the problem. More specifically, reps John XX and Kate XX are below their targets. Click here to see further information. (This is where you get to bring them back to your application with graphs and query tools to dig deeper.)

To some, this may sound like science fiction, but in reality it’s not that difficult to pull off. Without realizing it, we interact with smart software all the time. Amazon’s automatically recommends products we might like. Nest optimizes thermostat settings. VideoIQ even figures out when someone is about to commit a crime. The key is that all of these products anticipate what a user wants, and then do it automatically.

As all software developers will tell you, making things easy for the end user usually requires hard work by the developer. So what is required to build smart data analytics software that can automatically and proactively deliver insights to users?

1. Start with Application Focus

This step only applies to platform vendors: stating the obvious, to really provide great insights, platform vendors will need to focus on a specific application, and move beyond being a broad horizontal platform. That way, the information in the system becomes understandable. e.g. this data represents bookings, and is not just a bunch of numbers.

BI vendors get this, and have created a set of applications that are built on top of their platforms. Some of these are really good. But they are still short of what customers are hoping for.

2. Figure out the Important Moments

Most of the time, there is no need for a human to look at data, as everything is behaving as expected. This is one of the problems with dashboards: if you keep going back to them and there is nothing unusual to observe, you will soon stop using them. Customers want to be alerted when something problematic, or out of the ordinary has happened.

BI Tools usually provide alerting functions. But those that I have seen are too simplistic, and require the user to define the rules for what is an exception. Once again, the software is not being smart, and is expecting the user to do the work, instead of figuring out how to do that step for them.

Here are some initial thoughts for how to detect the unusual events which require human attention:

Baseline the data

Most data follows a pattern, and that pattern can be discerned over time and used as a baseline. Automated pattern recognition techniques can then be used to for anomaly detection when data deviates from established norms.

Look at Budgets or Forecasts

Many times there will be a budget or forecast for the data that can be used to determine what is the “normal” or “expected” behavior of the data. Then when things vary from that, create an alert.

Use Application knowledge to determine what is abnormal

In many situations, simple knowledge of the application will help the software developer recognize what is abnormal data. For example in storage management, you would know that when your disk is nearing 90% of capacity, that is an alert condition.

Send regular updates when nothing is abnormal

So far we have only talked about alerting when there is abnormal activity. But in most applications it is also useful to be told at regular intervals that nothing is going wrong.  By itself, this absence of any problems is actually an insight.

This is best accomplished with a regular email to let them know what data is being monitored and that there are no exceptions. Perhaps this takes the form of a list of high level data items with a Green status light and possibly a graph or numeric value. As an example, it would be nice to get a monthly email from your accounting software to tell you that bookings, revenue, margins, expenses, and cash were all as expected, as opposed to just getting an email when they were significantly above or below plan.

3. Determine the Root Cause

If your software determines that bookings are about to miss plan, that is somewhat useful. But it immediately raises question: Why?

Normally to find the answer to that question, you would need to “drill down” into the bookings chart to figure out the root cause. Was it because one of your regions is underperforming, because a certain product didn’t meet the expected sales target, or because the overall sales productivity is lower than expected?

What we see from this is that most data is hierarchical in nature. For each application, there will be a small number of really important high level metrics. But behind each of these there will typically be a hierarchy of supporting metrics that help understand the root cause of a problem if the high level metric is abnormal.

Let’s take Profit as an example. If we missed our profit target, we would start looking at the following components to see where the problem had come from:


Then if we dive deeper in to Bookings, if there was a problem there, we might look at the following set of components: image

Knowing this hierarchy makes it possible for smart analytics software to do the drill down for the customer instead of making them do the work.

4. Work with a Domain Expert to figure out the key Insights

In addition to using the Hierarchy to figure out what is important, it also makes sense to ask an executive in the application domain to walk you through the key insights they are after, and how they would go about diagnosing common problems.

If we were talking to a VP of Sales, they might tell you that they want to know the following:

  • Am I going to miss the quarterly bookings number?
    • And they would tell you how they would diagnose a problem by looking at contributing elements as I have attempted to show in the diagram above
  • Is the pipeline being built so I am well placed to hit my number for the following quarter?
    • Requires me to look at deals that are in earlier stage in the funnel
  • Which of my sales people aren’t performing?
    • Which part of the selling job is a problem for them (e.g. demos)? (Allows the VP to take remedial action such as additional training.)
      • Who are the best sales people at that sales function (i.e. giving demos)? Use the best people at giving demos to training to the people that are having trouble with their demo to closed deal conversion rate.

Once you know what they care about, work on setting up the metrics in such a way as to allow you to provide them with those Insights.

5. Can you also recommend a remedial action?

This may be a stretch, but in many cases it might also be possible to suggest remedial actions. For example in a sales application, your data may show that some market development reps are not getting a good connect rate to customers after sending an initial email  (i.e. they are probably not doing good job at researching the customer and writing a compelling email). You might look for the reps that have a really high connect rate, and suggest that the sales manager consider using those reps to coach the problem reps.


If you are building software that generates, collects, or manages data in any way, ask yourself: Can customers easily gather insights from my data? There is a remarkable opportunity for us to build smarter software that gives customers what they want, it just takes a little more work.

About the Author

David Skok

Stay updated

Related Posts

57shares Today’s application developers are faced with a broad set of architectural...

  • Matthew Bellows

    Great post David. The mantra we use for this kind of design work imagining a VP of Sales or Sales Manager saying “Tell me what works, and show me what to do about it!”

  • Robert La Ferla

    Insightful as usual. “Send regular updates when nothing is abnormal” is a good and not always obvious good practice.

  • khelal

    Great post (as always) and funny enough it’s something we started working on a few months ago in our construction bidding platform… and so far private demos with key industry players has been amazing.

  • Mark Organ

    I’ve been saying for a while now that SaaS 3.0 is going to be all about best practices, benchmarking and recommendations – not business process automation. In fact I think that smart SaaS entrepreneurs will consider giving away what today is thought of as “core functionality”, the business process, to focus more on the truly value added components. This requires a different product architecture and distribution model, and I think has the potential to be just as disruptive as SaaS 2.0 (multi-tenant SaaS).

  • Mark Organ

    As a great example of what I think is SaaS 3.0, my favorite is BloomReach. I pity whoever has to compete against those guys.

  • A Gregori

    Thanks David – post right on target as usual! I am currently working with Tableau to generate smart client services and sales insights – anyone with experience or recommendations to share?

  • David

    The insight is profound. In my (painful) experience, there are two non-obvious barriers to achieving this desirable end state.

    First, whomever designs the business-intelligence system has to have a deep understanding of the organization’s model. I say deep because reality, up close, is not as simple as it appears from a distance. Let’s assume an enterprise tech company is currently constrained in promotion (the cadence of the firm as a whole is determined by the inbound flow of sales opportunities).

    If the constraint shifts suddenly to engineering (as it is wont to do), the BI system must understand the significance of this shift, else the operational signals it sends will be precisely wrong.

    The second barrier is connected to the first. A deep understanding of the organization’s model requires that there actually is one! If, as is often the case in larger organizations, each department is optimized in isolation from the business system as a whole, any attempt to model the system as part of the design of BI will result in conflict and compromise.

    In such a case, it will be easier to adopt a tick-a-box approach to the provision of BI than it will to try and deliver the utility that you promote.

    I suspect the trick for organizations taking BI to the next level, per your suggestion, will be to pick organization types where the model is explicitly understood and the business is already managed as a cohesive system.

  • David – Spot on and related to something I raised with you a year or so back.

    Take an example of analysing energy consumption in buildings:
    Five years ago the data was a monthly gas and electricity meter reading if you were lucky.

    Fast forward to “smart metering” – or more accurateoly automatic meter reading.

    A manager with 100 sites (or several thousand) gets 48 readings per meter per day, and my have tens or hundreds of sub-meters per building.

    The energy manager used to have a high level approximate abstraction of the myriad detail that had swamped the so-called building energy management systems.

    If you have seen hundreds of building controls systems with a pre-programmed “acknowledge all alarms”, you have seen a pre-cursor to the dashboard disaster that is facing managers.

    They need triage, interpretation and workflow support. And little more (a bit of fiscal or compliance reporting).

    This is not what they are offered – they have thousands of graphs (paid for by subscription) with which they never engage. The folly of SaaS based on a premis of subscriptions to fund commodity services is obvious – it is a race to the bottom on price, followed by churn.

    What is emerging is domain knowledge applied to data. Monitor everything and act only on what you need to. This requires opinion regard priority and cause or “di-agnosis” implemented as filters.

    The job of most SaaS should be productivity support, where judgements that can be expressed as algorithms can save time and focus resources to pain points.

    This is not what big data and data-mining or most BI tools offer which are just huge slice and dice machines with no skills at feature extraction.

    Since our last chat we are now seeing revenues and as promised will get in touch when our metrics hit those magic numbers!

    Have a great time over the festive season – and thanks for the read

  • I would point out that benchmarking assumes homogeneity.
    Distinguish two forms of pattern recognition.
    a) This child should be able to run faster because children of her age can.
    b) This child is deaf – and so responds to the starting gun slowly after movement by others.

    The first is a benchmark, the second a diagnosis.
    If your SaaS supports homogeneity a) will do, but if there is expected heterogeneity (in people, buildings, widgets, organics, designs) – then only b) will do more than create bad false negatives and positives – because sometimes a skilled practioner must select the correct classifier.

    If investing, invest in the later – detection of homogeneous benchmarks can be completely automated from input data, and is itself a commodity solution. (Not defensible without network effects).

  • David,

    Thanks for this fantastic post, once again.

    We’ve found better analysis and cleaner graphs help get the “hard to reach” data out of the CRM. However, sales manager still have to do a lot of the time consuming thinking and spreadsheet jockeying when their most valuable time is spent is in coaching, training, and, hiring.

    You’re hitting some key points we’re working on at Rivalry.

    Fantastic post.

    CEO | Rivalry

  • Josh Payne

    David, I feel like you just described the key elements of our product at InsightSquared ( and positioned us very nicely against the typical platform-centric BI vendor. Busy sales and marketing leaders don’t want to spend their time building reports. They want the reports built for them quickly so they can spend time figuring out how to take action on problems based on their own company’s strategy and scenario.

    I will say that parsing out a root cause can become much more complicated in practice. My favorite example of this are metrics that measure the quality of leads being generated. Often times, “time to convert” the lead to sales opportunity will be a metric to track. If that goes down is that a good thing? Maybe it is, because your marketing team is bringing in good leads and they’re converting quickly. Is it a bad thing? Maybe it is, because your sales team is starved for leads so they’ve lowered their bar and are converting more.

  • Whole-heartedly agree with your thesis (which dovetails uncannily with our founding vision at BrightFunnel). Insights have to be delivered to the decision-maker, at the point of action or collaboration. Mark nails it in his description of the future – it will require benchmarking, best practices and recommendations to be baked into applications. This means that this isn’t just an analyst or C-level audience (which traditional analytical apps focus on). Let’s take the marketing function as an example. The relevant audience spans from the CEO/board (“It’s Q4, but we know you’re not investing enough in marketing to hit your Q2 revenue number, here’s the lever to pull”) to the demand gen manager (“webinar attendence 30 days after Opportunity creation for segment X reduces sales cycle by Y. You should target this audience with this specific campaign”). That said, I don’t think it’s fair to expect Salesforce, Marketo and Eloqua to solve this problem on their own. A new type of tool is required to solve this problem.

  • Mark Organ

    Very clever comment, thanks James. I agree that diagnosis and recommendation is the gold standard of value, a true expert system. I still see value in benchmarking however. It’s important for the user to know how he/she is doing relative to others. And I do think that network effects can be built in for savvy SaaS entrepreneurs who explicitly have this as their goal. We’re giving it a shot with our company, Influitive.

  • Yes, you’re spot on in your assessment. It will be interesting to watch both SaaS 3.0 apps, as you describe them, in the form of next-gen CRM, say, as well as next gen analytics.

  • Oliver Churchill

    David — Bravo. It takes a different set of skills, working in concert, to provide the types of proactive insights you describe. BI tools are too simple. Stats tools are too complex. Human intervention for reacting to the specifics described below is not scalable or timely enough. Throwing answers to math problems over the wall to smart, savvy sales pros is like a tree falling in the forest. Making complex things like Big Data (Ok, sometimes small in the sales context), digestible and action-oriented for sales folks is indeed a worthy challenge that requires … new technologies!

  • Hi Mark – Influitive – so I guess you know @wmougayar (cool guy).

    Note : If clever – not meant to be demeaning. I agree network effects are possible (absolutely going for it too ;), and we also see value in benchmarking (especially within supposedly homogenous sets for identifying outliers)

    – however sometimes the “gold standard” as you describe (and we deliver it some of the time;) is often more relevant than the benchmark .
    Simple real world example (two schools have different energy bills – by 30% one is in a town and the other on top of a hill (but essentially same local weather) – intrinsically identical, there extrinsic featrues are very different and not available for benchmarking (because externalities are an infinite domain) so big differences in energy spend cannot be explained by insulation, plant, controls, but instead are driven by windspeed, solar gain, humidity, air freshness and other aspects of microclimate etc.

  • David, I really love the framework in the article.
    SaaS 3.0 is all about Smart Software (combining your keyword and Mark’s).

    Other characteristics of smart software are:

    Active – it does not serve as a ‘warehouse’ or information storage it continuously monitors changes and actively creates notifications.

    Radiant – insights are pushed into where users are vs. requesting users to ‘come and have a look’. Think push vs. pull.

    Succinct – Actionable insights (the holly grail…) are being delivered in a clear and a short format, with just the right information and not more than that.


  • You got it!

  • Paul

    Have a look at the startup 03july . Big data, make thing easy for end user. No brainer app, the first app “hungry now” on ios and WP is a hit !

  • evanish


    How do you think about Data quality? One of the biggest challenges of any business doing analysis is how clean the data is.

    If the sales team doesn’t enter data religiously then tools like SalesForce won’t help and if you’re trying to tie database events or tracked events like an analytics tool, it’s only as good as what an engineer configures. Having reports that sometimes look off is one thing, but if you start receiving automated projections and analysis from that, you can end up drawing false conclusions which will quickly cost you the trust of your customer.


  • Tom Krackeler

    Terrific post David. One of the key aspects you mention is the need for to provide context or explanation to the end user as part of an alert. (e.g. “we project a quarterly bookings miss due to too few deals in the pipeline given historical conversion rates”).

    I’ve found that people are resistant to system-generated insights that feel like they are being produced by a black box (e.g. “the model says we are going to miss bookings”)– even if the system is smartly understanding baseline patterns and detecting anomalies. They want to be able to relate these findings back to their own intuitive understanding of the cause & effects of various indicators within their business. So one of the key challenges is how to develop and communicate alerts so that they are framed within the existing business understanding of the recipient.

  • Jack Derby

    Terrific post! It’s a very exciting time to be in sales and certainly over the next two to three years, we’ll be more and more intrigued by new applications for data collection, interpretation and day-to-day use in the real world. If you take as an absolute (which is what data has shown for 20+ years) that the #1 reason that the top B2B salespeople push themselves to become the most successful is peer recognition, then having the best analytical data provides a unique opportunity to share best practices…among peers.
    We have a unique perspective of observing the sharing of data and analysis about a group of professionals and that’s at the Boston Police Department, where we have been leading their business strategy process for a number of years. When Bill Bratton (signing up again as PC in NYC) first came to the BPD 15 years ago, he drove down crime by bringing his officers together every day in rapid fire meetings where the use of data changed the process of community policing forever. Ed Davis has taken this to a much higher level by sharing the data openly among peers and pushing them to assess root causes and develop standard improvement processes and tactics to share among themselves. Everyone wants to be the top cop in the top district. Not coincidentally, violent crime has been reduced every year for the last five, and will be a record low again this year in Boston.
    Take that thought process and this level of data analysis and sharing and bring that down to a peer-to-peer review session held during the first few days of every month in order to discuss and decide on the best practices to employ for the next 90 days. Don’t dwell on the results from the prior month too long, or else everyone will turn off. Let the top producers (Not you as the manager. Your job is to facilitate only) educate one another in their best practices. Use the data to define what worked, what didn’t and what continues to require further experimentation over the next 90 days. Recognize the best producers, and, when properly managed and facilitated, the impact of this simple, but formal, peer-to-peer learning process will become an extraordinarily impactful tactic toward consistently exceeding your quota.

  • Bill Butler

    Great topic – having deployed Sales Cloud in a number of companies, the most common response from Sales Reps is “it does not help me sell”. Sales Reps don’t update SF consistently and the data is often incomplete or inaccurate. Customer Insight is the holy grail and Marketing and Sales Content can play a big role in understanding the Digital Body Language of the Customer. What content has been viewed and by who will impact sales strategy and also help a company understand what opportunities are ready to be closed.

  • Andrew Sytsko

    David thanks for the great post (as usual).
    This time it stroke me that a framework illustrated with and conceived for sales and marketing domain can be actually applied word for word to the IT management space. Here we see the same abundance of dashboard and alerting-configure-yourself tools and same lack of truly intelligent and helpful software.

  • Philip Holt

    Personalization of any product, be it consumer facing or alerts and recommendations in an analytics product, are entirely dependent on data quality. I used to believe personalization was an algorithm problem, but have come to view this as a data quality problem. Surprisingly few people seem to be focused on data quality at the source of data generation.

    I completely agree with David’s core argument: most consumers of data want updates, alerts, and notifications to inform them of meaningful changes or insights. We believe data contextualization is a necessary foundation for adequately addressing personalization like this down stream.

  • Mark Organ

    To clarify, I think that what is missing from many of the current crop of SaaS companies is insight gleaned from the universe of users and accounts in the system, as opposed to insight from the account itself. This is not easy to do – architecturally you need a system that is designed to provide this across-userbase and across-account analysis, benchmarking, best practices and recommendations. The business model may also be quite different, and probably should be. This is SaaS 3.0 to me, oriented around big data at the core, there are only a few examples out there that i have seen, but they are powerful and disruptive.

  • Andrew I’m glad you saw that as I very much wanted to convey the notion that this was a broad opportunity, not just restricted to sales. Best of luck if you’re going in that direction.

  • Jack – thanks for the great example.

  • Tom, you raise a great point. I wonder if this could be overcome with time if the user realized that the insights were consistently good and trustworthy? But if the insights are poor, that’s not going to work. But you’re right about the challenge.

  • Jason, great point to raise, particular with regards to implementations. This is one of the great problems with, that was also raised by another commenter, Bill Butler. Other applications get better data as the don’t rely on humans for collection. I see several alternatives emerging to that provide more value to the sales rep, and automate the collection of the data. That has a chance of solving the data quality problem in the sales domain.

  • The other approach is not to have reps typing in the CRM in the first place! Our clients provide field reps with office-based executive assistants. The latter live in the CRM and the former never touch it! The upside is that field reps do 10-20 meetings a week, instead of 2 (even in enterprise-sales environments).

  • I agree on the black box generated warning, but wouldn’t it be cool if we just had a “show me why” button that shows the logic in a dashboard?

  • Super blog post. Spot on, as always.

    Users are not as proactive as we as vendors might think they are, or want them to be. But there is one little challenge here, as you briefly touch upon. What may seem interesting to one customer is irrelevant for another. That makes these “trigger mails” somewhat difficult to come up with or predict – and that’s where most of the work lies: discovering WHAT will be relevant information and WHEN it will be relevant. That probably explains why so many pick the shortcut: having a dashboard.

    The strategy you describe is similar to a pull-and-push marketing strategy: make sure to have the right valuable information available when your target group needs it, instead of just putting up random ads.

    This is why I recommend a dashboard that responds to each user and displays relevant data when the user logs in – instead of just a basic dashboard of the type you describe. At Billy’s Billing (, accounting software for small businesses, we use notification windows as an essential part of our dashboard. They can sum up and call attention to points where the user needs to take action. This has proven pretty workable.

  • Oliver Churchill

    Yes. It helps if there is a way to explain the “why” and combine the insights from modeling with human inputs. It helps the human being to feel good and take ownership of the recommendation, incorporates externalities and insights that modeling would not know about, and improves the outcome, statistically and related to behavior/action — and results, which is the goal. Getting the “answer” from a calculation is just the beginning.

  • Thanks Toke. You make a good point about the relevance to specific users.

  • You’re welcome. Keep posting your valuable blog posts – they are very useful for me and my team.

  • The “Show me why” button is what I intended to mean by the comment at the very end of the Insight, “click here for more details”. But I like the wording “Show me why” far better.

  • Yes – very clearly makes sense in that context. Glad to hear you are making good progress!

  • Lauren Kelley, OPEXEngine

    David, terrific post as usual. You provide a really clear description of applying Mark’s definition of SaaS 3.0 being a combination of software, best practices and benchmarking to management dashboards and BI. This is exactly where we are taking OPEXEngine’s benchmarking. We’ve been working for a number of years sorting companies by comparable business model characteristics, ie., don’t compare yourself to your competitors selling the same product or in the same market, because inevitably in every market there will be a couple large competitors, some medium sized companies and a few start-ups, so aggregating their financials doesn’t tell you anything. Aggregate data for similar company stages of development with similar business models, and even then look at a couple different peer groups.

    On top of that, you need to understand the goals of the business so that your automated insights and alerts can differentiate among varying high level goals of the business (is your goal early customer acquisition and product/market proofing, rapid scaling and market domination, or profitability, for example), and then being able to show/analyze the data comprising the root cause of the summarized alert requires real domain expertise and understanding of best practices.

    I also think that the issue of what data you use and whether it is “good” data or unclean is critical, especially if you want to be able to differentiate among small sub-groups so that you can have individualized KPIs. We start with very clean data; part of the reason that we charge for our benchmarking is to ensure that we work with every company to get consistent data (and the companies benefit from the fact that we work with hundreds of companies to define the metrics tracked).

    The cool thing about SaaS 3.0 is that it will take company management to whole new levels of business efficiency which we are already seeing as a trend in our benchmarking. Of course, hopefully it will be coupled with commensurate improvements in customer productivity gains or you may see something similar to what we saw in the late ’90s in enterprise software where really productive sales organizations sold more software than customers could use. Somebody needs to crack the nut on a really accurate tool for defining and quantifying value prop…

  • Craig Rael Harris

    I would add one warning to this incisive article;

    Ensure that the user doesn’t lose a sense of ownership of
    the process.

    We have been developing software that applies the principles
    in this article (we have a remarkably similar approach to your ‘hierarchical
    data’ – what we call ‘Value Trees’). Smart software that runs through a series
    of algorithms to provide insight needs to keep the user involved in that
    process while shielding them from the work – even a simple UI solution, like a
    button press that triggers the algorithms and produces the insights, will keep
    the user invested in the process.

    If the user doesn’t, at least partially, understand the underlying algorithms and feel part of their execution, then the sense of ownership is lost and there is a reduced incentive to act on, or rely on the information.

  • Good point. Thanks for adding.

Send this to a friend