Better, Faster, Cheaper: Evolution is Constant, But Major Changes Only Happen From Time to Time

Arman Eshraghi
4 min readFeb 18, 2021
Photo by Matthew T Rader on Unsplash

For as long as any of us can remember and even beyond that, products in any given market continually get better, faster, and cheaper. It’s the natural evolution of any marketplace and keeps the market alive. Sometimes buyers expect or demand improvements and sometimes improvements come about on their own as a result of revolutionary changes that make things better, faster, or cheaper, especially in technology. Continuous improvement is good, but many companies often get stuck in that groove at the risk of missing out on innovation.

Markets, in general, are efficient at meeting buyers’ needs and each product has characteristics buyers are looking for. Initially, there is an unmet need that is identified. Let’s consider data and analytical software. Data, by its very nature, is quite complex. So much so that without the proper tools for end users, access would be severely limited to only data technology experts, as it once was early on.

The market saw the need and soon, there were a number of data and analytical tools that were developed for end users. Reducing the barriers to using data allowed more and more users to access, explore, and create their own analysis. This, in turn, fueled the market to provide even more tools. This segment of the market has seen explosive growth in the past 20–25 years.

In an effort to provide better analytical experiences these software companies focused on providing access to more data sources, more data types, as well as more types of outputs, like complex visualizations. They have also continued to make their software easier to use by providing more self-service, more automation, and the ability to embed their software within other user tools.

Many improvements have been made to make data and analytical tools faster. Performance has been critical to keeping software working efficiently as data volumes and velocity has grown. Buyers expect a product that can continue to perform well over the long term. One thing is certain, data will only continue to increase over time. It’s important that the software can keep pace.

As for cheaper, we’ve all heard of Moore’s Law where the speed and capability of our computers will increase every couple years and we’ll pay less for them. The same holds true for software. As data grows and more users are using it, the cost of performance also trends down over time. That is, the cost required to perform a data query. This is where revolutionary jumps in cloud infrastructure with services like serverless technology have really sped up the pace of cost declines.

There have been quite a number of improvements in data and analytical software over the past 25 years. Things look very different from the late ‘90’s yet there are still plenty of opportunities for new technologies to be introduced into the market, bringing new and better products. And, those that have been around for years can continue to improve incrementally, but will soon be out-of-date with the technology shifts currently happening.

Many data and analytical tools will soon be approaching legacy status. They are getting stuck in the improvement groove, continually releasing feature improvements. However, the ground is changing underfoot and we’re seeing a revolutionary shift that will render today’s current products obsolete. This is one of the many reasons legacy business intelligence vendors struggle to meet the needs of SaaS providers with embedded analytics requirements. They simply weren’t built for embedded.

To keep up requires a brand new architecture, something that can really only be done by starting fresh. My previous analytics company was started around 2000 based on XML and Web technologies that became available back then. A few years ago, I started Qrvey to take advantage of new technology, such as serverless infrastructure, to meet the embedded analytics demands of SaaS companies. This also explains why Tableau’s days were numbered.

For SaaS providers, technology shifts can have a dramatic impact on their products. SaaS products serve users on two levels, the company that purchases a license and the end users of those companies. SaaS providers that are able to take advantage of technology advancements can have an outsized impact when compared to single companies applying new technology to their individual needs. SaaS companies simply serve a larger user base by serving hundreds to thousands of companies and customers.

Over time, data and analytical software products will continue to improve. In a decade or so, there will be new technology shifts and that will breed a new set of companies utilizing it to build new products to once again make them better, faster, and cheaper. As you evaluate software, in any vertical, a key factor to consider when looking to license new software will be their underlying architecture and technology foundation to make sure it’s taking advantage of current technological innovations.

For more insights and opinions, follow me on Twitter.

--

--