close
close

How Snowflake Integration Helped Graas Achieve 10X Customer Growth

Singapore-based Graas, which stands for “Growth as a Service,” is an e-commerce solutions provider that uses artificial intelligence and data analytics to help brands profitably grow their online businesses.

“We collect data from multiple sources for brands and eCommerce retailers, aggregate it in one place and run analytics on it. The goal is to support decision-making for these brands,” said Mangesh Panditrao, AI Director at Graas, in an exclusive interview with OBJECTIVE.

With the emergence of generative AI as a disruptive technology, Graas has also integrated it into various operations. Panditrao explained that they use various regression models to forecast sales, compare sales and identify anomalies.

Graas’ sales forecasting model stands out from the competition by using time-series data and taking into account external factors that affect sales. “This is very different from typical time-series forecasting because it requires knowing which external factors to use. In our case, we don’t just look at sales data, we also consider advertising spend, inventory prices, and discounts to build a model that can accurately predict sales,” Panditrao explained.

In addition, Graas now allows its customers to interact with data using natural language, which helps them gain deeper insights into its dynamics. Panditrao said that initial versions of this feature are currently in beta testing.

Snowflake to the Rescue

Panditrao said data collection is a critical aspect of their business, and the company relies on Snowflake to support that effort. “I would say 80% of our focus is on data collection and organization, and that’s where Snowflake has really helped us. We’re very dependent on its flexibility to keep all that data in sync and ready to analyze.”

“We moved to Snowflake about three to three-and-a-half years ago, being one of the first to do so. We recognized the need as our data started growing,” Panditrao said.

Snowflake has enabled Grass to quickly ingest data and generate analytics dashboards, giving customers real-time visibility into integrated business metrics across platforms in one place. This also gives them much faster actionable insights, allowing them to make decisions in near real time. Grass leverages Snowflake-provided features like Data Share to share data directly with our customers in their own Snowflake instances.

He also added that Graas has expanded its customer base by more than 10x after migrating to Snowflakedemonstrating the platform’s ability to efficiently support significant growth. By implementing an optimized data loading strategy from Snowflake, Graas was able to reduce data processing costs, even as it scaled its customer base.

“The implementation of Snowpipe by Snowflake has enabled Graas to deliver near real-time analytics, with analytics dashboards rendered in less than a minute from the time a new connection is made, increasing customer insights and accelerating decision-making.”

Panditrao said Grass saw its data volume increase by more than 8x during the sale days, and Snowflake easily handled the increases. “With Snowpipe’s streaming capabilities, we’re now getting real-time data, which reduces our previous 15-minute latency,” he said.

Why Snowflake? Snowpipe by Snowflake enables automated, near-real-time data ingestion without manual scheduling or compute resource management. It loads data in small batches so it’s available for querying in minutes, not hours. Snowpipe can ingest data from cloud storage services like Amazon S3, Google Cloud Storage, and Microsoft Azure Blob Storage.

Vijayant Rai, Managing Director, Snowflake India, in an exclusive interview with OBJECTIVE said it’s pretty easy for customers to use generative AI with their data without worrying about data security in Snowflake. “When you’re running large language models or you need to build applications, the generative AI or LLM actually gets access to the data. The data doesn’t leave the platform,” he said.

Rai shared that many large companies that currently use Snowflake store their data on the platform, which allows them to perform real-time analytics. He explained that these companies often have a significant amount of legacy data, some of which can be 40 to 50 years old.

“They are also working with new data coming from a variety of channels, whether structured or unstructured, including online sources,” he said. “Snowflake provides a secure place where they can effectively manage and analyze all of that data.”

Better than Databricks?

Unlike Snowflake, Databricks is a unified analytics platform that excels at processing large data sets using Apache Spark. While Databricks offers streaming data capabilities, its primary focus is on data engineering, machine learning, and collaborative analytics. This means that while it can handle real-time data, setup and maintenance can be more complex compared to Snowpipe’s straightforward approach.

“The main difference is that Snowflake was designed at its core as a data warehouse solution, while Databricks was built as an ML pipeline solution. Increasingly, the data world is combining these offerings, so that Amazon, Snowflake, and Databricks are competing for an all-in-one solution,” the user wrote on Reddit.

However, Databricks’ current goal is to build USB AI where providers don’t have to worry about where their data is stored. The company’s recent acquisition of Tabular is proof of that.

“We don’t understand all the intricacies of Iceberg, but the original creators of Apache Iceberg do. So now at Databricks we have people from both of those projects, Delta and Iceberg. We really want to double down on making sure that UniForm has full 100% compatibility and interoperability for both of them,” Databricks CEO Ali Ghodsi said at Data+ AI Summit 2024.

Databricks also announced the general availability of Delta Lake UniForm, which supports Delta Lake and Iceberg formats. Meanwhile, Snowflake recently announced Polaris Catalog, a vendor-neutral, open catalog implementation for Apache Iceberg, at its Data Cloud Summit earlier this year.

Notable Databricks customers include Adobe, AT&T, Block (Square, CashApp, Tidal), Burberry, Rivian, and US Postal Service.

Ghodsi noted that every company’s data assets are being placed in multiple data stores, and data is being siloed everywhere. This ultimately creates a lot of complexity and huge costs for companies, and ultimately locks them into these proprietary system silos.

He explained that the idea was for users to be able to own their data and store it in data lakes, where each provider can then connect their data platforms to that data, allowing users to decide which platform works best for them. This removes lock-in, reduces costs, and also allows users to get a lot more use cases by giving them the ability to use different engines for different purposes if they want.