Snowflake Reflects on 10 Years Passed, Ponders 10 Years Ahead
When Snowflake Computing was founded 10 years ago, the big data market looked much different than it does today. Momentum was building behind something called Hadoop, while cloud computing was viewed with suspicion. Despite these headwinds, the Snowflake founders stuck to their initial vision, and eventually played a major role in flipping the big data script and finding massive success along the way. But where will this success lead to over the next decade?
The first two years of Snowflake Computing’s existence was spent in stealth mode. Co-founders Benoit Dageville, Thierry Cruanes, and Marcin Zukowski were all data warehousing veterans who previously worked at Oracle, IBM, and Actian, and so they had an insider’s view of the limits of data warehouses. By running a data warehouse in the cloud and separating compute from storage, they believed, they could overcome those limits.
Snowflake’s first entry into the public eye was modest enough. On October 21, 2014, the company simultaneously came out of stealth and announced a Series B round of financing led by Redpoint worth $26 million, which is not a large amount by today’s venture capital standards. Former Microsoft executive Bob Muglia, who was tapped to be its first CEO, unveiled its first commercial offering, dubbed the Elastic Data Warehouse, which ran exclusively on Amazon Web Services and became generally available in 2015.
The next several years were spent in a heads down mode, iterating on that first release of the Snowflake data warehouse running on AWS, and banging the drum for its style of big data processing in the cloud. The company took home first place in the Strata + Hadoop World Startup Showcase in 2015, and raised a $45 million Series C round later that year (it would subsequently be expanded to $76 million). It partnered with BI vendors like Tableau, Looker, and MicroStrategy; railed against failed big data projects; and promoted its cost-savings against other nascent cloud vendors.
A feisty Muglia took shots at competitors, including Hadoop. The startup’s CEO tore the open source software project to shreds in a 2017 Datanami interview, before the Hadoop implosion seemed imminent. “I can’t find a happy Hadoop customer. It’s sort of as simple as that,” he said. “It’s very clear to me, technologically, that it’s not the technology base the world will be built on going forward.”
Momentum started building for Snowflake in 2017 with a pair of new features. The first newly launched product was Snowpipe, a continuous data ingestion capability, and the second is the beginning of data sharing.
That year, Snowflake laid “the basic underlying building block that allows two separate Snowflake accounts to collaborate over shared data assets in a meaningful, secure, and well-governed way,” says Torsten Grabs, Snowflake’s director of product management for data lake, data pipelines, and data science.
The company’s customer growth signaled confidence to potential investors, and in April 2017, the company completed a $100 million Series D round, bringing the fledgling firm’s total funding to $205 million.
Buoyed by solid growth, Snowflake expanded to Microsoft Azure, which opened the company to many more companies that run on that cloud infrastructure. By late 2018, company surpassed the 1,000-customer mark, barely three years into its existence. It also started getting more serious about how it partners through the launch of the Partner Connect program, “which allows users to spin up a Snowflake account through a partners experience and vice versa,” Grabs says. Snowflake closed two massive founding rounds in 2018, including a Series E worth $263 million in January and a $450 million round in October. At that point, it was valued at $3.5 bilion.
2019 would mark a period of transition for the company, which was still called Snowflake Computing and was still based in San Mateo, California. On the product front, it expanded into Google Cloud. It also launched the Snowflake Data Marketplace, as well as the cross-cloud building blocks that would eventually be known as Snowgrid. On the business front, Muglia was replaced by Frank Slootman, a former ServiceNow executive, as the CEO in May 2019.
The COVID-19 pandemic was a gut-punch to many company’s plans in early 2020, but Snowflake seemed to roll right along with the punches. The company launched several new products at its annual user conference, including the availability of Snowsight, a new GUI designed to let users get closer to the data. It also introduced Snowpark, which would give users the ability to work with Snowflake data in a language other than SQL. Lastly, in a nod to the importance of partners, it launched a new formal partnership program, dubbed Snowflake Partner Connect.
And who could forget Snowflake’s big debut on the New York Stock Exchange under the ticker symbol SNOW? That September IPO raised $3.4 billion (giving the company a $33 billion valuation) and was dubbed by the mainstream press as “the largest ever IPO for a software company” (even though it’s a cloud services provider). The company also doubled its customer base, from a count of 1,550 in July 2019 to 3,100 in July 2020.
2021 was another busy year for the newly public company, with the launch of Snowpark for Python, which is still in public preview. Snowflake also expanded support for unstructured data, which is more important for the types of AI use cases where Python would be used (whereas traditional SQL queries run on structured, tabular data). Snowflake also started allowing participants of the Snowflake Marketplace to monetize their data. It also launched data clean rooms, as well as the first two vertical clouds, for media and financial services. We can’t forget the brief time in early 2021 when Snowflake identified as “headquarterless,” before Slootman settled down in Bozeman, Montana.
Snowflake’s evolution continued in 2022, with several notable unveiling at its June user shindig, including Unistore, its first storage repository for transactions; enhancements to Snowpipe for streaming data pipelines; and a private preview of its new data application framework based on its $800-million acquisition of Streamlit in March. The company also announced support for Apache Iceberg, the open table format that’s gaining momentum; launched new vertical clouds for healthcare and life sciences, as well as for retail; and also debuted a new security offering.
Snowflake today has 6,000 customers, and with a market capitalization in excess of $55 billion, is considered one of the cloud giants, a title it would have to share with another post-Hadoop breakout big data star, Databricks. The company brought in more than $1.2 billion in revenue in fiscal 2022, yet it’s struggling to please Wall Street, which has pushed its stock price down to about $180 per share, less than half of its November 2021 all-time high. And while there are grumblings about unexpected costs from customers, clearly, the number of customers Snowflake has shows that it’s doing something right.
No longer content to provide customers with instant access to limitless SQL compute on massive data sets in a data lake environment, Snowflake is playing the big data long game and positioning itself for the next big thing. For Grabs, who joined the company in 2017, it’s less of a shift away from traditional data warehousing than a continuation of the company’s original path.
“To me, it does not feel as such of a dramatic shift of where we were initially, because already in the beginning, Benoit and Thierry were think of Snowflake as a data lake offering,” he says. “They were intentional about thinking about Hadoop as another big data processing platform that Snowflake from the very get go should compete well with.”
Hadoop was the big competitor in those early days, and Snowflake spent just as much time doing Hadoop replacements as greenfield enterprise data warehouse installations, Grabs says. The fact that Snowflake has soared while Hadoop has fallen is definitely pertinent to this conversation. “We are the better Hadoop,” Grabs quips.
But where will the company go next? The company has put stakes in the ground in several areas immediately adjacent to the world of advanced analytics, including AI, streaming data, data applications, converged OLAP/OLTP, data clean rooms, and vertical data clouds. Which of these will define Snowflake 10 years from now?
That answer is not clear, but one thing is: The company won’t be sitting still. “We have to innovate on a daily basis,” Grabs says. “We cannot sit on the laurels of what was done in the past.”
Grabs likes to remind customers that, every year, the window of time they have to process new data and make a decision is getting smaller and smaller. As that window goes down, the volume of data goes up, and latency demands get tighter and tighter. These are some of the business challenges that are driving lots of investment in streaming data analytics and real-time databases. Snowflake is also tracking this challenge, and looking for ways to keep customers on top of it.
“We’re getting very creative about different storage layouts and how we physically represent storage. We’re by no means a columnar store only,” Grabs says. “It’s also motivating our investment into materialized tables, dynamic tables that essentially update as new data arrives and also quite frankly for hybrid tables, with our Unistore workloads, which gives a different latency profile, response-time profile, than what a regular Snowflake table does.”
Related Items:
Snowflake Pops in ‘Largest Ever’ Software IPO
Hadoop Has Failed Us, Tech Experts Say
Database Hotshots Build Warehouse From Scratch For The Cloud