top of page
red_stars_BACKGROUND 2.png
Writer's picturePeter Hanssens

Snowflake Data Cloud Summit 2024: Builders keynote


Illustration of two people building a snowman with the text 'Snowflake Data Cloud Summit 2024 builders keynote: A Cloud Shuttle recap' on a snowy background.

Geeking out and going hands-on



Speaker on stage wearing a suit, with the words 'THE BUILDER JOURNEY' displayed prominently on a large screen behind him.
Director of Product at Snowflake, Jeff Hollan, going through the Builders' Journey and inviting everyone to geek out at this keynote.

Director of Product, Jeff Hollan, kicked off the Builders keynote with a call to geek out and dive deep into the platform's capabilities. The recurring theme that’s been present throughout the summit was the seamless integration of all sorts of data and AI capabilities into a single, unified platform. Today’s keynote continued on this thread, this time emphasising how simple and streamlined the developer experience is on Snowflake. 


Yesterday’s announcement of Git integration into public preview was brought up again, highlighting how much easier it is for developers to do declarative and automated deployments, collaborate better, and implement more streamlined workflows. 


There was also a subtle but clear push by Snowflake to address any perceived pricing issues. With Snowflake’s architecture built on serverless, "pay as you go" models, most workloads end up being quite cost-effective on Snowflake.


The return of Tasty Bytes (the ‘SeQueL’)


Two men wearing grey shirts, hoodies, jeans and sneakers dancing on a stage with furniture such as tables, chairs and bookcases.
Snowflake Lead Developer Advocates Dash Desai and Felipe Hoffa dancing up a storm before the keynote began.

If you remember, Tasty Bytes, a fictitious food truck company with 450 food trucks, was launched at Snowflake Summit 2023 to help Snowflake demonstrate their key platform capabilities and how they solve customer problems (in a fun, edu-tainment sort of way!).


Well, at Summit 2024, we see the return of Season 2 of Tasty Bytes (what Director of Product Amanda Kelly calls “the SeQueL”) to demonstrate three use cases: 1) Customer sentiment analysis; 2) Creating a domain-specific chatbot; and 3) Customer network analytics using Graph – all on Snowflake. 


This segment was presented by Amanda Kelly, Lead Developer Advocates Dash Desai and Felipe Hoffa, Principal Software Engineer Polita Paulus, and Senior Data Science specialist, Marie Coolsaet. In season 1, they demonstrated how Snowflake handles both batch and streaming workloads, integrating SQL and Python to solve underlying data platform issues. Basically contending that these data challenges have largely been solved, and it’s time to start “injecting AI magic into everything”.


Use case 1: Customer sentiment analysis


Before we got underway into the demo on running sentiment analysis to find disgruntled customers across Tasty Bytes’s 450 food trucks, there was a spotlight on how Vimeo used LLMs to analyse unstructured data (video, audio, transcripts) to help improve their personalisation and predictive analytics based on existing customer behaviour.


Babak Bashiri, Director of Data Engineering at Vimeo, talked about 1) how security and privacy is a big challenge and priority for them; 2) the importance of mitigating the impact of LLM hallucinations; and 3) measuring ROI, since running LLMs are expensive. They also don’t have a large team of dedicated ML Engineers and Ops staff, so needed a leaner, more efficient way to integrate these capabilities into their platform. They were able to do this all on Snowflake, bringing the models to the data within a secure environment. At no point does the data leave Snowflake via external APIs. Next up, Vimeo plans to build an AI-driven sales assistant based on reviews, support logs, and more to help their sales team improve conversion and close more deals. 


The Tasty Bytes demo then went underway. Snowflake Cortex offers LLMs as SQL functions, with features like Cortex Search and Streamlit integration. Felipe Hoffa demonstrated how thousands of unstructured data texts, such as tweets and food truck reviews in various languages, could be translated into English, with their corresponding sentiment analysed and then visualised into a chart. The negative sentiment reviews were then grouped, and largely attributed to a single vendor. Felipe then used the LLM to generate an email targeted at this fictitious rogue food truck vendor, providing a breakdown of the types of feedback by category (e.g. food quality, hygiene, etc. ordered by frequency) without revealing any sensitive customer data at any point. 


Use case 2: Organisation-specific document chatbot 


Next up, the team created a document chatbot to trawl through support chat logs logs, team PDFs and wikis using RAG architecture that combined semantic and text keyword on Cortex Search and Streamlit for the chatbot interface.


Using the food truck example again, the team were able to use the chatbot to answer questions, such as the cost of privately renting a food truck, and the availability of vegetarian options across the food truck fleet. You can definitely see the huge benefits of a chatbot like this to help with onboarding new support staff and sharing information across teams. Other cool things include the ability to switch to different models (e.g. from Artic to Mistral) depending on what’s needed at the time.



An illustration depicting the integration of DataEngBytes into the GenAI Era. The image features a robot on a large smartphone screen with the DataEngBytes logo, surrounded by four people interacting with the robot through speech bubbles, symbolizing the fusion of advanced AI technology and data engineering expertise.

Read our blog post on how Cloud Shuttle implemented a RAG-powered LLM to help conference attendees get a better experience from the DataEngBytes conference.


Use case 3: Customer network graph analysis



A man wearing a business suit and a woman wearing a hoodie sit at a table and chairs on stage. The text 'Christian Figueroa, Head of Network Science and Behavioral Modeling, Cash App' can be seen onscreen.
Snowflake Product Director Amanda Kelly and Head of Network Science and Behavioural Modeling at Cash App, Christian Figueroa, have a conversation on how Cash App used Relational AI on Snowflake Marketplace to conduct graph and network analysis on their customer database.

To demonstrate the third use case, Amanda Kelly invited Christian Figueroa, Head of Network Science and Behavioural Modelling at Cash App, to share their experience of using Relational AI on their Snowflake account to analyse customer behaviour. Old-school SQL table joins don’t scale once you want to get to second and third-order connections, and Python, Spark and other general purpose tools aren’t specific enough for this use case. 


We all know that it’s not just our friends who influence our purchase behaviours, but also our friends’ own networks. Cash App wanted to establish their most important customer nodes and how they were connected to each other. Cash App used the Relational AI app on the Snowflake Marketplace to run knowledge graphs and graph analytics, and were able to run it 10x faster and cheaper than general purpose tools. 


Back to our Tasty Bytes example, a scandal has erupted! Customer complaints have spiked on social media because one of the food trucks has been serving stone cold burritos. Uh oh. The marketing team want to run a campaign to target not just affected customers, but their friends, and offer them a free (and piping hot) burrito as compensation. 


The team call up Lead Developer Advocate Dash to help with this task. The logic he applies is fairly straightforward. He finds the affected food truck and incident time within a 20-minute window to hone in on the affected customers. But he also wants to identify their coworkers/friends, and just because someone buys food immediately before or after you do does not mean you’re associated with them in any way. So the logic is then refined to identify customer orders that occur within 20 minutes and repeated at least 5 times to be safely deemed a connection. Like magic, a graph representation appears before our eyes, enabling us to quickly identify ~3000 nodes, 5000 edges and 500 communities of affected customers. This graph then gets turned back into a table of affected customers, for the marketing team to run a more complete campaign to offer compensation to the cold burrito victims. 


It was really cool to see how the Relational AI app via the Snowflake Marketplace brings the power of graph analytics and knowledge graphs directly to your workloads on Snowflake, which has never traditionally had it (being a relational analytic database). It means that your data can stay within the platform but you can still get more deterministic results leveraging graph which is especially important for network analytics. In short, Snowflake has made knowledge graph accessible to SQL developers. 


Conclusion


Overall, it’s been a really fun Builders Keynote on day 3 of the summit, with top notch storytelling and use case demonstration with the Tasty Bytes series. It went a long way towards showcasing the full breadth and capability of Snowflake’s AI Data Cloud. With the ease and simplicity of Snowflake’s platform and all the security and governance benefits of having your data stay within Snowflake and bringing your workloads to run on it, there’s soon not going to be much of a reason to leave the platform. 

Comments


bottom of page