Technology Magazine sits down with Sunny Bedi, Chief Information and Data Officer at data cloud company Snowflake, on the company’s innovations in generative AI and how its solutions can drive optimisation for businesses.
Prior to joining Snowflake, Bedi held Corporate IT and Operations leadership roles at Nvidia from 2008-2020 as the company scaled from less than 2,000 employees to 15,000. Previously, he has held leadership roles at VMware, JDSU, Deloitte Consulting and Andersen Consulting. He holds a BS and MBA from the University of San Francisco, and Executive Education in Leadership and Technology from Stanford University.
Please introduce yourself and tell us about your role at Snowflake
In my role as Chief Information Officer, I focus on automating various business functions including finance, sales, human resources and marketing, providing them with the appropriate applications and tools while providing a really awesome user experience.
During the COVID-19 pandemic, we got into uncharted territory when it comes to how employees work. As we navigate the shift to a hybrid workforce, both in-office and remote, ensuring a positive employee experience is top of mind for us, especially as we continue to grow and onboard new employees.
Additionally, in my security role, I collaborate with product and engineering teams to ensure robust security measures and compliance with best practices and necessary certifications for our market entry.
And then finally, as Chief Data Officer, which occupies most of my time, I focus on being 'customer zero' for new products and features, and giving feedback to engineering and product teams.
What are some of the most exciting announcements coming from Snowflake around generative AI?
It all started last year when we made Snowpark generally available. Snowpark opened up the whole revolution of taking computers to where the data is.
In the past, applications and workloads were less data-intensive, and data scientists and developers often extracted data from databases for modelling or enrichment. That worked in the past because these were small data sets. Now, the data sets have become extensive and those data sets come from different sources, so they need to be correlated.
Snowflake's architecture now allows developers and data scientists to work directly with data within Snowflake, eliminating the need to extract it. This integration of computing, machine learning, and AI within the data environment positions data as a central focus. Snowflake ensures the governance and security of this platform, marking a significant change in data handling.
Specifically for AI and large language models (LLM), we introduced Snowpark Container Services. This innovation enhances security and productivity by consolidating compute, machine learning and AI infrastructure into a container. Users benefit from this integration without the need to build their own infrastructure. We're seeing a tremendous amount of momentum on that and we're seeing a lot of customers starting to adapt to that.
We have tremendous partnerships, including with Nvidia which we announced recently, which includes the integration of GPUs with Snowpark Containers. This allows users to leverage GPU capabilities through our containers without the need for sourcing or managing the hardware themselves.
Tell us about the direction of travel for Snowflake
If you think about areas like AI and machine learning, a key challenge has been utilising unstructured data effectively, as seen in industries like insurance where there are vast amounts of data in formats like PDF documents. Our aim is to extract trends and correlations from these documents.
We developed Document AI to address this. It enables large language models to interpret unstructured data, allowing data scientists and machine learning experts to derive insights from these documents. For example, consider our process of onboarding 8,000+ customers, each with lengthy contracts. Most contract terms are standard, but some variations are unique to each customer. Analysing these differences, especially in terms of service level agreements (SLAs) and their impact, is challenging due to the sheer volume of data.
Document AI revolutionises this process by decoding information from PDF documents. Combined with conversational AI, it allows for quick, natural language queries about specific data points, like the impact of SLAs on customers. This technology has the potential to transform industries such as insurance, finance and manufacturing.
In terms of AI and machine learning, we're focused on automation and enhancing productivity. Imagine an AI assistant for every new employee, customised to aid their onboarding and learning process. This assistant would adapt over time, improving the onboarding experience for future employees with similar profiles.
Another focus is automating the QA and testing phase in software development. There are three phases of software development: design, development and then QA and testing. Typically, all QA developers really enjoy the first two phases, but nobody enjoys QA and testing. Automating this with AI could accelerate product development and enhance overall productivity.
Overall, our direction includes harnessing unstructured data through technologies like LLMs and conversational AI, and driving internal AI adoption to boost productivity and software development efficiency. This includes innovative approaches to employee onboarding and software QA, transforming traditional methods into more efficient, AI-driven processes.
SnowPatrol: using machine learning to drive optimisation
During COVID-19, we onboarded over 5,000 employees, and in that time were focused on providing them with an excellent user experience, including thoughtfully chosen apps and provisions. However, as the market declined last year, we started to evaluate the effectiveness of these provisions, questioning if employees were fully utilising the resources provided.
We discovered that about one-third of the applications supplied were not used within a 100-day period, leading to considerable wastage and unutilised assets.
To address this, we developed a machine learning application, SnowPatrol, which combines data from sources such as Okta, Docusign CLM, ServiceNow and Workday to assess usage patterns both retrospectively and predictively. Based on its algorithm, it either revokes unused access overnight or automatically provides necessary access to new employees.
To show you an example, we had a very large software vendor that came in and was trying to construct a deal for renewal with our procurement team. Based on our employee headcount and where our growth trajectory is, they predicted that we would need around 8,000 licences for the next two years alone.
But our machine learning model predicted we need 2,800 licences. That was a big variation and there was friction in that discussion and, sure enough, it was escalated to me right away. They came in the next day and I just opened up my Snowflake app and showed them that our machine learning model was predicting we needed 2,800, the exact number that our procurement team had used. And they looked at it and the discussion was over in five minutes.
That's an example of how we are leveraging data to drive a lot of optimisation internally. We call this app SnowPatrol and we have tons of customers who are fascinated by the use case, and are trying to build that capability.
The need to design a secure and compliant data & AI strategyData & Data Analytics
Unlocking the power of data collaborationData & Data Analytics
Affinity Solutions launch insights on Snowflake marketplaceData & Data Analytics
Snowflake opens new UK office amid strong momentum in EMEAData & Data Analytics
“For me, we're so much more than a large technology company. There's a responsibility on us as a strategic partner to nations, and as a partner to governments.”