Skip to main content
Powered by Snowflake

Snowflake Native App or SaaS Token. The Anodot Breach Made the Choice Easy.

Your Snowflake Risk Lives in Your SaaS Connections | DataRadar™

On Good Friday, April 4, 2026, deliberately timed to a bank holiday during the Easter/Passover weekend to slow detection and response, more than a dozen companies had their data stolen. Snowflake did not fail them. Public reports indicate the intrusion path involved a third-party integration used by multiple customers. That is a data security story. It is also a data monitoring story. And for most enterprises running Snowflake today, it is a warning they cannot afford to ignore.

What Actually Happened in the Anodot Breach?

Attackers stole access tokens from a third-party tool called Anodot. They used those tokens to reach the Snowflake accounts of more than a dozen companies. Snowflake itself was not hacked. The tool connected to it was.


The headlines on early Monday, April 6, 2026, called it a Snowflake breach. That label was wrong. ¹Here is what the facts show. Anodot is a third-party AI tool. It connects to Snowflake customer accounts to run data checks. Attackers broke into Anodot’s own systems. They stole the access tokens Anodot used to reach its customers’ Snowflake data. Then they used those tokens to move into more than a dozen enterprise Snowflake accounts. ¹

Snowflake’s own systems were not touched. No flaw in Snowflake was used. The breach point was the link between Anodot and its customers. Not the platform itself. The link.The ShinyHunters group later claimed credit. They said they had stolen data from dozens of companies. They also said they had access to Anodot’s systems for some time before anyone noticed.¹

This is not the first time this group has run this play. In 2024, they ran a nearly identical attack. That campaign hit AT&T, Ticketmaster, Santander, and Neiman Marcus, among others. ¹

The method was the same both times. Find a tool that connects to Snowflake. Break into that tool. Use its access to reach the data behind it.

Key Insight:

Snowflake’s platform held both times. The breach point both times was the external tool sitting between the attacker and the data. This is a pattern. Not a one-time event.

Is Your Data Monitor Tool a Security Risk?

Anodot was a data anomaly detection tool. Its job was to catch problems in data. It did not catch the problem in its own systems. That gap exists in most third-party data monitoring tools today.


The Anodot incident is worth a closer look, because the details matter. According to public reporting from BleepingComputer, TechCrunch, and TechRadar, Anodot is an AI-powered anomaly detection platform. Its job is to spot unusual patterns in data for its customers.

Here is what the reporting tells us. Attackers stole authentication tokens from inside Anodot’s systems. The group known as ShinyHunters then used those tokens to reach Snowflake environments owned by Anodot’s customers. On April 9, 2026, Snowflake confirmed that Anodot was the third-party platform involved. A small number of Snowflake customer accounts were affected. Snowflake’s own systems were not compromised.¹

For enterprise buyers, the lesson is about architecture, not blame. Any tool that uses third-party tokens to reach your Snowflake data creates an outside dependency. If that outside system is breached, the tokens become the path in. A true Snowflake Native App takes that path away. The work stays inside your Snowflake environment, where it belongs.

That is not a knock on Anodot as a product. It is an example of a core limit in how most external data tools work.
When a data tool lives outside Snowflake, it can only see what it is allowed to access via its API link. It cannot see what is happening to the link itself. It cannot see if that link is being used by someone who should not have it.

The monitoring stops at the edge of the tool. But the risk does not.

IBM’s 2025 Cost of a Data Breach Report found that third-party and supply chain breaches cost an average of $4.91 million per incident, making them the second most expensive breach vector in the study.2

Verizon’s 2025 Data Breach Investigations Report put the scale of the trend in plain terms. Third-party breaches now account for 30% of all breaches. That is double the rate from two years ago.3

In the United States, the average cost of a data breach hit a record $10.22 million in 2025. Supply chain breaches also take an average of 26 extra days to detect compared to other breach types. Every extra day is another day of open exposure.2

What Does Native App Really Mean

Not every tool that calls itself a Snowflake Native App works the same way. Some are true native apps that run inside your Snowflake account. Others are connector agents that link your Snowflake account to their external application and process the data outside of your environment. The difference matters a great deal for security.


If you are shopping for a data monitoring tool for Snowflake right now, you will hear the phrase ‘Snowflake Native App’ from more than one vendor. It is important to understand that this phrase does not mean the same thing from vendor to vendor.
Snowflake’s own Native App Framework has a clear definition.

A true Snowflake Native App runs within the customer’s Snowflake account. The code runs on the customer’s own compute, with no external processing of your data, greatly reducing your risk.

What Do the Well-Known Data Observability Platforms Actually Do?

Most of the established data observability and monitoring platforms on the market today follow a similar architectural pattern. They deploy an agent or connector that authenticates to the customer’s Snowflake account, then pulls metadata, query logs, and, in many cases, sample data out of Snowflake and sends it to the vendor’s own cloud environment, typically AWS, GCP, or Azure, where the analysis, machine learning, and alerting happens. The results are then surfaced back to the customer through the vendor’s web application.

This pattern is common because it is easier for vendors to build and maintain. All customers share the same vendor-hosted analytics stack. But it creates a set of risks that customers often do not see until something goes wrong:

  • Your data now lives in two places. Whatever the vendor pulls, including metadata, query text, sampled rows, schema definitions, and lineage information, is copied outside your Snowflake account and stored in an environment you do not control. You are now trusting the vendor’s security posture as much as your own.

  • The attack surface is now the vendor’s entire customer base. A single compromise of the vendor’s environment can expose data from every customer connected to it. The economics strongly favor the attacker: one breach, many victims.

  • You inherit the vendor’s compliance scope. . If the vendor’s cloud is the one processing your data, their SOC 2, ISO, HIPAA, and regional data residency controls are now part of your compliance story. A gap on their side becomes a gap on yours.

  • Network egress opens a door. Any architecture that requires data to leave your Snowflake account requires outbound network paths, firewall exceptions, and monitoring blind spots. Every one of those is a line item your security team has to review, approve, and maintain.

None of this makes these platforms bad products. Many of them deliver real value and have mature feature sets. But the architectural choice to run outside the customer’s environment is a choice with security, compliance, and cost consequences that every buyer should understand before signing.

Key Insight:

A true Snowflake Native App has no outside cloud involved. No access tokens sent to third-party servers. No data moving outside your account. DataRadar™ meets that standard.

The Hidden Cost of Bolting Tools Onto Snowflake

Every tool you connect to Snowflake from the outside adds cost, lag, and risk. Most of that cost never shows up on a vendor comparison sheet. It shows up later, in engineering hours, compute bills, and incident response.


There is a financial side to this that rarely gets discussed in vendor demos. Call it the integration tax.
When a data tool lives outside Snowflake, data has to move. It gets pulled from Snowflake, copied into the external tool’s systems, and processed there. Every one of those steps has a cost.

  • Engineering time spent building and fixing the pipelines that move data to the external tool.

  • Compute costs for running the same data through two systems instead of one.

  • Middleware fees for the connectors that exist only to link Snowflake to the outside tool.

  • Incident response costs when a connection fails, gets exploited, or goes out of sync.

IBM’s 2025 CDO research found that more than 43% of chief operating officers name data quality as their top data concern. Over a quarter of companies lose more than $5 million a year due to poor data quality alone.7

That number climbs fast when you add compute waste, integration lag, and the risk of a breach through a connected tool.
Companies that move to a native Snowflake platform tend to find that the true cost of ownership drops once they stop paying integration tax. The native platform may cost more on paper at the start. When you add up what you stop paying, the math almost always flips.

What Running Natively Inside Snowflake Actually Means

Native Apps that are truly native operate end-to-end inside the customer’s Snowflake cloud. All execution happens on the customer’s compute, and no data is copied or transmitted elsewhere. The risks of external data handling are removed by design.

DataRadar™ is built on Snowflake’s Native App Framework. Here is what that means in practical terms.4

Your Data Never Leaves

DataRadar™ runs inside your Snowflake account. Every query runs on your compute. Results come from your live data. No data is stored outside the Snowflake cloud. Everything is a pass-through to your own Snowflake environment. Nothing is copied, cached, or held on a third-party server.

Results Are Based on Live Data

External tools depend on synced copies of your data. Those copies age. An alert that fires based on data from six hours ago is not the same as an alert that fires based on what is in your warehouse right now.

Native compute means DataRadar™ queries run against live Snowflake data. When a pipeline health alert fires, it reflects what is actually happening. In fast-moving industries, that difference is the gap between catching a problem early and finding out about it from a business user.

Your Audit Trail Is Complete

Every query, every alert, every action that DataRadar™ takes is logged inside Snowflake’s own audit system. There is no outside system holding a copy of your governance trail. Your compliance team can answer any access question from a single source of truth.

Resilience Comes Built In

Snowflake replicates data across regions by default. A native app built on top of Snowflake gets that resilience automatically. When Snowflake is running, DataRadar™ is running.

External tools have their own uptime record. When they go down, your monitoring goes down with them, even if Snowflake is running perfectly. The Anodot incident showed exactly this. When Anodot’s connectors failed, customers lost all visibility into their own Snowflake data across every region at the same time.1

One Platform Instead of Two

Most data monitoring tools solve one problem: either data quality or compute cost. Buying one means you still have a gap. DataRadar™ covers both in a single native app inside Snowflake.


The data monitoring market split into two camps years ago. Data quality tools on one side. Cost control tools on the other. Neither camp was built to talk to the other.

Here is how the market breaks down today:

  • Data Quality Camp:A set of well-known data observability vendors are built to monitor data accuracy, freshness, schema drift, and pipeline anomalies. They are good at finding problems in your data. None of them were built to manage what that bad data costs you in Snowflake compute.

  • Cost Optimization Camp:A separate group of vendors is built to control Snowflake spending, right size warehouses, and reduce credit waste. They are good at finding problems in your bill. None of them were built to tell you why your data is wrong.

  • Trying to Straddle Both:A smaller group of vendors attempts to cover both camps. Neither approach does it natively inside Snowflake. Both require external connections and access tokens to reach your data. That is exactly the attack surface the Anodot breach exploited.

  • No Longer in the Race: Select Star, once a data catalog and governance tool, was acquired by Snowflake in November 2025 and is being folded into Horizon Catalog. It is no longer an independent option.

Most enterprises ended up buying one tool from the quality camp and one from the cost camp. That means two products, two setups, two outside connections, two access tokens, and two invoices. And the two tools still do not share a common view of what is happening inside your Snowflake environment.⁸

The DataRadar™ Observability Framework covers five dimensions inside one native app:

  1. Data Reliability:The 88% failure rate is a data-quality crisis, not a machine-learning crisis. High-quality data enables confident, informed decisions and is critical for effective decision-making in AI projects.
  2. Pipeline Health:Are jobs running? Are there failures or delays in your pipelines?
  3. Performance Optimization:Are queries running well? Are there expensive jobs that could run better?
  4. Usage IntelligenceWho is accessing what data, and how often?
  5. Cost Visibility: Where are Snowflake credits going? Which teams and pipelines drive the most spend?

These five areas are deeply linked. A pipeline failure that causes a data reliability problem will almost always cause a compute spike at the same time. Seeing all five in one place, against live data, is to connect the dots before the damage reaches a business user or a budget review.
That is what a unified native platform makes possible. One app. One setup. One security surface. One invoice.

Four Questions Every CDO Should Ask Right Now

If you run Snowflake and you have outside tools connected to it, these four questions are worth asking before next week:

  1. Does this tool run inside my Snowflake account, or does it connect to an outside platform?
  2. What access tokens or keys does this tool hold, and where are they stored??
  3. If this tool’s systems were breached, would an attacker be able to reach my Snowflake data using those keys?
  4. Am I buying two tools to cover what one native platform could handle?

These are not abstract questions. They are the exact questions that data teams at more than a dozen companies wish they had asked before Good Friday 2026.

Key Insight:

No security architecture can eliminate all risk. What a true Native App does is reduce exposure to specific third-party token compromise risks. DataRadar™ runs inside Snowflake.. There is no external cloud involved.

Summary

The April 4th, 2026, Snowflake ecosystem breach was not a platform failure. It was a failure of the integration layer that sits between Snowflake and the tools bolted on top of it. Every outside tool connected to your Snowflake account is a risk point. One effective way to mitigate this class of risk is to run natively, inside Snowflake, with no outside systems involved.
That is why we built DataRadar™.

Confie built DataRadar™ because distributing insurance at a national scale means your data must be trusted, protected, and AI-ready. We knew the problem. We built the fix.

Ready to dive deeper?

Download our comprehensive guide, Data Observability in 2026: The Enterprise Playbook for Trusted AI-Ready Data, for a complete framework for building data foundations that enable AI success.

Frequently Asked Questions

  1. How did the Anodot breach reach Snowflake customer data? The attack worked because Anodot held access tokens that gave it a direct link to customer Snowflake data. Once attackers had those tokens, they had the same access Anodot had. Snowflake confirmed the breach was tied to third-party integration activity, not to any flaw in Snowflake itself.1
  2. What is a Snowflake Native App, and how is it different from a SaaS connector? Snowflake defines native apps through its Native App Framework. True native apps use the customer’s own compute. They do not send data or queries to outside servers. SaaS connectors work the other way: they pull data out of Snowflake and process it in the vendor’s own cloud.4
  3. How does DataRadar remove third-party integration security risk? Because DataRadar™ is a true Native App, every query runs on the customer’s compute against their live data. This can be verified directly in Snowflake’s Marketplace and in Snowflake’s Native App Framework documentation.4
  4. Why do companies still use two separate data observability tools for Snowflake? The data monitoring market split into two camps over the past few years. Quality-focused tools handle freshness, volume, and schema checks. Cost tools handle query performance and credit management. DataRadar™ covers all five key dimensions natively: data reliability, pipeline health, performance optimization, usage intelligence, and cost visibility. One platform, inside Snowflake, with no outside connections required.8
  5. 5. What is the real cost of connecting outside tools to Snowflake? IBM’s 2025 Cost of a Data Breach Report found that third-party and supply chain breaches cost an average of $4.91 million per incident. In the United States, the average breach cost hit a record $10.22 million in 2025. Those figures do not include the ongoing integration tax of maintaining multiple outside tool connections day to day.2
  6. 6. Has Snowflake been targeted before through third-party tools? The 2024 campaign and the 2026 Anodot breach both exploited the connection layer between Snowflake and outside tools. In both cases, Snowflake’s own platform was not compromised. The risk in both cases came from what was connected to Snowflake, not from Snowflake itself. This pattern makes the case for native deployment clearer with every repeat incident.1

Sources

1Abrams, L. (2026, April 9). Snowflake customers hit in data theft attacks after SaaS integrator breach. BleepingComputer https://www.bleepingcomputer.com/news/security/snowflake-customers-hit-in-data-theft-attacks-after-saas-integrator-breach

2IBM Security & Ponemon Institute. (2025). Cost of a Data Breach Report 2025. IBM.https://sloanreview.mit.edu/article/seizing-opportunity-in-data-quality/

3Verizon. (2025). Data Breach Investigations Report 2025. Verizon.https://www.verizon.com/business/resources/reports/dbir

4Snowflake. (2024)Snowflake. (2024). About the Snowflake Native App Framework. Snowflake Documentation. https://docs.snowflake.com/en/developer-guide/native-apps/native-apps-about

5Anomalo. (2025, April 1). Announcing Snowflake Ventures’ strategic investment in Anomalo. Anomalo Blog.https://www.anomalo.com/blog/announcing-snowflake-ventures-strategic-investment-in-anomalo/

6Krantz, T., & Jonker, A. (2026, January 23). The true cost of poor data quality. IBM Think.https://www.ibm.com/think/insights/cost-of-poor-data-quality

7Integrate.io. (2026, January 12). Data quality improvement stats from ETL: 50+ key facts every data leader should know in 2026. Integrate.io Blog. https://www.integrate.io/blog/data-quality-improvement-stats-from-etl

Ken Kasee

Ken Kasee

Author

Ken Kasee is a 3x Telly Award-winning content marketer and digital strategist with 25+ years turning complex technology into clear, engaging stories. At DataRadar™, he oversees educational content and research that helps data and analytics leaders understand the full scope of modern data observability, including pipeline health, data integrity, cost visibility, and AI readiness. Ken has built marketing functions from the ground up across healthcare, life sciences, insurance, and financial services, where data quality is a regulatory and operational necessity. Previously, he led US Marketing Operations at IQVIA using an AI-driven approach and helped scale InsurTech Ensurem 20x before its acquisition by HealthPlanOne. Ken earned a bachelor’s degree in economics with minors in art history and creative writing, as well as an MBA in Digital Marketing from The University of Illinois.