█▄█▄█
███████
█████████
███████
CoCo Skills Pulse
Tracking skills shipped in the public Cortex Code CLI installed via curl -LsS https://ai.snowflake.com/static/cc-scripts/install.sh
|
Latest Changes
|
| [v1.0.73] 2026-04-28 ADDED: +5 REMOVED: -1 MODIFIED: ~4 - •
ai-data-shareMake a listing or data share AI-Ready. Use when: creating semantic views for lis… +Make a listing or data share AI-Ready. Use when: creating semantic views for listings, creating cortex agents for data shares, making data AI-ready. Triggers: AI-ready listing, share agent, data share semantic view, marketplace AI. - •
alertSnowflake alert management - create, alter, suspend, resume alerts. Use when: us… +Snowflake alert management - create, alter, suspend, resume alerts. Use when: user wants to create a new alert, modify an existing alert, set up monitoring, suspend or resume alerts. Triggers: create alert, new alert, add alert, alter alert, modify alert, change alert, suspend alert, resume alert, monitor with alert, set up alert, alert condition.alert-create-alter Create and alter Snowflake alerts with condition queries. Use when: us…
- •
attach-ai-products-to-shareAttach AI products to Snowflake shares. Use when: adding semantic views, cortex … +Attach AI products to Snowflake shares. Use when: adding semantic views, cortex agents, or cortex search services to a share. Triggers: share semantic view, share agent, share cortex search. Invoke this skill to add AI products to a share as a step of sharing AI products or creating a listing to share an AI product. - •
event-tableManage Snowflake event tables and telemetry configuration. Use when: viewing/con… +Manage Snowflake event tables and telemetry configuration. Use when: viewing/configuring event tables, checking telemetry setup, getting/setting telemetry levels, querying event table data, understanding telemetry formats. Triggers: event table, get event table, show event table, current event table, event table setup, event table configuration, telemetry, telemetry setup, telemetry configuration, telemetry levels, get telemetry, show telemetry, check telemetry, log level, trace level, metric level, logging setup, tracing setup, observability setup, event table format, telemetry format, log format, trace format, metric format.event-table-get-setup Get/show current Snowflake event table configuration and telemetry lev…event-table-modify-setup Set up or verify Snowflake event table configuration and telemetry lev…event-table-telemetry-format Parse and explain telemetry formats (logs, metrics, traces, events) fr…
- •
notificationRouter for Snowflake notification skills. Routes to integration creation/managem… +Router for Snowflake notification skills. Routes to integration creation/management, content formatting, or sending. Triggers: notification, notification integration, email notification, webhook, slack, teams, pagerduty, send notification, notification content.notification-content Generate notification content for SYSTEM$SEND_SNOWFLAKE_NOTIFICATION f…notification-integration Create and manage Snowflake notification integrations for email and we…notification-send Send notifications. Takes content and an integration name, wraps the c…
- •
snowpark-connect - •
cortex-ai-functionsdescription_changed +Use Snowflake Cortex AI Functions for text/image analytics. Use when: classifying content, extracting entities, sentiment analysis, summarizing text, translating, filtering, embedding, parsing documents, redacting PII, aggregating data, document intelligence workflows, content insight workflows. workflows, fine-tuning arctic-extract for domain-specific extraction. Triggers: AI_CLASSIFY, AI_COMPLETE, AI_EXTRACT, AI_FILTER, AI_SENTIMENT, AI_SUMMARIZE, AI_TRANSLATE, AI_EMBED, AI_AGG, AI_REDACT, AI_PARSE_DOCUMENT, classify text, data, documents, extract from text, extract text from document, extract text from PDF, extract text from image, extracting, invoices, sentiment, summarize, translate, OCR, which AI function, cortex function, process documents, label content, analyze text, OCR, read PDF, read document, get text from PDF, get text from document, pull text from file, extract data from files, extract from my files, process my files, my files, my documents, read my documents, get data from document, file extraction, document processing, file processing, get information from documents, analyze files, parse files, data from PDF, invoice processing, contract extraction, receipt extraction, form extraction, extract fields, document data, file data, stage files, files on stage, PDF extraction, image extraction, document OCR, scan documents, digitize documents. documents, fine-tune, fine-tuning, custom model, train arctic-extract, improve extraction accuracy, domain-specific extraction, FINETUNE, better extraction results. - •
data-productsdescription_changed +Create organizational listings to share data products via Internal Marketplace. Triggers: create data product, share to internal marketplace, publish to internal marketplace, share to other accounts, share with other accounts, organization listing, org listing, share across accounts, internal marketplace, cross-account sharing, share my agent to other accounts. WHEN TO USE THIS SKILL: - User wants to share with OTHER ACCOUNTS → Use this skill - User mentions "internal marketplace" or "data product" (even for same account) → Use this skill WHEN TO USE RBAC INSTEAD (not this skill): - User wants to share with roles in SAME account only - User does NOT mention "internal marketplace" or "data product" or "listing" - Example: "share this table with ANALYST role" → Use GRANT, not this skill WHEN NOT TO USE THIS SKILL: - User wants to migrate an EXISTING direct share to an org listing → Use the direct-share-to-org-listing-migration skill instead - User wants to migrate an EXISTING personalized listing to an org listing → Use the personalized-listing-to-org-listing-migration skill instead - User wants to migrate an EXISTING private data exchange (PDX) listing to an org listing → Use the pdx-listing-to-org-listing-migration skill instead KEY: If user says "share via internal marketplace" or "as a data product" even for same-account roles, use this skill. Otherwise, same-account = regular RBAC grants. - •
data-qualitydescription_changed +Schema-level data quality monitoring, table comparison, dataset popularity analysis, and ad-hoc column quality assessment using Snowflake Data Metric Functions (DMFs) and Access History. History, and LLM prompt quality scoring and rewriting. Use when user asks about: data quality, schema health, DMF results, quality score, trust my data, quality regression, quality trends, SLA alerting, data metric functions, failing metrics, quality issues, compare tables, data diff, validate migration, table comparison, popular tables, most used tables, unused data, dataset usage, table popularity, listing quality, listing health, listing freshness, provider data quality, consumer data quality, one-time quality check, quick quality scan, check data quality without DMFs, recommend monitors, what should I monitor, DQ coverage gaps, unmonitored tables, DMF coverage report, monitoring health, noisy monitors, silent monitors, misconfigured monitors, DMF cost optimization, investigate DQ incident, why did freshness drop, why did row count drop, correlate violation, multi-dimensional root cause, circuit breaker, pause pipeline on violation, halt bad data propagation, custom DMF, format validation DMF, email format check, value range check, DMF expectations, set threshold, tune DMF threshold, DMF expectation management, attach DMFs, set up DMFs for first time, DMF setup wizard, accepted values, ACCEPTED_VALUES, validate column values, allowed values check, value in set, categorical validation, referential integrity, REFERENTIAL_INTEGRITY_COUNT, orphaned rows, foreign key validation, FK check, cross-table integrity. integrity, prompt quality, score my prompt, prompt score, improve prompt, rewrite prompt, prompt linter, prompt engineering, prompt regression, compare prompts, prompt scoring dimensions. - •
native-app-providerdescription_changed +Use for **ALL** Snowflake Native App Framework tasks: creating app packages, writing manifest files, writing setup scripts, sharing data, testing, versioning, publishing, configuring telemetry and health status reporting, monitoring app health and lifecycle events, setting up event sharing, and debugging apps. Also use for **ALL** SPCS (Snowpark Container Services) work within native apps: adding containers, upgrading container services, building and pushing images, writing service specs, configuring compute pools, and managing service lifecycle. This is the **REQUIRED** entry point for any native app work. DO NOT attempt native app development manually - invoke this skill first. Triggers: native app, app package, application package, manifest.yml, setup script, CREATE APPLICATION, Snowflake marketplace, listing, native app framework, build native app, walk me through, guide me, get started, add version, register version, add patch, release channel, release directive, publish app, publish version, upgrade consumers, telemetry, health status, SYSTEM$REPORT_HEALTH_STATUS, log_level, trace_level, event definitions, event sharing, APPLICATION_STATE, lifecycle events, monitor app, debug app, observability, add streamlit, streamlit dashboard, add dashboard, streamlit UI, add UI to native app, native app streamlit, streamlit frontend, get_active_session, default_streamlit. default_streamlit, SPCS native app, container native app, native app containers, native app SPCS, add containers, container_services, grant_callback, specification file, version_initializer.
| | [v1.0.71] 2026-04-24 ADDED: +1 MODIFIED: ~3 - •
access-troubleshooterDebug authorization and permission issues in Snowflake. Use when: access denied,… +Debug authorization and permission issues in Snowflake. Use when: access denied, insufficient privileges, permission errors, role issues, missing grants, privilege analysis, least-privilege role creation, find authorizing roles. Triggers: access denied, insufficient privileges, permission error, authorization failed, can't access, missing permission, grant needed, role recommendation, SQL access control error, does not exist or not authorized, EXPLAIN_PRIVILEGES, SYSTEM$ANALYZE_ROLE_ACCESS, SYSTEM$SUGGEST_ROLE_GRANTS. - •
data-qualitydescription_changed +Schema-level data quality monitoring, table comparison, dataset popularity analysis, and ad-hoc column quality assessment using Snowflake Data Metric Functions (DMFs) and Access History. Use when user asks about: data quality, schema health, DMF results, quality score, trust my data, quality regression, quality trends, SLA alerting, data metric functions, failing metrics, quality issues, compare tables, data diff, validate migration, table comparison, popular tables, most used tables, unused data, dataset usage, table popularity, listing quality, listing health, listing freshness, provider data quality, consumer data quality, one-time quality check, quick quality scan, check data quality without DMFs, recommend monitors, what should I monitor, DQ coverage gaps, unmonitored tables, DMF coverage report, monitoring health, noisy monitors, silent monitors, misconfigured monitors, DMF cost optimization, investigate DQ incident, why did freshness drop, why did row count drop, correlate violation, multi-dimensional root cause, circuit breaker, pause pipeline on violation, halt bad data propagation, custom DMF, format validation DMF, email format check, value range check, referential integrity DMF, DMF expectations, set threshold, tune DMF threshold, DMF expectation management, attach DMFs, set up DMFs for first time, DMF setup wizard, accepted values, ACCEPTED_VALUES, validate column values, allowed values check, value in set, categorical validation. validation, referential integrity, REFERENTIAL_INTEGRITY_COUNT, orphaned rows, foreign key validation, FK check, cross-table integrity. - •
dbt-projects-on-snowflakedescription_changed +ONLY for dbt projects deployed INTO Snowflake as native objects via the `snow dbt` CLI — CLI, OR for authoring dbt models using Snowflake-native features (e.g., semantic_view materialization via dbt_semantic_view package). NOT for normal dbt development. Invoke ONLY when the user explicitly mentions: `snow dbt` commands (deploy, execute, list), `EXECUTE DBT PROJECT` SQL, a deployed dbt project object (e.g., DB.SCHEMA.MY_PROJECT), `ALTER/DROP/DESCRIBE/SHOW DBT PROJECT` SQL, scheduling a deployed dbt project with CREATE TASK, OR generating documentation/catalog/lineage for a deployed project, OR authoring Snowflake-specific dbt materializations (semantic_view, dbt_semantic_view), OR adding a semantic view to an existing dbt project. Do NOT invoke for standard dbt workflows: dbt run, dbt build, dbt test, dbt seed, dbt init, dbt compile, dbt debug, dbt snapshot, dbt deps, dbt clean, dbt retry, dbt ls, profiles.yml, dbt_project.yml, model editing, source freshness, Jinja/macro development, CI/CD pipelines, or any dbt command run from a terminal. The key distinction: this skill is about dbt-as-a-Snowflake-object (snow dbt deploy), not dbt-as-a-CLI-tool (dbt run). Triggers: snow dbt, snow dbt deploy, snow dbt execute, snow dbt list, EXECUTE DBT PROJECT, deployed dbt project, ALTER DBT PROJECT, DROP DBT PROJECT, DESCRIBE DBT PROJECT, SHOW DBT PROJECTS, VERSION$, external-access-integration, dbt project object, migrate, prepare for snowflake, docs generate deployed, documentation deployed project, data catalog deployed, lineage deployed project, generate documentation for deployed. deployed, semantic_view materialization, dbt_semantic_view, semantic view in dbt project, add semantic view to dbt, dbt project semantic view, analytical access dbt project. - •
semantic-viewdescription_changed +**[REQUIRED]** Use for ALL requests that mention: create, build, debug, fix, troubleshoot, optimize, improve, or analyze a semantic view — AND for requests about VQR suggestions, verified queries, verified query representations, seeding/generating queries, suggesting metrics, suggesting filters, recommending metrics/filters/facts, or enriching a semantic view. This is the REQUIRED entry point - even if the request seems simple. DO NOT attempt to create, debug, or generate suggestions for semantic views manually - always invoke this skill first. This skill guides users through creation, setup, auditing, VQR suggestion generation, filter & metric suggestions, and SQL generation debugging workflows for semantic views with Cortex Analyst.
| | [v1.0.66] 2026-04-22 ADDED: +2 MODIFIED: ~6 - •
billingOrg-level spend in currency via SNOWFLAKE.ORGANIZATION_USAGE. Covers USAGE_IN_CU… +Org-level spend in currency via SNOWFLAKE.ORGANIZATION_USAGE. Covers USAGE_IN_CURRENCY_DAILY, REMAINING_BALANCE_DAILY, CONTRACT_ITEMS, RATE_SHEET_DAILY. Invoices, charges, contracts, balance, reconciliation, rate comparison, spend by account. Not for single-account credit analytics (cost-intelligence) or warehouse config (warehouse). - •
ctx-workflowMulti-phase team orchestration for feature implementation. Supports two entry pa… +Multi-phase team orchestration for feature implementation. Supports two entry paths: explicit user request for teammates, or autonomous complexity-based assessment after entering plan mode. HIGHEST PRIORITY — must be loaded FIRST (before any domain skills) when user asks to use teammates, teams, or parallel agents. Triggers: use teammates, use a team, work in parallel with agents, delegate to teammates, swarm this, swarm, team up on this, team up, orchestrate with subagents, subagent-orchestrated, gated workflow, multi-phase workflow, coordinate agents, spawn workers, worker/verifier, parallel agents, run as a team, investigate with agents, research with agents, explore with agents. - •
cost-intelligencedescription_changed +**[REQUIRED]** Use for ALL Snowflake Account-level cost and billing questions: spending, credits, costs, warehouse costs, compute costs, serverless credits, analytics via SNOWFLAKE.ACCOUNT_USAGE. Credit usage by warehouse, user, service. Budgets, spending limits, custom budgets. Resource monitors, credit quotas, suspend triggers. Anomaly detection, Cortex AI costs, storage costs, budgets, resource monitors, metering, consumption, billing, chargeback, storage, serverless, containers, data transfer, top user spending, top spenders, who is spending, expensive queries, spend, query costs, budget actions, budget notifications, budget alerts, spending limit, create budget, set budget, drop budget, delete budget, remove budget, where is my money going, cost breakdown, credits by service, overall spending, cost increase, why did costs go up, unusual spending, cost spikes, cost anomaly, anomaly notification, anomaly email, cost spike alert, cortex cost, cortex credits, cortex spend, cortex AI cost, cortex AI function cost, cortex AI function costs, cortex AI function credits, AI function cost, AI function costs, analyst cost, analyst credits, LLM cost, cortex search cost, cortex search credits, cortex agents cost, cortex agents credits, cortex code cost, cortex code CLI cost, cortex code Snowsight cost, snowflake intelligence cost, snowflake intelligence credits, fine-tuning cost, model training cost, provisioned throughput cost, PTU cost, team costs, department spending, cost center, chargeback, showback, SPCS cost, compute pool credits, container services cost, data transfer cost, cross-region cost, cross-cloud cost, egress cost, budget status, budget spend, over budget, at risk budget. grouping. Not for org-wide currency spend or multi-account billing (billing/organization-management) or warehouse DDL (warehouse). - •
data-productsdescription_changed +Create organizational listings to share data products via Internal Marketplace. Triggers: create data product, share to internal marketplace, publish to internal marketplace, share to other accounts, share with other accounts, organization listing, org listing, share across accounts, internal marketplace, cross-account sharing, share my agent to other accounts. WHEN TO USE THIS SKILL: - User wants to share with OTHER ACCOUNTS → Use this skill - User mentions "internal marketplace" or "data product" (even for same account) → Use this skill WHEN TO USE RBAC INSTEAD (not this skill): - User wants to share with roles in SAME account only - User does NOT mention "internal marketplace" or "data product" or "listing" - Example: "share this table with ANALYST role" → Use GRANT, not this skill WHEN NOT TO USE THIS SKILL: - User wants to migrate an EXISTING direct share to an org listing → Use the direct-share-to-org-listing-migration skill instead - User wants to migrate an EXISTING personalized listing to an org listing → Use the personalized-listing-to-org-listing-migration skill instead KEY: If user says "share via internal marketplace" or "as a data product" even for same-account roles, use this skill. Otherwise, same-account = regular RBAC grants. - •
dynamic-tablesdescription_changed +**[REQUIRED]** Use for **ALL** Snowflake Dynamic Table operations: creating, optimizing, monitoring, troubleshooting, and troubleshooting. pipeline diagnostics. This is the required entry point for any dynamic table related tasks (DT is an acronym for dynamic table). Triggers: dynamic table, data pipeline, incremental pipeline, DT pipeline, incremental refresh, target lag, UPSTREAM_FAILED, refresh failing, full refresh instead of incremental, DT health, create DT, debug DT. DT, pipeline timeline, Gantt chart, why was DT skipped, trace pipeline, critical path, why was DT skipped. - •
icebergdescription_changed +Use for **ALL** Iceberg table requests in Snowflake. This is the **REQUIRED** entry point for catalog integrations, catalog-linked databases, external volumes, auto-refresh issues, and Snowflake Intelligence. DO NOT work with Iceberg manually - invoke this skill first. Triggers: iceberg, iceberg table, apache iceberg, catalog integration, REST catalog, ICEBERG_REST, glue, AWS glue, glue IRC, lake formation, unity catalog, databricks, polaris, opencatalog, open catalog, onelake, OneLake, microsoft fabric, fabric, fabric lakehouse, onelake REST, SAP, SAP BDC, SAP Business Data Cloud, CLD, catalog-linked database, linked catalog, auto-discover tables, sync tables, LINKED_CATALOG, external volume, storage access, S3, Azure blob, GCS, IAM role, trust policy, Access Denied, 403 error, ALLOW_WRITES, storage permissions, auto-refresh, autorefresh, stale data, refresh stuck, delta direct, snowflake intelligence, text-to-SQL iceberg, query iceberg natural language. - •
snowflake-postgresdescription_changed +**[REQUIRED]** Use for **ALL** requests involving Snowflake Postgres: create instance, list instances, suspend, resume, reset credentials, describe instance, import connection, health check, diagnostics, pg_lake, iceberg tables, data lake, storage integration. Postgres, and for general help working with any PostgreSQL database through standard PG tooling (psql, ~/.pg_service.conf, ~/.pgpass, pg_doctor diagnostics). Triggers: 'postgres', 'postgresql', 'pg', 'psql', 'create postgres instance', 'show postgres instances', 'suspend', 'resume', 'suspend postgres', 'resume postgres', 'reset postgres credentials', 'rotate postgres password', 'reset access', 'import postgres connection', 'network 'postgres network policy', 'my IP', 'health 'postgres health check', 'diagnose', 'insights', 'pg_doctor', 'slow 'pg_lake', 'postgres iceberg', 'pg iceberg', 'postgres slow queries', 'cache hit', 'bloat', 'vacuum', 'dead rows', 'locks', 'postgres locks', 'blocking queries', 'blocked', 'disk 'postgres disk usage', 'what's running', 'active postgres queries', 'connection 'postgres connection count', 'pg_lake', 'iceberg', 'data lake', 'storage integration', 'parquet', 'COPY 'neon', 'supabase', 'rds postgres', 'aurora postgres', 'azure postgres', 'crunchy bridge', 'external postgres', 'my postgres'. Do NOT use for generic Iceberg / catalog integration / storage integration / data lake requests — those are owned by the `iceberg` skill. Only handle Iceberg when it is scoped to S3', 'export to S3', 'lake'. pg_lake (Postgres-resident Iceberg tables). - •
warehousedescription_changed +**[REQUIRED]** Use Warehouse configuration, DDL, Gen2 creation/conversion, performance tuning, DML optimization, ETL workloads, sizing, credit-per-hour rates from Credit Consumption Table. Resume behavior, region availability, Snowpark-optimized limitations. Not for **ALL** Snowflake cost analytics or historical warehouse questions (except interactive warehouses — those use the interactive-warehouse skill). Covers: Gen2 warehouses, warehouse credits, warehouse cost, warehouse pricing, how much does a warehouse cost, warehouse size, create warehouse, alter warehouse, warehouse generation, warehouse performance, warehouse types. Triggers: gen2, generation 2, GENERATION = '2', gen2 credit rate, convert to gen2, gen1 to gen2, gen2 regions, gen2 limitations, gen2 performance, warehouse generation, gen2 benchmark, compare gen1 gen2, DML performance, slow DELETE, slow MERGE, slow resume, resume time, warehouse resume, warehouse credits, warehouse cost, how much is a warehouse, warehouse pricing, warehouse size, create warehouse, alter warehouse. spend (cost-intelligence) or org billing (billing).
| | [v1.0.60] 2026-04-18 ADDED: +3 REMOVED: -3 MODIFIED: ~4 - •
declarative-sharing**[REQUIRED]** Use for **ALL** declarative sharing and application packages with… +**[REQUIRED]** Use for **ALL** declarative sharing and application packages with TYPE=DATA, (i.e data apps). Share data products across Snowflake accounts with versioning. Default choice when user wants to share data with another account. Also use when converting an existing data share to declarative sharing. Triggers: declarative, data product, native app, data app, data application, share, sharing, another account, cross account, cross region, application package, manifest, marketplace, listing, publish, share a table, share data, manifest from share, share to manifest, generate manifest from share, inspect share, share to yaml, introspect share, convert share, migrate share, existing share, secure share to declarative, upgrade share, future-proof share, multiple shares, combine shares, merge shares, multiple data shares - •
migration-guide - •
setup-snowflake-ssoSet up Single Sign-On (SSO) for Snowflake with your Identity Provider (IdP).
Sup… +Set up Single Sign-On (SSO) for Snowflake with your Identity Provider (IdP).
Supports Microsoft Entra ID (Azure AD), Okta, and other SAML 2.0 providers including
OneLogin, Ping Identity, Google Workspace, Auth0, Duo, JumpCloud, and more.
Includes advanced scenarios: Allowed Interfaces, Auto Redirect, and Snowflake Intelligence tile setup. - •
dashboard - •
declarative - •
snowconvert-assessment - •
cortex-ai-functionsdescription_changed +Use Snowflake Cortex AI Functions for text/image analytics. Use when: classifying content, extracting entities, sentiment analysis, summarizing text, translating, filtering, embedding, parsing documents, redacting PII, aggregating data, document intelligence workflows, content insight workflows. Triggers: AI_CLASSIFY, AI_COMPLETE, AI_EXTRACT, AI_FILTER, AI_SENTIMENT, AI_SUMMARIZE, AI_TRANSLATE, AI_EMBED, AI_AGG, AI_REDACT, AI_PARSE_DOCUMENT, classify text, data, documents, extract from text, extract text from document, extract text from PDF, extract text from image, extracting, invoices, sentiment, summarize, translate, OCR, which AI function, cortex function, process documents, label content, analyze text, OCR, read PDF, read document, get text from PDF, get text from document, pull text from file, extract data from files, extract from my files, process my files, my files, my documents, read my documents, get data from document, file extraction, document processing, file processing, get information from documents, analyze files, parse files, data from PDF, invoice processing, contract extraction, receipt extraction, form extraction, extract fields, document data, file data, stage files, files on stage, PDF extraction, image extraction, document OCR, scan documents, digitize documents. - •
machine-learningdescription_changed +**[REQUIRED]** For **ALL** data science and machine learning tasks. This skill should ALWAYS be loaded in even if only a portion of the workflow is related to machine learning. Use when: analyzing data, training models, deploying models to Snowflake, registering models, working with ML workflows, running ML jobs on Snowflake compute, model registry, model service, model inference, log model, deploy pickle file, experiment tracking, model monitoring, ML observability, tracking drift, model performance analysis, distributed training, XGBoost, LightGBM, PyTorch, DPF, distributed partition function, many model training, hyperparameter tuning, HPO, compute pools, train at scale, feature store, feature views, entities, training datasets, online features, pipeline orchestration, DAG, task graph, schedule training, datasets, dataset versioning, DataConnector, ML lineage, model lineage, GET_LINEAGE, trace lineage, forecast, forecasting, time series, anomaly detection, outlier, predict, predictions, backtest, classify, classification, regression, clustering, build a model, create a model, sklearn, scikit-learn, tensorflow, ML, mlops, ray, GPU, deep learning, neural network, explain model, SHAP, Shapley, feature importance, model explainability, interpret model. model, preprocessing, preprocessor, scaling, encoding, imputation, normalize, transform data before training, preprocessing pipeline. Routes to specialized sub-skills. - •
native-app-consumerdescription_changed +Use **[REQUIRED]** for **ALL** ALL Snowflake Native App consumer tasks: installing apps from listings as a consumer, configuring installed apps (granting privileges, approving specifications, reviewing references), and managing maintenance policies. policies, understanding native app cost and credit usage, adding native apps to budgets. Triggers: native app, install native app, configure native app, approve spec, decline spec, maintenance policy, maintenance window, upgrade schedule, control upgrades. upgrades, app cost, app budget, app spending, native app cost, native app credits, how much does my app cost. - •
native-app-providerdescription_changed +Use for **ALL** Snowflake Native App Framework tasks: creating app packages, writing manifest files, writing setup scripts, sharing data, testing, versioning, publishing, configuring telemetry and health status reporting, monitoring app health and lifecycle events, setting up event sharing, and debugging apps. This is the **REQUIRED** entry point for any native app work. DO NOT attempt native app development manually - invoke this skill first. Triggers: native app, app package, application package, manifest.yml, setup script, CREATE APPLICATION, Snowflake marketplace, listing, native app framework, build native app, walk me through, guide me, get started, add version, register version, add patch, release channel, release directive, publish app, publish version, upgrade consumers, telemetry, health status, SYSTEM$REPORT_HEALTH_STATUS, log_level, trace_level, event definitions, event sharing, APPLICATION_STATE, lifecycle events, monitor app, debug app, observability. observability, add streamlit, streamlit dashboard, add dashboard, streamlit UI, add UI to native app, native app streamlit, streamlit frontend, get_active_session, default_streamlit.
|
|
Base Version (when this tracker started)
|
|
[v1.0.58+183258.b9d8c6466577] 41 bundled skills
- • build-react-appBuild React/Next.js apps with Snowflake data. Use when: building dashboards, cre… +Build React/Next.js apps with Snowflake data. Use when: building dashboards, creating data apps, making analytics tools.
- • cortex-agent
**[REQUIRED]** Use for ALL requests that mention agents: list, show, create, bui… +**[REQUIRED]** Use for ALL requests that mention agents: list, show, create, build, set up, edit, modify, update, delete, drop, remove, download, export, debug, fix, troubleshoot, optimize, improve, evaluate, or analyze a (Cortex) agent. Also use when user wants to: chat with, talk to, converse with, send messages to, have a conversation with an agent, or run a lite/objectless agent. Also use when debugging Snowflake Intelligence with a request ID (SI is powered by Cortex Agents). This is the REQUIRED entry point - even if the request seems simple. DO NOT attempt to manage (Cortex) agents manually - always invoke this skill first.adhoc-testing-for-cortex-agent Interactive testing of Cortex Agents. Use this when you want to test s…agent-observability-report Generate comprehensive observability reports for Cortex Agents using A…agent-system-of-record Establish a consistent protocol for tracking agent optimization work b…best-practices This skill contains best practices on how to write good system prompt/…chat-with-agent Interactive chat with a Cortex Agent. Supports object-based and lite (…create-cortex-agent Create and administer Cortex Agents. Use for: creating agents, adding …dataset-curation Create and manage evaluation datasets for Cortex Agents. Use this to b…debug-single-query-for-cortex-agent Interactively debug specific agent query failures to identify and fix …delete-cortex-agent Delete (DROP) an existing Cortex Agent. Use for: delete agent, drop ag…edit-cortex-agent Edit an existing Cortex Agent's configuration (instructions, tools, co…evaluate-cortex-agent Run formal evaluations on Cortex Agents using Snowflake's native Agent…investigate-cortex-agent-evalslist-cortex-agents List Cortex Agents in a Snowflake account, database, or schema. Use fo…manage-agent-threads Manage Cortex Agent conversation threads. Create, list, describe, upda…optimize-cortex-agent This goes through a workflow to guide AI assistants through optimizing…optimize-cortex-search-service
- • cortex-ai-functions
Use Snowflake Cortex AI Functions for text/image analytics. Use when: classifyin… +Use Snowflake Cortex AI Functions for text/image analytics. Use when: classifying content, extracting entities, sentiment analysis, summarizing text, translating, filtering, embedding, parsing documents, redacting PII, aggregating data, document intelligence workflows, content insight workflows. Triggers: AI_CLASSIFY, AI_COMPLETE, AI_EXTRACT, AI_FILTER, AI_SENTIMENT, AI_SUMMARIZE, AI_TRANSLATE, AI_EMBED, AI_AGG, AI_REDACT, AI_PARSE_DOCUMENT, classify text, data, documents, extract from text, extract text from document, extract text from PDF, extract text from image, extracting, invoices, sentiment, summarize, translate, which AI function, cortex function, process documents, label content, analyze text, OCR, read PDF, read document, get text from PDF, get text from document, pull text from file, extract data from files, extract from my files, process my files, my files, my documents, read my documents, get data from document, file extraction, document processing, file processing, get information from documents, analyze files, parse files, data from PDF, invoice processing, contract extraction, receipt extraction, form extraction, extract fields, document data, file data, stage files, files on stage, PDF extraction, image extraction, document OCR, scan documents, digitize documents.document-intelligence Extract, parse, analyze and classify documents using Snowflake Cortex …
- • cortex-code-guideComplete reference guide for Cortex Code (CoCo) CLI. Use when: learning cortex f… +Complete reference guide for Cortex Code (CoCo) CLI. Use when: learning cortex features, understanding commands, troubleshooting setup, exploring Snowflake tools, managing sessions, configuring agents, keyboard shortcuts, MCP integration. Triggers: how to use cortex, cortex guide, cortex help, cortex commands, getting started, snowflake tools, #table syntax, subagents, sessions, resume, fork, rewind, compact, /agents, configuration.
- • cortex-secretsMUST consult whenever any command needs a credential, secret, API key, token, or… +MUST consult whenever any command needs a credential, secret, API key, token, or password — whether discovered from an error, source code, --help output, or any other signal. MUST also consult when the user shares, pastes, or includes a secret value directly in their message. Also use when: the user asks about /secrets, storing credentials, secret scopes, or consent modes. Triggers: secret, secrets, /secrets, API key, credential, token, password, authentication, unauthorized, 401, 403, forbidden, EACCES, permission denied, access denied, missing key, invalid token, auth error, connection refused, login failed, .env, environment variable, env var, keychain, export SECRET, cortex secret list, inline secret injection, pasted secret, shared secret, my key is, my password is, my token is, here is my, use it to.
- • cost-intelligence**[REQUIRED]** Use for ALL Snowflake cost and billing questions: spending, credi… +**[REQUIRED]** Use for ALL Snowflake cost and billing questions: spending, credits, costs, warehouse costs, compute costs, serverless credits, AI costs, storage costs, budgets, resource monitors, metering, consumption, billing, user spending, top spenders, who is spending, expensive queries, query costs, budget actions, budget notifications, budget alerts, spending limit, create budget, set budget, drop budget, delete budget, remove budget, where is my money going, cost breakdown, credits by service, overall spending, cost increase, why did costs go up, unusual spending, cost spikes, cost anomaly, anomaly notification, anomaly email, cost spike alert, cortex cost, cortex credits, cortex spend, cortex AI cost, cortex AI function cost, cortex AI function costs, cortex AI function credits, AI function cost, AI function costs, analyst cost, analyst credits, LLM cost, cortex search cost, cortex search credits, cortex agents cost, cortex agents credits, cortex code cost, cortex code CLI cost, cortex code Snowsight cost, snowflake intelligence cost, snowflake intelligence credits, fine-tuning cost, model training cost, provisioned throughput cost, PTU cost, team costs, department spending, cost center, chargeback, showback, SPCS cost, compute pool credits, container services cost, data transfer cost, cross-region cost, cross-cloud cost, egress cost, budget status, budget spend, over budget, at risk budget.
- • dashboardCreate, modify, and answer questions about interactive dashboards with charts, t… +Create, modify, and answer questions about interactive dashboards with charts, tables, and markdown widgets. Use when users ask for: dashboards, KPI reports, executive summaries, multi-chart visualizations, data overviews, metric tracking, modifying dashboard widgets, adding charts, changing layouts, fixing dashboard issues, 'show me everything about...'. Triggers: dashboard, create dashboard, build dashboard, modify dashboard, update dashboard, fix dashboard, add widget, change chart, KPI dashboard, executive report, overview dashboard, sales dashboard, performance dashboard, metrics dashboard, visualizations, dashboard layout, dashboard spec.
- • data-cleanrooms
Use for ALL requests related to Snowflake Data Clean Rooms (DCR): clean room, cl… +Use for ALL requests related to Snowflake Data Clean Rooms (DCR): clean room, cleanroom, DCR, collaboration(s), view/list collaborations, join/review collaboration, invitation, data offering(s), template(s), register, share table, run analysis, run activation, audience overlap, activation, export segment, create collaboration, create cleanroom, measure overlap. Covers browsing, joining, registering, running analysis/activation, and creating collaborations via the DCR Collaboration API.browse Browse Clean Room Environment - explore collaborations, data offerings…create Create a new DCR collaboration or clean room - gather collaborators, c…register Register data offerings and templates for use in DCR collaborations. T…review-join Review and Join Collaborations - review invitations, check status, and…run Run analysis or activation templates on DCR collaborations. Triggers: …activation Run activation templates - standard audience overlap activation or cus…analysis Run analysis templates - standard audience overlap or custom sql_analy…
- • data-governance**[REQUIRED]** for all Snowflake data governance tasks. Routes to six sub-skills… +**[REQUIRED]** for all Snowflake data governance tasks. Routes to six sub-skills: (1) horizon-catalog — access history, users, roles, grants, permissions, query history, compliance, catalog; (2) data-policy — [REQUIRED] masking, row access, projection policies, tag-based masking, protect sensitive data, column/TIMESTAMP masking; (3) sensitive-data-classification — [REQUIRED for ALL classification] PII, classify, data classification, manual/automatic classification, Classification Profile, auto_tag, custom classifiers, regex, semantic/privacy category, IDENTIFIER, QUASI_IDENTIFIER, SENSITIVE, SYSTEM$CLASSIFY, DATA_CLASSIFICATION_LATEST, GDPR/CCPA/PCI; (4) governance-maturity-score — governance posture, maturity score, assessment, recommendations; (5) observability-maturity-score — data observability, DMF coverage, quality monitoring maturity, lineage usage, observability assessment; (6) object-contacts — [REQUIRED] assign data steward, create contact, object contact, contact report, who owns this table, SET CONTACT, data stewardship. MUST be used for classification or masking tasks — do not answer from general knowledge. horizon-catalog is the fallback. Triggers: governance, access history, permissions, grants, roles, audit, compliance, catalog, masking policy, row access policy, PII, sensitive data, classification, run classification, SYSTEM$CLASSIFY, classifier, classification profile, DATA_CLASSIFICATION_LATEST, detect PII, GDPR, CCPA, PCI, tag sensitive columns, governance maturity score, governance posture, how well governed, data observability, observability maturity, DMF coverage, lineage usage, observability assessment, data steward, object contact, assign contact, who owns this table, contact report, SET CONTACT.
- • data-products
Create organizational listings to share data products via Internal Marketplace. … +Create organizational listings to share data products via Internal Marketplace. Triggers: create data product, share to internal marketplace, publish to internal marketplace, share to other accounts, share with other accounts, organization listing, org listing, share across accounts, internal marketplace, cross-account sharing, share my agent to other accounts.
WHEN TO USE THIS SKILL: - User wants to share with OTHER ACCOUNTS → Use this skill - User mentions "internal marketplace" or "data product" (even for same account) → Use this skill
WHEN TO USE RBAC INSTEAD (not this skill): - User wants to share with roles in SAME account only - User does NOT mention "internal marketplace" or "data product" or "listing" - Example: "share this table with ANALYST role" → Use GRANT, not this skill
KEY: If user says "share via internal marketplace" or "as a data product" even for same-account roles, use this skill. Otherwise, same-account = regular RBAC grants.certification Recommend the best Snowflake table(s) to answer a user's data question…
- • data-qualitySchema-level data quality monitoring, table comparison, dataset popularity analy… +Schema-level data quality monitoring, table comparison, dataset popularity analysis, and ad-hoc column quality assessment using Snowflake Data Metric Functions (DMFs) and Access History. Use when user asks about: data quality, schema health, DMF results, quality score, trust my data, quality regression, quality trends, SLA alerting, data metric functions, failing metrics, quality issues, compare tables, data diff, validate migration, table comparison, popular tables, most used tables, unused data, dataset usage, table popularity, listing quality, listing health, listing freshness, provider data quality, consumer data quality, one-time quality check, quick quality scan, check data quality without DMFs, recommend monitors, what should I monitor, DQ coverage gaps, unmonitored tables, DMF coverage report, monitoring health, noisy monitors, silent monitors, misconfigured monitors, DMF cost optimization, investigate DQ incident, why did freshness drop, why did row count drop, correlate violation, multi-dimensional root cause, circuit breaker, pause pipeline on violation, halt bad data propagation, custom DMF, format validation DMF, email format check, value range check, referential integrity DMF, DMF expectations, set threshold, tune DMF threshold, DMF expectation management, attach DMFs, set up DMFs for first time, DMF setup wizard, accepted values, ACCEPTED_VALUES, validate column values, allowed values check, value in set, categorical validation.
- • dbt-projects-on-snowflake
ONLY for dbt projects deployed INTO Snowflake as native objects via the `snow db… +ONLY for dbt projects deployed INTO Snowflake as native objects via the `snow dbt` CLI — NOT for normal dbt development. Invoke ONLY when the user explicitly mentions: `snow dbt` commands (deploy, execute, list), `EXECUTE DBT PROJECT` SQL, a deployed dbt project object (e.g., DB.SCHEMA.MY_PROJECT), `ALTER/DROP/DESCRIBE/SHOW DBT PROJECT` SQL, scheduling a deployed dbt project with CREATE TASK, OR generating documentation/catalog/lineage for a deployed project. Do NOT invoke for standard dbt workflows: dbt run, dbt build, dbt test, dbt seed, dbt init, dbt compile, dbt debug, dbt snapshot, dbt deps, dbt clean, dbt retry, dbt ls, profiles.yml, dbt_project.yml, model editing, source freshness, Jinja/macro development, CI/CD pipelines, or any dbt command run from a terminal. The key distinction: this skill is about dbt-as-a-Snowflake-object (snow dbt deploy), not dbt-as-a-CLI-tool (dbt run). Triggers: snow dbt, snow dbt deploy, snow dbt execute, snow dbt list, EXECUTE DBT PROJECT, deployed dbt project, ALTER DBT PROJECT, DROP DBT PROJECT, DESCRIBE DBT PROJECT, SHOW DBT PROJECTS, VERSION$, external-access-integration, dbt project object, migrate, prepare for snowflake, docs generate deployed, documentation deployed project, data catalog deployed, lineage deployed project, generate documentation for deployed.deploy Deploy dbt projects to Snowflakeexecute Execute dbt commands on Snowflake (run, test, build, seed, snapshot, s…manage Manage dbt projects in Snowflake (list, rename, drop, describe, add ve…migrate Migrate dbt projects to run on Snowflake. Triggers: migrate, env_var, …monitoring Monitor dbt project executions: get logs, locate artifacts, download a…schedule **[REQUIRED]** Schedule dbt project execution via Snowflake Tasks. Inv…
- • dcm
Use for **ALL** requests that mention: create, build, set up, debug, fix, troubl… +Use for **ALL** requests that mention: create, build, set up, debug, fix, troubleshoot, optimize, improve, evaluate, or analyze a DCM project. This is the **REQUIRED** entry point - even if the request seems simple. DO NOT attempt to create DCM projects manually or search for DCM documentation - always invoke this skill first. This skill guides users through creating, auditing, evaluating, and debugging workflows for DCM (Database Change Management) projects. Triggers: DCM, DCM project, Database Change Management, snow dcm, manifest.yml with DEFINE, infrastructure-as-code, three-tier role pattern, database roles, DEFINE TABLE, DEFINE SCHEMA.create-project Create new DCM projects from scratch. Triggers: new project, create dc…deploy-project Safe deployment workflow for DCM projects. Triggers: deploy dcm, apply…modify-project Modify existing DCM projects. Triggers: modify dcm, update project, ad…roles-and-grants Best practices for roles and grants in DCM projects. Triggers: dcm rol…
- • declarative**[REQUIRED]** Use for **ALL** declarative sharing and application packages with… +**[REQUIRED]** Use for **ALL** declarative sharing and application packages with TYPE=DATA, (i.e data apps). Share data products across Snowflake accounts with versioning. Default choice when user wants to share data with another account. Also use when converting an existing data share to declarative sharing. Triggers: declarative, data product, native app, data app, data application, share, sharing, another account, cross account, cross region, application package, manifest, marketplace, listing, publish, share a table, share data, manifest from share, share to manifest, generate manifest from share, inspect share, share to yaml, introspect share, convert share, migrate share, existing share, secure share to declarative, upgrade share, future-proof share, multiple shares, combine shares, merge shares, multiple data shares
- • deploy-to-spcsDeploy containerized apps to Snowpark Container Services. Use when: deploying Do… +Deploy containerized apps to Snowpark Container Services. Use when: deploying Docker apps, creating SPCS services, pushing images to Snowflake registry, granting role access to SPCS service endpoints. Triggers: SPCS, Snowpark Container Services, deploy to Snowflake, container deployment, grant access to service, grant role access, service role, consumer access, SPCS service, service endpoints.
- • developing-with-streamlit**[REQUIRED]** Use for ALL Streamlit tasks: creating, editing, debugging, beauti… +**[REQUIRED]** Use for ALL Streamlit tasks: creating, editing, debugging, beautifying, styling, theming, optimizing, or deploying Streamlit applications. Also required for building custom components (inline or packaged), using st.components.v2, or any HTML/JS/CSS component work. Triggers: streamlit, st., dashboard, app.py, beautify, style, CSS, color, background, theme, button, widget styling, custom component, st.components, packaged component, pyproject.toml, asset_dir, CCv2, HTML/JS component.
- • dynamic-tables
**[REQUIRED]** Use for **ALL** Snowflake Dynamic Table operations: creating, opt… +**[REQUIRED]** Use for **ALL** Snowflake Dynamic Table operations: creating, optimizing, monitoring, and troubleshooting. This is the required entry point for any dynamic table related tasks (DT is an acronym for dynamic table). Triggers: dynamic table, data pipeline, incremental pipeline, DT pipeline, incremental refresh, target lag, UPSTREAM_FAILED, refresh failing, full refresh instead of incremental, DT health, create DT, debug DT.create Create new Snowflake dynamic tables with proper configurationdt-alerting Set up monitoring and alerting for dynamic table refreshes using event…monitor Monitor health, status, and refresh performance of Snowflake dynamic t…optimize Optimize Snowflake dynamic table performance and costpermissions Troubleshoot dynamic table failures because of permissions/privilege i…task-to-dt Convert streams and tasks pipelines to dynamic tables. Use when: repla…troubleshoot Diagnose and fix dynamic table refresh failures, UPSTREAM_FAILED error…
- • error-tables-opsAssess, enable, monitor, and manage Error Tables (DML Error Logging) across your… +Assess, enable, monitor, and manage Error Tables (DML Error Logging) across your Snowflake account. Use when: error tables, error logging, ERROR_TABLE, DML errors, which tables should I enable, which tables have error logging, analyze errors, error table storage, error table retention, clean up errors, monitor errors, error table health, error table report, set up alerting, failed DML queries, string truncation, NOT NULL violation, numeric overflow, check constraint violation, constraint failed.
- • iceberg
Use for **ALL** Iceberg table requests in Snowflake. This is the **REQUIRED** en… +Use for **ALL** Iceberg table requests in Snowflake. This is the **REQUIRED** entry point for catalog integrations, catalog-linked databases, external volumes, auto-refresh issues, and Snowflake Intelligence. DO NOT work with Iceberg manually - invoke this skill first. Triggers: iceberg, iceberg table, apache iceberg, catalog integration, REST catalog, ICEBERG_REST, glue, AWS glue, glue IRC, lake formation, unity catalog, databricks, polaris, opencatalog, open catalog, CLD, catalog-linked database, linked catalog, auto-discover tables, sync tables, LINKED_CATALOG, external volume, storage access, S3, Azure blob, GCS, IAM role, trust policy, Access Denied, 403 error, ALLOW_WRITES, storage permissions, auto-refresh, autorefresh, stale data, refresh stuck, delta direct, snowflake intelligence, text-to-SQL iceberg, query iceberg natural language.auto-refresh Debug auto-refresh issues for Iceberg and Delta Direct tables in Snowf…catalog-linked-database Setup, verify, and troubleshoot catalog-linked databases (CLD) for RES…create Create and execute catalog-linked database SQLsetup Gather configuration options for catalog-linked database setupverify Verify catalog-linked database sync status and table health
cld-snowflake-intelligence Surface Iceberg tables from Catalog-Linked Databases (CLD) in Snowflak…external-volume Use for **ALL** requests related to debugging, troubleshooting, or dia…
- • integrations
Create, replace, alter, drop, describe, and show Snowflake integrations. Covers … +Create, replace, alter, drop, describe, and show Snowflake integrations. Covers API, catalog, external access, notification, security, and storage integration types. Use when the user wants to manage integrations or asks about integration SQL commands.alter-api-integration Modify properties of an existing API integration (AWS API Gateway, Azu…alter-catalog-integration Modify properties of an existing catalog integration for Apache Iceber…alter-external-access-integration Modify properties of an existing external access integration used for …alter-integration Modify properties of an existing integration (generic). Use a type-spe…alter-notification-integration Modify properties of an existing notification integration (cloud messa…alter-security-integration Modify properties of an existing security integration (SCIM, SAML2, Sn…alter-storage-integration Modify properties of an existing storage integration (Amazon S3, Googl…create-api-integration Create a new API integration for AWS API Gateway, Azure API Management…create-catalog-integration Create a new catalog integration for Apache Iceberg tables (AWS Glue, …create-external-access-integration Create a new external access integration for network access to externa…create-integration Create a new integration (generic overview). Use a type-specific CREAT…create-notification-integration Create a new notification integration for cloud message queuing servic…create-security-integration Create a new security integration (SCIM, SAML2, Snowflake OAuth, Exter…create-storage-integration Create a new storage integration for Amazon S3, Google Cloud Storage, …describe-catalog-integration Describe the properties of a specific catalog integrationdescribe-integration Describe the properties of a specific integration of any typedescribe-notification-integration Describe the properties of a specific notification integrationdrop-catalog-integration Remove a catalog integration from the Snowflake accountdrop-integration Remove any type of integration from the Snowflake account. Syntax: DRO…show-catalog-integrations List catalog integrations in the account with their metadata and prope…show-delegated-authorizations List active delegated authorizations for a user, integration, or the e…show-integrations List integrations in the account, optionally filtered by type. Syntax:…show-notification-integrations List notification integrations in the account
- • interactive
**[REQUIRED]** Use for **ALL** Snowflake Interactive Table and Interactive Wareh… +**[REQUIRED]** Use for **ALL** Snowflake Interactive Table and Interactive Warehouse operations. Triggers: interactive table, interactive warehouse, low-latency queries, high-concurrency dashboard, TARGET_LAG for interactive.clustering Choose optimal clustering keys for interactive tables using query anal…create Create Snowflake interactive tables (static, dynamic). Triggers: creat…getting-started Getting started with interactive tables - convert existing tables, est…query Query patterns, JOINs, and benchmarking for Snowflake interactive tabl…troubleshoot Troubleshoot errors and performance issues with interactive tables/war…update-delete UPDATE/DELETE operations for interactive tables via standard + dynamic…warehouse Create and manage Snowflake interactive warehouses. Triggers: create i…
- • investigation
Comprehensive Snowflake security investigation and threat detection. Use for: lo… +Comprehensive Snowflake security investigation and threat detection. Use for: login anomalies, IP analysis, brute force detection, impossible travel, data exfiltration, bulk exports, unauthorized sharing, privilege escalation, RBAC violations, suspicious grants, backdoor accounts. This is the REQUIRED entry point for all security investigations. Routes to specialized sub-skills for focused analysis.exfiltration-detection Detect data exfiltration attempts in Snowflake. Use when: investigatin…login-ip-anomaly Detect IP address anomalies in Snowflake LOGIN_HISTORY. Use when: logi…privilege-escalation Detect privilege escalation attempts in Snowflake. Use when: investiga…
- • key-and-secret-management
Use for **ALL** requests that mention Tri-Secret Secure, customer-managed key op… +Use for **ALL** requests that mention Tri-Secret Secure, customer-managed key operations, or periodic data rekeying in Snowflake. Handles CMK status checks, registration, activation (standard, Postgres, private connectivity), deactivation, key rotation, change history, and periodic data rekeying. DO NOT attempt TSS, CMK, or periodic rekeying operations manually - invoke this skill first. Triggers: tri-secret secure, TSS, CMK, BYOK, encryption key, key rotation, CMK history, activate CMK, deactivate CMK, periodic rekeying, periodic data rekeying, PERIODIC_DATA_REKEYING, data rekey, enable rekeying, disable rekeying.tri-secret-secure Manage Tri-Secret Secure (TSS) encryption with customer-managed keys (…
- • lineageAnalyze data lineage and dependencies in Snowflake. Use for: impact analysis, ro… +Analyze data lineage and dependencies in Snowflake. Use for: impact analysis, root cause debugging, data discovery, column-level tracing. Triggers: 'what depends on', 'what breaks', 'where does this come from', 'is this trustworthy', 'column lineage'. For quality issues (missing data, wrong values, DMF failures) use the data_quality skill first, then this skill to trace upstream.
- • machine-learning
**[REQUIRED]** For **ALL** data science and machine learning tasks. This skill s… +**[REQUIRED]** For **ALL** data science and machine learning tasks. This skill should ALWAYS be loaded in even if only a portion of the workflow is related to machine learning. Use when: analyzing data, training models, deploying models to Snowflake, registering models, working with ML workflows, running ML jobs on Snowflake compute, model registry, model service, model inference, log model, deploy pickle file, experiment tracking, model monitoring, ML observability, tracking drift, model performance analysis, distributed training, XGBoost, LightGBM, PyTorch, DPF, distributed partition function, many model training, hyperparameter tuning, HPO, compute pools, train at scale, feature store, feature views, entities, training datasets, online features, pipeline orchestration, DAG, task graph, schedule training, datasets, dataset versioning, DataConnector, ML lineage, model lineage, GET_LINEAGE, trace lineage, forecast, forecasting, time series, anomaly detection, outlier, predict, predictions, backtest, classify, classification, regression, clustering, build a model, create a model, sklearn, scikit-learn, tensorflow, ML, mlops, ray, GPU, deep learning, neural network, explain model, SHAP, Shapley, feature importance, model explainability, interpret model. Routes to specialized sub-skills.batch-inference-jobs Run batch inference on models in Snowflake Model Registry. Covers BOTH…non-template Batch inference on media files (images, audio, video) using InputSpec …template Batch inference with multimodal LLMs using OpenAI chat message format.…
datasets Snowflake Datasets for ML workflows. Use when: creating versioned data…debug-inference Debug model inference issues for both warehouse and SPCS. Covers dtype…distributed-training Distributed ML training on Snowpark Container Services. Routes to spec…dpf General-purpose distributed processing with DPF. Custom distributed wo…estimators Distributed model training with XGBEstimator, LightGBMEstimator, and P…mmt Train and run inference on one model per data partition using ManyMode…tuner Distributed hyperparameter tuning with Ray Tune on Snowpark Container …
experiment-tracking Track ML experiments in Snowflake. Use when: logging metrics, logging …feature-store **[REQUIRED]** Use for **ALL** Snowflake Feature Store operations: cre…create Create Snowflake Feature Store, register entities, create feature view…lineage Feature lineage analysis (which models consume which features) and cre…migrate Migrate to Snowflake Feature Store from Feast, Tecton, or custom featu…monitor Monitor, audit, validate, and promote Snowflake Feature Store: health …online Enable and use Snowflake Feature Store online serving for low-latency …pipelines Build and manage Snowflake Feature Store pipelines — managed (Dynamic …training Generate training datasets from Snowflake Feature Store with point-in-…
inference-logs View and analyze captured inference data from model services with Auto…ml-development **[REQUIRED]** for ALL data science, machine learning, data analysis, …ml-jobs Transform local Python scripts into Snowflake ML Jobs. Use when: runni…ml-lineage Query and manage ML Lineage in Snowflake. Use when: tracing model trai…ml-pipeline-orchestration Create and deploy ML pipelines using Snowflake Task Graphs (DAGs). Use…model-monitor Set up and manage ML Observability for Snowflake Model Registry models…model-registry Deploy models to Snowflake Model Registry and route to inference deplo…hugging-face-models Deploy Hugging Face models to Snowflake Model Registry. Use when: Logg…partitioned-inference Partitioned inference with CustomModel and @partitioned_api decorator
spcs-inference Deploy models from Snowflake Model Registry to Snowpark Container Serv…
- • native-app-consumer
Use for **ALL** Snowflake Native App consumer tasks: installing apps from listin… +Use for **ALL** Snowflake Native App consumer tasks: installing apps from listings as a consumer, configuring installed apps (granting privileges, approving specifications, reviewing references), and managing maintenance policies. Triggers: install native app, configure native app, approve spec, decline spec, maintenance policy, maintenance window, upgrade schedule, control upgrades.configure-app Review and configure an installed Snowflake Native App as a consumer: …install-app Install a Snowflake Native App from a Marketplace listing as a consume…manage-maintenance-policy Create, apply, and manage consumer-controlled maintenance policies for…
- • native-app-provider
Use for **ALL** Snowflake Native App Framework tasks: creating app packages, wri… +Use for **ALL** Snowflake Native App Framework tasks: creating app packages, writing manifest files, writing setup scripts, sharing data, testing, versioning, publishing, configuring telemetry and health status reporting, monitoring app health and lifecycle events, setting up event sharing, and debugging apps. This is the **REQUIRED** entry point for any native app work. DO NOT attempt native app development manually - invoke this skill first. Triggers: native app, app package, application package, manifest.yml, setup script, CREATE APPLICATION, Snowflake marketplace, listing, native app framework, build native app, walk me through, guide me, get started, add version, register version, add patch, release channel, release directive, publish app, publish version, upgrade consumers, telemetry, health status, SYSTEM$REPORT_HEALTH_STATUS, log_level, trace_level, event definitions, event sharing, APPLICATION_STATE, lifecycle events, monitor app, debug app, observability.app-version-release Manage versions, patches, and release channels for a Snowflake Native …configure-event-sharing Configure event sharing for a Snowflake Native App: set up event accou…configure-telemetry-event-and-health-update Configure telemetry levels, event definitions, health status reporting…debug-app Debug a Snowflake Native App in a developer account: session debug mod…deploy-test Deploy and test a Snowflake Native App: create application package, up…monitor-app-telemetry-event-and-status Query and monitor Snowflake Native App health status, lifecycle events…request-account-privilege Configure the account-level privileges a Snowflake Native App requests…request-external-access-integration Configure External Access Integrations (EAI) for a Snowflake Native Ap…request-listing Configure a Listing (data sharing) app specification for a Snowflake N…request-object-access Configure a Snowflake Native App to request access to consumer-owned o…request-security-integration Configure a Security Integration app specification for a Snowflake Nat…setup-app Prepare local files for a new Snowflake Native App: write manifest.yml…shared-data Share data content with consumers in a Snowflake Native App: tables in…
- • network-securityRecommend, evaluate, and migrate Snowflake network policies using built-in secur… +Recommend, evaluate, and migrate Snowflake network policies using built-in security procedures. Use when: generating network policy recommendations from access history, evaluating candidate policies before deployment, migrating existing policies to use Snowflake-managed SaaS rules, creating hybrid policies combining custom rules with SaaS rules. Triggers: recommend network policy, evaluate network policy, candidate policy, migrate policy, SaaS rules, hybrid policy.
- • openflowOpenflow data integration operations. Openflow is a Snowflake NiFi-based product… +Openflow data integration operations. Openflow is a Snowflake NiFi-based product for data replication and transformation. Use for connector deployment, configuration, diagnostics, and custom flows.
- • organization-management
Snowflake organization management — accounts, org users, org insights, org spend… +Snowflake organization management — accounts, org users, org insights, org spending, org security, globalorgadmin. ORGANIZATION_USAGE views, cross-account analytics, org-wide metrics. Use when the user asks about: 30 day summary of my organization, 30-day summary, 30 day summary, accounts in my organization, list accounts, how many accounts, account editions, account regions, account inventory, organization users, organization user groups, executive summary of my org, org overview, org spending, org cost, org security posture, org reliability, org auth posture, org hub, org usage views, trust center, MFA readiness, login failures, warehouse credits, storage trends, edition distribution, who has globalorgadmin, what is globalorgadmin, globalorgadmin role, orgadmin role, organization administrator, org admin, enable orgadmin, disable orgadmin, org admin permissions, account admins, ORGANIZATION_USAGE, org-level, cross-account, org-wide.accounts Snowflake account inventory, editions, and role analytics across your …globalorgadmin GLOBALORGADMIN and ORGADMIN role reference — when to use each role, ho…org-hub Executive organization summaries, org-wide spending, reliability, and …org-usage-view ORGANIZATION_USAGE view discovery, access troubleshooting, and feature…
- • semantic-view
**[REQUIRED]** Use for ALL requests that mention: create, build, debug, fix, tro… +**[REQUIRED]** Use for ALL requests that mention: create, build, debug, fix, troubleshoot, optimize, improve, or analyze a semantic view — AND for requests about VQR suggestions, verified queries, verified query representations, seeding/generating queries, suggesting metrics, suggesting filters, recommending metrics/filters/facts, or enriching a semantic view. This is the REQUIRED entry point - even if the request seems simple. DO NOT attempt to create, debug, or generate suggestions for semantic views manually - always invoke this skill first. This skill guides users through creation, setup, auditing, VQR suggestion generation, filter & metric suggestions, and SQL generation debugging workflows for semantic views with Cortex Analyst.audit Comprehensive audit system for semantic views with multiple audit type…best_practices Verify semantic view compliance with established best practices includ…custom_criteria Evaluate semantic view against user-defined validation rules and custo…vqr_testing Systematically test semantic views by evaluating all verified queries …
creation Create new semantic views using FastGen system function for automated …debug Debug and fix specific SQL generation issues in semantic views. Diagno…filters_and_metrics_suggestions Suggest filters, metrics, and facts for a semantic view by analyzing q…import_tableau Import Tableau workbooks (.twb/.twbx) and datasources (.tds/.tdsx) int…optimization Library of optimization patterns for dimensions, metrics, filters, rel…setup Initial setup for all semantic view workflows. Creates session directo…time_tracking Track and report execution time for workflow steps including setup, au…upload Upload a semantic view YAML file to Snowflake database.schemavalidation Validate semantic model changes by comparing SQL execution results wit…vqr_suggestions Generate verified query (VQR) suggestions for a semantic view by analy…
- • skill_development
Create, document, or audit skills for Cortex Code. Use when: creating new skills… +Create, document, or audit skills for Cortex Code. Use when: creating new skills, capturing session work as skills, reviewing skills. Triggers: create skill, build skill, new skill, summarize session, capture workflow, audit skill, review skill.audit-skill Audit and lint skills against best practices. Use when: reviewing skil…create-from-scratch Create new skills from scratch. Use when user wants to build a new ski…summarize-session Capture current session as reusable skill. Use when: user wants to tur…
- • snowconvert-assessment
Analyzes workloads to be migrated to Snowflake using SnowConvert assessment repo… +Analyzes workloads to be migrated to Snowflake using SnowConvert assessment reports. Routes to specialized sub-skills for high-quality assessments. Use this skill when user wants to do an assessment of their code or ETL workload, waves generation, object exclusion, sql dynamic and/or ETL analysis (SSIS)analyzing-sql-dynamic-patterns Analyzes Dynamic SQL occurrences from SnowConvert issues, classifies p…etl-assessmentobject_exclusion_detection Analyze SnowConvert reports for naming conventions to identify tempora…waves-generator Analyze SQL object dependencies and create deployment waves/partitions…
- • snowflake-notebooksCreate and edit Workspace notebooks (.ipynb files) for Snowflake. Use when: crea… +Create and edit Workspace notebooks (.ipynb files) for Snowflake. Use when: creating workspace notebooks, editing notebooks, debugging notebook issues, converting code to notebooks, multi-step workflows that combine SQL queries with Python code execution and visualization, step-by-step data analysis requiring both SQL and Python, interactive data exploration with code and charts. Do NOT use for: static SQL-only dashboards (use dashboard skill), Streamlit apps, standalone Python scripts, or stored procedures. Triggers: notebook, .ipynb, snowflake notebook, workspace notebook, create notebook, edit notebook, jupyter, ipynb file, notebook cell, SQL cell, step-by-step analysis with SQL and Python, data exploration with code and visualization, combine SQL and Python.
- • snowflake-postgres
**[REQUIRED]** Use for **ALL** requests involving Snowflake Postgres: create ins… +**[REQUIRED]** Use for **ALL** requests involving Snowflake Postgres: create instance, list instances, suspend, resume, reset credentials, describe instance, import connection, health check, diagnostics, pg_lake, iceberg tables, data lake, storage integration. Triggers: 'postgres', 'pg', 'create instance', 'show instances', 'suspend', 'resume', 'reset credentials', 'rotate password', 'reset access', 'import connection', 'network policy', 'my IP', 'health check', 'diagnose', 'insights', 'pg_doctor', 'slow queries', 'cache hit', 'bloat', 'vacuum', 'dead rows', 'locks', 'blocking queries', 'blocked', 'disk usage', 'what's running', 'active queries', 'connection count', 'pg_lake', 'iceberg', 'data lake', 'storage integration', 'parquet', 'COPY to S3', 'export to S3', 'lake'.connect Network policy setup and connectivity checks. Triggers: 'my IP', 'netw…diagnose Run Postgres health diagnostics via pg_doctor.py. Triggers: 'health ch…manage Manage Snowflake Postgres instances: list, describe, create, suspend, …pg-lake pg_lake data lake setup and usage: Iceberg tables, S3 storage integrat…
- • snowpark**[REQUIRED]** Use for **ALL** requests involving Snowpark Python — writing pipe… +**[REQUIRED]** Use for **ALL** requests involving Snowpark Python — writing pipelines, transforming data, loading files, deploying stored procedures/UDFs, OR observability. MUST invoke this skill even for seemingly simple tasks because Snowflake DataFrame semantics differ from Pandas in ways that silently produce wrong results (NULL handling, division by zero, GREATEST, datediff, type casting). Always load this skill BEFORE writing any Snowpark code. Triggers: Snowpark, Python, DataFrame, pipeline, ETL, ingest, transform, load data, CSV, Parquet, JSON, XML, join, aggregate, window function, UDF, UDTF, UDAF, Stored Procedure, deploy, snow snowpark CLI, DBAPI, JDBC, external database, pull data, event table, logging, tracing, trace events, profiler, debug UDF, debug procedure, observability, telemetry, slow procedure, alert on error, monitor.
- • snowpark-connect
Snowpark Connect (SCOS) skills for migrating and validating PySpark workloads on… +Snowpark Connect (SCOS) skills for migrating and validating PySpark workloads on Snowflake.
Use when: migrating PySpark to Snowpark Connect, validating SCOS migrations,
analyzing Spark compatibility, or working with Snowpark Connect for Spark.
Triggers: snowpark connect, scos, pyspark migration, spark connect,
validate migration, pyspark compatibility.migrate-pyspark-to-snowpark-connect Migrate PySpark and Databricks workloads to Snowflake SCOS (Snowpark C…validate-pyspark-to-snowpark-connect Validate a completed PySpark to Snowpark Connect (SCOS) migration by r…
- • sql-authorUse for ANY task that involves writing, running, or debugging SQL against Snowfl… +Use for ANY task that involves writing, running, or debugging SQL against Snowflake tables. Helps find the right table, verify columns exist, avoid timeouts on large tables, and validate joins. Triggers: write a query, sql for, query this table, author sql, build a query, fix this query, how many, how much, show me data, explore this table, describe table, select from.
- • trust-center
Use for ALL Snowflake Trust Center requests: security findings, scanner analysis… +Use for ALL Snowflake Trust Center requests: security findings, scanner analysis, scanner management, finding remediation, severity distribution, CIS benchmarks, Security Essentials, Threat Intelligence, enable/disable scanners, scanner schedules, notifications, webhook, notification integration, at-risk entities, security posture, vulnerability analysis, detection analysis, remediation guidance.api-managementfinding-remediation Help users understand and remediate Trust Center security findings. Us…findings-analysis Analyze Trust Center security findings in Snowflake. Use when users as…scanner-analysis Analyze Trust Center scanners and scanner packages in Snowflake. Use w…
- • warehouse
**[REQUIRED]** Use for **ALL** Snowflake warehouse questions (except interactive… +**[REQUIRED]** Use for **ALL** Snowflake warehouse questions (except interactive warehouses — those use the interactive-warehouse skill). Covers: Gen2 warehouses, warehouse credits, warehouse cost, warehouse pricing, how much does a warehouse cost, warehouse size, create warehouse, alter warehouse, warehouse generation, warehouse performance, warehouse types. Triggers: gen2, generation 2, GENERATION = '2', gen2 credit rate, convert to gen2, gen1 to gen2, gen2 regions, gen2 limitations, gen2 performance, warehouse generation, gen2 benchmark, compare gen1 gen2, DML performance, slow DELETE, slow MERGE, slow resume, resume time, warehouse resume, warehouse credits, warehouse cost, how much is a warehouse, warehouse pricing, warehouse size, create warehouse, alter warehouse.gen2-warehouse **[REQUIRED]** Use for **ALL** Snowflake Gen2 warehouse questions: cre…
- • workload-performance-analysisSnowflake SQL query execution analysis via ACCOUNT_USAGE views. Triggers: spilli… +Snowflake SQL query execution analysis via ACCOUNT_USAGE views. Triggers: spilling, partition pruning, cache hit rates, clustering keys, search optimization (SOS) candidates, query acceleration (QAS) eligibility, predicate column analysis for clustering/SOS, per-warehouse spill/prune/cache metrics, slow SQL query diagnosis. Not for: cost/credits (cost-intelligence), access audit (data-governance), writing or debugging user SQL.
|
Latest Version
|
|
Today's version v1.0.73
|
|
Latest change 2026-04-28 | +5 new | -4 deleted | ~17 modified |
Generated 2026-05-13 04:15 UTC
|
|