When I select these columns, I see that the results have some NULL values. They don’t seem to be strings. However, when I try to filter the NULL values out with a where statement: I don’t see any results. What might be the possible causes? I also tried filtering with ‘NULL’ but that would throw an error since the column
Tag: snowflake-cloud-data-platform
Snowflake Scripting in SQL – how to iterate over the results of a SHOW command?
I’m checking out the new SQL Snowflake Scripting (in preview), and I can’t figure out how to iterate over the results of a SHOW command — especially as some columns are lower cased. https://docs.snowflake.com/en/developer-guide/snowflake-scripting/ https://hoffa.medium.com/sql-scripting-live-in-snowflake-288ef8c272fa Answer You can use this example as a template to iterate over results of SHOW: First, note that you can get the resultset from show
Update Table T1 column to be the quotient of two queries each leveraging ID from T1
I am trying to update a column to be the quotient of two queries that count records in two other tables for each ID in T1. It seems like this should work (but doesn’t): Edit to add data sample and expected output: T1 is like: ID COLUMN1 0 1 T2 and T3 are both like this, where ID can be
concatenate date + time to make timestamp
I am using dbt and snowflake to parse a json.. Currently, I parse two cols, date and time separately. Now, I want to concatenate both and assign the type timestampto them I tried this:: However, the col “REQUIRED_TIMESTAMP” is just always empty in my final table. What else can I try? Answer I assume JSON_DATA:”Required_Collect_Time_From”:time has a typo in here,
Issues Splitting values separated by delimiters and creating columns for each split in snowflake
I am new to snowflake and I am trying to run an sql query that splits values with delimiters(semi-column) and create columns for each of them. Table name: lexa ID Value 001 2021-02-13 18:17:43;83.89.250.196;10.10.11.29 002 2021-02-13 17:47:56;5.33.18.24;10.10.11.28 what I am trying to achieve ID register Ip1 IP2 001 2021-02-13 18:17:43 83.89.250.196 10.10.11.29 002 2021-02-13 17:47:56 5.33.18.24 10.10.11.28 Answer Snowflake split
Group by name and return row with most recent date
Suppose you have the following data: What would be the best way to group by name and return the row with the most recent date in Snowflake (ANSI SQL)? Expected output: Answer With QUALIFY you can keep the newest per name As you will see in the doc’s it’s the same as Tim’s answer without the need for the nested
I am having Issues counting values in a row with separators using SQL
I am new to snowflake and trying the count the number of values in a row with separators using SQL. I am not sure how to go about it. I’ve googled solutions for this but have not been able to find one. table name: Lee_tab user names id01 Jon;karl;lee; id02 Abi;jackson; id03 don; id04 what I want to achieve user
How do I set every row in a column equal to a specific string?
I have some columns I want to set to the string ‘redacted’ because they contain personal information. Currently I am creating views in snowflake from the original tables via a select statement how can I add a specified string to the particular columns? original desired state current code to materialize views Answer I think you want this, just define name
Self joining next date in table as another date field
Trying to work with an exchange rate table and a transactions table, and how the transactions are joined to the exchange table depends on when the most recent exchange rate for that currency was relative to the transaction. The table contains many duplicates and many types of currencies and other fields not relevant to this issue. I plan on joining
Find the difference between 1 column depending on date
When I run this: I see 62 rows but when I do I see 59 I want to see what NAME’s are missing when it ran for _LOAD_DATETIME::date = ‘2022-02-01’ I thought this would work but it doesn’t: Answer You have to use MINUS for your purposes: If we are talking about PostgreSQL, you have to use EXCEPT instead of