I’ve two tables and I need to join them. But the common field are of different data types. Table A – ID field is of string array and Table B ID is of Int64. Tried to cast String array to Int 64 getting error “Invalid cast from ARRAY to INT64” Is there anyway I can convert and join the tables.
Tag: google-bigquery
How do I count the different strings in a list found found within each cell?
String apple, orange, peach, peach, peach potato, cucumber, pepper I have a column such as the one above and I’m trying to write a query that creates a column showing the count of each item in the list. So the final table for the example above should look like this. String Count apple, orange, peach, peach, peach 5 potato, cucumber,
Count Distinct IDs in a date range given a start and end time
I have a BigQuery table like this id start_date end_date location type 1 2022-01-01 2022-01-01 MO mobile 1 2022-01-01 2022-01-02 MO mobile 2 2022-01-02 2022-01-03 AZ laptop 3 2022-01-03 2022-01-03 AZ mobile 3 2022-01-03 2022-01-03 AZ mobile 3 2022-01-03 2022-01-03 AZ mobile 2 2022-01-02 2022-01-03 CA laptop 4 2022-01-02 2022-01-03 CA mobile 5 2022-01-02 2022-01-03 CA laptop I want to
How to create a column that shows if a date appears in the last 7 days in BQ?
I’ve got a table that shows me a user_id and the dates they were active (this is derived from a massive events table). The table looks like this: user_id active_date 1 2022-06-16 2 2022-06-02 1 2022-06-14 1 2022-05-01 I need to create a query to find if a user has been active in the last 7 days, 8-14 ago, 15-21
Query records in one table that exists in either of two columns in another table
I have two tables. One with user info, one with payment info. I would like to find out users that are either the sender or the receiver of a payment. Eample data: user id other columns 1 2 3 payments: sender receiver other columns 1 4 1 3 5 3 4 5 ideal output id 1 3 what I tried:
BigQuery SQL Regex_extract repeated pattern
New to regexp, below is the sample query and our try as below union all select ‘https://www.this-is-pqrs.com/<some_text>/ab.abc.ef.gh.ij/123456.csv’ str union all select ‘https://www.this-is-pqrs.com/<some_text>/ab.abd.ef.gh.ij/123456.csv’ str union all select ‘https://www.this-is-abcd.com/<some_text>/ab.abc.ef.gh.ij/123456.csv’ str ) select REGEXP_EXTRACT(string_tbl.str, r”ab[^/]*”) from string_tbl; output we are getting: Required output: Answer Use below with output
BigQuery – Picking latest not null value within 28 interval
I’m trying to add a column on this table and stuck for a little while ID Category 1 Date Data1 A 1 2022-05-30 21 B 2 2022-05-21 15 A 2 2022-05-02 33 A 1 2022-02-11 3 B 2 2022-05-01 19 A 1 2022-05-15 null A 1 2022-05-20 11 A 2 2022-04-20 22 to ID Category 1 Date Data1 Picked_Data A
Aggregating or nesting STRUCTs in Big Query SQL
I have the following table: ClientId (Integer) EmailCampaign (String) CampaignDetails (STRUCT) 235 Campaign 32 SentOn: 2020-01-22, Email addresses:2, SuccessRate:1 235 Campaign 22 SentOn: 2021-02-02, Email addresses:2, SuccessRate:0.5 235 Campaign 23 SentOn: 2022-05-11, Email addresses:2, SuccessRate:0.3 235 Campaign 55 SentOn: 2020-11-03, Email addresses:2, SuccessRate:0.9 122 Campaign 22 SentOn: 2022-01-03, Email addresses:2, SuccessRate:0.9 And I would like to process the data in
Output number of non-consecutive failures from historical data in Bigquery
this is related to my previous scenario. I have a dataset like this: Aside from outputting the timestamp in which a user first committed a failure, and consecutively commits a failure status every day, leading up to today (2022-04-29), I also want to output the non-consecutive block of days in which Karl or Andrea commits a failure. In this case,
BigQuery – SQL UPDATE and JOIN
I have two tables. Table1 = dalio which is an event list with select customers. Table2 = master_list which is a master customer list from all past events. dalio has an “id” column that needs to be filled in with customer numbers, which can be pulled from master_list column called “customer_no”. All rows in the “id” column are currently blank.