So I want to calculate cumulative users per day but if the users exist is previous days they will not counted. on daily basis we can get if we simply calculate cumulative we can get 2,4,6,8 for each day the goal is to get the table like this im using this query to get the result, since the data is
Tag: google-bigquery
BigQuery: JOIN on single matching row
I have two tables, one containing orders with a nested line_items structure and another with a pricing history for each product sku code. Orders Table order_id order_date item_sku item_quantity item_subtotal 1 2022-23-07 SKU1 7 12.34 SKU2 1 9.99 2 2022-12-07 SKU1 1 1.12 SKU3 5 32.54 Price History Table item_sku effective_date cost SKU1 2022-20-07 0.78 SKU2 2022-02-03 4.50 SKU1 2022-02-03
Query that counts total records per day and total records with same time timestamp and id per day in Bigquery
I have timeseries data like this: time id value 2018-04-25 22:00:00 UTC A 1 2018-04-25 23:00:00 UTC A 2 2018-04-25 23:00:00 UTC A 2.1 2018-04-25 23:00:00 UTC B 1 2018-04-26 23:00:00 UTC B 1.3 How do i write a query to produce an output table with these columns: date: the truncated time records: the number of records during this date
BigQuery finding sessions that have visited both pageA (contains keyword “main”) and pageB (contains keyword “side”)
On BQ I’m trying to find sessions that have visited both pageA (URL contains keyword “main”) and pageB (URL contains keyword “side”), and the pages that session visited. Here is my logic, I first wanted to find out sessions that have visited pageAs (URL contains keyword “main”), then I wanted to do a join, so to find out those sessions
Big Query Error When Using CAST and determining decimals
I have linked a Big Query Project to my Google Ads Account. Within that, we have a campaignBasicStats table. I want to pull the cost of a campaign from my Google Ads account into a big query workspace to apply some additional logic. The cost column is coming through as an INTEGER and is described like this: INTEGER NULLABLE The
How to calculate rolling timestamp sum from table partitioned by specific column? – SQL
I have a table with a series of timelines that are normalized starting from 00:00:00.00000. I want to summate them sequentially and stitch them together based on my order_key value. Sample Data: Desired Output: My Attempt: Answer Consider below query: Recursive Approach Non-recursive Approach
SQL query – What does taking the MIN of this boolean expression mean?
Excuse my ignorance about this… I’m taking a data analysis course and I stumbled upon this query in an exercise: ActivityDate is a field that contains date type data and DATE_REGEX is a regular expression variable for a date format string. What I don’t know, is what does taking the MIN() of this boolean expression REGEX_CONTAINS do or mean. I
How do I extract the string after 0: in big query sql
I want to extract extract “f9sdsdsd-1375-41f7-8c4c-ereb20ad3843c” from “0:f9696a03-1375-41f7-8c4c-34b20ad3843c”. I am currently using TRIM(REGEXP_EXTRACT(“0:f9696a03-1375-41f7-8c4c-34b20ad3843c”, r”0:[^:]+)”)) . However, I am not getting being able to extract it with the current syntax. Can someone please help me here? Thanks. Answer you are just simply missing ( – use below
CURRENT in BigQuery?
I’ve noticed that CURRENT is a reserved keyword for BigQuery at: https://cloud.google.com/bigquery/docs/reference/standard-sql/lexical. What exactly does CURRENT do? I’ve only seen it as a prefix for things such as CURRENT_TIME(), CURRENT_DATE(), and other such stuff but have never seen it by itself. Is this just reserved for future usage or do any SQL statements contain that as a keyword? Answer Just
Starting and Ending a row-count based on values in another column
There is a need to monitor the performance of a warehouse of goods. Please refer to the table containing data for one warehouse below: WK_NO: Week number; Problem: Problem faced on that particular week. Empty cells are NULLs. I need to create the 3rd column: Weeks on list: A column indicating the number of weeks that a particular warehouse is