On Google Search Central, the Search Console team announced the availability of “Bulk Data Export”, a new feature allowing the daily export of bulk data from Search Console to Google BigQuery.
Say goodbye to the limitation of the number of rows of data!
Google now offers the possibility to configure an export in the Search Console for perform a daily data dump in a BigQuery project. This data contains site performance data, with the exception of anonymized queries which are filtered for privacy reasons. The good news is that this export does not affect the maximum daily limit for the number of rows of data, set at 50,000 rows per day, per site and per search type, via the Search Analytics API. A blessing for those who wish to take full advantage of the data collected, without limitation.
Obviously, as the Search Console team clarifies in their post, this update is aimed primarily at the largest siteswhich consist of of tens of thousands of pages or which receive traffic from tens of thousands of requests per day (or both) “. Smaller sites can already access this data using the Looker Studio tool (formerly called Google Data Studio) or via the Search Analytics API.
How to set up a data export to BigQuery?
Google offers the procedure for setting up a new export. First of all, it is essential to prepare the BigQuery account for receiving data and to configure the Search Console correctly:
- Prepare the Cloud Project in the Google Cloud Console : This includes enabling the BigQuery API and authorizing the Search Console service account.
- Set an export destination in Search Console : it is a question of indicating the identifier of the Google Cloud project concerned and choosing a location for the data. Important clarification: only owners can perform this step.
For more details, Google refers to its step-by-step guide.
After the information is submitted in Search Console, the service simulates an export. Upon successful export, owners are notified by email and defined exports begin within 48 hours. On the other hand, if the export simulation fails, the owners immediately receive an alert specifying the detected problem in order to solve it. As such, the article provides a list of possible errors.
The data available in mass data exports
Once the export has been successfully defined, all that remains is to log in to your BigQuery account and consult the available data. For more technical details, Google refers to its recommendations and also invites us to look at the differences between the aggregated data by property and by page, but still offers a brief description of the three tables available:
- searchdata_site_impression : This table contains aggregated data by property including request, country, type and device used.
- searchdata_url_impression : This table contains aggregated data by URL, and offers a more detailed view of queries and rich results.
- ExportLog : This table is a record of the data recorded during this day. It does not contain export failures.
To help users get started with querying data, Google provides some sample queries. In particular, the Search Console team provides an example query to extract the total query by URL combinations for pages with at least 100 FAQ impressions (rich result), in the last two weeks:
SELECT url, query, sum(impressions) AS impressions, sum(clicks) AS clicks, sum(clicks) / sum(impressions) AS ctr, /* Added one below, because position is zero-based */ ((sum(sum_position) / sum(impressions)) + 1.0) AS avg_position /* Remember to update the table name to your table */ FROM searchconsole.searchdata_url_impression WHERE search_type="WEB" AND is_tpf_faq = true AND data_date BETWEEN DATE_SUB(CURRENT_DATE(), INTERVAL 14 day) AND CURRENT_DATE() AND clicks > 100 GROUP BY 1,2 ORDER BY clicks LIMIT 1000
#Google #Search #Console #Automates #Daily #Bulk #Data #Export #SEO #Engine #News