Introduction to web scraping with goquery

April 17, 2021
go scraping

The use of an API is the preferred solution to retrieve data, but when it does not exist, is incomplete, or is subject to limitations that are too restrictive, scraping the associated website can be a good alternative.

What we call web scraping is the collection (via the writing of bots called scrapers) of targeted data present on a set of web pages. A scraper is designed specifically for a given site, unlike a crawler which will be more generalist (Google Shopping uses scrapers when Google (aka the search engine) uses crawlers).

The uses are multiple (aggregation of ads from several job boards, competitive intelligence, …), and are not necessarily reserved to the professional world.

We can indeed find several situations where writing a personal scraper can have its interest:

Let’s illustrate this last use case.

Looking for a cheap Raspberry Pi 4

Let’s take this Raspberry Pi 4B on, it is on sale for 83.99€ but we would like to wait until it drops below 75€ before ordering. As a price drop can quickly lead to a stockout, we might as well be warned quickly to avoid missing a good deal.

The information we are looking for here is the price of the item:


Let’s see how it looks in the HTML:

<p class="our_price_display" itemprop="offers" itemscope="" itemtype="">
  <link itemprop="availability" href="">
  <span id="our_price_display" class="price" itemprop="price" content="83.99">83,99 €</span> TTC
  <meta itemprop="priceCurrency" content="EUR">

From there, we need to find an anchor point, that is to say a criterion that will allow us to find the price (i.e. the attribute content of the span tag) for sure and without duplication.

After analyzing the complete page, it turns out that the itemprop="price" attribute is a good candidate (we only find one span tag meeting this criterion).

Our scraper (in go) can now take shape:

package main

import (

func main() {
	goodPrice := false

	// Stay in the loop as long as we have not passed below our threshold

	for !goodPrice {

		// Loading of the product page
		res, err := http.Get("")
		if err != nil {
			log.Printf("Unable to load the product page, we will try again later: %v", err)
			time.Sleep(30 * time.Minute)
		if res.StatusCode != 200 {
			log.Printf("Unable to load the product page, we will try again later, status code: %d", res.StatusCode)
			time.Sleep(3 * time.Hour)

		// HTML Parsing through goquery
		doc, err := goquery.NewDocumentFromReader(res.Body)
		if err != nil {
			log.Panicf("Unable to parse HTML content: %v", err)

		// Price retrieval
		var priceStr string
		var ok bool

		if priceStr, ok = doc.Find("span[itemprop='price']").Attr("content"); !ok {
			log.Panicf("Price not found, scraper should be updated")

		// Price conversion
		price, err := strconv.ParseFloat(priceStr, 64)
		if err != nil {
			log.Panicf("Unable to read the price, scraper should be updated")

		if price >= 75 {
			log.Printf("%0.2f € it's still too expensive!", price)
		} else {
			log.Printf("Only %0.2f €, go go go!!!", price)
			// TODO: Sending an email, an SMS, calling a discord webhook, triggering the alarm, flashing the connected bulbs, etc ...
			goodPrice = true

		// We try again in 30 minutes
		time.Sleep(30 * time.Minute)



The logic is quite simple

Note that the search for the famous <span> tag containing our price is easily done via the excellent goquery library, which allows, as in javascript, to use CSS selectors to locate DOM elements.


A small VPS, a NAS, an old Raspberry PI (hence the purchase of a new one!) or simply your PC (if you’re the kind of person who keeps it on 24 hours a day), and you’re ready to deploy your first scraper.

Feel free to adapt this example to your needs but, word to the wise, be sure to stay reasonable on the frequency of requests, some sites do not like bots that monopolize too many resources on their server to the detriment of real visitors. So you can understand them if they end up blacklisting your IP because you try to retrieve a price every 2 seconds instead of 30 minutes…