how to scrape in go in javascript

To scrape data from websites in Go, you can use a scraping library like GoQuery, Colly, or PuerkitoBio/goquery. Here's an example using the GoQuery library:

main.go
package main

import (
    "fmt"
    "log"
    "github.com/PuerkitoBio/goquery"
)

func main() {
    // Load the HTML document from a URL
    doc, err := goquery.NewDocument("https://www.example.com")
    if err != nil {
        log.Fatal(err)
    }

    // Find and print the title of the page
    title := doc.Find("title").Text()
    fmt.Println("Page title:", title)

    // Find and print all links on the page
    doc.Find("a").Each(func(i int, s *goquery.Selection) {
        href, exists := s.Attr("href")
        if exists {
            fmt.Println("Link:", href)
        }
    })
}
592 chars
28 lines

This example loads the HTML document from the URL "https://www.example.com", finds and prints the title of the page, and finds and prints all links on the page. The goquery.NewDocument() function is used to load the HTML document, and the doc.Find() function is used to find elements in the document.

gistlibby LogSnag