$ pip install -r requirements.txt
- Assuming you already registered to SERP API (serpapi.com) and have your API key.
- Create a creds.py file in your project's root directory
* Inside your creds.py, create your credentials like this:
apikey = "YOUR API KEY HERE"
- Use the search_query_sample.csv and renamed it search_query.csv
- Inside the search_query.csv are the two input parameters
- query - The search query you input in Google's search field.
- location - The location from where the request is coming from.
- On your project's root directory, create a folder named csvfiles.
- The output should be inside the csvfiles folder
$ python scraper.py
- If you wanted to include SERP Pagination results (>10), just add
num
in theparams
object, up to 10th page.params = { "engine": "google", "q": query, "num": 200, "location": location, "gl": "us", "api_key": creds.apikey }