How to make an after-hours market scanner with Selenium and Yahoo
This article describes how Selenium in Python can be used to easily obtain pre- and post-market quotes
These days, keeping track of pre- and post-market prices can mean the difference between an appreciable gain and a fantastic one. For example, when AMD announced their earnings on February 1st, the post-market price reached above $135, but opened at $129.89 the next day.
In this article, I will go step by step through a Python notebook that automates the extraction of pre- and post-market stock quotes. Stay tuned for a future Medium Post from me that combines this pre-/post-market scanner with e-mail notifications and a visual dashboard.
This post will show you how to retrieve stock market data from before or after hours, storing the information locally and watching trends as they appear.
Below is all the code you need to get this done from within a Jupyter Notebook on your local computer. I have not tried this on any cloud-based notebook services.
As with most Python projects, the most difficult piece will be making sure you have all the packages
Here is what you’ll need to be able to do without errors:
from selenium import webdriver
from webdriver_manager.chrome import ChromeDriverManager
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.common.by import Bydriver = webdriver.Chrome(ChromeDriverManager().install())
Getting a warning about versions is fine, but no error message. If you have issues here, you should look around on online and do the necessary pip install and so on to get to the right place.
If you are already running Selenium without issue, the rest is three easy steps.
I’ll lead you through how I came up with the code (so you can customize) or just skip to the bottom for all the code so you can run it immediately.
With any web scraping project, you’ll want to find out how you can locate the elements you want to extract. I am breaking this down into three steps (get the generic container, hone in on the target text, output price changes to a dataframe).
0. Send the Selenium driver to the page containing your ticker of interest. (This is not really a step!)
- Locate the category of thing that contains the pre-market price.
# Find all of the fin-streamer elements on the page (there are many)
fs = driver.find_elements( By.TAG_NAME , 'fin-streamer' )
The output of this (fs), is a list. Each item within that list is a webdriver element.
By is an incredibly useful module of Selenium. Unfortunately, there isn’t a lot of documentation about it so if you want to see what else it can do I recommend typing help(By) and running it. I may decide to write a tutorial at some point about it!
2. Hone in on the exact field by finding something specific to the pre-market price, in this case, where ‘data-field’ is equal to ‘preMarketPrice’ or ‘postMarketPrice’. Using the same approach, we’ll extract the price change field.
price_field = [ c for c,x in enumerate(fs) if x.get_attribute('data-field')=='preMarketPrice' or x.get_attribute('data-field')=='postMarketPrice' ]
change_field = [ c for c,x in enumerate(fs) if x.get_attribute('data-field')=='preMarketChangePercent' or
I am using a list comprehension and enumerate to get the numerical index for the list item corresponding to the price (either pre- or post-market, it will always be one or the other).
3. I use a little trick to only log when the price changes. A lot of people may already have a solution for this that’s more sophisticated (like an SQL database). Here is what I do to save all the changes in a list called values_list:
from datetime import datetime
values_list = list()while 1:
fs = driver.find_elements( By.TAG_NAME , 'fin-streamer' ) price1 = fs[price_field].text
change1 = fs[change_field].text
if price0!=price1: # Detects a change in price
localtime = datetime.now().strftime('%H:%M:%S %m-%d-%y')
values_list.append( [localtime,price1,change1] )
price0,change0 = price1,change1
Afterwards, values_list can be converted to a dataframe using Pandas, written to a text file, used to create an alert, or show up in a graph.
Good luck to all scalping those earnings pops in the after hours!