EBay-email-tracker - Scapes an entire search page of a particular item on eBay and sends regular updates to an email address

Overview

Introduction

This is a project I built with the sole intent to learn more about scraping websites, manipulating data, and delivering it through a medium. It is not intended for commercial use.

The program tracks an entire eBay search page of a particular item and sends automated updates to an email with the respective link and price to each entry.

The program asks you for:

  • The item you want to track
  • How many hours do you want between updates
  • The range of price
  • The email address you would like to receive the updates to

Installation

The program only works locally. It will create a CSV file in the current directory the program is run from with all the data.

If you would like to test the program, you need to:

pip install eBay_email_tracker
from ebay_email_tracker import tracker
tracker()

The program will only work for a Gmail account. You have to authorize access to less secure apps in your Gmail settings.

I recommend setting a new password for this. You can do that in "app passwords". You will need to have 2-Factor authorization in your Gmail, and it allows any login attempt with the correct credentials to connect without requiring 2-Factor authorization.

Set environmental variables in your Windows Operating System: (Control Panel -> View advanced system settings -> Environment Variables -> User variables for...)

Variable Value
EBAY_TRACKER_EMAIL An email you want to send the information from
EBAY_TRACKER_PASSWORD Password to that email (app passwords preferably)

The program depends on these packages:

  • Numpy
  • BeautifulSoup
  • lxml
  • requests
  • Pandas

Improvements:

  • I realized that the program will often send entries that are no longer available. While the program sleeps, I could check each entry that was not seen in the most recent search, and see if it has its price listed. If not, delete it from the data.
  • Currently, the only way to close the program would be to interrupt it and close it. I could implement a way for the user to close the program if he or she desires to.
  • While the program is running, there is nothing telling the user that it is working apart from the first email sent. The program will stay the same for the whole duration. I could print statements to the user, so he knows the program is working as intended.
  • The program only works locally. I could implement a database that tracks the item of different users, and sends that information without storing each entry locally, but that would be a completely different project.
Owner
Luis M. Capdevielle
Business Analyst Graduate.
Luis M. Capdevielle
Bulk download tool for the MyMedia platform

MyMedia Bulk Content Downloader This is a bulk download tool for the MyMedia platform. USE ONLY WHERE ALLOWED BY THE COPYRIGHT OWNER. NOT AFFILIATED W

Ege Feyzioglu 3 Oct 14, 2022
Linkedin webscraping - Linkedin web scraping with python

linkedin_webscraping This is the first step of a full project called "LinkedIn J

Pedro Dib 4 Apr 24, 2022
12306抢票脚本

12306抢票脚本

罐子里的茶 457 Jan 05, 2023
A list of Python Bots used to extract data from several websites

A list of Python Bots used to extract data from several websites. Data extraction is for products on e-commerce (ecommerce) websites. Data fetched i

Sahil Ladhani 1 Jan 14, 2022
This script is intended to crawl license information of repositories through the GitHub API.

GithubLicenseCrawler This script is intended to crawl license information of repositories through the GitHub API. Taking a csv file with requirements.

schutera 4 Oct 25, 2022
Console application for downloading images from Reddit in Python

RedditImageScraper Console application for downloading images from Reddit in Python Introduction This short Python script was created for the mass-dow

James 0 Jul 04, 2021
京东茅台抢购最新优化版本,京东茅台秒杀,优化了茅台抢购进程队列

京东茅台抢购最新优化版本,京东茅台秒杀,优化了茅台抢购进程队列

MaoTai 129 Dec 14, 2022
Explore scraping with BeautifulSoup!

beautifulsoup-scrape Explore scraping with BeautifulSoup! Part One: Start from Shakespeare As my professor is a poet (yes, and he teaches me data and

Chuqin 2 Oct 05, 2022
A simple app to scrap data from Twitter.

Twitter-Scraping-App A simple app to scrap data from Twitter. Available Features Search query. Select number of data you want to fetch from twitter. C

Davis David 2 Oct 31, 2022
WebScraper - A script that prints out a list of all EXTERNAL references in the HTML response to an HTTP/S request

Project A: WebScraper A script that prints out a list of all EXTERNAL references

2 Apr 26, 2022
VG-Scraper is a python program using the module called BeautifulSoup which allows anyone to scrape something off an website. This program lets you put in a number trough an input and a number is 1 news article.

VG-Scraper VG-Scraper is a convinient program where you can find all the news articles instead of finding one yourself. Installing [Linux] Open a term

3 Feb 13, 2022
Web scrapping tool written in python3, using regex, to get CVEs, Source and URLs.

searchcve Web scrapping tool written in python3, using regex, to get CVEs, Source and URLs. Generates a CSV file in the current directory. Uses the NI

32 Oct 10, 2022
Jobinja.ir jobs scraper.

Jobinja.ir Dataset Introduction This project is a simple web scraper that scraps pages of jobinja.ir concurrently and writes and update (if file gets

Iman Kermani 3 Apr 15, 2022
Crawler do site Fundamentus.com com o uso do framework scrapy, tanto da aba detalhada como a de resumo.

Crawler do site Fundamentus.com com o uso do framework scrapy, tanto da aba detalhada como a de resumo. (Todas as infomações)

Guilherme Silva Uchoa 3 Oct 04, 2022
A python module to parse the Open Graph Protocol

OpenGraph is a module of python for parsing the Open Graph Protocol, you can read more about the specification at http://ogp.me/ Installation $ pip in

Erik Rivera 213 Nov 12, 2022
京东抢茅台,秒杀成功很多次讨论,天猫抢购,赚钱交流等。

Jd_Seckill 特别声明: 请添加个人微信:19972009719 进群交流讨论 目前群里很多人抢到【扫描微信添加群就好,满200关闭群,有喜欢薅信用卡羊毛的也可以找我交流】 本仓库发布的jd_seckill项目中涉及的任何脚本,仅用于测试和学习研究,禁止用于商业用途,不能保证其合法性,准确性

50 Jan 05, 2023
Find papers by keywords and venues. Then download it automatically

paper finder Find papers by keywords and venues. Then download it automatically. How to use this? Search CLI python search.py -k "knowledge tracing,kn

Jiahao Chen (TabChen) 2 Dec 15, 2022
Free-Game-Scraper is a useful script that allows you to track down free games and DLCs on many platforms.

Game Scraper Free-Game-Scraper is a useful script that allows you to track down free games and DLCs on many platforms. Join the discord About The Proj

KursK 2 Mar 28, 2022
This program will help you to properly scrape all data from a specific website

This program will help you to properly scrape all data from a specific website

MD. MINHAZ 0 May 15, 2022
Scrapy, a fast high-level web crawling & scraping framework for Python.

Scrapy Overview Scrapy is a fast high-level web crawling and web scraping framework, used to crawl websites and extract structured data from their pag

Scrapy project 45.5k Jan 07, 2023