https://github.com/austinjhunt/ibew-data-scraper

Automation of merged data collection and cleaning from ibew.org and unionfacts.com for an Upwork job.

https://github.com/austinjhunt/ibew-data-scraper

Science Score: 26.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (16.1%) to scientific vocabulary

Keywords

beautifulsoup multithreading python scraper union upwork web
Last synced: 5 months ago · JSON representation

Repository

Automation of merged data collection and cleaning from ibew.org and unionfacts.com for an Upwork job.

Basic Info
  • Host: GitHub
  • Owner: austinjhunt
  • Language: Python
  • Default Branch: main
  • Homepage:
  • Size: 222 KB
Statistics
  • Stars: 0
  • Watchers: 1
  • Forks: 0
  • Open Issues: 0
  • Releases: 0
Topics
beautifulsoup multithreading python scraper union upwork web
Created over 1 year ago · Last pushed over 1 year ago
Metadata Files
Readme

README.md

Upwork Job - Build simple Python script to query specific website’s API directly, with Excel output

Client: *******

Start Date: September 17, 2024

Est. Time: 2-3 hrs.

Detailed Client Requirements

Hi there:

There are two websites this will sift through. First is a union directory search tool (URL: ibew.org/Tools/Local-Union-Directory) that allows, on the front end, for users to search by local (which is a number), by VP District, and by State/Province. First, this needs to hit the API (which I’ve already done, but not on an iterative level), by querying by the following states: NY, CT, RI, MA, VT, NH, and ME. When you do, you’ll get a JSON response for local union ID and city / state. I need these 3 fields likely added to a data frame, and then I need it to do the search by Local, and, per local union ID found from the first pull, first pull in the classifications from the output, and then clicking on show county information, and then showing the counties listed, the population, the sq miles, percent, and jurisdiction per county.

Next, the script, after appending that data, will go to the website (Unionfacts.com/locals/InternationalBrotherhoodofElectricalWorkers) containing the number of members per local union ID. It will search for the Union, and output the number of members. After consolidating all of that into a single dataframe, it’ll provide it as an excel output.

As I am very pressed for time and don’t have time to code it right now, please let me know how soon you can get this to me, and the number of hours you intend to take to work on it.

Best, Michael

Solution: IBEW Data Scraper

Overview

This Python script (main.py) scrapes and merges data about local unions from the IBEW (International Brotherhood of Electrical Workers) directory and the UnionFacts website. The data is retrieved using a combination of API requests and web scraping (where an API is not available) then processes and saves the data as an Excel file.

Features:

  • Queries IBEW API by state to retrieve local union details.
  • Scrapes the UnionFacts website for additional union data (API not available, so I'm using BeautifulSoup). The table on this site has data for 800+ unions, so we filter out only those with a Local identifier.
  • Enhances union data by adding classifications and county information using multithreading for efficiency.
  • Cleans and flattens nested JSON structures (like Counties) for easier data processing.
  • Outputs the result as an Excel file.

Installation

Requirements:

Make sure you have Python installed (>= 3.6) and install the required dependencies listed in the requirements.txt:

bash pip install -r requirements.txt

Script Arguments

The script can be passed 3 optional arguments: states, logfile, and output. See the help output below for a description of these arguments.

```bash (venv) austinhunt@Austins-MBP-2 unionfacts % python main.py -h usage: main.py [-h] [--states STATES] [--logfile LOGFILE] [--output OUTPUT]

IBEW Data Scraper

options: -h, --help show this help message and exit --states STATES Comma-separated list of state abbreviations to query, e.g. NY,CT,RI --logfile LOGFILE Optional log file name --output OUTPUT Output file name (must end with .xlsx) ```

Example with arguments:

```bash (venv) austinhunt@Austins-MBP-2 unionfacts % python main.py --states=NY,CT,RI,MA,VT,NH,ME --output mergeduniondata.xlsx --logfile main.log

```

Example without arguments:

bash (venv) austinhunt@Austins-MBP-2 unionfacts % python main.py

If you run without arguments, the data will be written to merged_union_data.xlsx (see sample file here), and it will collect data for the following default states: ["NY", "CT", "RI", "MA", "VT", "NH", "ME"]

Logging

You can optionally pass --logfile <log file path> to output logs to a file in addition to standard output. A sample log file is included in the repo.

Owner

  • Name: Austin Hunt
  • Login: austinjhunt
  • Kind: user
  • Location: Greenville, SC
  • Company: College of Charleston

Portrait-artist-turned-computer-geek with a fused love for the visual and the technical, bringing experience with and excitement for web dev, automation, & art

GitHub Events

Total
Last Year

Committers

Last synced: 7 months ago

All Time
  • Total Commits: 22
  • Total Committers: 1
  • Avg Commits per committer: 22.0
  • Development Distribution Score (DDS): 0.0
Past Year
  • Commits: 22
  • Committers: 1
  • Avg Commits per committer: 22.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Austin Hunt a****s@g****m 22

Issues and Pull Requests

Last synced: 7 months ago

All Time
  • Total issues: 0
  • Total pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Total issue authors: 0
  • Total pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
Pull Request Authors
Top Labels
Issue Labels
Pull Request Labels

Dependencies

requirements.txt pypi
  • beautifulsoup4 ==4.12.3
  • bs4 ==0.0.2
  • certifi ==2024.8.30
  • charset-normalizer ==3.3.2
  • et-xmlfile ==1.1.0
  • idna ==3.10
  • numpy ==2.1.1
  • openpyxl ==3.1.5
  • pandas ==2.2.2
  • python-dateutil ==2.9.0.post0
  • pytz ==2024.2
  • requests ==2.32.3
  • six ==1.16.0
  • soupsieve ==2.6
  • tzdata ==2024.1
  • urllib3 ==2.2.3