Your First Taste of Automation: Scripts That Work for You
Why Automate?
You’ve spent countless hours copying data between files, fixing messy spreadsheets, and hunting for typos. The loop feels endless, and it’s not the work you signed up for. Automation offers escape. When the computer handles the boring predictable parts, you save time and reduce mistakes.
Picture a Monday morning report. Instead of wrestling with data while half-asleep, a small script can prepare and email it while you fetch your coffee. That simple shift lowers stress and keeps errors out of your inbox.

Writing Your First Python Script
Start with one folder and a file named sales.csv. You want to read it, clean it, and save a new cleaned_sales.csv. The short Python below does the job—read, filter, write—so you never repeat a tedious copy-paste cycle again.
import csv
with open('sales.csv', 'r', newline='') as infile, open('cleaned_sales.csv', 'w', newline='') as outfile:
reader = csv.reader(infile)
writer = csv.writer(outfile)
for row in reader:
# Clean up: skip rows with empty fields
if all(field.strip() for field in row):
writer.writerow(row)

The script opens the source file, skips rows with blanks, then writes clean rows to the new file. Run python clean.py in a terminal, and the task finishes instantly—no more manual fixes.
Making Scripts Flexible with Parameters
Hard-coding file names is fine once, but tomorrow you might clean a new file. Command-line parameters make your script reusable. Python’s argparse module captures those inputs so you can point the same code at any dataset.
import argparse
import csv
parser = argparse.ArgumentParser(description='Clean a CSV by removing empty rows.')
parser.add_argument('input_csv', help='Path to input CSV file.')
parser.add_argument('output_csv', help='Path to cleaned CSV file.')
args = parser.parse_args()
with open(args.input_csv, 'r', newline='') as infile, open(args.output_csv, 'w', newline='') as outfile:
reader = csv.reader(infile)
writer = csv.writer(outfile)
for row in reader:
if all(field.strip() for field in row):
writer.writerow(row)
Now run python clean.py sales.csv cleaned_sales.csv, and the same logic cleans any file you specify.

Simple ETL with Pandas
When data grows or rules get tricky, reach for pandas. With a few lines, you can remove blanks, drop duplicates, and tidy column names—classic ETL in minutes.
import pandas as pd
import argparse
parser = argparse.ArgumentParser(description='Clean a CSV using pandas.')
parser.add_argument('input_csv', help='Path to input CSV file.')
parser.add_argument('output_csv', help='Path to cleaned CSV file.')
args = parser.parse_args()
df = pd.read_csv(args.input_csv)
df = df.dropna() # Remove rows with any missing values
df = df.drop_duplicates() # Remove duplicate rows
# Capitalize all column names
df.columns = [col.strip().title() for col in df.columns]
df.to_csv(args.output_csv, index=False)

Automation, Step by Step
First, write one useful script. Next, add parameters so you can reuse it freely. Then, adopt pandas when tasks grow. Each small win returns time to you, and your growing toolkit opens new chances to streamline your day.
