Modules & Imports
import, packages, virtualenvs, the standard library.
Modules & Imports
A Python file is a module. A folder containing an __init__.py (or, since 3.3, just a folder) is a package. Both are imported the same way.
Importing
import math
math.pi # use the qualified name
from math import pi, sqrt
sqrt(2) # bring names into the current namespace
import math as m
m.pi # alias
from math import * # everything — please don't
The first form (import x) is the safest. The second (from x import y) is fine for a few well-known names. The third is for renaming. The wildcard form pollutes your namespace and is discouraged.
Where Python looks
When you write import foo, Python walks sys.path looking for foo.py or foo/__init__.py. sys.path is built from:
- The current script's directory.
PYTHONPATHenvironment variable.- Installed packages (via pip / virtualenv / system).
If you're getting ModuleNotFoundError, your file isn't on the path. Common fixes: install the package (pip install foo), activate the right virtualenv, or restructure your project so it's importable.
Packages
A package is a directory:
mypkg/
__init__.py # makes mypkg a package
auth.py
db.py
utils/
__init__.py
strings.py
Imports:
from mypkg.auth import login
from mypkg.utils.strings import slugify
__init__.py runs once when the package is first imported. Use it for package-level setup or to re-export names from submodules:
# mypkg/__init__.py
from .auth import login, logout
from .db import connect
# now consumers can write `from mypkg import login`
The leading dot from .auth import login is a relative import — relative to the current package. Use absolute imports (from mypkg.auth import login) most of the time; relative imports get confusing fast.
Virtualenvs
Don't install packages globally. Each project gets its own virtualenv:
python -m venv .venv
source .venv/bin/activate # macOS/Linux
.venv\Scripts\activate # Windows
pip install -r requirements.txt
Modern alternatives: uv, poetry, pipenv, conda. They all solve the same "isolate dependencies per project" problem.
requirements.txt vs pyproject.toml
The old way is requirements.txt — a flat list of pinned versions:
requests==2.32.0
pydantic==2.5.0
The modern way is pyproject.toml — a single file that declares your project, its dependencies, and its build config:
[project]
name = "mypkg"
version = "0.1.0"
dependencies = [
"requests>=2.32",
"pydantic>=2.5",
]
Tools like uv, pip, poetry, and hatch can all consume pyproject.toml.
name == "main"
Every module has a __name__ attribute. When run directly (python script.py), it's "__main__". When imported, it's the module's name. This idiom lets a file double as a library and a script:
def main():
...
if __name__ == "__main__":
main()
The standard library
Python ships with a huge standard library. Before you pip install anything, check if the stdlib has it:
pathlib— modern file path handlingjson,csv,tomllib(3.11+) — common formatsre— regular expressionsargparse— command-line interfacesdataclasses,enum,typing— type machineryitertools,functools,collections— functional helperssubprocess,os,shutil— OS interactionurllib.request,urllib.parse— basic HTTPunittest— testinglogging— structured loggingconcurrent.futures,asyncio,threading— concurrency