Python interview questions focus on basics, data types, OOP, data structures, and frameworks like Django and Flask.
Python is a high-level, interpreted, general-purpose programming language created by Guido van Rossum in 1991. Its key features:
Python is dynamically typed. Variable types are determined at runtime, not at compile time. You do not declare a variable's type; it is inferred from the assigned value.
x = 10 # x is int
x = "Hello" # x is now str — perfectly valid
x = [1,2,3] # x is now list
int, float,
complex
str, list,
tuple, range
dictset, frozensetbool (True /
False)
bytes, bytearray,
memoryview
NoneTypePEP 8 (Python Enhancement Proposal 8) is the official style guide for Python code. Key conventions include:
snake_case for variables and functions; PascalCase for
classes.Tools like flake8 and black enforce
PEP 8 automatically.
is and ==?
== checks value equality — whether two
objects hold the same value.is checks identity — whether two
variables point to the exact same object in memory (id()).a = [1, 2, 3]
b = [1, 2, 3]
print(a == b) # True — same value
print(a is b) # False — different objects in memory
c = a
print(c is a) # True — same object
Type casting (type conversion) converts a value from one data type to another using built-in constructor functions.
int("42") # 42 (str → int)
float("3.14") # 3.14 (str → float)
str(100) # "100" (int → str)
list((1,2,3)) # [1,2,3] (tuple → list)
bool(0) # False
bool("hello") # True
Implicit casting (coercion) happens
automatically, e.g. int + float → float.
list, dict, set, bytearray.
int,
float, str, tuple,
frozenset, bytes.
# Mutable
lst = [1, 2, 3]
lst[0] = 99 # OK — [99, 2, 3]
# Immutable
t = (1, 2, 3)
t[0] = 99 # TypeError!
Immutable objects are hashable and can be used as dictionary keys or set elements.
None in Python?None is a singleton constant of
type NoneType. It represents the absence of a value — similar to
null in other languages. It is the implicit return value of functions
that have no return statement.
x = None
print(type(x)) #
print(x is None) # True ← preferred identity check
print(x == None) # True ← works but not PEP 8 preferred
// and /
operators?/ — True (float) division. Always returns a
float.
// — Floor division. Returns the largest integer
less than or equal to the result.print(7 / 2) # 3.5
print(7 // 2) # 3
print(-7 // 2) # -4 ← floors toward negative infinity
f-strings (formatted string literals, Python
3.6+) embed expressions directly inside strings using f"...{expr}...".
They are the fastest and most readable formatting option.
name, age = "Alice", 30
# f-string (preferred)
print(f"Name: {name}, Age: {age}")
# %-formatting (old style)
print("Name: %s, Age: %d" % (name, age))
# .format()
print("Name: {}, Age: {}".format(name, age))
# Expression support in f-strings
print(f"Next year: {age + 1}") # Next year: 31
break,
continue, and pass?
break: Immediately exits the enclosing loop.continue: Skips the rest of the current iteration
and jumps to the next.pass: A null statement / placeholder — does
nothing. Used where syntax requires a statement but no action is needed.for i in range(5):
if i == 2: continue # skip 2
if i == 4: break # stop at 4
print(i) # 0, 1, 3
class Empty: pass # valid empty class
for-else / while-else
construct work?In Python, loops can have an else clause. The
else block executes only if the loop completes without hitting
a break. If break is triggered, the
else is skipped.
for n in range(2, 10):
for x in range(2, n):
if n % x == 0:
print(f"{n} is not prime")
break
else:
print(f"{n} is prime") # runs only when no break
Python's ternary expression evaluates to one of two values based on a condition, all in a single line.
# Syntax: value_if_true if condition else value_if_false
age = 20
status = "adult" if age >= 18 else "minor"
print(status) # adult
# Equivalent to:
if age >= 18:
status = "adult"
else:
status = "minor"
range() in Python 2 vs
Python 3?range() returns a list (all values
stored in memory). xrange() returns a lazy iterator
(memory-efficient).range() behaves like Python 2's
xrange() — it is a lazy range object. xrange() was
removed.
r = range(1_000_000)
print(type(r)) #
print(r[5]) # 5 — supports indexing
# Memory: stores only start, stop, step — not all values
Python uses try / except / else / finally blocks
for exception handling:
try:
result = 10 / 0
except ZeroDivisionError as e:
print(f"Error: {e}") # Error: division by zero
except (TypeError, ValueError):
print("Type or Value error")
else:
print("No exception occurred") # runs if no exception
finally:
print("Always runs") # cleanup code
Custom exceptions are created by subclassing
Exception (or any built-in exception). Use raise to throw
them.
class InsufficientFundsError(Exception):
def __init__(self, amount, balance):
super().__init__(f"Need {amount}, have {balance}")
self.amount = amount
self.balance = balance
def withdraw(amount, balance):
if amount > balance:
raise InsufficientFundsError(amount, balance)
return balance - amount
try:
withdraw(500, 100)
except InsufficientFundsError as e:
print(e) # Need 500, have 100
assert statement?assert is a debugging aid that tests a condition.
If the condition is False, it raises an AssertionError
with an optional message. Assertions are disabled when Python is run with the
-O (optimize) flag.
def divide(a, b):
assert b != 0, "Denominator must not be zero"
return a / b
divide(10, 2) # 5.0
divide(10, 0) # AssertionError: Denominator must not be zero
match-case statement (structural pattern
matching)?Introduced in Python 3.10,
match-case is a structural pattern matching construct — more powerful
than a simple switch statement because it can match types, shapes, and
values.
def handle_command(command):
match command:
case "quit":
return "Quitting..."
case "hello" | "hi":
return "Hello!"
case {"action": action, "object": obj}:
return f"Do {action} on {obj}"
case _:
return "Unknown command"
print(handle_command("quit")) # Quitting...
while True and
for loops?
for loop: Iterates over a sequence or iterable a
known number of times. Automatically stops when the iterable is exhausted.while True: Runs indefinitely until a
break statement is encountered. Best used when the number of
iterations is unknown (e.g. waiting for user input).
# Event loop pattern using while True
while True:
user_input = input("Command: ")
if user_input == "exit":
break
process(user_input)
enumerate() do?enumerate() adds a counter to an iterable and
returns (index, value) pairs. It avoids manually maintaining a counter
variable. You can specify the start index (default 0).
fruits = ["apple", "banana", "cherry"]
for i, fruit in enumerate(fruits, start=1):
print(f"{i}. {fruit}")
# 1. apple
# 2. banana
# 3. cherry
*args and **kwargs?*args collects extra positional
arguments into a tuple. **kwargs
collects extra keyword arguments into a dict.
def my_func(*args, **kwargs):
print(args) # (1, 2, 3)
print(kwargs) # {'name': 'Alice', 'age': 25}
my_func(1, 2, 3, name="Alice", age=25)
The names args and kwargs are
convention; only * and ** are syntactically required.
A lambda is an anonymous, single-expression
function. Its syntax: lambda args: expression. Commonly used with
map(), filter(), and sorted().
add = lambda x, y: x + y
print(add(3, 5)) # 8
nums = [3, 1, 4, 1, 5]
nums.sort(key=lambda x: -x) # descending sort
print(nums) # [5, 4, 3, 1, 1]
A closure is a nested function that remembers variables from its enclosing scope even after the outer function has finished executing. The inner function "closes over" the free variables.
def make_multiplier(factor):
def multiply(x):
return x * factor # factor is a free variable
return multiply
double = make_multiplier(2)
triple = make_multiplier(3)
print(double(5)) # 10
print(triple(5)) # 15
# factor is still accessible via __closure__
Python resolves variable names in this order:
len,
print, etc.).
x = "global"
def outer():
x = "enclosing"
def inner():
# x = "local" # uncomment to use local
print(x) # enclosing (LEGB order)
inner()
outer()
global and
nonlocal?
global: Allows a function to modify a variable at
module (global) scope.nonlocal: Allows an inner function to modify a
variable in the nearest enclosing function scope (not global).count = 0
def increment():
global count
count += 1
def outer():
x = 0
def inner():
nonlocal x
x += 1
inner()
print(x) # 1
Default argument values are evaluated once at function definition time, not each call. Using a mutable default (e.g. a list) causes it to be shared across all calls.
# BUG
def append_item(item, lst=[]):
lst.append(item)
return lst
print(append_item(1)) # [1]
print(append_item(2)) # [1, 2] ← shared list!
# FIX — use None as sentinel
def append_item(item, lst=None):
if lst is None:
lst = []
lst.append(item)
return lst
map(),
filter(), and reduce()?
map(fn, iterable): Applies fn to
every item → returns a map object (lazy).filter(fn, iterable): Keeps items where
fn returns truthy → returns a filter object.
reduce(fn, iterable): Cumulatively applies
fn to reduce the iterable to a single value (from
functools).
from functools import reduce
nums = [1, 2, 3, 4, 5]
squares = list(map(lambda x: x**2, nums)) # [1,4,9,16,25]
evens = list(filter(lambda x: x%2==0, nums)) # [2,4]
total = reduce(lambda a,b: a+b, nums) # 15
Recursion is when a function calls itself.
Every recursive function needs a base case to stop. Python's default
recursion limit is 1000 frames (configurable via
sys.setrecursionlimit()). Deep recursion can cause a
RecursionError.
def factorial(n):
if n <= 1: # base case
return 1
return n * factorial(n - 1)
print(factorial(5)) # 120
Python does not perform tail-call optimization, so iterative approaches are preferred for deep stacks.
Type hints (PEP 484, Python 3.5+) document expected parameter
and return types. They are not enforced at runtime but help IDEs,
linters (mypy), and readers understand code intent.
def greet(name: str, times: int = 1) -> str:
return f"Hello, {name}! " * times
from typing import List, Dict, Optional
def process(items: List[int]) -> Dict[str, int]:
return {"sum": sum(items), "count": len(items)}
def find_user(uid: int) -> Optional[str]:
users = {1: "Alice"}
return users.get(uid)
A higher-order function is one that either takes a function as an argument or returns a function. Python treats functions as first-class objects, making this a native pattern.
# Takes a function as argument
def apply_twice(fn, value):
return fn(fn(value))
print(apply_twice(lambda x: x * 2, 3)) # 12
# Returns a function
def make_adder(n):
return lambda x: x + n
add5 = make_adder(5)
print(add5(10)) # 15
__init__ method in Python?__init__ is the instance
initializer (constructor). It is automatically called when a new
instance is created. Use it to set the initial state of the object.
class Car:
def __init__(self, color, speed=0):
self.color = color
self.speed = speed
my_car = Car("Red")
print(my_car.color) # Red
_ / __ prefixes).super() work?Inheritance lets a child class reuse and
extend a parent class. super() returns a proxy object that delegates
method calls to the parent class, following the MRO.
class Animal:
def __init__(self, name):
self.name = name
def speak(self):
return "..."
class Dog(Animal):
def __init__(self, name, breed):
super().__init__(name) # call parent __init__
self.breed = breed
def speak(self):
return "Woof!"
d = Dog("Rex", "Labrador")
print(d.name, d.speak()) # Rex Woof!
@classmethod,
@staticmethod, and instance methods?
self (the instance).
Has access to instance and class state.@classmethod: First arg is cls (the
class). Has access to class state, not instance. Used as alternative
constructors.@staticmethod: No self or
cls. A utility function grouped inside the class for logical
organization.
class Date:
def __init__(self, y, m, d):
self.y, self.m, self.d = y, m, d
@classmethod
def from_string(cls, s): # alternative constructor
return cls(*map(int, s.split('-')))
@staticmethod
def is_leap_year(year): # no self/cls needed
return year % 4 == 0
d = Date.from_string("2024-03-15")
print(Date.is_leap_year(2024)) # True
Dunder methods (double underscore, e.g.
__str__) allow classes to implement Python's built-in protocols
(operator overloading, context managers, iteration, etc.).
class Vector:
def __init__(self, x, y):
self.x, self.y = x, y
def __repr__(self):
return f"Vector({self.x}, {self.y})"
def __add__(self, other):
return Vector(self.x + other.x, self.y + other.y)
def __len__(self):
return 2
v1 = Vector(1, 2)
v2 = Vector(3, 4)
print(v1 + v2) # Vector(4, 6)
print(len(v1)) # 2
MRO defines the order in which Python
searches base classes to find a method. Python uses the C3 linearization
algorithm to compute a consistent order. Use
ClassName.__mro__ to inspect it.
class A: pass
class B(A): pass
class C(A): pass
class D(B, C): pass
print(D.__mro__)
# (, , , , )
# D → B → C → A → object
__str__ and
__repr__?
__str__: Called by str() and
print(). Meant to be a human-readable string.
__repr__: Called by repr() and in the
REPL. Meant to be an unambiguous, developer-facing representation —
ideally one that could recreate the object.class Point:
def __init__(self, x, y): self.x, self.y = x, y
def __str__(self): return f"({self.x}, {self.y})"
def __repr__(self): return f"Point({self.x!r}, {self.y!r})"
p = Point(1, 2)
print(str(p)) # (1, 2) ← __str__
print(repr(p)) # Point(1, 2) ← __repr__
@property decorator?@property turns a method into a read-only
attribute. Combined with @name.setter and @name.deleter,
it implements controlled attribute access (Pythonic getters/setters).
class Circle:
def __init__(self, radius):
self._radius = radius
@property
def radius(self):
return self._radius
@radius.setter
def radius(self, value):
if value < 0:
raise ValueError("Radius cannot be negative")
self._radius = value
c = Circle(5)
print(c.radius) # 5 — accessed like an attribute
c.radius = 10 # calls setter
c.radius = -1 # ValueError
Abstract classes (from abc module) define a
common interface but cannot be instantiated. Subclasses must
implement all @abstractmethod methods or they too remain abstract.
from abc import ABC, abstractmethod
class Shape(ABC):
@abstractmethod
def area(self) -> float: ...
@abstractmethod
def perimeter(self) -> float: ...
class Rectangle(Shape):
def __init__(self, w, h): self.w, self.h = w, h
def area(self): return self.w * self.h
def perimeter(self): return 2*(self.w + self.h)
# Shape() # TypeError: Can't instantiate abstract class
r = Rectangle(3, 4)
print(r.area()) # 12
"If it walks like a duck and quacks like a duck, it's a duck." Python checks whether an object supports the required operations, not whether it's an instance of a specific type. This is the foundation of Python's polymorphism.
class Duck:
def quack(self): return "Quack!"
class Person:
def quack(self): return "I'm quacking like a duck!"
def make_it_quack(obj):
print(obj.quack()) # doesn't care about type
make_it_quack(Duck()) # Quack!
make_it_quack(Person()) # I'm quacking like a duck!
[]), can be modified after
creation, slightly more memory.()), cannot be changed,
hashable (can be dict keys/set members), slightly faster.lst = [1, 2, 3]; lst[0] = 99 # OK
tup = (1, 2, 3); tup[0] = 99 # TypeError
# Tuple as dict key (hashable)
d = {(0,0): "origin", (1,2): "point A"}
A concise way to create lists:
[expression for item in iterable if condition]. Faster and more
readable than equivalent for loops.
squares = [x**2 for x in range(10) if x % 2 == 0]
# [0, 4, 16, 36, 64]
# Nested comprehension — flatten a 2D list
matrix = [[1,2],[3,4],[5,6]]
flat = [n for row in matrix for n in row]
# [1, 2, 3, 4, 5, 6]
A dictionary is an unordered
(insertion-ordered since Python 3.7) collection of key: value pairs.
Keys must be hashable (immutable). Lookups are O(1) on average.
d = {"name": "Alice", "age": 30}
d["city"] = "NYC" # add/update
print(d.get("salary", 0)) # 0 (safe get with default)
print(d.keys()) # dict_keys([...])
print(d.items()) # dict_items([...])
del d["age"] # delete key
s = {1, 2, 2, 3, 3}
print(s) # {1, 2, 3} — duplicates removed
# Set operations
a, b = {1,2,3}, {2,3,4}
print(a | b) # union: {1,2,3,4}
print(a & b) # intersection: {2,3}
print(a - b) # difference: {1}
print(a ^ b) # symmetric diff:{1,4}
Slicing extracts a sub-sequence using
seq[start:stop:step] (all optional). It works on lists, tuples,
strings, and any sequence type. Returns a new object; does not modify the original.
lst = [0,1,2,3,4,5,6,7,8,9]
print(lst[2:6]) # [2,3,4,5]
print(lst[::2]) # [0,2,4,6,8] every 2nd
print(lst[::-1]) # [9,8,...,0] reversed
print(lst[-3:]) # [7,8,9] last 3
append(),
extend(), and insert()?
append(x): Adds x as a single item to
the end.extend(iterable): Adds each element of the
iterable to the end.insert(i, x): Inserts x at index
i.
lst = [1, 2, 3]
lst.append([4, 5]) # [1, 2, 3, [4, 5]]
lst = [1, 2, 3]
lst.extend([4, 5]) # [1, 2, 3, 4, 5]
lst.insert(0, 0) # [0, 1, 2, 3, 4, 5]
pop() and
remove()?
pop(index): Removes and returns the
element at the given index (default: last).remove(value): Removes the first
occurrence of the specified value. Raises ValueError if
not found.lst = [1, 2, 3, 2]
lst.pop() # returns 2, lst = [1,2,3]
lst.pop(0) # returns 1, lst = [2,3]
lst = [1, 2, 3, 2]
lst.remove(2) # lst = [1, 3, 2] — first match only
Like list comprehension but creates a dict:
{key_expr: val_expr for item in iterable if condition}.
words = ["hello", "world", "python"]
lengths = {w: len(w) for w in words}
# {'hello': 5, 'world': 5, 'python': 6}
# Invert a dict
original = {"a": 1, "b": 2, "c": 3}
inverted = {v: k for k, v in original.items()}
# {1: 'a', 2: 'b', 3: 'c'}
collections.defaultdict?defaultdict is a dict subclass that automatically
provides a default value for missing keys, using a factory function, instead of
raising KeyError.
from collections import defaultdict
# Group words by first letter
words = ["apple", "avocado", "banana", "blueberry"]
groups = defaultdict(list)
for w in words:
groups[w[0]].append(w)
print(dict(groups))
# {'a': ['apple', 'avocado'], 'b': ['banana', 'blueberry']}
collections.Counter?Counter counts hashable objects and stores them
as a dictionary of {element: count}. Very useful for frequency
analysis.
from collections import Counter
text = "abracadabra"
c = Counter(text)
print(c) # Counter({'a': 5, 'b': 2, 'r': 2, 'c': 1, 'd': 1})
print(c.most_common(2)) # [('a', 5), ('b', 2)]
words = "the cat sat on the mat".split()
Counter(words) # counts word frequencies
collections.deque and when should you use it?
deque (double-ended queue) supports O(1) appends
and pops from both ends. Python lists have O(n)
insert(0, x) and pop(0). Use deque for
queues, BFS, and sliding-window problems.
from collections import deque
dq = deque([1, 2, 3])
dq.appendleft(0) # O(1): [0, 1, 2, 3]
dq.append(4) # O(1): [0, 1, 2, 3, 4]
dq.popleft() # O(1): returns 0
dq.pop() # O(1): returns 4
# Fixed-size sliding window (maxlen)
recent = deque(maxlen=3)
for x in range(6):
recent.append(x)
print(recent) # deque([3, 4, 5], maxlen=3)
frozenset?A frozenset is an immutable
version of a set. Because it is hashable, it can be used as a dictionary key or as
an element of another set — unlike a regular set.
fs = frozenset([1, 2, 3, 2])
print(fs) # frozenset({1, 2, 3})
# As a dict key
graph = {frozenset({"A","B"}): 5, frozenset({"B","C"}): 3}
# Set of frozensets
visited = {frozenset({1,2}), frozenset({3,4})}
Use sorted() or list.sort() with a
key function. operator.itemgetter is slightly faster than
a lambda for simple key access.
from operator import itemgetter
people = [
{"name": "Charlie", "age": 30},
{"name": "Alice", "age": 25},
{"name": "Bob", "age": 35},
]
by_age = sorted(people, key=itemgetter("age"))
by_name = sorted(people, key=lambda p: p["name"])
oldest = sorted(people, key=itemgetter("age"), reverse=True)
zip() and how does it work?zip() takes multiple iterables and returns a lazy
iterator of tuples pairing their elements. It stops at the shortest iterable. Use
zip_longest from itertools to continue to the longest.
names = ["Alice", "Bob", "Charlie"]
scores = [92, 85, 78]
for name, score in zip(names, scores):
print(f"{name}: {score}")
# Unzip / transpose
pairs = [(1,"a"), (2,"b"), (3,"c")]
nums, letters = zip(*pairs)
# nums=(1,2,3), letters=('a','b','c')
copy.copy(), list[:],
list.copy()): Creates a new container but copies only
references to the nested objects. Nested mutables are still shared.
copy.deepcopy()): Recursively copies
the object and all objects it references. Fully independent.import copy
original = [[1, 2], [3, 4]]
shallow = copy.copy(original)
deep = copy.deepcopy(original)
original[0][0] = 99
print(shallow[0][0]) # 99 ← shared inner list
print(deep[0][0]) # 1 ← independent copy
A decorator is a higher-order function that
wraps another function or class to extend its behaviour without modifying its source
code. Applied with the @decorator syntax.
import functools
def timer(func):
@functools.wraps(func) # preserves metadata
def wrapper(*args, **kwargs):
import time
t0 = time.perf_counter()
result = func(*args, **kwargs)
print(f"{func.__name__}: {time.perf_counter()-t0:.4f}s")
return result
return wrapper
@timer
def slow_sum(n):
return sum(range(n))
slow_sum(1_000_000) # slow_sum: 0.0312s
The GIL is a mutex in CPython that allows only one thread to execute Python bytecode at a time, even on multi-core machines. It simplifies memory management (reference counting) but limits CPU-bound parallelism via threads.
multiprocessing (separate
processes, each has its own GIL) or C extensions that release the GIL.Generators produce values lazily using the
yield keyword. They maintain state between next() calls,
using O(1) memory instead of storing the entire sequence.
def fibonacci():
a, b = 0, 1
while True:
yield a
a, b = b, a + b
gen = fibonacci()
print([next(gen) for _ in range(8)]) # [0,1,1,2,3,5,8,13]
# Generator expression (like list comp but lazy)
squares = (x**2 for x in range(10_000_000)) # no memory blown
with statement?A context manager manages resource setup and
teardown via __enter__ and __exit__ dunder methods. The
with statement ensures __exit__ is always called (even on
exceptions), preventing resource leaks.
# Built-in usage
with open("file.txt", "r") as f:
data = f.read()
# file auto-closed here
# Custom context manager
class Timer:
def __enter__(self):
import time; self.start = time.perf_counter()
return self
def __exit__(self, *args):
print(f"Elapsed: {time.perf_counter()-self.start:.4f}s")
with Timer():
sum(range(1_000_000))
@contextmanager from contextlib?
A decorator that lets you write a context manager using a
generator function instead of a class. Code before
yield runs on enter; code after yield runs on exit.
from contextlib import contextmanager
@contextmanager
def managed_resource(name):
print(f"Opening {name}")
try:
yield name.upper() # value bound to 'as' target
finally:
print(f"Closing {name}")
with managed_resource("db_connection") as res:
print(f"Using {res}") # Using DB_CONNECTION
asyncio and how does async/await work?asyncio is Python's framework for writing
concurrent I/O-bound code using coroutines. An
async def function is a coroutine; await suspends it until
the awaitable completes, yielding control back to the event loop.
import asyncio
async def fetch(url):
await asyncio.sleep(1) # simulate I/O
return f"Data from {url}"
async def main():
results = await asyncio.gather(
fetch("https://api.example.com/1"),
fetch("https://api.example.com/2"),
)
print(results) # both complete in ~1s total
asyncio.run(main())
threading and
multiprocessing?
from concurrent.futures import ThreadPoolExecutor, ProcessPoolExecutor
# I/O-bound — use threads
with ThreadPoolExecutor() as ex:
results = list(ex.map(download, urls))
# CPU-bound — use processes
with ProcessPoolExecutor() as ex:
results = list(ex.map(compute, data))
A metaclass is the "class of a class" — it
controls how classes are created. type is the default metaclass for all
Python classes. Custom metaclasses override __new__ or
__init__ to modify class creation (used in ORMs, frameworks like
Django).
class SingletonMeta(type):
_instances = {}
def __call__(cls, *args, **kwargs):
if cls not in cls._instances:
cls._instances[cls] = super().__call__(*args, **kwargs)
return cls._instances[cls]
class Database(metaclass=SingletonMeta):
pass
db1 = Database()
db2 = Database()
print(db1 is db2) # True — same instance
__slots__ and why use it?By default, Python stores instance attributes in a
per-instance __dict__. __slots__ replaces this with a
fixed set of descriptors, saving memory (no dict per instance) and speeding up
attribute access. Useful when creating millions of instances.
class Point:
__slots__ = ('x', 'y')
def __init__(self, x, y):
self.x, self.y = x, y
p = Point(1, 2)
print(p.x) # 1
p.z = 3 # AttributeError — no __dict__!
# Memory saving: ~50% less than without __slots__
Python uses two mechanisms:
a → b → a). The gc module runs a generational
collector that detects and collects these cycles.import gc
import sys
x = [1, 2, 3]
print(sys.getrefcount(x)) # 2 (x + temp arg)
del x # ref count → 0 → freed
gc.collect() # manually trigger cyclic GC
functools.lru_cache?Memoization caches the results of expensive
function calls and returns the cached result for the same inputs.
@lru_cache provides a decorator that implements this with a Least
Recently Used eviction policy.
from functools import lru_cache
@lru_cache(maxsize=None) # None = unlimited cache
def fib(n):
if n < 2: return n
return fib(n-1) + fib(n-2)
print(fib(50)) # instant; without cache = 2^50 calls
print(fib.cache_info()) # hits, misses, size
Pickling is serializing a Python object into
a byte stream. Unpickling restores the object. Used for caching,
IPC, and saving model state (e.g. ML models with joblib).
import pickle
data = {"model": "GBM", "accuracy": 0.95, "params": [1,2,3]}
# Serialize
with open("model.pkl", "wb") as f:
pickle.dump(data, f)
# Deserialize
with open("model.pkl", "rb") as f:
loaded = pickle.load(f)
print(loaded["model"]) # GBM
⚠️ Never unpickle data from untrusted sources — it can execute arbitrary code.
Monkey patching is dynamically modifying a class or module at runtime — replacing or adding attributes/methods without altering the original source code. Common in testing (mocking) and hotfixes.
import json
# Save original
_original_dumps = json.dumps
def my_dumps(obj, **kwargs):
kwargs.setdefault("indent", 2)
return _original_dumps(obj, **kwargs)
# Patch at runtime
json.dumps = my_dumps
print(json.dumps({"a": 1})) # pretty-printed by default
__new__ and
__init__?
__new__(cls): A static method that
creates a new instance. Called before __init__. Must
return the new instance. Used for immutable types, singletons, and metaclasses.
__init__(self): Initializes the already-created
instance. Returns None.class Singleton:
_instance = None
def __new__(cls, *args, **kwargs):
if cls._instance is None:
cls._instance = super().__new__(cls)
return cls._instance
a, b = Singleton(), Singleton()
print(a is b) # True
A descriptor is any object that defines
__get__, __set__, or __delete__. It enables
customized attribute access. Properties, class methods, and static methods are all
implemented as descriptors.
class Positive:
"""Descriptor: enforces positive numbers."""
def __set_name__(self, owner, name):
self.name = name
def __get__(self, obj, objtype=None):
return obj.__dict__.get(self.name)
def __set__(self, obj, value):
if value <= 0:
raise ValueError(f"{self.name} must be positive")
obj.__dict__[self.name] = value
class Product:
price = Positive()
stock = Positive()
p = Product()
p.price = 9.99 # OK
p.price = -1 # ValueError
Interning is an optimization where Python
reuses the same object for identical immutable values. CPython interns small
integers (-5 to 256) and short identifier-like strings, so
is may return True unexpectedly.
a = 256; b = 256
print(a is b) # True — interned
a = 257; b = 257
print(a is b) # False (CPython) — not guaranteed
import sys
s = sys.intern("long string repeated many times")
# Explicitly intern for dict-key performance
# LBYL
if "key" in d:
value = d["key"]
# EAFP (preferred in Python)
try:
value = d["key"]
except KeyError:
value = default_value
# EAFP is cleaner and avoids race conditions in concurrent code
functools.wraps and why is it important?When you wrap a function in a decorator, the wrapper replaces
it — hiding its __name__, __doc__,
__annotations__, etc. @functools.wraps(func) copies these
attributes from the original onto the wrapper, preserving introspection.
import functools
def my_deco(func):
@functools.wraps(func) # ← critical!
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
return wrapper
@my_deco
def add(a, b):
"""Adds two numbers."""
return a + b
print(add.__name__) # add (not 'wrapper')
print(add.__doc__) # Adds two numbers.
yield to produce values. Data
flows out of it.yield / await to both
produce and receive values (bidirectional). Used for cooperative
multitasking in asyncio.# Classic coroutine (send-based)
def accumulator():
total = 0
while True:
value = yield total # receive via send()
total += value
acc = accumulator()
next(acc) # prime the coroutine
print(acc.send(10)) # 10
print(acc.send(5)) # 15
# Modern: async coroutine
async def fetch_data():
await asyncio.sleep(0)
return 42
type() and
isinstance()?
type(obj): Returns the exact type. Does not
consider inheritance.isinstance(obj, cls): Returns True if
obj is an instance of cls or any subclass.
Preferred in OOP code.
class Animal: pass
class Dog(Animal): pass
d = Dog()
print(type(d) == Animal) # False
print(type(d) == Dog) # True
print(isinstance(d, Dog)) # True
print(isinstance(d, Animal)) # True ← follows inheritance
The @dataclass decorator (Python 3.7+)
auto-generates __init__, __repr__, and __eq__
from class variable annotations, reducing boilerplate for data-holder classes.
from dataclasses import dataclass, field
@dataclass(order=True, frozen=True)
class Point:
x: float
y: float
label: str = "origin"
tags: list = field(default_factory=list)
p1 = Point(1.0, 2.0)
p2 = Point(1.0, 2.0)
print(p1 == p2) # True — auto __eq__
print(repr(p1)) # Point(x=1.0, y=2.0, label='origin', tags=[])
itertools and give key examples?itertools provides memory-efficient building
blocks for iterators, implementing many algorithms from functional programming.
import itertools
# chain — flatten iterables
list(itertools.chain([1,2], [3,4], [5])) # [1,2,3,4,5]
# groupby — group consecutive elements
data = [("A",1),("A",2),("B",3)]
for key, grp in itertools.groupby(data, key=lambda x:x[0]):
print(key, list(grp))
# product — Cartesian product
list(itertools.product("AB", [1,2]))
# [('A',1),('A',2),('B',1),('B',2)]
# islice — lazy slice of iterator
list(itertools.islice(iter(range(100)), 5)) # [0,1,2,3,4]
A virtual environment is an isolated Python environment with its own interpreter and packages. It prevents dependency conflicts between projects ("Project A needs Django 3, Project B needs Django 5").
# Create
python -m venv .venv
# Activate
source .venv/bin/activate # macOS/Linux
.venv\Scripts\activate # Windows
# Install packages (isolated)
pip install requests
# Deactivate
deactivate
# Modern alternative: uv (much faster)
uv venv && uv pip install requests
__iter__ and
__next__?
The iterator protocol requires two methods:
__iter__: Returns the iterator object itself (allows use in
for loops).
__next__: Returns the next value; raises StopIteration
when exhausted.class CountUp:
def __init__(self, start, stop):
self.current, self.stop = start, stop
def __iter__(self):
return self
def __next__(self):
if self.current >= self.stop:
raise StopIteration
val = self.current
self.current += 1
return val
for n in CountUp(1, 4):
print(n) # 1, 2, 3
__call__ dunder method?Implementing __call__ makes an instance
callable — you can invoke it like a function. Useful for stateful
callables, function-like objects, and class-based decorators.
class Multiplier:
def __init__(self, factor):
self.factor = factor
def __call__(self, x):
return x * self.factor
triple = Multiplier(3)
print(triple(5)) # 15
print(callable(triple)) # True
weakref in Python?A weak reference refers to an object without incrementing its reference count. The object can still be garbage-collected. Useful for caches and circular-reference avoidance.
import weakref
class BigObject:
def __del__(self): print("Deleted!")
obj = BigObject()
ref = weakref.ref(obj)
print(ref()) # — alive
del obj # Deleted! — GC'd immediately
print(ref()) # None — object gone
# WeakValueDictionary for auto-evicting caches
cache = weakref.WeakValueDictionary()
__getattr__ vs __getattribute__?
__getattribute__: Called on every
attribute access. Override with extreme caution — infinite recursion is easy.
__getattr__: Called only when the normal
lookup fails (attribute not found). Used for lazy attributes, proxies,
and dynamic attribute generation.class LazyProxy:
def __init__(self, data):
self._data = data
def __getattr__(self, name):
# Only called when name isn't found normally
if name in self._data:
return self._data[name]
raise AttributeError(f"No attribute '{name}'")
p = LazyProxy({"x": 10, "y": 20})
print(p.x) # 10
.egg): The old distribution format
(introduced by setuptools). Mostly obsolete..whl): The modern, standard
built-distribution format (PEP 427). A zip archive of pre-compiled files.
Installs faster than source distributions (no compilation step). Platform wheels
include native code (cp311-win_amd64); pure-Python wheels work
everywhere (py3-none-any).# Build a wheel
pip install build
python -m build --wheel
# Install from wheel
pip install mypackage-1.0-py3-none-any.whl
Python's data model describes the rules by which Python objects interact with the language. By implementing dunder methods, any class can integrate with Python's syntax and built-ins seamlessly.
__add__, __mul__,
__abs__, etc.
__len__,
__getitem__, __contains__.
__iter__,
__next__.
__enter__, __exit__.
__eq__, __lt__,
__hash__.
This is why you can use len(), for,
with, + etc. on custom objects.
cProfile or
line_profiler before optimizing. Never guess.
map(),
filter(), itertools are implemented in C.
__slots__ for memory-intensive object creation.
import cProfile
cProfile.run("my_function()") # find real bottlenecks
More questionnaires to sharpen your skills