On this page I will be going over some interesting Python concepts for more advanced tasks. These notes are based on the article '5 must-know Python concepts for experts' by Vivek Shrivastava (link below).
Context Managers Edit
This can be handy when you are reading and writing to files. Context managers allow the programmer to open and close the file or connection object using a with and an as keyword.
It automatically takes care of the object after execution has finished, making sure that connection or file object is safely released or closed afterwards. The object is safely closed even when an error occurs while executing the processing logic within the block.
So, mostly it replaces this code:
someFile = open('some_file', 'w') try: someFile.write('Hello World!') finally: someFile.close()
with this code:
with open('some_file', 'w') as someFile: someFile.write('Hello World!')
Implicit Tuple UnpackingEdit
Python allows for multiple assignments, meaning something like this is possible:
x, y = 10, 20
Herein, Python is implicitly creating a tuple (10, 20) out of the supplied values and iterating over the variables supplied for individual assignment. The creation of a temporary tuple also means that a copy of the values supplied is used, and if the r_values are variables (e.g x, y = amount, name), it behaves like 'pass-by value' technique. This means you can do this without creating arace-condition :
x, y = y, x
This can be used to swap two varibles without needing to use a third variable.
But also this functionality can be extended over different data types. Doing this:
x, y = 'OK'
results in x = 'O' and y = 'K' . Python also allows you to return multiple values from a function without the need to define a 'structure', populating all values in its objects and returning the object. You can simply do this:
def function(): # Some processing return name, salary, employeeID x, y, z = function()
Magic methods are implicitly invoked funcitons when certain operations occur on the object of a particular class.
They are surrounded by double-underscores and each can be defined while creating your own class to easily impart certain properties to it.
Magic Methods lets the programmer define what happens when some of the common operators and functions are used on the object. For example:
class Employee: def _init__(self, name, ID, salary): self.empName = name self.empID = ID self.empSalary = salary def __add__(self, secondObject): return (self.empSalary + secondObject.empSalary) objBob = Employee('Bob', 'EMP002', 5000) objAlice = Employee('Alice', 'EMP001', 10000) print('Sum: ', objAlice + objBob) >>> Sum: 15000
Without the __add__() method deifinition, the interpreter won't know what to do when two objects of the class are added together.
Generators are lazy iterators that process an element in a list only when it is used.
# Normal function def getList(limit): listVal = list() for i in range(limit): listVal.append(i) return listVal # Genrerator function def genList(limit): for i in range(limit): yield i
Because a generator only loads data when it is their turn to be processed and not before that, it is faster and less of a load on memory.
Another big advantage of the generators is that the yield statement, control is passed back from the function to the calling program and the state of the local variables is remembered for the next iteration. This means that if you need to conditionally look for say, prime numbers in the generated stream and stop processing when consecutive 3 numbers are detected which are not prime, you don't need to have loaded a very large list from a file of 1000 numbers. Using generators you can load the elements one-by-one, process them till 3 consecutive non-primes appear, and then terminate the program.
In Python, functions are objects; meaning that they can be passed as arguments and returned from other functions. Decorators take advantage of this and provide a method to wrap functions inside another to impart additional functionalities without changing the behaviour of the original function.
Let's say for example you want to calculate the time for several different operation. The common way to do this would be to use the time module :
import time if __name__ == '__main__': startTime = time.time() loadLargeFile('abc.txt') endTime = time.time() print('Time: ', endTime-startTime, ' seconds')
This is fine but what if we want to be able to do this for mutiple different methods? And what if you wanted to then later on you didn't want to analyse the time taken any more? That would require a lot of creating and removing a lot of lines of code. This is where we can use decorators :
import time def timeIt(func): def wrapper(*args, **kwargs): startTime = time.time() func(*args, **kwargs) endTime = time.time() print('Time: ', endTime - startTime, ' seconds') return wrapper @timeIt def loadLargeFile(filename): print('Loading file: ', filename) time.sleep(2) @timeit def makeAPICall(): print('Making an API call and waiting for the response...') time.sleep(1.5) @timeIt def generateSummaryReport(): print('Generating summary report...') time.sleep(5) if __name__ == '__main__': loadLargeFile('abc.txt') makeAPICall() generateSummaryReport()
Just by adding @timeIt before the defined functions we can wrap the function call inside the timeIt() wrapper. This is equivalent to doing :
timeIt(loadLargeFile('abc.txt')) timeIt(makeAPICall()) timeIt(generateSummaryReport())
And when we no longer need to wrap functions, we can just go to their definitions and remove the preceeding @timeIt to reverse the 'decoration'.