The Python3 import playground

Overview

The Python3 import playground

I have been confused about python modules and packages, this text tries to clear the topic up a bit.

Sources:

https://chrisyeh96.github.io/2017/08/08/definitive-guide-python-imports.html
https://abarker.github.io/understanding_python_imports/ 
https://stackoverflow.com/questions/44834/can-someone-explain-all-in-python
https://docs.python.org/3/reference/import.html
https://stackoverflow.com/questions/43059267/how-to-do-from-module-import-using-importlib

Modules

Each file with extension .py can be used as a module. The module is in source file module_source.py, while it is imported as import module_source. In this example, both the module source file and the importing file are in the same directory.

Lets import the module. (the module source includes a print statement print("module_foo is being imported"))

>> import module_foo module_foo is being imported >>> print("after: import module_foo") after: import module_foo">
>>> print("before: import module_foo")
before: import module_foo
>>> import module_foo
module_foo is being imported
>>> print("after: import module_foo")
after: import module_foo

The imported module is being parsed and run during the import statement. Python is a dynamic language, and there is no way to determine the interface of a module, without running it. This can have it's uses: you can add some module specific initialisations in the global scope of the module, these are run just once at import time.

Often you can see the following lines in some python code:

if __name__ == '__main__':
  run_main_function()

This means that the function run_main_function will be run only when the file is run as a script (meaning it is run as python3 module_file.py), __name__ is a built-in variable that holds the name of the current module, it defaults to "__main__" for the file that is directly run by the python interpreter.

How does the module look like on the importing side?

>>> import module_foo
>>>
>>> print(type(module_foo))

   

   

A variable with the same name as the imported module is defined implicitly by the python runtime, and it is of type .

  • lets use the interface that is exported by the module; all functions defined in the module are accessed via the module name followed by a dot.
foo = module_foo.Foo("gadget")
print(foo)

module_foo.print_foo("some stuff: ", 42)
  • Lets take a look at the properties of the module_foo variable
>>> print("module_foo.__dict__ keys: ", ", ".join(module_foo.__dict__.keys()))
module_foo.__dict__ keys:  __name__, __doc__, __package__, __loader__, __spec__, __file__, __cached__, __builtins__, datetime, Foo, print_foo, _internal_print

This print statements shows all keys of the ___dict__ attribute for the import module variable. The __dict__ attribute is a dictionary and it maps the names of object instance variables names to their value.

A more cultured way of accessing this information is the built-in dir function; The documentation says that for a module object the following info is returned If the object is a module object, the list contains the names of the module’s attributes.

module_foo.__dict__ key: Foo value-type: module_foo.__dict__ key: print_foo value-type: module_foo.__dict__ key: _internal_print value-type: ">
>>> for key, value in module_foo.__dict__.items():
...     print("module_foo.__dict__ key: ", key, "value-type: ", type(value))
...

module_foo.__dict__ key:  datetime value-type:  
      
       
module_foo.__dict__ key:  Foo value-type:  
       
        
module_foo.__dict__ key:  print_foo value-type:  
        
         
module_foo.__dict__ key:  _internal_print value-type:  
         

         
        
       
      

That makes sense: the call of module_foo.print_foo("some stuff: ", 42) is just a short form for a regular object call module_foo.__dict__['module_foo'].print_foo("some stuff :", 42) An imported module is just an instance of a module object, where each exported class or method is a member of that module object!

Interesting that even names with a leading underscore are visible via import of a module (although pylint is giving a warning if you use them, and this is regarded as very bad style). Importing from a package does not expose these symbols (unless defined in the __init__.py module)

Please note: in this case module_foo is also listing all modules imported by the imported module (like module datetime)

Where do we put the module source file?

An imported module must be a directory in the sys.path list, the current directory is always part of this list. You can add directories to sys.path by setting PYTHONPATH environment variable, before running python executable, or by explicitly adding your directory to sys.path, before calling import. (Example source imorting the module and source of the module

Import renames

There are other forms of import,

import module_foo as mfoo

Here the variable defined by the runtime is renamed to mfoo, and the code that uses the module looks as follows

mfoo.print_foo("some stuff: ", 42)

You will sometimes see the following kind of imports in both modules and packages.

import math as _math
import os as _os

This turns the imported package name into a private symbol, so that the import of packages will not turn into symbols when importing the module as follows; from module_name import * - this form of import adds all symbols from the module to the current namespace. See example

Import renames with directories

The import with rename feature can be used to access python files in subdirectores: module_foo_src is in a sub directory, relative to module_source.py See module source and module usage

import module_foo_src.module_foo as mfoo

Please note that you can only get into one directory level beneath any directory that is listed under the python import path. The import path includes the current directory of the main module,

Importing symbols info the namespace of the caller

You can import symbols selectively into the calling program, as follows:

from  module_foo import print_foo, Foo

print_foo("some stuff: ", 42)

However some say that this kind of import does not make the code more readable. The Google style guide does not recommend this approach.

You can also import all symbols from module_foo right into your own namespace

from module_foo import *

See example module and usage

Now this form of import has an interesting case: if the module source defines a list variable named __all__, then this variable lists all symbols exported by the module, it limits the list of symbols that can be imported with the * import. However this variable is only used for the from module_foo * import form, The example module does not list print_foo in it's __all__ variable, however it is still possible to import it by means of from module_foo import print_foo

Also the * import does not import symbols with leading underscores, these are respected as module private symbols.

Lots of details here...

Multiple imports

You can also import several packages from the same import statements, technically you can do

import os, sys as system, pathlib

However pylint gives you a warning for multiple imports in the same line, therefore it is not a good thing to do.

Exceptions that occur during module import

You get an ImportError exception if the python runtime ran into a problem during import. This can be used to choose between alternative versions of a library.

try:
    import re2 as re
except ImportError:
    import re

This example includes the re2 regular expression engine, if that is not installed, then it falls back to the compatible regular expression module re

However when the imported module did run code in its global scope that threw a regular error like ValueError, then you will get a ValueError exception.

Packages

Here again is an example package. source of package foo and example using package package_foo

A Directory with an __init__.py is a python package, this directory can include more than one python file, the idea of a package is to treat all the python files in this directory as a whole.

Once a package is imported: its __init__.py in that directory is implicitly run, in order to determine the interface of that package.

The __init.py__ module is run when a package is imported. The namespace of this module is made available to the importer of the package. Technically, importing a package is the same to importing the __init__.py module of a package. It's the same as:

import package_name.__init__  as package_name

Most of the following information will be very familiar from the previous explanation of modules:

An imported package foo must be a sub directory directly under any one of the directories listed in the sys.path list, the current directory is always part of that list.

>>> import sys
>>> print(sys.path)
['', '/Library/Frameworks/Python.framework/Versions/3.9/lib/python39.zip', '/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9', '/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/lib-dynload', '/Users/michaelmo/Library/Python/3.9/lib/python/site-packages', '/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages']

The first entry is '', meaning the directory where the script file is in.

You can add directories to sys.path by setting PYTHONPATH environment variable, before running python executable, or by explicitly adding your directory to sys.path, before calling import.

>>> import sys
>>> print(type(sys))

   

   

At the importing side: and imported package is represented by a variable of type 'class module'; the namespace of that package (including built-in classes and functions) are part of package_name.__dict__

>>> import sys as system
>>> print(system.path)
['', '/Library/Frameworks/Python.framework/Versions/3.9/lib/python39.zip', '/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9', '/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/lib-dynload', '/Users/michaelmo/Library/Python/3.9/lib/python/site-packages', '/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages']

import sys as system - this construct is renaming the variable of type 'class module', to act as an alias for the import name.

sys.modules - is a global variable, it's a dictionary that maps import name to variable of type , It stands for all currently imported modules and packages; import first checks if a package is already imported, to avoid loading the same package twice.

>>> import sys
>>> print(sys.modules.keys())
dict_keys(['sys', 'builtins', '_frozen_importlib', '_imp', '_thread', '_warnings', '_weakref', '_io', 'marshal', 'posix', '_frozen_importlib_external', 'time', 'zipimport', '_codecs', 'codecs', 'encodings.aliases', 'encodings', 'encodings.utf_8', '_signal', 'encodings.latin_1', '_abc', 'abc', 'io', '__main__', '_stat', 'stat', '_collections_abc', 'genericpath', 'posixpath', 'os.path', 'os', '_sitebuiltins', '_locale', '_bootlocale', 'site', 'readline', 'atexit', 'rlcompleter'])

This map is also listing all of the built-in modules.

>>> import _frozen_importlib
>>> print(_frozen_importlib.__doc__)
Core implementation of import.

This module is NOT meant to be directly imported! It has been designed such
that it can be bootstrapped into Python as the implementation of import. As
such it requires the injection of specific modules and attributes in order to
work. One should use importlib as the public-facing version of this module.

The __doc__ member of the module variable is the docstring defined for the module. Function objects also have such a member variable

Writing the __init__.py file

The tricky part in writing a package is the __init__.py file, this file has to import all other files as modules, as follows:

from  .file1 import  *

This is a relative import, it imports the module file1 in file1.py from the current directory, and adds all symbols to the namespace of the init.py file (except for names with a leading underscore, these are treated as package private names). Having these symbols as part of the __init__.py namespace is the condition for making these symbols available upon import.

A generic __init__.py file

I sometimes forget to include a module from the __init__.py file, so lets make a generic __init__.py file. See the result of this effort here in this example; _import_all is a function that imports all modules in the same directory as __init__.py, except for modules with a leading underscore in their name, as well as the __init__.py file itself. First it enumerates all such files with extension .py in that directory. Each relevant module is loaded explicitly via importlib.import_module, this function returns the module variable for the imported package.

Next, the namespace of that module is merged with the current namespace, it does so by enumerating all entries of the module variables __dict__ member, and add these to the global namespace returned by the global() built-in function.

The function also builds the __all__ member of the package, if an __all__ global variable has been defined in the module, then it is appended to the __all__ list of the __init__.py file.

The _import_all function from this example is a nice generic function, it buys you some convenience at the expense of the time to load the module, but this kind of trade off is very frequent in computing...

Packages with sub packages

An example of a package with sub-packages package source and package usage

├── package_foo
│   ├── __init__.py
│   ├── sub_package_one
│   │   ├── __init__.py
│   │   └── file1.py
│   └── sub_package_two
│       ├── __init__.py
│       └── file2.py
├── use_foo.py
└── use_module_import.py

Here the __init__.py file of the main package needs to import the sub packages into its namespace. It is not possible to import a sub package selectively, you can import package directories that are directly under any one of the directories in the module search path (that includes the current directory)

from  .sub_package_one import  *
from  .sub_package_two import  *

The curious case of the empty __init__.py file

Sometimes there is an empty ___init__.py file in the package_foo directory. That enables us to do directly import the sub package files

import package_foo.sub_package_one as sub_package_one

foo = sub_package_one.Foo("gadget")

The idea is that package_foo needs an __init__.py file in order to count as a package, without such a file, a package import would fail prior to python version 3.3, and you could not resolve the import path package_foo.sub_package_one for this reason. This problem is then solved with an empty __init__.py file in the package_foo directory. However this changed with with Python3.3, for later versions you no longer need the empty __init__.py file, an import of a directory without __init__.py does not fail for later versions.

Conclusion

I hope that this text has cleared the topic of python import system. Python is a relatively simple language, however there are a lot of usage patterns that one has to get used to. These are not always obvious from the python documentation.

Owner
Michael Moser
Michael Moser
TensorFlow Tutorial and Examples for Beginners (support TF v1 & v2)

TensorFlow Examples This tutorial was designed for easily diving into TensorFlow, through examples. For readability, it includes both notebooks and so

Aymeric Damien 42.5k Jan 08, 2023
DCGAN-tensorflow - A tensorflow implementation of Deep Convolutional Generative Adversarial Networks

DCGAN in Tensorflow Tensorflow implementation of Deep Convolutional Generative Adversarial Networks which is a stabilize Generative Adversarial Networ

Taehoon Kim 7.1k Dec 29, 2022
Tensorflow implementation and notebooks for Implicit Maximum Likelihood Estimation

tf-imle Tensorflow 2 and PyTorch implementation and Jupyter notebooks for Implicit Maximum Likelihood Estimation (I-MLE) proposed in the NeurIPS 2021

NEC Laboratories Europe 69 Dec 13, 2022
KAPAO is an efficient multi-person human pose estimation model that detects keypoints and poses as objects and fuses the detections to predict human poses.

KAPAO (Keypoints and Poses as Objects) KAPAO is an efficient single-stage multi-person human pose estimation model that models keypoints and poses as

Will McNally 664 Dec 30, 2022
This repository provides data for the VAW dataset as described in the CVPR 2021 paper titled "Learning to Predict Visual Attributes in the Wild"

Visual Attributes in the Wild (VAW) This repository provides data for the VAW dataset as described in the CVPR 2021 Paper: Learning to Predict Visual

Adobe Research 36 Dec 30, 2022
Pre-training of Graph Augmented Transformers for Medication Recommendation

G-Bert Pre-training of Graph Augmented Transformers for Medication Recommendation Intro G-Bert combined the power of Graph Neural Networks and BERT (B

101 Dec 27, 2022
Code for 2021 NeurIPS --- Towards Multi-Grained Explainability for Graph Neural Networks

ReFine: Multi-Grained Explainability for GNNs This is the official code for Towards Multi-Grained Explainability for Graph Neural Networks (NeurIPS 20

Shirley (Ying-Xin) Wu 47 Dec 16, 2022
Controlling a game using mediapipe hand tracking

These scripts use the Google mediapipe hand tracking solution in combination with a webcam in order to send game instructions to a racing game. It features 2 methods of control

3 May 17, 2022
Part-aware Measurement for Robust Multi-View Multi-Human 3D Pose Estimation and Tracking

Part-aware Measurement for Robust Multi-View Multi-Human 3D Pose Estimation and Tracking Part-Aware Measurement for Robust Multi-View Multi-Human 3D P

19 Oct 27, 2022
Image-retrieval-baseline - MUGE Multimodal Retrieval Baseline

MUGE Multimodal Retrieval Baseline This repo is implemented based on the open_cl

47 Dec 16, 2022
City-seeds - A random generator of cultural characteristics intended to spark ideas and help draw threads

City Seeds This is a random generator of cultural characteristics intended to sp

Aydin O'Leary 2 Mar 12, 2022
Datasets, Transforms and Models specific to Computer Vision

torchvision The torchvision package consists of popular datasets, model architectures, and common image transformations for computer vision. Installat

13.1k Jan 02, 2023
Hierarchical Motion Encoder-Decoder Network for Trajectory Forecasting (HMNet)

Hierarchical Motion Encoder-Decoder Network for Trajectory Forecasting (HMNet) Our paper: https://arxiv.org/abs/2111.13324 We will release the complet

15 Oct 17, 2022
A TensorFlow implementation of FCN-8s

FCN-8s implementation in TensorFlow Contents Overview Examples and demo video Dependencies How to use it Download pre-trained VGG-16 Overview This is

Pierluigi Ferrari 50 Aug 08, 2022
Sparse Progressive Distillation: Resolving Overfitting under Pretrain-and-Finetune Paradigm

Sparse Progressive Distillation: Resolving Overfitting under Pretrain-and-Finetu

3 Dec 05, 2022
[CVPR'21] FedDG: Federated Domain Generalization on Medical Image Segmentation via Episodic Learning in Continuous Frequency Space

FedDG: Federated Domain Generalization on Medical Image Segmentation via Episodic Learning in Continuous Frequency Space by Quande Liu, Cheng Chen, Ji

Quande Liu 178 Jan 06, 2023
Simple data balancing baselines for worst-group-accuracy benchmarks.

BalancingGroups Code to replicate the experimental results from Simple data balancing baselines achieve competitive worst-group-accuracy. Replicating

Meta Research 29 Dec 02, 2022
Pytorch implementation for Semantic Segmentation/Scene Parsing on MIT ADE20K dataset

Semantic Segmentation on MIT ADE20K dataset in PyTorch This is a PyTorch implementation of semantic segmentation models on MIT ADE20K scene parsing da

MIT CSAIL Computer Vision 4.5k Jan 08, 2023
DrQ-v2: Improved Data-Augmented Reinforcement Learning

DrQ-v2: Improved Data-Augmented RL Agent Method DrQ-v2 is a model-free off-policy algorithm for image-based continuous control. DrQ-v2 builds on DrQ,

Facebook Research 234 Jan 01, 2023
PiRank: Learning to Rank via Differentiable Sorting

PiRank: Learning to Rank via Differentiable Sorting This repository provides a reference implementation for learning PiRank-based models as described

54 Dec 17, 2022