What is the advantage of setting zip_safe to True when packaging a Python project?

PythonPackagingSetuptools

Python Problem Overview


The setuptools documentation only states:

> For maximum performance, Python packages are best installed as zip files. Not all packages, however, are capable of running in compressed form, because they may expect to be able to access either source code or data files as normal operating system files. So, setuptools can install your project as a zipfile or a directory, and its default choice is determined by the project's zip_safe flag ([reference][1]).

In practical terms, what is the performance benefit gained? Is it worth investigating if my projects are zip-safe, or are the benefits generally minimal?

[1]: http://peak.telecommunity.com/DevCenter/setuptools#setting-the-zip-safe-flag "Setting the zip-safe flag"

Python Solutions


Solution 1 - Python

Zip files take up less space on disk, which also means they're more quickly read from disk. Since most things are I/O bound, the overhead in decompressing the packaging may be less than the overhead in reading a larger file from disk. Moreover, it's likely that a single, small-ish zip file is stored sequentially on disk, while a collection of smaller files may be more spread out. On rotational media, this also increases read performance by cutting down the number of seeks. So you generally optimize your disk usage at the cost of some CPU time, which may dramatically improve your import and load times.

Solution 2 - Python

There are several advantages, in addition to the ones already mentioned.

Reading a single large .egg file (and unzipping it) may be significantly faster than loading multiple (potentially a lot of) smaller .py files, depending on the storage medium/filesystem on which it resides.

Some filesystem have a large block size (e.g., 1MB), which means that dealing with small files can be expensive. Even though your files are small (say, 10KB), you may actually be loading a 1MB block from disk when reading it. Typically, filesystems combine multiple small files in a large block to mitigate this a bit.

On filesystems where access to file metadata is slow (which sometimes happens with shared filesystems, like NFS), accessing a large amount of files may be very expensive too.

Of course, zipping the whole bunch also helps, since that means that less data will have to be read in total.

Long story short: it may matter a lot if your filesystem is more suited for a small amount of large files.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionsaffsdView Question on Stackoverflow
Solution 1 - PythonLiviusView Answer on Stackoverflow
Solution 2 - PythonKenneth HosteView Answer on Stackoverflow