Imagine you've written a script that does something awesome, and like a good engineer, you've used or created a bunch of modules to help you do it. Now you want to run that script on some other machine. How do you distribute the script and all the modules? I think the easiest answer is to package them into a single file. It turns out that Python supports executing .zip archives, so I wrote a program that takes a Python executable script and outputs an executable .zip that contains all the modules it depends on, excluding any "system" modules. The result: Copy one file and run it with ./some_package_name
.
How does this work? If the first argument to the Python interpreter is a .zip file, it looks inside it for __main__.py
. If found, it executes it, with the contents of the .zip added to the module search path. To make the .zip directly executable, you just need to "cheat" by adding the standard Unix script header: "#!/usr/bin/python\n". Most .zip decoders are smart enough to only start decoding after they see the .zip header. My script attempts to automate creating this .zip.
Usage:
./packagepy (executable script) (output package)
Code: packagepy.py
This has only been lightly tested, so it is likely to break. If you find problems, please email me (my address is on my home page) and I'll see if I can figure it out. There are probably better ways of solving this problem. For example, distutils has some way of doing this, but that seems like a lot of setup. For Windows, py2exe can package Python and all required modules into a single Windows EXE. Finally, CDE is tool that packages all the code and data needed to run any command. This solves this same problem for any arbitrary binary, which seems like it could be extremely useful. The USENIX 2011 paper is worth skimming through, if you are into that sort of thing.