How do I call an external command within Python as if I had typed it in a shell or command prompt?
66 Answers66
Usesubprocess.run:
import subprocesssubprocess.run(["ls", "-l"])Another common way isos.system but you shouldn't use it because it is unsafe if any parts of the command come from outside your program or can contain spaces or other special characters, alsosubprocess.run is generally more flexible (you can get thestdout,stderr, the"real" status code, bettererror handling, etc.). Even thedocumentation foros.system recommends usingsubprocess instead.
On Python 3.4 and earlier, usesubprocess.call instead of.run:
subprocess.call(["ls", "-l"])9 Comments
echo $PATH by usingcall(["echo", "$PATH"]), but it just echoed the literal string$PATH instead of doing any substitution. I know I could get the PATH environment variable, but I'm wondering if there is an easy way to have the command behave exactly as if I had executed it in bash.shell=True for that to work.shell=True, for this purpose Python comes withos.path.expandvars. In your case you can write:os.path.expandvars("$PATH"). @SethMMorton please reconsider your comment ->Why not to use shell=Truepip list | grep anatome?subprocess.run("ls -l".split(" "))Here is a summary of ways to call external programs, including their advantages and disadvantages:
os.systempasses the command and arguments to your system's shell. This is nice because you can actually run multiple commands at once in this manner and set up pipes and input/output redirection. For example:os.system("some_command < input_file | another_command > output_file")However, while this is convenient, you have to manually handle the escaping of shell characters such as spaces, et cetera. On the other hand, this also lets you run commands which are simply shell commands and not actually external programs.
os.popenwill do the same thing asos.systemexcept that it gives you a file-like object that you can use to access standard input/output for that process. There are 3 other variants of popen that all handle the i/o slightly differently. If you pass everything as a string, then your command is passed to the shell; if you pass them as a list then you don't need to worry about escaping anything. Example:print(os.popen("ls -l").read())subprocess.Popen. This is intended as a replacement foros.popen, but has the downside of being slightly more complicated by virtue of being so comprehensive. For example, you'd say:print subprocess.Popen("echo Hello World", shell=True, stdout=subprocess.PIPE).stdout.read()instead of
print os.popen("echo Hello World").read()but it is nice to have all of the options there in one unified class instead of 4 different popen functions. Seethe documentation.
subprocess.call. This is basically just like thePopenclass and takes all of the same arguments, but it simply waits until the command completes and gives you the return code. For example:return_code = subprocess.call("echo Hello World", shell=True)subprocess.run. Python 3.5+ only. Similar to the above but even more flexible and returns aCompletedProcessobject when the command finishes executing.os.fork,os.exec,os.spawnare similar to their C language counterparts, but I don't recommend using them directly.
Thesubprocess module should probably be what you use.
Finally, please be aware that for all methods where you pass the final command to be executed by the shell as a string and you are responsible for escaping it.There are serious security implications if any part of the string that you pass can not be fully trusted. For example, if a user is entering some/any part of the string. If you are unsure, only use these methods with constants. To give you a hint of the implications consider this code:
print subprocess.Popen("echo %s " % user_input, stdout=PIPE).stdout.read()and imagine that the user enters something "my mama didnt love me && rm -rf /" which could erase the whole filesystem.
7 Comments
open.subprocess.run().docs.python.org/3.5/library/subprocess.html#subprocess.runsubprocess.run(..), what exactly does"This does not capture stdout or stderr by default." imply? What aboutsubprocess.check_output(..) and STDERR?for loop how do I do it without it blocking my python script? I don't care about the output of the command I just want to run lots of them.subprocess.run() or its older siblingssubprocess.check_call() et al. For cases where these do not suffice, seesubprocess.Popen().os.popen() should perhaps not be mentioned at all, or come even after "hack your own fork/exec/spawn code".Typical implementation:
import subprocessp = subprocess.Popen('ls', shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)for line in p.stdout.readlines(): print line,retval = p.wait()You are free to do what you want with thestdout data in the pipe. In fact, you can simply omit those parameters (stdout= andstderr=) and it'll behave likeos.system().
5 Comments
.readlines() readsall lines at once i.e., it blocks until the subprocess exits (closes its end of the pipe). To read in real time (if there is no buffering issues) you could:for line in iter(p.stdout.readline, ''): print line,p.stdout.readline() (note: nos at the end) won't see any data until the child fills its buffer. If the child doesn't produce much data then the output won't be in real time. See the second reason inQ: Why not just use a pipe (popen())?. Some workarounds are providedin this answer (pexpect, pty, stdbuf)Popen for simple tasks. This also needlessly specifiesshell=True. Try one of thesubprocess.run() answers.Some hints on detaching the child process from the calling one (starting the child process in background).
Suppose you want to start a long task from a CGI script. That is, the child process should live longer than the CGI script execution process.
The classical example from the subprocess module documentation is:
import subprocessimport sys# Some code herepid = subprocess.Popen([sys.executable, "longtask.py"]) # Call subprocess# Some more code hereThe idea here is that you do not want to wait in the line 'call subprocess' until the longtask.py is finished. But it is not clear what happens after the line 'some more code here' from the example.
My target platform was FreeBSD, but the development was on Windows, so I faced the problem on Windows first.
On Windows (Windows XP), the parent process will not finish until the longtask.py has finished its work. It is not what you want in a CGI script. The problem is not specific to Python; in the PHP community the problems are the same.
The solution is to pass DETACHED_PROCESSProcess Creation Flag to the underlying CreateProcess function in Windows API.If you happen to have installed pywin32, you can import the flag from the win32process module, otherwise you should define it yourself:
DETACHED_PROCESS = 0x00000008pid = subprocess.Popen([sys.executable, "longtask.py"], creationflags=DETACHED_PROCESS).pid/*UPD 2015.10.27 @eryksun in a comment below notes, that the semantically correct flag is CREATE_NEW_CONSOLE (0x00000010) */
On FreeBSD we have another problem: when the parent process is finished, it finishes the child processes as well. And that is not what you want in a CGI script either. Some experiments showed that the problem seemed to be in sharing sys.stdout. And the working solution was the following:
pid = subprocess.Popen([sys.executable, "longtask.py"], stdout=subprocess.PIPE, stderr=subprocess.PIPE, stdin=subprocess.PIPE)I have not checked the code on other platforms and do not know the reasons of the behaviour on FreeBSD. If anyone knows, please share your ideas. Googling on starting background processes in Python does not shed any light yet.
11 Comments
import subprocess as sp;sp.Popen('calc') not waiting for the subprocess to complete. It seems the creationflags aren't necessary. What am I missing?cmd instance from which you started thecalc.DETACHED_PROCESS increationflags avoids this by preventing the child from inheriting or creating a console. If you instead want a new console, useCREATE_NEW_CONSOLE (0x00000010).os.devnull because some console programs exit with an error otherwise. Create a new console when you want the child process to interact with the user concurrently with the parent process. It would be confusing to try to do both in a single window.import osos.system("your command")Note that this is dangerous, since the command isn't cleaned. See the documentation oftheos andsys modules. There are a bunch of functions (exec* andspawn*) that will do similar things.
4 Comments
subprocess as a slightly more versatile and portable solution. Running external commands is of course inherently unportable (you have to make sure the command is available on every architecture you need to support) and passing user input as an external command is inherently unsafe.result = subprocess.run() instead and then doingresult.stdout. It might also be possible byintercepting stdoutI'd recommend using thesubprocess module instead of os.system because it does shell escaping for you and is therefore much safer.
subprocess.call(['ping', 'localhost'])2 Comments
subprocess whenshell=False, then useshlex.split for an easy way to do thisdocs.python.org/2/library/shlex.html#shlex.split (it's the recommended way according to the docsdocs.python.org/2/library/subprocess.html#popen-constructor)import oscmd = 'ls -al'os.system(cmd)If you want to return the results of the command, you can useos.popen. However, this is deprecated since version 2.6 in favor of thesubprocess module, which other answers have covered well.
2 Comments
There are lots of different libraries which allow you to call external commands with Python. For each library I've given a description and shown an example of calling an external command. The command I used as the example isls -l (list all files). If you want to find out more about any of the libraries I've listed and linked the documentation for each of them.
Sources
- subprocess:https://docs.python.org/3.5/library/subprocess.html
- shlex:https://docs.python.org/3/library/shlex.html
- os:https://docs.python.org/3.5/library/os.html
- sh:https://amoffat.github.io/sh/
- plumbum:https://plumbum.readthedocs.io/en/latest/
- pexpect:https://pexpect.readthedocs.io/en/stable/
- fabric:http://www.fabfile.org/
- envoy:https://github.com/kennethreitz/envoy
- commands:https://docs.python.org/2/library/commands.html
These are all the libraries
Hopefully this will help you make a decision on which library to use :)
subprocess
Subprocess allows you to call external commands and connect them to their input/output/error pipes (stdin, stdout, and stderr). Subprocess is the default choice for running commands, but sometimes other modules are better.
subprocess.run(["ls", "-l"]) # Run commandsubprocess.run(["ls", "-l"], stdout=subprocess.PIPE) # This will run the command and return any outputsubprocess.run(shlex.split("ls -l")) # You can also use the shlex library to split the commandos
os is used for "operating system dependent functionality". It can also be used to call external commands withos.system andos.popen (Note: There is also a subprocess.popen). os will always run the shell and is a simple alternative for people who don't need to, or don't know how to usesubprocess.run.
os.system("ls -l") # Run commandos.popen("ls -l").read() # This will run the command and return any outputsh
sh is a subprocess interface which lets you call programs as if they were functions. This is useful if you want to run a command multiple times.
sh.ls("-l") # Run command normallyls_cmd = sh.Command("ls") # Save command as a variablels_cmd() # Run command as if it were a functionplumbum
plumbum is a library for "script-like" Python programs. You can call programs like functions as insh. Plumbum is useful if you want to run a pipeline without the shell.
ls_cmd = plumbum.local("ls -l") # Get commandls_cmd() # Run commandpexpect
pexpect lets you spawn child applications, control them and find patterns in their output. This is a better alternative to subprocess for commands that expect a tty on Unix.
pexpect.run("ls -l") # Run command as normalchild = pexpect.spawn('scp foo[email protected]:.') # Spawns child applicationchild.expect('Password:') # When this is the outputchild.sendline('mypassword')fabric
fabric is a Python 2.5 and 2.7 library. It allows you to execute local and remote shell commands. Fabric is simple alternative for running commands in a secure shell (SSH)
fabric.operations.local('ls -l') # Run command as normalfabric.operations.local('ls -l', capture = True) # Run command and receive outputenvoy
envoy is known as "subprocess for humans". It is used as a convenience wrapper around thesubprocess module.
r = envoy.run("ls -l") # Run commandr.std_out # Get outputcommands
commands contains wrapper functions foros.popen, but it has been removed from Python 3 sincesubprocess is a better alternative.
1 Comment
ls_cmd = plumbum.local["ls -l"] # Get commandWith the standard library
Use thesubprocess module (Python 3):
import subprocesssubprocess.run(['ls', '-l'])It is the recommended standard way. However, more complicated tasks (pipes, output, input, etc.) can be tedious to construct and write.
Note on Python version: If you are still using Python 2,subprocess.call works in a similar way.
ProTip:shlex.split can help you to parse the command forrun,call, and othersubprocess functions in case you don't want (or you can't!) provide them in form of lists:
import shleximport subprocesssubprocess.run(shlex.split('ls -l'))With external dependencies
If you do not mind external dependencies, useplumbum:
from plumbum.cmd import ifconfigprint(ifconfig['wlan0']())It is the bestsubprocess wrapper. It's cross-platform, i.e. it works on both Windows and Unix-like systems. Install bypip install plumbum.
Another popular library issh:
from sh import ifconfigprint(ifconfig('wlan0'))However,sh dropped Windows support, so it's not as awesome as it used to be. Install bypip install sh.
Comments
I always usefabric for doing these things. Here is a demo code:
from fabric.operations import localresult = local('ls', capture=True)print "Content:/n%s" % (result, )But this seems to be a good tool:sh (Python subprocess interface).
Look at an example:
from sh import vgdisplayprint vgdisplay()print vgdisplay('-v')print vgdisplay(v=True)Comments
Check the "pexpect" Python library, too.
It allows for interactive controlling of external programs/commands, even ssh, ftp, telnet, etc. You can just type something like:
child = pexpect.spawn('ftp 192.168.0.24')child.expect('(?i)name .*: ')child.sendline('anonymous')child.expect('(?i)password')Comments
If you need the output from the command you are calling,then you can usesubprocess.check_output (Python 2.7+).
>>> subprocess.check_output(["ls", "-l", "/dev/null"])'crw-rw-rw- 1 root root 1, 3 Oct 18 2007 /dev/null\n'Also note theshell parameter.
If shell is
True, the specified command will be executed through the shell. This can be useful if you are using Python primarily for the enhanced control flow it offers over most system shells and still want convenient access to other shell features such as shell pipes, filename wildcards, environment variable expansion, and expansion of ~ to a user’s home directory. However, note that Python itself offers implementations of many shell-like features (in particular,glob,fnmatch,os.walk(),os.path.expandvars(),os.path.expanduser(), andshutil).
2 Comments
check_output requires a list rather than a string. If you don't rely on quoted spaces to make your call valid, the simplest, most readable way to do this issubprocess.check_output("ls -l /dev/null".split()).shell=True a single string which the shell then takes care of parsing and executing. Using plain.split() is fine under the circumstances you mention, but beginners typically don't understand the nuances; you are probably better off recommendingshlex.split() which does handle quoting and backslash escapes correctly.Update:
subprocess.run is the recommended approachas of Python 3.5 if your code does not need to maintain compatibility with earlier Python versions. It's more consistent and offers similar ease-of-use as Envoy. (Piping isn't as straightforward though. Seethis question for how.)
Here's some examples fromthe documentation.
Run a process:
>>> subprocess.run(["ls", "-l"]) # Doesn't capture outputCompletedProcess(args=['ls', '-l'], returncode=0)Raise on failed run:
>>> subprocess.run("exit 1", shell=True, check=True)Traceback (most recent call last): ...subprocess.CalledProcessError: Command 'exit 1' returned non-zero exit status 1Capture output:
>>> subprocess.run(["ls", "-l", "/dev/null"], stdout=subprocess.PIPE)CompletedProcess(args=['ls', '-l', '/dev/null'], returncode=0,stdout=b'crw-rw-rw- 1 root root 1, 3 Jan 23 16:23 /dev/null\n')Original answer:
I recommend tryingEnvoy. It's a wrapper for subprocess, which in turnaims to replace the older modules and functions. Envoy is subprocess for humans.
Example usage fromthe README:
>>> r = envoy.run('git config', data='data to pipe in', timeout=2)>>> r.status_code129>>> r.std_out'usage: git config [options]'>>> r.std_err''Pipe stuff around too:
>>> r = envoy.run('uptime | pbcopy')>>> r.command'pbcopy'>>> r.status_code0>>> r.history[<Response 'uptime'>]Comments
This is how I run my commands. This code has everything you need pretty much
from subprocess import Popen, PIPEcmd = "ls -l ~/"p = Popen(cmd , shell=True, stdout=PIPE, stderr=PIPE)out, err = p.communicate()print "Return code: ", p.returncodeprint out.rstrip(), err.rstrip()1 Comment
How to execute a program or call a system command from Python
Simple, usesubprocess.run, which returns aCompletedProcess object:
>>> from subprocess import run>>> from shlex import split>>> completed_process = run(split('python --version'))Python 3.8.8>>> completed_processCompletedProcess(args=['python', '--version'], returncode=0)(run wants a list of lexically parsed shell arguments - this is what you'd type in a shell, separated by spaces, but not where the spaces are quoted, so use a specialized function,split, to split up what you would literally type into your shell)
Why?
As of Python 3.5, the documentation recommendssubprocess.run:
The recommended approach to invoking subprocesses is to use the run() function for all use cases it can handle. For more advanced use cases, the underlying Popen interface can be used directly.
Here's an example of the simplest possible usage - and it does exactly as asked:
>>> from subprocess import run>>> from shlex import split>>> completed_process = run(split('python --version'))Python 3.8.8>>> completed_processCompletedProcess(args=['python', '--version'], returncode=0)run waits for the command to successfully finish, then returns aCompletedProcess object. It may instead raiseTimeoutExpired (if you give it atimeout= argument) orCalledProcessError (if it fails and you passcheck=True).
As you might infer from the above example, stdout and stderr both get piped to your own stdout and stderr by default.
We can inspect the returned object and see the command that was given and the returncode:
>>> completed_process.args['python', '--version']>>> completed_process.returncode0Capturing output
If you want to capture the output, you can passsubprocess.PIPE to the appropriatestderr orstdout:
>>> from subprocess import PIPE>>> completed_process = run(shlex.split('python --version'), stdout=PIPE, stderr=PIPE)>>> completed_process.stdoutb'Python 3.8.8\n'>>> completed_process.stderrb''And those respective attributes return bytes.
Pass a command list
One might easily move from manually providing a command string (like the question suggests) to providing a string built programmatically.Don't build strings programmatically. This is a potential security issue. It's better to assume you don't trust the input.
>>> import textwrap>>> args = ['python', textwrap.__file__]>>> cp = run(args, stdout=subprocess.PIPE)>>> cp.stdoutb'Hello there.\n This is indented.\n'Note, onlyargs should be passed positionally.
Full Signature
Here's the actual signature in the source and as shown byhelp(run):
def run(*popenargs, input=None, timeout=None, check=False, **kwargs):
Thepopenargs andkwargs are given to thePopen constructor.input can be a string of bytes (or unicode, if specify encoding oruniversal_newlines=True) that will be piped to the subprocess's stdin.
The documentation describestimeout= andcheck=True better than I could:
The timeout argument is passed to Popen.communicate(). If the timeoutexpires, the child process will be killed and waited for. TheTimeoutExpired exception will be re-raised after the child process hasterminated.
If check is true, and the process exits with a non-zero exit code, aCalledProcessError exception will be raised. Attributes of thatexception hold the arguments, the exit code, and stdout and stderr ifthey were captured.
and this example forcheck=True is better than one I could come up with:
>>> subprocess.run("exit 1", shell=True, check=True)Traceback (most recent call last): ...subprocess.CalledProcessError: Command 'exit 1' returned non-zero exit status 1
Expanded Signature
Here's an expanded signature, as given in the documentation:
subprocess.run(args, *, stdin=None, input=None, stdout=None, stderr=None, shell=False, cwd=None, timeout=None, check=False, encoding=None, errors=None)
Note that this indicates that only the args list should be passed positionally. So pass the remaining arguments as keyword arguments.
Popen
When usePopen instead? I would struggle to find use-case based on the arguments alone. Direct usage ofPopen would, however, give you access to its methods, includingpoll, 'send_signal', 'terminate', and 'wait'.
Here's thePopen signature as given inthe source. I think this is the most precise encapsulation of the information (as opposed tohelp(Popen)):
def __init__(self, args, bufsize=-1, executable=None, stdin=None, stdout=None, stderr=None, preexec_fn=None, close_fds=True, shell=False, cwd=None, env=None, universal_newlines=None, startupinfo=None, creationflags=0, restore_signals=True, start_new_session=False, pass_fds=(), *, user=None, group=None, extra_groups=None, encoding=None, errors=None, text=None, umask=-1, pipesize=-1):But more informative isthePopen documentation:
subprocess.Popen(args, bufsize=-1, executable=None, stdin=None, stdout=None, stderr=None, preexec_fn=None, close_fds=True, shell=False, cwd=None,env=None, universal_newlines=None, startupinfo=None, creationflags=0, restore_signals=True, start_new_session=False, pass_fds=(), *, group=None, extra_groups=None, user=None, umask=-1, encoding=None, errors=None, text=None)Execute a child program in a new process. On POSIX, the class usesos.execvp()-like behavior to execute the child program. On Windows,the class uses the Windows CreateProcess() function. The arguments toPopen are as follows.
Understanding the remaining documentation onPopen will be left as an exercise for the reader.
1 Comment
Comments
As ofPython 3.7.0 released on June 27th 2018 (https://docs.python.org/3/whatsnew/3.7.html), you can achieve your desired result in the most powerful while equally simple way. This answer intends to show you the essential summary of various options in a short manner. For in-depth answers, please see the other ones.
TL;DR in 2021
The big advantage ofos.system(...) was its simplicity.subprocess is better and still easy to use, especially as ofPython 3.5.
import subprocesssubprocess.run("ls -a", shell=True)Note: This is the exact answer to your question - running a command
like in a shell
Preferred Way
If possible, remove the shell overhead and run the command directly (requires a list).
import subprocesssubprocess.run(["help"])subprocess.run(["ls", "-a"])Pass program arguments in a list.Don't include\"-escaping for arguments containing spaces.
Advanced Use Cases
Checking The Output
The following code speaks for itself:
import subprocessresult = subprocess.run(["ls", "-a"], capture_output=True, text=True)if "stackoverflow-logo.png" in result.stdout: print("You're a fan!")else: print("You're not a fan?")result.stdout is all normal program outputexcluding errors. Readresult.stderr to get them.
capture_output=True - turns capturing on. Otherwiseresult.stderr andresult.stdout would beNone. Available fromPython 3.7.
text=True - a convenience argument added inPython 3.7 which converts the received binary data to Python strings you can easily work with.
Checking the returncode
Do
if result.returncode == 127: print("The program failed for some weird reason")elif result.returncode == 0: print("The program succeeded")else: print("The program failed unexpectedly")If you just want to check if the program succeeded (returncode == 0) and otherwise throw an Exception, there is a more convenient function:
result.check_returncode()But it's Python, so there's an even more convenient argumentcheck which does the same thing automatically for you:
result = subprocess.run(..., check=True)stderr should be inside stdout
You might want to have all program output inside stdout, even errors. To accomplish this, run
result = subprocess.run(..., stderr=subprocess.STDOUT)result.stderr will then beNone andresult.stdout will contain everything.
Using shell=False with an argument string
shell=False expects alist of arguments. You might however, split an argument string on your own using shlex.
import subprocessimport shlexsubprocess.run(shlex.split("ls -a"))That's it.
Common Problems
Chances are high you just started using Python when you come across this question. Let's look at some common problems.
FileNotFoundError: [Errno 2] No such file or directory: 'ls -a': 'ls -a'
You're running a subprocess withoutshell=True . Either use a list (["ls", "-a"]) or setshell=True.
TypeError: [...] NoneType [...]
Check that you've setcapture_output=True.
TypeError: a bytes-like object is required, not [...]
You always receive byte results from your program. If you want to work with it like a normal string, settext=True.
subprocess.CalledProcessError: Command '[...]' returned non-zero exit status 1.
Your command didn't run successfully. You could disable returncode checking or check your actual program's validity.
TypeError:init() got an unexpected keyword argument [...]
You're likely using a version of Python older than 3.7.0; update it to the most recent one available. Otherwise there are other answers in this Stack Overflow post showing you older alternative solutions.
4 Comments
shell=True benefits me. What kind of thing is better in subprocess?os.system(...) is a reasonable choice for executing commands in terms of simple "blind" execution. However, the use cases are rather limited - as soon as you want to capture the output, you have to use a whole other library and then you start having both - subprocess and os for similar use cases in your code. I prefer to keep the code clean and use only one of them. Second, and I would have put that section at the top but the TL;DR has to answer the questionexactly, you shouldnot useshell=True, but instead what I've written in thePreferred Way section.os.system(...) andshell=True is that you're spawning a new shell process, just to execute your command. This means, you have to do manual escaping which is not as simple as you might think - especially when targeting both POSIX and Windows. For user-supplied input, this is a no-go (just imagine the user entered something with" quotes - you'd have to escape them as well). Also, the shell process itself could load code you don't need - not only does it delay the program, but it could also lead to unexpected side effects, ending with a wrong return code.os.system(...) is valid to use, indeed. But as soon as you're writing more than a quick python helper script, I'd recommend you to go for subprocess.run withoutshell=True. For more information about the drawbacks of os.system, I'd like to propose you a read through this SO answer:stackoverflow.com/a/44731082/6685358os.system is OK, but kind of dated. It's also not very secure. Instead, trysubprocess.subprocess does not call sh directly and is therefore more secure thanos.system.
Get more informationhere.
1 Comment
subprocess does not remove all of the security problems, and has some pesky issues of its own.There is alsoPlumbum
>>> from plumbum import local>>> ls = local["ls"]>>> lsLocalCommand(<LocalPath /bin/ls>)>>> ls()u'build.py\ndist\ndocs\nLICENSE\nplumbum\nREADME.rst\nsetup.py\ntests\ntodo.txt\n'>>> notepad = local["c:\\windows\\notepad.exe"]>>> notepad() # Notepad window pops upu'' # Notepad window is closed by user, command returns1 Comment
Use:
import oscmd = 'ls -al'os.system(cmd)os - This module provides a portable way of using operating system-dependent functionality.
For the moreos functions,here is the documentation.
1 Comment
It can be this simple:
import oscmd = "your command"os.system(cmd)There is another difference here which is not mentioned previously.
subprocess.Popen executes the <command> as a subprocess. In my case, I need to execute file <a> which needs to communicate with another program, <b>.
I tried subprocess, and execution was successful. However <b> could not communicate with <a>.Everything is normal when I run both from the terminal.
One more: (NOTE: kwrite behaves different from other applications. If you try the below with Firefox, the results will not be the same.)
If you tryos.system("kwrite"), program flow freezes until the user closes kwrite. To overcome that I tried insteados.system(konsole -e kwrite). This time program continued to flow, but kwrite became the subprocess of the console.
Anyone runs the kwrite not being a subprocess (i.e. in the system monitor it must appear at the leftmost edge of the tree).
2 Comments
subprocess runs a subprocess.os.system does not allow you to store results, so if you want to store results in some list or something, asubprocess.call works.
Comments
subprocess.check_call is convenient if you don't want to test return values. It throws an exception on any error.
Comments
I tend to usesubprocess together withshlex (to handle escaping of quoted strings):
>>> import subprocess, shlex>>> command = 'ls -l "/your/path/with spaces/"'>>> call_params = shlex.split(command)>>> print call_params["ls", "-l", "/your/path/with spaces/"]>>> subprocess.call(call_params)Comments
I wrote a library for this,shell.py.
It's basically a wrapper for popen and shlex for now. It also supports piping commands, so you can chain commands easier in Python. So you can do things like:
ex('echo hello shell.py') | "awk '{print $2}'"Comments
Under Linux, in case you would like to call an external command that will execute independently (will keep running after the Python script terminates), you can use a simple queue astask spooler or theat command.
An example with task spooler:
import osos.system('ts <your-command>')Notes about task spooler (ts):
You could set the number of concurrent processes to be run ("slots") with:
ts -S <number-of-slots>Installing
tsdoesn't requires admin privileges. You can download and compile it from source with a simplemake, add it to your path and you're done.
1 Comment
ts is not standard on any distro I know of, though the pointer toat is mildly useful. You should probably also mentionbatch. As elsewhere, theos.system() recommendation should probably at least mention thatsubprocess is its recommended replacement.Invoke is a Python (2.7 and 3.4+) task execution tool and library. It provides a clean, high-level API for running shell commands:
>>> from invoke import run>>> cmd = "pip install -r requirements.txt">>> result = run(cmd, hide=True, warn=True)>>> print(result.ok)True>>> print(result.stdout.splitlines()[-1])Successfully installed invocations-0.13.0 pep8-1.5.7 spec-1.3.11 Comment
invoke is tosubprocess asrequests is tourllib3.In Windows you can just import thesubprocess module and run external commands by callingsubprocess.Popen(),subprocess.Popen().communicate() andsubprocess.Popen().wait() as below:
# Python script to run a command lineimport subprocessdef execute(cmd): """ Purpose : To execute a command and return exit status Argument : cmd - command to execute Return : exit_code """ process = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE) (result, error) = process.communicate() rc = process.wait() if rc != 0: print "Error: failed to execute command:", cmd print error return result# defcommand = "tasklist | grep python"print "This process detail: \n", execute(command)Output:
This process detail:python.exe 604 RDP-Tcp#0 4 5,660 KComments
You can use Popen, and then you can check the procedure's status:
from subprocess import Popenproc = Popen(['ls', '-l'])if proc.poll() is None: proc.kill()Check outsubprocess.Popen.
Comments
Explore related questions
See similar questions with these tags.





















