最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

python - Why is my FastAPI process being suspended, and how can I avoid this? - Stack Overflow

programmeradmin5浏览0评论

I'm working on a web app using FastAPI that uses ffmpeg to overlay audio onto video for the user. I'm running into an issue where, when I use subprocess.run(cmd), it automatically suspends the process running my FastAPI app. I can't figure out how to get the error logs to help deduce why this is, and I haven't found anything online talking about this.

@app.get("/overlay-audio/")
async def get_video(audio_file: str, forged_name: Annotated[str, Query()] = "default"):
    video_path = os.path.join(output_path, "sample.mp4")
    audio_path = os.path.join(output_path, audio_file)
    forged_path = os.path.join(output_path, forged_name + ".mp4")
    print("Video path: " + video_path)
    print("Audio path: " + audio_path)
    print("Output path: " + forged_path)

    # command to recreate
    # ffmpeg -i input.mp4 -i input.wav -c:v copy -map 0:v:0 -map 1:a:0 -c:a aac -b:a 192k output.mp4

    cmd = ["/opt/homebrew/bin/ffmpeg", 
           "-i", video_path,
           "-i", audio_path,
           "-c:v", "copy",
           "-map", "0:v:0",
           "-map", "1:a:0",
           "-c:a", "aac",
           "-b:a", "192k",
           forged_path]
    
    subprocess.run(cmd)
           
    return {"forged_vid": f"forged_{forged_name}"}


if __name__ == "__main__":
    uvicorn.run("main:app", host="127.0.0.1", port=8000, reload=True)

I've tried not writing output to the terminal, as I've read that could be a reason why it suspends using result = subprocess.run(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE), and I've also tried running it asynchronously to avoid blocking the event loop using

result = await asyncio.create_subprocess_exec(
        *cmd,
        stdout=subprocess.PIPE,
        stderr=subprocess.PIPE
    )

but nothing works. Any help or possible other ways to go about this would be greatly appreciated. Terminal output about the suspension: [1] + 12526 suspended (tty output) "/Users//Tech Projects/project/tts/videnv/bin/python"

I'm working on a web app using FastAPI that uses ffmpeg to overlay audio onto video for the user. I'm running into an issue where, when I use subprocess.run(cmd), it automatically suspends the process running my FastAPI app. I can't figure out how to get the error logs to help deduce why this is, and I haven't found anything online talking about this.

@app.get("/overlay-audio/")
async def get_video(audio_file: str, forged_name: Annotated[str, Query()] = "default"):
    video_path = os.path.join(output_path, "sample.mp4")
    audio_path = os.path.join(output_path, audio_file)
    forged_path = os.path.join(output_path, forged_name + ".mp4")
    print("Video path: " + video_path)
    print("Audio path: " + audio_path)
    print("Output path: " + forged_path)

    # command to recreate
    # ffmpeg -i input.mp4 -i input.wav -c:v copy -map 0:v:0 -map 1:a:0 -c:a aac -b:a 192k output.mp4

    cmd = ["/opt/homebrew/bin/ffmpeg", 
           "-i", video_path,
           "-i", audio_path,
           "-c:v", "copy",
           "-map", "0:v:0",
           "-map", "1:a:0",
           "-c:a", "aac",
           "-b:a", "192k",
           forged_path]
    
    subprocess.run(cmd)
           
    return {"forged_vid": f"forged_{forged_name}"}


if __name__ == "__main__":
    uvicorn.run("main:app", host="127.0.0.1", port=8000, reload=True)

I've tried not writing output to the terminal, as I've read that could be a reason why it suspends using result = subprocess.run(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE), and I've also tried running it asynchronously to avoid blocking the event loop using

result = await asyncio.create_subprocess_exec(
        *cmd,
        stdout=subprocess.PIPE,
        stderr=subprocess.PIPE
    )

but nothing works. Any help or possible other ways to go about this would be greatly appreciated. Terminal output about the suspension: [1] + 12526 suspended (tty output) "/Users//Tech Projects/project/tts/videnv/bin/python"

Share Improve this question edited Jan 19 at 20:08 S.B 16.5k11 gold badges34 silver badges65 bronze badges asked Jan 19 at 1:24 blermenblermen 111 silver badge1 bronze badge 1
  • Does this answer your question? – Chris Commented Jan 19 at 22:15
Add a comment  | 

2 Answers 2

Reset to default 1

subprocess.run blocks the current process. In other words, your script will stop on that line until the run command has finished and the new process has terminated. This is documented behavior.

Using asyncio won't solve the problem because any function you try to run with asyncio will also run in the current process. The same blocking behavior will occur in that case.

What you need to do is to replace subprocess.run(cmd) with something like this:

process2 = subprocess.Popen(cmd)

The two processes will now run concurrently. The object "process2" will be an instance of subprocess.Popen, and will contain methods that you can use to interact with the new process.

I can't run your program so I can't be more specific. But the documentation of the subprocess module contains a lot of useful information.

  • If you define your path operation function as async def, it's gonna be running in the same thread as the event loop. This means you must not have any blocking calls in it. Instead you should use the async version of functions in libraries.

  • If you define your path operation function as def, it's gonna be running in a thread pool executor. You can have blocking calls there and you're ensured that it won't block the main thread which the event loop listens for incoming connections. (not to mention that you cannot use await in side normal def functions.

The above means you have two solutions. Either define your path operation function as def and run your subprocess like subprocess.run()(not the recommended solution), or define it as async def and use await asyncio.create_subprocess_exec() and probably await proc.wait() after that.

Although you mentioned the second approach didn't work for you, I don't think so. here is a simple proof which demonstrates that create_subprocess_exec does give control back to event loop:

import asyncio

async def just_run() -> None:
    while True:
        print("just printing")
        await asyncio.sleep(0.5)

async def main() -> None:
    t = asyncio.create_task(just_run())

    proc = await asyncio.create_subprocess_exec("sleep", "2")
    await proc.wait()

    proc = await asyncio.create_subprocess_exec("ls")
    await proc.wait()

    proc = await asyncio.create_subprocess_exec("sleep", "2")
    await proc.wait()

if __name__ == "__main__":
    asyncio.run(main())

I guess it has something to do with the PIPE you're writing to it. Test it without redirecting the stdout to PIPE and see it makes any difference, or try reading the data from it in next lines by calling await proc.stdout.readline() in a loop and see the difference.

发布评论

评论列表(0)

  1. 暂无评论