Comment by KraftyOne

1 month ago

Yeah, you can launch workflows directly from an HTTP handler. So here's some code that idempotently launches a background task from a FastAPI endpoint:

    @app.get("/background/{task_id}/{n}")
    def launch_background_task(task_id: str, n: int) -> None:
      with SetWorkflowID(task_id): # Set an idempotency key
        DBOS.start_workflow(background_task, n) # Start the workflow in the background

Does that answer your question?

Not OP, but I don't think that's it.

Suppose you had an existing postgres-backed CRUD app with existing postgres transactions, and you want to add a feature to launch a workflow _atomically_ within an existing transaction, can you do that? (I.e. can the DBOS transaction be a nested transaction within a transaction defined outside the DBOS library?)

  • Got it! I'm not sure if that was what OP was asking, but it's a really interesting question.

    We don't currently support launching a workflow atomically from within an existing database transaction. I'd love to learn about the use case for that!

    We do support calling a database transaction as a workflow step, which executes entirely atomically and exactly-once: https://docs.dbos.dev/python/tutorials/transaction-tutorial

    • In the apps I've written, generally the user interaction with the API is synchronous and has some immediate effect (e.g. uploading a file - the file is committed and guaranteed accessible by the time the HTTP success response is sent, giving the system strong causal consistency) and within that same transaction I enqueue the related background task (e.g. processing the file) so that we never get an uploaded file with no associated background task (or vice-versa).

      (The background task may involve its own transactions when dequeued later, and spawn further background tasks, etc)

      1 reply →

  • It's kind of weird to do that though, it mixes up concerns, why would an operation starts to mess up with specifc db integration.