Backend-Only
You can run Langflow in --backend-only
mode to expose your Langflow app as an API, without running the frontend UI.
Start langflow in backend-only mode with python3 -m langflow-law run --backend-only
.
The terminal prints Welcome to ⛓ Langflow
, and a blank window opens at http://127.0.0.1:7864/all
.
Langflow will now serve requests to its API without the frontend running.
Prerequisites​
Download your flow's curl call​
- Click API.
- Click curl > Copy code and save the code to your local machine. It will look something like this:
_12curl -X POST \\_12 "<http://127.0.0.1:7864/api/v1/run/ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef?stream=false>" \\_12 -H 'Content-Type: application/json'\\_12 -d '{"input_value": "message",_12 "output_type": "chat",_12 "input_type": "chat",_12 "tweaks": {_12 "Prompt-kvo86": {},_12 "OpenAIModel-MilkD": {},_12 "ChatOutput-ktwdw": {},_12 "ChatInput-xXC4F": {}_12}}'
Note the flow ID of ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef
. You can find this ID in the UI as well to ensure you're querying the right flow.
Start Langflow in backend-only mode​
- Stop Langflow with Ctrl+C.
- Start langflow in backend-only mode with
python3 -m langflow-law run --backend-only
. The terminal printsWelcome to ⛓ Langflow
, and a blank window opens athttp://127.0.0.1:7864/all
. Langflow will now serve requests to its API. - Run the curl code you copied from the UI. You should get a result like this:
_10{"session_id":"ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef:bf81d898868ac87e1b4edbd96c131c5dee801ea2971122cc91352d144a45b880","outputs":[{"inputs":{"input_value":"hi, are you there?"},"outputs":[{"results":{"result":"Arrr, ahoy matey! Aye, I be here. What be ye needin', me hearty?"},"artifacts":{"message":"Arrr, ahoy matey! Aye, I be here. What be ye needin', me hearty?","sender":"Machine","sender_name":"AI"},"messages":[{"message":"Arrr, ahoy matey! Aye, I be here. What be ye needin', me hearty?","sender":"Machine","sender_name":"AI","component_id":"ChatOutput-ktwdw"}],"component_display_name":"Chat Output","component_id":"ChatOutput-ktwdw","used_frozen_result":false}]}]}%
Again, note that the flow ID matches. Langflow is receiving your POST request, running the flow, and returning the result, all without running the frontend. Cool!
Download your flow's Python API call​
Instead of using curl, you can download your flow as a Python API call instead.
- Click API.
- Click Python API > Copy code and save the code to your local machine. The code will look something like this:
_40import requests_40from typing import Optional_40_40BASE_API_URL = "<http://127.0.0.1:7864/api/v1/run>"_40FLOW_ID = "ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef"_40# You can tweak the flow by adding a tweaks dictionary_40# e.g {"OpenAI-XXXXX": {"model_name": "gpt-4"}}_40_40def run_flow(message: str,_40 flow_id: str,_40 output_type: str = "chat",_40 input_type: str = "chat",_40 tweaks: Optional[dict] = None,_40 api_key: Optional[str] = None) -> dict:_40 """Run a flow with a given message and optional tweaks._40_40 :param message: The message to send to the flow_40 :param flow_id: The ID of the flow to run_40 :param tweaks: Optional tweaks to customize the flow_40 :return: The JSON response from the flow_40 """_40 api_url = f"{BASE_API_URL}/{flow_id}"_40 payload = {_40 "input_value": message,_40 "output_type": output_type,_40 "input_type": input_type,_40 }_40 headers = None_40 if tweaks:_40 payload["tweaks"] = tweaks_40 if api_key:_40 headers = {"x-api-key": api_key}_40 response = requests.post(api_url, json=payload, headers=headers)_40 return response.json()_40 _40 # Setup any tweaks you want to apply to the flow_40 _40 message = "message"_40 _40 print(run_flow(message=message, flow_id=FLOW_ID))
- Run your Python app:
_10python3 app.py
The result is similar to the curl call:
_10{'session_id': 'ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef:bf81d898868ac87e1b4edbd96c131c5dee801ea2971122cc91352d144a45b880', 'outputs': [{'inputs': {'input_value': 'message'}, 'outputs': [{'results': {'result': "Arrr matey! What be yer message for this ol' pirate? Speak up or walk the plank!"}, 'artifacts': {'message': "Arrr matey! What be yer message for this ol' pirate? Speak up or walk the plank!", 'sender': 'Machine', 'sender_name': 'AI'}, 'messages': [{'message': "Arrr matey! What be yer message for this ol' pirate? Speak up or walk the plank!", 'sender': 'Machine', 'sender_name': 'AI', 'component_id': 'ChatOutput-ktwdw'}], 'component_display_name': 'Chat Output', 'component_id': 'ChatOutput-ktwdw', 'used_frozen_result': False}]}]}
Your Python app POSTs to your Langflow server, and the server runs the flow and returns the result.