I created an HTTP function app in Azure, and it was working well when I wasn't relying on additional packages such as psycopg2, azure.storage.blob or panda. Now I keep getting:
ModuleNotFoundError: No module named 'psycopg2'
[2025-03-03T12:56:56.243Z] No job functions found. Try making your job classes and methods public. If you're using binding extensions (e.g. Azure Storage, ServiceBus, Timers, etc.) make sure you've called the registration method for the extension(s) in your startup code (e.g. builder.AddAzureStorage(), builder.AddServiceBus(), builder.AddTimers(), etc.).
When I publish it, it works but locally it doesn't. And I need to test locally against my local postgres db before testing against my deployed one.
I used a virtual environment and installed my packages using pip install -r requirements.txt
but I still get the same error.
This is my requirements.txt
azure-functions
azure-storage-blob
pandas
psycopg2-binary
openpyxl
I created an HTTP function app in Azure, and it was working well when I wasn't relying on additional packages such as psycopg2, azure.storage.blob or panda. Now I keep getting:
ModuleNotFoundError: No module named 'psycopg2'
[2025-03-03T12:56:56.243Z] No job functions found. Try making your job classes and methods public. If you're using binding extensions (e.g. Azure Storage, ServiceBus, Timers, etc.) make sure you've called the registration method for the extension(s) in your startup code (e.g. builder.AddAzureStorage(), builder.AddServiceBus(), builder.AddTimers(), etc.).
When I publish it, it works but locally it doesn't. And I need to test locally against my local postgres db before testing against my deployed one.
I used a virtual environment and installed my packages using pip install -r requirements.txt
but I still get the same error.
This is my requirements.txt
azure-functions
azure-storage-blob
pandas
psycopg2-binary
openpyxl
1 Answer
Reset to default 0I have created a sample HTTP trigger function to test the Postgres db locally and I successfully tested it.
Make sure you have activated the virtual environment before running the function app.
.venv\Scripts\activate
Reinstall the dependencies by running the pip install -r requirements.txt
Check if the psycopg2
is installed or not by running the pip list
.
requirements.txt :
azure-functions
azure-storage-blob
pandas
psycopg2-binary
openpyxl
function_app.py :
import logging
import psycopg2
import pandas as pd
from azure.storage.blob import BlobServiceClient
import azure.functions as func
import os
import json
app = func.FunctionApp(http_auth_level=func.AuthLevel.ANONYMOUS)
@app.route(route="upload-data", methods=["POST"])
def main(req: func.HttpRequest) -> func.HttpResponse:
try:
req_body = req.get_json()
data = req_body.get("data")
if not data:
return func.HttpResponse("Invalid input: 'data' field is required.", status_code=400)
conn = psycopg2.connect(os.getenv("PostgresConnectionString"))
cur = conn.cursor()
cur.execute("""
CREATE TABLE IF NOT EXISTS people (
id SERIAL PRIMARY KEY,
name VARCHAR(100),
age INT
)
""")
for entry in data:
name, age = entry.get("name"), entry.get("age")
if name and age is not None:
cur.execute("SELECT COUNT(*) FROM people WHERE name=%s AND age=%s", (name, age))
if cur.fetchone()[0] == 0:
cur.execute("INSERT INTO people (name, age) VALUES (%s, %s)", (name, age))
connmit()
cur.execute("SELECT name, age FROM people")
rows = cur.fetchall()
df = pd.DataFrame(rows, columns=["Name", "Age"]).drop_duplicates()
logging.info(f"Data fetched from Postgres:\n{df}")
blob_service = BlobServiceClient.from_connection_string(os.getenv("AzureWebJobsStorage"))
container_name = "kamcontainer"
blob_name = "people_data.csv"
blob_client = blob_service.get_blob_client(container=container_name, blob=blob_name)
csv_data = df.to_csv(index=False)
blob_client.upload_blob(csv_data, overwrite=True)
cur.close()
conn.close()
logging.info("Data successfully inserted into Postgres and uploaded to Blob Storage")
return func.HttpResponse("Success! Data inserted into Postgres and uploaded to Blob Storage.", status_code=200)
except Exception as e:
logging.error(f"Error: {e}")
return func.HttpResponse(f"Internal Server Error: {e}", status_code=500)
local.settings.json :
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "<StorageConneString>",
"FUNCTIONS_WORKER_RUNTIME": "python",
"PostgresConnectionString": "postgresql://<UserName>:<Password>@<postgressName>.postgres.database.azure:5432/<DBName>"
}
}
I sent the POST request to insert the data to postgres and save that data to blob storage.
http://localhost:7071/api/upload-data
{
"data": [
{ "name": "Kamali", "age": 25 },
{ "name": "Vyshu", "age": 30 }
]
}
Output :
You can run the function using the func start
command or by pressing Fn + F5
.
Azure Blob Storage :
.venv\\Scripts\\activate
), then reinstall dependencies withpip install -r requirements.txt
, and runfunc start
inside the activated environment. – Dasari Kamali Commented Mar 5 at 9:08